Just under a year and a half ago, I received a video of a Tesla that had been spotted with a LiDAR array in Leuven, Belgium. Who was behind it was not clear until recently, but now I learned then name of company.
IVEX.ai tests the safety of driver assistance and autonomous driving systems with its own test platform called Carvex. It then uses vehicles such as a Tesla Model 3 to collect data to check the reactions of Tesla Autopilot in this case. The company published the result of such tests on its blog a few days ago.
In total, IVEX.ai collected 40 gigabytes of driving data from over 15,000 kilometers. A special focus was on hard braking events, which are repeatedly reported by Tesla owners. According to their owners, Tesla cars brake heavily at high speeds for no apparent reason, which is why many call it phantom or ghost braking. Here, IVEX understands hard braking to be when a deceleration of 4.0 m/s² (13.12 ft/s²) and more occurs.
Observing such spontaneous braking events is interesting for two reasons. First, it could be that the driver disengages the autopilot after an emergency braking event, which would mean that the autopilot cannot handle a situation because it may be outside its operational design domain (ODD). Second, the emergency braking could be a false positive event (emergency braking in a situation where emergency braking is not warranted), which is dangerous because unexpected, false positive emergency braking could result in a rear-end collision.
The Carvex test system for the Tesla included RTK GNSS, LiDAR, radar, and cameras, as well as a camera mounted in the passenger compartment that filmed the Tesla dashboard. IVEX.ai performed three examples of emergency braking.
In the first example, a truck on the right side of the Tesla crosses the lane marking, causing the vehicle to brake hard.
In the second example, the Tesla brakes heavily before an intersection where the traffic light changes from yellow. This situation is problematic in that in this situation traffic behind the Tesla might not be prepared for this heavy braking. However, as it turned out, the driver took over from Autopilot to apply braking because the Autopilot version in use did not recognize traffic lights.
In the third example, the vehicle brakes heavily before entering a traffic circle. A traffic circle does not seem to be within the operational design domain of the tested autopilot, which can be interpreted as phantom braking by the average Tesla owner.
At least two conclusions can be drawn from these examples:
- In events that are not a part of the Tesla Autopilot ODD, the autopilot disengages, which can require the driver to brake sharply
- There are events that the autopilot will perceive as being risky, such as a potential obstacle cutting in, and will hard-brake, even when the perception does not correspond to actual events
More details about the test can be found on the IVEX.ai-blog.
This article was also published in German.