Tesla delivers ‘Full Self-Driving’ beta version 9

Tesla began sending out over-the-air software updates for its long-awaited “Full Self-Driving” beta version 9, the definitely-not-autonomous-but-certainly-advanced driver assist system.

As promised by Elon Musk, the software update (2021.4.18.12) began uploading after midnight on Friday, giving thousands of Tesla owners who have purchased the FSD option access to the feature, which enables drivers to use many of Autopilot’s advanced driver-assist features on local, non-highway streets.

Musk has been promising v9 of the software for, well, a while now. He said in 2018 that the “long awaited” version of FSD would begin rolling out in August. He did it again in 2019, proclaiming that “a year from now” there would be “over a million cars with full self-driving, software, everything.” Earlier this month, he claimed that “FSD 9 beta is shipping soon.” So to say that Tesla fans have been anticipating this update for a while would be an understatement.

The real question is whether it’s ready for primetime. To that, Musk gave a typical muddled response, tweeting that “Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid.” He added, “Safety is always top priority at Tesla.” Release notes included with the update warn testers that “it may do the wrong thing at the worst time” and to avoid complacence. They also mention improvements to the cabin camera’s driver monitoring to check for attentiveness, along with updated, larger visualizations on the in-car display (as shown above).

There’s no question that Tesla is more willing than its competitors to test beta versions of its Autopilot driver assist feature on its customers in the interest of gathering data and working out any bugs in the system. And Tesla customers are mostly fine with this, routinely flooding Musk’s mentions begging to be admitted into the company’s Early Access Program for beta testers. This has helped contribute to Tesla’s public reputation as a leader in autonomous driving, despite its vehicles continuously falling short of what most experts would agree defines a self-driving car.

Tesla warns that drivers need to keep their eyes on the road and hands on the wheel at all times, though the automaker famously refuses to include a more robust driver-monitoring system (like infrared eye tracking, for example) to ensure its customers are following safety protocols (although maybe that’s changing). Autopilot is considered a Level 2 “partially automated” system by the Society of Automotive Engineers’ standards (and by Tesla’s lawyers), which requires that drivers keep their hands on the wheel and eyes on the road.

However, consumer advocates have proven that Tesla’s system can easily be tricked into thinking there’s someone in the driver seat, which gained renewed attention in the aftermath of a fatal crash in Texas involving a Tesla, in which authorities said there was no one behind the steering wheel.

But this hasn’t stopped some Tesla owners from abusing Autopilot — sometimes going so far as to film and publicize the results. Drivers have been caught sleeping in the passenger seat or backseat of their Teslas while the vehicle speeds down a crowded highway. A Canadian man was charged with reckless driving last year after being pulled over for sleeping while traveling at speeds of 93mph.

Since Tesla introduced Autopilot in 2015, there have been at least 11 deaths in nine crashes in the US that involved the driver assistance system. Internationally, there have been at least another nine deaths in seven additional crashes.

Meanwhile, the US government is requiring that car companies report crashes involving autonomous vehicles or advanced driver assist systems, often within a day of the incident. It was a major change that signals a tougher stance on these partially automated systems by regulators.

Leave a Comment