On March 6, Elon Musk tweeted about adding a “Download Beta” on-screen button to Tesla cars’ touchscreens “in 10 days.” While the button is stuck in limbo on the flow of EMT (Elon’s Master Time), the Tesla CEO also disclosed that the next big software release for the automaker’s vehicles will be in April, so we can at least expect the expanded availability of FSD beta software beyond today’s select group of owner-testers and some employees.

The forthcoming “Download beta” option for all is huge news. Customers who paid thousands of dollars for the FSD sensor hardware necessary to support fully automated driving (and the promise of later fully autonomous travel activated by software update), they can now try out a skeletal outline of that capability in “beta,” or incomplete test form. (The system can guide so-equipped Teslas along a navigation route, making lane changes, make full right and left turns, and heed traffic signals.) Those same eager Teslarati should take note: It’s not all hands-free, kick-back-and-let-the-car-do-the-work from here on out, and they can lose their FSD preview if they aren’t careful.

How? Well, Mr. Musk has revealed that some drivers had their FSD beta access removed because they were not paying sufficient attention to the road with the system engaged.

How did Tesla determine who was being naughty? Already, every Tesla equipped with the Level 2 Autopilot driver assist regularly detects force applied to the steering wheel—so as to shut Autopilot off if a driver fails to make an input at the wheel every so often (a de-facto “check-in”). On recent models, the in-cabin camera can keep tabs on things; for a time, this camera was simply disabled. But recently, Tesla has started using the in-car camera for driver attention monitoring, as an additional way to combat misuse of Autopilot, which is not intended to be a hands- and attention-free system. Again, it’s only a Level 2 semi-autonomous setup, capable of accelerating, braking, and steering in certain closed environments such as freeways.

It seems that, so far, being kicked out of FSD beta testing requires being spotted not spotting the road ahead by this in-car camera, as well as the typical steering-input tracking Tesla has long employed.

Tesla hacker “green”, @greentheonly, was able to gather footage from the cabin camera (which is mounted right below the rear-view mirror) and figured out what aspects the computer is detecting for driver monitoring. The system is mostly tracking head, eyes, and sunglasses. Interestingly, it is also trying to detect “phone use,” keeping a virtual eye out for drivers holding and looking at a phone, which is a common cause of distracted driving. The percentage read-outs in the video represent the system’s “confidence level.” As Tesla is famously known for using AI for image recognition. In this case, the higher the percentage, the more likely it is the case that the driver is using a phone. The hacker also tried to place physical photo print outs (including a photo of Elon Musk) in various locations to trick the system. And yes, it can be tricked. It is an interesting video, take a look in the YouTube clip below:

Obviously, the system is still in its infancy and in development. At this point, Tesla has three ways to detect a driver’s attention level: Steering force, the seat sensor, and the in-cabin camera.

Autonomous driving—and where responsibility lies in case of an accident—is an extremely difficult problem to work out. The road to fully autonomous driving, Full Self Driving or otherwise, is very long but technology likely will get there. Tesla is definitely pushing the boundaries of testing such setups by releasing an “autonomous-adjacent” feature to the public and gathering data on their use of the system. But it cannot be stressed enough: No matter the FSD system’s capability, it requires an attentive pilot to monitor whether the system is actually doing its job. This is why tech companies and other automakers pay trained individuals to keep tabs on self-driving prototype cars and be ready to jump in and take control if a situation warrants them doing so.

So, a reminder, Tesla owners. FSD is merely in limited-capability beta form, and Autopilot is anything but what its name implies. Both setups can make stupid decisions and still require the driver’s attention to take control for many situations. All of which is to say: Tesla owners, when the “Download FSD Beta” button finally appears on your EV’s touchscreen, please use it responsibly and operate the vehicle safely, keeping your eyes on the road. Tesla is watching.

Similar Posts