Full disclosure: I think all systems—that is, every sort of partial self-driving system you can get in a car today—are stupid. Not because the technology is necessarily bad, but that these systems, like Tesla’s Autopilot, are just not compatible with how human brains work. If you don’t believe me, please stare gape-mouthed at this video of a new Tesla Model 3 driving itself on Autopilot with no one in the driver’s seat.
Here’s the video, in the reply to this tweet:
That Tesla Model 3 Performance is owned by Alex Choi, famous for his among other things, I guess, and what’s going on in this video is deeply, lavishly stupid.
It’s stupid on Alex Choi’s part (who appears to be videoing from the back seat), whoever that guy is in the passenger seat, and also stupid on Tesla’s part for selling a with descriptions like “” which mislead consumers into thinking that pulling shit like this is somehow okay.
Let’s just be absolutely clear here: this is not, in any way, okay. Tesla’s Autopilot requires someone to be ready to take over driving at a moment’s notice; if that happens while you’re in the passenger seat or back seat, you’re boned, along with anyone unfortunate enough to be around you, caught in your sphere of idiot-bonery.
The video is only eight seconds long, and Tesla’s Autopilot So, that leaves enough time to shoot an idiotic video like this.
It’s strange Tesla isn’t checking the sensor in the seat (the one almost all cars have to decide if someone’s in the seat and if so, to beep at you and illuminate the seat belt warning light) as a failsafe to keep Autopilot from being engaged if the driver’s seat is unoccupied. That should be pretty damn easy to implement, and all the hardware to do so is already in the car.
So why aren’t they doing that? That would keep dangerous bullshit like this from happening.
Videos like this, or at the wheel of the Autopilot-engaged Teslas should be a big fat wake-up call that these systems are way too easy to abuse by determined or sleepy idiots, and sooner or later, wrecks will happen. These systems are not designed to be used like this; they can stop working at any time, for any number of reasons. They to correct. They are not for this.
I reached out to Tesla for comment, and they pointed me to the same thing they always say in these circumstances, which basically boils down to “don’t do this.” Here’s Tesla’s boilerplate reminder (emphasis mine):
While Autopilot is designed to become more capable over time, in its current form, , and it does not allow the driver to abdicate responsibility. When used properly, Autopilot reduces a driver’s overall workload, and the redundancy of eight external cameras, radar and 12 ultrasonic sensors provides an additional layer of safety that two eyes alone would not have.”
If a someone wants to do this on a private track and see what happens, have at it. We’ll probably even run the video. But doing this on public roads, with non-idiots driving around with no idea what’s happening, is irresponsible and a terrible idea.
So knock it off, dummies.