By Katherine Bindley and
Rebecca Elliott
May 20, 2021
Tesla Inc. Chief Executive Elon Musk for years has been championing his vehicles’ driver-assistance system called Autopilot and forecasting that self-driving cars are an emerging reality. Some would-be social media stars and Tesla owners can’t seem to wait.
Param Sharma, 25, has posted multiple videos to Instagram in which he appears to operate a Tesla while in the back seat with nobody at the wheel. Police in California arrested Mr. Sharma on May 10 for alleged reckless driving after an officer said he saw him operating a Tesla Model 3 from the back seat on a Bay Area highway.
Similar videos abound on social media, even though Tesla’s technology is intended only as a way to assist drivers, who are instructed to keep their hands on the wheel. Echoing Mr. Musk’s penchant for pushing the envelope, some Tesla drivers over years have created an online-video genre out of testing what’s possible with their vehicles, in some cases appearing to override safety functions to perform stunts that they post to YouTube or TikTok.
One TikTok user shared a video last year that appeared to depict a Tesla going more than 60 miles an hour on a highway with no one in the driver’s seat while its passengers drank hard seltzer and sang along to Justin Bieber. The video, which refers to the car as the designated driver, has 1.7 million likes. The video’s poster didn’t respond to requests for comment.
Tesla Inc. Chief Executive Elon Musk for years has been championing his vehicles’ driver-assistance system called Autopilot and forecasting that self-driving cars are an emerging reality. Some would-be social media stars and Tesla owners can’t seem to wait.
Param Sharma, 25, has posted multiple videos to Instagram in which he appears to operate a Tesla while in the back seat with nobody at the wheel. Police in California arrested Mr. Sharma on May 10 for alleged reckless driving after an officer said he saw him operating a Tesla Model 3 from the back seat on a Bay Area highway.
Similar videos abound on social media, even though Tesla’s technology is intended only as a way to assist drivers, who are instructed to keep their hands on the wheel. Echoing Mr. Musk’s penchant for pushing the envelope, some Tesla drivers over years have created an online-video genre out of testing what’s possible with their vehicles, in some cases appearing to override safety functions to perform stunts that they post to YouTube or TikTok.
One TikTok user shared a video last year that appeared to depict a Tesla going more than 60 miles an hour on a highway with no one in the driver’s seat while its passengers drank hard seltzer and sang along to Justin Bieber. The video, which refers to the car as the designated driver, has 1.7 million likes. The video’s poster didn’t respond to requests for comment.
Tesla’s Autopilot has features designed to make hands-on driving easier and safer by helping with tasks such as steering and maintaining appropriate distance from others on the road. The company tells drivers repeatedly in its user manuals to remain engaged.
Tesla’s public messaging has at times appeared inconsistent with those instructions. A 2019 video that the company posted to YouTube, for example, shows a Tesla operating for well over a minute with the driver’s hands not on the wheel. Mr. Musk also drove a Tesla hands-free in a 2018 “60 Minutes” interview.
“I’m highly confident the car will be able to drive itself with a reliability in excess of humans this year,” Mr. Musk said in January.
Mr. Sharma, the Tesla driver arrested this past week, said in an interview that he has been inspired by Mr. Musk. He said he regularly operates his vehicle from the back seat, occasionally touching the steering wheel with his foot to keep the vehicle from coming to a stop. A spokesperson for the local district attorney said that no decision has been made on whether to formally charge Mr. Sharma.
“If Elon Musk is right about self-driving cars, then what I’m doing, like by next year it’ll be normal,” Mr. Sharma said.” He added that he doesn’t view the behavior as risky because of his perception of the vehicle’s capabilities.
Tesla and Mr. Musk didn’t respond to requests for comment. Tesla has repeatedly said the features it offers make driving safer. Last month, Mr. Musk tweeted that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”
Advanced driver-assistance systems are widely seen as providing safety benefits, though some authorities and lawmakers wonder whether they also introduce new risks. The National Transportation Safety Board and safety advocates have said Tesla isn’t doing enough to prevent misuse.
The California Department of Motor Vehicles said Monday it was reviewing whether Tesla violated a state regulation that bars companies from falsely advertising vehicles as autonomous. The Los Angeles Times first reported the review.
Tesla vehicles are designed to monitor force applied to the steering wheel to ensure driver engagement, but people have found ways to trick aspects of the system.
Last month, for example, Consumer Reports rigged a Tesla Model Y to operate on Autopilot while no one was in the driver’s seat by attaching a weighted chain to the wheel. (The person slid into the passenger’s seat without opening any doors, which would have shut off the assistance system.)
Not all the videos of drivers apparently misusing their cars are what they seem. Trevor Laird climbed into the back seat of his blue Tesla Model 3, stretched out on a plush blanket and told the car to start driving. In the TikTok video Mr. Laird shared with his roughly 360,000 followers, the sedan is shown to pull away with no one in the driver’s seat.
Mr. Laird was in fact at a private ranch along the Rio Grande river and had spliced together footage from multiple videos to make it appear as though he was embarking on a road trip with no one at the wheel. “It’s a bit of parody and my own fantasy of where these systems will go,” he said.
But others have made a business out of selling tools to circumvent safety features or providing advice on how to use or not use them.
Christopher Allessi, a YouTuber, says he has made around 2,000 Tesla-related videos. “I kind of built my whole channel around testing and pushing the limits of Tesla vehicles,” he said.
In one video, Mr. Alessi reviews a so-called defeat device that can cost $180 and is intended to trick the system into believing the driver’s hands are on the wheel. He calls reminders to clutch the wheel a “pain in the butt” and says the car’s alerts can come on when the driver’s hands rest on the wheel too lightly. In the video, Mr. Allessi shows how to use the device to stop that from happening. He does urge drivers to maintain their hands on the wheel and says driving hands-free can be hazardous.
“If you’re someone that’s going to abuse it and read a book or go through your daily emails, no, you need to get your license taken away,” he said.
Other YouTubers show viewers how to get around the alerts to drive hands free. They post videos demonstrating tricks like wedging an orange in the steering wheel or attaching a weight so the system thinks someone is touching the wheel.
Craig Merwitzer, a Tesla enthusiast who is an administrator of a Facebook group that shares tips on how to appropriately use the cars, said he thinks the company has done enough to deter misuse. The names Tesla uses for its driver-assistance features—Full Self-Driving in addition to Autopilot—could be misleading, he said, but the owner’s manual and in-car notifications are clear.
“The bottom line is I think the car is safe. I think people are dangerous.” said Mr. Merwitzer, who drives a Tesla Model S.
[Next you'll hear him say, "autonomous cars don't kill people. People kill people."]
Tyron Louw, a senior research fellow at the University of Leeds who studies how humans interact with automated driving systems, said people have a hard time understanding the limits of advanced driver-assistance features. Some of these videos of seeming misuse, he said, could lead to copycat behavior with unintended consequences.
“People are impressionable,” Dr. Louw said. While owners bear responsibility for their choices, it’s on Tesla to prevent misuse in the first place, he said.
“If they know that their system is prone to hacks, then like with software engineering, they need to figure out when hacks are invading their system,” Dr. Louw said.
Tesla’s public messaging has at times appeared inconsistent with those instructions. A 2019 video that the company posted to YouTube, for example, shows a Tesla operating for well over a minute with the driver’s hands not on the wheel. Mr. Musk also drove a Tesla hands-free in a 2018 “60 Minutes” interview.
“I’m highly confident the car will be able to drive itself with a reliability in excess of humans this year,” Mr. Musk said in January.
Mr. Sharma, the Tesla driver arrested this past week, said in an interview that he has been inspired by Mr. Musk. He said he regularly operates his vehicle from the back seat, occasionally touching the steering wheel with his foot to keep the vehicle from coming to a stop. A spokesperson for the local district attorney said that no decision has been made on whether to formally charge Mr. Sharma.
“If Elon Musk is right about self-driving cars, then what I’m doing, like by next year it’ll be normal,” Mr. Sharma said.” He added that he doesn’t view the behavior as risky because of his perception of the vehicle’s capabilities.
Tesla and Mr. Musk didn’t respond to requests for comment. Tesla has repeatedly said the features it offers make driving safer. Last month, Mr. Musk tweeted that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”
Advanced driver-assistance systems are widely seen as providing safety benefits, though some authorities and lawmakers wonder whether they also introduce new risks. The National Transportation Safety Board and safety advocates have said Tesla isn’t doing enough to prevent misuse.
The California Department of Motor Vehicles said Monday it was reviewing whether Tesla violated a state regulation that bars companies from falsely advertising vehicles as autonomous. The Los Angeles Times first reported the review.
Tesla vehicles are designed to monitor force applied to the steering wheel to ensure driver engagement, but people have found ways to trick aspects of the system.
Last month, for example, Consumer Reports rigged a Tesla Model Y to operate on Autopilot while no one was in the driver’s seat by attaching a weighted chain to the wheel. (The person slid into the passenger’s seat without opening any doors, which would have shut off the assistance system.)
Not all the videos of drivers apparently misusing their cars are what they seem. Trevor Laird climbed into the back seat of his blue Tesla Model 3, stretched out on a plush blanket and told the car to start driving. In the TikTok video Mr. Laird shared with his roughly 360,000 followers, the sedan is shown to pull away with no one in the driver’s seat.
Mr. Laird was in fact at a private ranch along the Rio Grande river and had spliced together footage from multiple videos to make it appear as though he was embarking on a road trip with no one at the wheel. “It’s a bit of parody and my own fantasy of where these systems will go,” he said.
But others have made a business out of selling tools to circumvent safety features or providing advice on how to use or not use them.
Christopher Allessi, a YouTuber, says he has made around 2,000 Tesla-related videos. “I kind of built my whole channel around testing and pushing the limits of Tesla vehicles,” he said.
In one video, Mr. Alessi reviews a so-called defeat device that can cost $180 and is intended to trick the system into believing the driver’s hands are on the wheel. He calls reminders to clutch the wheel a “pain in the butt” and says the car’s alerts can come on when the driver’s hands rest on the wheel too lightly. In the video, Mr. Allessi shows how to use the device to stop that from happening. He does urge drivers to maintain their hands on the wheel and says driving hands-free can be hazardous.
“If you’re someone that’s going to abuse it and read a book or go through your daily emails, no, you need to get your license taken away,” he said.
Other YouTubers show viewers how to get around the alerts to drive hands free. They post videos demonstrating tricks like wedging an orange in the steering wheel or attaching a weight so the system thinks someone is touching the wheel.
Craig Merwitzer, a Tesla enthusiast who is an administrator of a Facebook group that shares tips on how to appropriately use the cars, said he thinks the company has done enough to deter misuse. The names Tesla uses for its driver-assistance features—Full Self-Driving in addition to Autopilot—could be misleading, he said, but the owner’s manual and in-car notifications are clear.
“The bottom line is I think the car is safe. I think people are dangerous.” said Mr. Merwitzer, who drives a Tesla Model S.
[Next you'll hear him say, "autonomous cars don't kill people. People kill people."]
Tyron Louw, a senior research fellow at the University of Leeds who studies how humans interact with automated driving systems, said people have a hard time understanding the limits of advanced driver-assistance features. Some of these videos of seeming misuse, he said, could lead to copycat behavior with unintended consequences.
“People are impressionable,” Dr. Louw said. While owners bear responsibility for their choices, it’s on Tesla to prevent misuse in the first place, he said.
“If they know that their system is prone to hacks, then like with software engineering, they need to figure out when hacks are invading their system,” Dr. Louw said.
No comments:
Post a Comment