A software CEO is asking Tesla to ban its “self-driving” feature until they can show that the software will detect children.
In defense of the company, some Tesla enthusiasts used their children as props to prove that the feature really works.
YouTube removed two videos from its platform showing Tesla drivers using their own children to conduct vehicle safety tests.
The tests were meant to prove that Tesla’s Autopilot and “full self-driving” (FSD) beta software — the automaker’s advanced driver assistance systems which have automated driving features but do not enable the cars to drive themselves — would automatically detect pedestrians, and children, that are walking or standing in the road and avoid hitting them.
A YouTube spokesperson told CNBC, which first reported the news, that the social media platform removed the videos because YouTube doesn’t allow content showing a minor participating in dangerous activities or encouraging minors to do dangerous activities. YouTube is a division of parent company Alphabet, which also owns autonomous vehicle company Waymo.
The videos posted by Tesla investors were in part a response to a TV ad by the Dawn Project, an organization aiming to ban unsafe software from safety critical systems, that showed Tesla’s FSD software repeatedly hitting child-sized mannequins on a test track. The Dawn Project, which is headed by Dan O’Dowd, CEO of Green Hill Software, also posted a full-page ad in The New York Times in January calling FSD “the worst software ever sold by a Fortune 500 company.”
Tad Park, a Tesla owner and investor and CEO of Volt Equity, posted a video on August 14 that showed him driving a Model 3 vehicle at eight miles per hour toward one of his children on a San Francisco road. The video had tens of thousands of hits before YouTube took it down.
Park told CNBC his kids were never in danger and that he was prepared to take over at any time. The video he posted showed the car slowing down and not killing or maiming his kid, nor anyone or anything else.
Tesla vehicles come standard with Autopilot, an advanced driver assistance system (ADAS) that includes features like traffic-aware cruise control, steering assist within clearly marked lanes and pedestrian detection at crosswalks. FSD is Tesla’s more advanced ADAS, and it includes the parking feature Summon as well as Navigate on Autopilot, which navigates a car from highway on-ramp to off-ramp and is now operational on city streets. All of these capabilities require a human driver to stay focused and take control of the vehicle when needed.
A series of accidents involving Tesla vehicles that may have been engaged in one of the ADAS systems have prompted investigations by National Highway Traffic Safety Administration (NHTSA). Last week, the federal agency updated an ongoing probe into 830,000 Tesla vehicles equipped with Autopilot to learn more about how Tesla’s cabin camera determines if a driver isn’t paying attention while Autopilot is engaged and sends alerts.
Autopilot and FSD have also come under fire at the state level recently. In late July, the California Department of Motor Vehicles filed complaints alleging that Tesla was falsely advertising the capabilities of its ADAS in an unsafe manner.
CEO Elon Musk tweeted Sunday that FSD’s price would increase in North America from a one-time payment of $12,000 to $15,000 starting September 5.