Tesla owners are already reporting issues including crashes and near misses with their car’s new “Smart Summon” feature, which is designed to make a Tesla vehicle drive to its owner’s location autonomously.
Jalopnik reports that Tesla owners are already reporting issues with Tesla vehicles new Smart Summon feature just a few days after the release of Elon Musk’s new Version 10 software update. The new update allowed Tesla owners to play the video game Cuphead and watch Netflix while in park, and also enables the new Smart Summon feature which allows users to call their vehicles to drive to them without human control.
While this feature sounds impressive in theory, soon after the update rolled out, users took to Twitter to discuss some of the issues they’ve faced with the new feature. Many users reported that their vehicles were colliding with objects and causing damage when using the feature:
Other party thinks that I was actually driving because I ran to my car before he got out. Please give me some advise. @LikeTeslaKim @TesLatino @Model3Owners @teslaownersSV @teslamodel3fan pic.twitter.com/ScE12wHqA9
— David F Guajardo (@DavidFe83802184) September 28, 2019
Be forewarned @Tesla @elonmusk Enhanced summon isn’t safe or production ready. Tried in my empty drive way. Car went forward and ran into the side of garage. Love the car but saddened. #Tesla #TeslaModel3 pic.twitter.com/tRZ88DmXAW
— AB (@abgoswami) September 28, 2019
In another video, a man’s Tesla vehicle can be seen almost driving directly into traffic:
— Roddie Hasan – راضي (@eiddor) September 28, 2019
Tesla has specified that the feature is currently in beta, but it appears that it is nowhere near ready for public use. Tesla has been promoting the possible future self-driving aspects of its vehicle as a major selling point for its cars, but many have questioned whether Tesla vehicles are as capable of autonomous driving as the firm claims.
David Friedman, Vice President of Advocacy for Consumer Reports, said in a press release in April:
Technology has the potential to shape future transportation to be safer, less expensive, and more accessible. Yet, safety must always come first. Today’s driver assistance technologies have helped deliver on safety, but the marketplace is full of bold claims about self-driving capabilities that overpromise and underdeliver. For instance, Tesla’s current driver-assist system, ‘Autopilot,’ is no substitute for a human driver. It can’t dependably navigate common road situations on its own, and fails to keep the driver engaged exactly when it is needed most.
We’ve heard promises of self-driving vehicles being just around the corner from Tesla before. Claims about the company’s driving automation systems and safety are not backed up by the data, and it seems today’s presentations had more to do with investors than consumers’ safety. We agree that Tesla, and every other car company, has a moral imperative to make transportation safer, and all companies should embrace the most important principle: preventing harm and saving lives.
But instead of treating the public like guinea pigs, Tesla must clearly demonstrate a driving automation system that is substantially safer than what is available today, based on rigorous evidence that is transparently shared with regulators and consumers, and validated by independent third-parties. In the meantime, the company should focus on making sure that proven crash avoidance technologies on Tesla vehicles, such as automatic emergency braking with pedestrian detection, are as effective as possible.
In a series of tweets, the PAVE campaign posted “reminders” to consumers, policymakers and the media, stating:
Most vehicles available for sale today offer driver assistance features; in all vehicles available for sale today, even those with the most advanced of these aids, the driver must always monitor and be prepared to control the vehicle. It is damaging to public discussion about advanced vehicle technologies – and potentially unsafe – to refer to vehicles now available for sale to the public using inaccurate terms.
This includes terms such as “fully automated,” “full self-driving,” “fully autonomous,” “auto pilot” or “driverless,” which can create an inaccurate impression of vehicle capabilities that can put drivers and other road users at risk. There are vehicles from several companies now on the roads that feature “self driving” or “autonomous” capability. These vehicles currently include safety drivers or engineers at the controls.
Many of these vehicles are being designed to operate as part of a ride hailing or goods delivery service, so they will not be available for sale to the general public in the near term. PAVE will speak out publicly in instances where industry participants, the media, or public officials mischaracterize the current state of technology.
Breitbart News will continue to follow Tesla and Elon Musk.