A Tesla in full self-driving mode makes a left turn from the middle lane on a busy San Francisco street. It jumps into a bus lane where it’s not meant to. It turns into a corner and almost collapses into parked vehicles, leaving the driver lulled to the wheel. These scenes have been captured by car critic AI Addict, and other similar scenarios are popping up on YouTube. One can say that these are all the mistakes that a human being would have made on a cell phone. But we expect more from our AI overlord.
earlier this month, Tesla begins sending out over-the-air software updates for its full self-driving (FSD) beta version 9 software, an advanced driver assistance system that relies only on cameras, not Tesla’s previous ADAS systems such as cameras and radar.
In response to videos displaying unsafe driving behavior, unsafe left turn, and other reports from Tesla owners, Consumer Reports issued a statement it said on Tuesday that the software upgrade isn’t secure enough for public roads, and that it will independently test software updates on its Model Y SUVs once the necessary software updates are received.
The consumer organization said it is concerned that Tesla is using its current owners and their vehicles as guinea pigs for testing new features. Making his point for them, Tesla CEO Elon Musk urged drivers not to become complacent while driving because “there will be unknown issues, so please go crazy.” Many Tesla owners know what they are getting themselves into because they signed up for Tesla’s Early Access program that provides beta software for feedback, but other road users have not given their consent to such tests. has given.
Tesla updates are shipped to drivers across the country. The electric vehicle company did not respond to a request for more information about whether it takes into account self-driving regulations in specific states — 29 states have enacted laws related to autonomous driving, but they vary wildly on a state basis. are different. Other self-driving technology companies such as Cruise, Waymo and Argo AI tell CR that they either test their software on private tracks or use trained safety drivers as monitors.
William Wallace, manager of safety policy at CR, said in a statement, “Car technology is advancing really fast, and automation has a lot of potential, but policymakers need to take steps to implement strong, sensible safety regulations Is required.” “Otherwise, some companies will treat our public roads as if they are private proving grounds, much less holding them accountable for safety.”
in June, National Highway Traffic Safety Administration Issued a Standing General Order requiring manufacturers and operators of vehicles with SAE level 2 ADAS or SAE level 3, 4 or 5 automatic driving systems to report crashes.
“NHTSA’s core mission is safety. By mandating crash reporting, the agency will have access to critical data that will help quickly identify safety issues that emerge in these automated systems,” said NHTSA Acting Administrator Dr. Steven Cliff in a statement. said in the statement. “Indeed, collecting the data will help build public confidence that the federal government is closely monitoring the safety of automated vehicles.”
Features have been added to the FSD beta 9 software that automate more driving tasks, such as navigating intersections and city streets under the supervision of a driver. But with such excellent graphics detailing where the car is in relation to other road users, in relation to a woman on a scooter below, drivers may be more distracted by the technology that is there to assist them in critical moments.
“Tesla just asking people to pay attention isn’t enough — the system needs to make sure people stay engaged when the system is on,” Jay said.AK Fischer, senior director of CR’s Auto Test Center in a statement. “We already know that testing developing self-driving systems without adequate driver support can – and will – end in fatal results.”
Fischer said Tesla should implement an in-car driver monitoring system to ensure drivers are keeping an eye on the road to avoid accidents Uber’s self-driving test vehicle that hit and killed a woman in 2018 As he crossed the street in Phoenix.