A Tesla in full self-driving mode made a left turn out of the middle lane on a San Francisco street then jumped a bus lane. After that it turned a corner and nearly ran into several parked vehicles which caused the driver to finally grab the steering wheel. These are actual scenes captured by automotive reviewer AI Addict while many other video scenarios are being posted to YouTube. Some might suggest that these are mistakes anyone driving distracted might make. However, after all the hype it’s reasonable to expect more from artificial intelligence in passenger cars.
In July 2021 Tesla started sending over-the-air software updates for its Full Self-Driving (FSD) beta version 9 software. FSD is an advanced driver assist system that relies only on cameras, rather than cameras and radar like Tesla’s previous ADAS systems. In reaction to videos displaying unsafe driving behavior, like unprotected left turns, and other reports from Tesla owners, Consumer Reports issued a statement saying the software upgrade does not appear to be safe enough for public roads.
Consumer Reports is concerned Tesla is using existing owners and their vehicles as guinea pigs for testing new features. Consumer Reports is using the words of Tesla’s Founder to make their point for them. Tesla CEO Elon Musk did urged drivers not to be complacent while driving because “there will be unknown issues, so please be paranoid.” Many Tesla owners know what they’re getting themselves into because they signed up for Tesla’s Early Access Program that delivers beta software for feedback, but other road users have not given their consent for such trials. Consumer Reports is concerned that if the NHTSA doesn’t handle this now some companies will just treat our public roads as if they were private proving grounds, with little holding them accountable for safety.
In June 2021 the National Highway Traffic Safety Administration (NHTSA) issued an order requiring manufacturers and operators of vehicles with SAE Level 2 ADAS or SAE levels 3, 4 or 5 automated driving systems to report crashes. “NHTSA’s core mission is safety. By mandating crash reporting, the agency will have access to critical data that will help quickly identify safety issues that could emerge in these automated systems,” said Dr. Steven Cliff, NHTSA’s acting administrator, in a statement. “In fact, gathering data will help instill public confidence that the federal government is closely overseeing the safety of automated vehicles.”
The FSD beta 9 software has added features that automates more driving tasks, like navigating intersections and city streets with the driver’s supervision. But with such excellent graphics detailing where the car is in relation to other road users, down to a woman on a scooter passing by, drivers might be more distracted by the tech that’s meant to assist them at crucial moments. “Tesla just asking people to pay attention isn’t enough – the system needs to make sure people are engaged when the system is operational,” said Jake Fisher, senior director of CR’s Auto Test Center in a statement. “We already know that testing developing self-driving systems without adequate driver support can – and will – end in fatalities.”
GB Legal represents people injured because of the careless and reckless acts of others. At the end of the day your case can only be settled one time and you need to know all of the facts beforehand. The reason that insurance companies have paid our clients in excess of $130,000,000.00 is that we get the facts and are not intimidated at the prospect of going to trial when insurance companies fail to offer full compensation. We help with serious injuries that require serious representation. We are the Law Offices of Guenard & Bozarth, LLP. Our attorneys have more than 60 years of experience specializing in only representing injured people. Call GB Legal 24/7/365 at 888-809-1075 or visit www.gblegal.com We would be honored to represent you!