Tesla and Honda’s 363 Accidents Show Why Self-Driving Cars May be Decades Away from Safety

Tesla and Honda’s 363 Accidents Show Why Self-Driving Cars May be Decades Away from Safety

A report states that Tesla and Honda’s self-driving cars had 363 accidents in a year.

We were assured a very near future where autonomous machines would be serving our demands and vehicle ownership would be rendered unneeded: robots would rapidly and effectively deliver our orders, and we might squeeze in a couple of more hours of job or sleep while being chauffeured around in autonomous vehicles.

Development has been realized, at minimum, on some of this. University campuses and cities across North America have certainly experienced the expanding presence of tiny food-delivery robots. New partnerships have lately been announced to develop and test the safety of self-driving trucks.

The experience toward autonomous or self-driving consumer cars, on the other side, has undoubtedly come to a sudden stop. In 2021, top industry specialists acknowledged that improving safe autonomous driving systems was not as easy as expected. Among them, Elon Musk conceded that cultivating the technology demanded to deliver safe self-driving vehicles has demonstrated more difficult than he believed.

Automation paradox

This week, more negative information came when the U.S. National Highway Traffic Safety Administration (NHTSA) released numbers that revealed that Tesla vehicles were in charge of almost 70 percent of the crashes involving supposed SAE Level 2 cars.

Some cars are totally autonomous and can drive without input from the human motorist. For instance, Waymo One, in Phoenix, Ariz., is a ride-hailing service that presently deploys self-governing cars on an examination route.

SAE Level 2 self-governing systems, like Tesla Autopilot, need human drivers to remain alert every moment, even when the system temporarily takes control of steering and velocity. As soon as the traffic or road conditions are not adequate for the system to run, control is replaced to the motorist who needs to take over manual control of the car.

Human factors engineering is a cross-disciplinary research field examining how people interact with vehicle technology. Its researchers have emphasized the safety risks of automated driving for years, especially when the system requires the motorist to make up for technological deficiencies to run safely.

This is the scenario in what is known as the automation paradox, wherein the more automated the automobile, the more challenging it is for humans to operate it properly.

Underestimating car capacity

Among the most remarkable dangers of operating SAE Level 2 cars is when chauffeurs miscomprehend the capacities of the automated system. The issue usually leads to hazardous actions like reading a book or napping while the car is in motion.

In 2021, there were many reports of risky behaviors at the wheel of Level 2 cars, that the NHTSA demanded manufacturers to start reporting crashes that had occurred when these systems were engaged.

The initial findings, released in June 2022, revealed that since 2021, Tesla and Honda’s vehicles had been involved in 273 and 90 reported accidents when these systems were involved. Most crashes occurred in Texas and California.

While these data paint a disappointing image of the safety of these systems, they pale in contrast to the over 40,000 reported deadly accidents that occurred in the United States in 2021 alone.

As a component of the same report, NHTSA itself highlights some of the methodological limitations of the research: from the incompleteness of some of the source information to failing to account for private manufacturers’ overall car volume or distance traveled by vehicles.

For the doubters, this does not mean the end of autonomous cars. Nonetheless, it confirms that the widespread deployment of safe self-driving vehicles is not years but decades in manufacturing.


Read the original article on the Conversation.

Share this post