Tech

11 more auto-vehicle-related crash deaths, US government data shows


According to newly released government data, 11 people were killed in crashes in the US involving vehicles using autopilot systems over a four-month period earlier this year, part of the alarming incidents related to this technology.

Ten of the vehicle-related deaths were committed by Teslaalthough it’s unclear from National Highway Traffic Safety Administration data whether the technology itself is at fault, or the driver’s fault may be responsible.

The 11th death involves a Ford to pick up. The automaker said it had to quickly report the fatal crashes to the government, but it later determined that the truck was not equipped with a partial autopilot system.

The deaths include four motorcycle-related crashes that occurred during the spring and summer: two in Florida and one in California and Utah. Safety advocates note that motorcyclist deaths in crashes involving Tesla vehicles using automated driver assistance systems such as Autopilot are increasing.

New fatal crashes have been recorded in a database NHTSA is building in an effort to widely assess the safety of automated driving systems, led by Tesla, that are increasingly being used widely used. Tesla alone has more than 8,30,000 vehicles on US roads with this system. The agency is requiring auto and technology companies to report all accidents involving self-driving cars as well as cars with driver assistance systems that can take on some driving tasks from People.

Eleven new fatal crashes, reported between May and September, were included in the statistics released Monday by the agency. In June, the agency released data it had collected between July of last year and May 15.

Figures released in June show that six people died in crashes involving automated systems and five were seriously injured. Of the deaths, 5 occurred in Teslas and 1 in Ford. In each case, the database states that advanced driver assistance systems were in use at the time of the crash.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said he was confused by NHTSA’s continued investigations and what he called a lack of action since the problems. with Autopilot starting to appear in 2016.

“I think there’s a pretty clear pattern of bad behavior by Tesla when it comes to following (federal) safety act orders, and NHTSA just sits there,” he said. “How many more deaths of motorcyclists do we need to see?”

Brooks notes that Tesla crashes are victimizing more people than non-Tesla owners.

“You’re seeing innocent people who have no choice but to be killed or injured,” he said.

A message was left on Tuesday seeking a response from NHTSA.

Tesla’s crash count is likely to grow because it uses telecommunications to monitor its vehicles and obtain real-time crash reports. Other automakers lack such capabilities, so their accident reports may appear slower or may go unreported, NHTSA said.

NHTSA has been investigating the Autopilot feature since last August after a string of crashes since 2018, in which Teslas collided with emergency vehicles parked along the road with flashing lights. That investigation was one step closer to the June recall, when it was upgraded to so-called technical analysis.

In the documents, the agency raised questions about the system, finding that the technology was being used in areas where its capabilities were limited and many drivers failed to take steps to Avoid collisions despite warnings from vehicles.

NHTSA also reported that it had recorded 16 crashes, in which vehicles with automated systems in use crashed into ambulances and trucks with warning signs, leaving 15 people injured and one dead.

The National Transportation Safety Board, which has also investigated a number of Tesla crashes since 2016, has recommended that NHTSA and Tesla limit Autopilot use to areas where it can operate safely. whole. The NTSB also recommended that NHTSA ask Tesla to improve its systems to ensure that drivers are paying attention. NHTSA has yet to act on the recommendations. (The NTSB can only make recommendations to other federal agencies.)

The message was left on Tuesday seeking comment from Tesla. At the company’s artificial intelligence day in September, the CEO Elon Musk asserts that, based on crash rate and total kilometers driven, Tesla’s automated systems are safer than the driver’s – a position some safety experts dispute.

“At a time when you believe that adding autonomy reduces injuries and deaths, I think you have a moral obligation to implement it,” Musk said. “Although you will be sued and blamed by a lot of people. Because the people you’ve saved don’t know their lives have been saved. And those who occasionally die or get injured, they certainly knew, or their state, that there was a problem with Autopilot anyway. “

Musk said that Teslas with automated systems have controlled more than 3 million vehicles on the road.

“That’s a lot of miles to be driven every day. And it won’t be perfect. But the important thing is that it is very clearly safer than not implementing it.”

In addition to Autopilot, Tesla also sells a “Full Self-Driving” system, although the company says cars cannot drive themselves and drivers must be ready to intervene at all times.

The number of deaths related to auto vehicles is small compared to the total number of traffic deaths in the US. Nearly 43,000 people were killed on American roads last year, the highest number in 16 years, after Americans returned to the roads as the pandemic eased. Authorities blamed reckless behavior such as speeding and driving while impaired by drugs or alcohol.

© Thomson Reuters 2022


Affiliate links can be generated automatically – see ours Moral standards for details.

news7f

News7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button