This week US Department of Transportation report detailed the crashes that have involved advanced driver assistance systems over the past year or so. Teslaadvanced features, including autopilot and full autonomous driving, accounted for 70 percent of the nearly 400 incidents, far more than previously known. But the report could raise more questions about this security technology than it answers, the researchers say, due to blind spots in the data.
The report looked at systems that promise to save some of the tedious or dangerous moments of driving by automatically changing lanes, staying in lane, braking before a collision, slowing down before big curves on the road and, in some cases, working on motorways. without driver intervention. Systems include Autopilot, Ford’s BlueCruise, General Motors’ Super Cruise and Nissan’s ProPilot Assist. While this shows that these systems are not perfect, there is still much to be learned about how the next generation of road safety features actually work.
This is largely due to the fact that automakers have vastly different ways of reporting crash data to the federal government. Some, such as Tesla, BMW, and GM, can receive detailed data from their vehicles wirelessly after an accident has occurred. This allows them to quickly meet the government’s requirement for 24/7 reporting. But others, like Toyota and Honda, don’t. Chris Martin, a spokesman for American Honda, said in a statement that the automaker’s reports to the DOT are based on “unverified customer claims” about whether their advanced driver assistance systems were on at the time of the accident. Later, the automaker can extract black box data from their vehicles, but only with the permission of the customer or at the request of law enforcement, and only with the help of specialized wired equipment.
Of the 426 crash reports detailed in the government report, only 60 percent came through the car’s telematics systems. The other 40 percent was obtained through customer reports and claims, sometimes filtered through disparate dealer networks, media reports, and law enforcement. As a result, the report prevents anyone from making apples-to-apples comparisons of safety features, says Brian Reimer, who studies vehicle automation and safety at MIT’s AgeLab.
Even the data that the government collects does not fit into the full context. The government, for example, does not know how often a car using the extended assistance function has a mileage accident. The National Highway Traffic Safety Administration, which published the report, warned that some incidents may appear more than once in the data set. And automakers with large market share and good reporting systems, especially Tesla, are likely to be overrepresented in crash reports simply because they have more cars on the road.
It’s important that the NHTSA report doesn’t stop automakers from providing more comprehensive data, says Jennifer Homendy, chair of the federal regulator, the National Transportation Safety Board. “The last thing we want is to punish manufacturers who collect reliable safety data,” she said in a statement. “What we really need is data that tells us what security improvements need to be made.”
Without this transparency, it can be difficult for drivers to understand, compare and even use the features their car is equipped with, and for regulators to keep track of who is doing what. “As we gather more data, NHTSA will be better able to identify any emerging risks or trends and learn more about how these technologies work in the real world,” agency administrator Stephen Cliff said in a statement.
Outside the NHTSA, this information is almost impossible to find. Police reports and insurance claims can reveal issues with advanced safety features, says David Kidd, senior fellow at the nonprofit Insurance Institute for Highway Safety. But accurate police reports depend on law enforcement identifying and understanding the many different systems of different automakers. And claims can only concern whether the vehicle involved in the accident was equipped with a security system, but whether it was turned on at the time of the accident.
Tesla offers some degree of self-reporting, but has relied on statistics for years that the NHTSA said were misleading in 2018. Company quarterly report Autopilot safety reports don’t include important context, such as how often vehicles with the system enabled go off the highway and how safer those using the feature are than other drivers of other luxury cars. Tesla did not respond to a request for comment on the new DOT report.
The concern, according to Kidd, an IIHS researcher, is that new safety systems “could lead to different types of accidents and possibly new failures that create different types of safety problems.” The DOT, for example, is investigating incidents in which Teslas crashed into stopped emergency vehicles, killing at least one person and injuring 15 others. He is also investigating reports of cars suddenly braking on autopilot without warning and for no apparent reason. People “can handle a lot of weird driving situations with ease,” Kidd says. But some automotive systems are “not flexible enough, not innovative enough to handle what’s happening on the roads today.”
Beyond specific technologies, safety researchers are wondering if driver assistance systems have fundamental flaws. Automakers warn that drivers should keep their hands on the wheel and keep their eyes on the road, even when the systems are engaged, but decades of research show that it’s hard for people to keep paying attention to the task at hand when the car is doing most of the work. Job. Consumer Reports ranked GM’s Super Cruise and Ford’s BlueCruise as the safest driver assistance systems because both automakers use in-car cameras to make sure drivers are looking ahead. A study by Reimer’s team at the Massachusetts Institute of Technology found that drivers using Autopilot are more likely to take their eyes off the road when the system is on.
Reimer views the DOT report and dataset as a call to action. “With automation comes a new level of complexity,” says Reimer. “There are many risks and many rewards.” The trick will be to minimize these risks, and that will require much better data.
Credit: www.wired.com /