"Automakers and technology companies say they are making
driving safer, but verifying these claims is difficult.
Every three months, Tesla publishes a safety report that
provides the number of miles between crashes when drivers use the company’s
driver-assistance system, Autopilot, and the number of miles between crashes
when they do not.
These figures always show that accidents are less frequent
with Autopilot, a collection of technologies that can steer, brake and
accelerate Tesla vehicles on its own.
But the numbers are misleading. Autopilot is used mainly for
highway driving, which is generally twice as safe as driving on city streets,
according to the Department of Transportation. Fewer crashes may occur with
Autopilot merely because it is typically used in safer situations.
Tesla has not provided data that would allow a comparison of
Autopilot’s safety on the same kinds of roads. Neither have other carmakers
that offer similar systems.
Autopilot has been on public roads since 2015. General
Motors introduced Super Cruise in 2017, and Ford Motor brought out BlueCruise
last year. But publicly available data that reliably measures the safety of
these technologies is scant. American drivers — whether using these systems or
sharing the road with them — are effectively guinea pigs in an experiment whose
results have not yet been revealed.
Carmakers and tech companies are adding more vehicle
features that they claim improve safety, but it is difficult to verify these
claims. All the while, fatalities on the country’s highways and streets have
been climbing in recent years, reaching a 16-year high in 2021. It would seem
that any additional safety provided by technological advances is not offsetting
poor decisions by drivers behind the wheel.
“There is a lack of data that would give the public the
confidence that these systems, as deployed, live up to their expected safety
benefits,” said J. Christian Gerdes, a professor of mechanical engineering and
co-director of Stanford University’s Center for Automotive Research who was the
first chief innovation officer for the Department of Transportation.
G.M. collaborated with the University of Michigan on a study
that explored the potential safety benefits of Super Cruise but concluded that
they did not have enough data to understand whether the system reduced crashes.
A year ago, the National Highway Traffic Safety
Administration, the government’s auto safety regulator, ordered companies to
report potentially serious crashes involving advanced driver-assistance systems
along the lines of Autopilot within a day of learning about them. The order
said the agency would make the reports public, but it has not yet done so.
The safety agency declined to comment on what information it
had collected so far but said in a statement that the data would be released
“in the near future.”
Tesla and its chief executive, Elon Musk, did not respond to
requests for comment. G.M. said it had reported two incidents involving Super
Cruise to NHTSA: one in 2018 and one in 2020. Ford declined to comment.
The agency’s data is unlikely to provide a complete picture
of the situation, but it could encourage lawmakers and drivers to take a much
closer look at these technologies and ultimately change the way they are
marketed and regulated.
“To solve a problem, you first have to understand it,” said
Bryant Walker Smith, an associate professor in the University of South
Carolina’s law and engineering schools who specializes in emerging
transportation technologies. “This is a way of getting more ground truth as a
basis for investigations, regulations and other actions.”
Despite its abilities, Autopilot does not remove
responsibility from the driver. Tesla tells drivers to stay alert and be ready
to take control of the car at all times. The same is true of BlueCruise and
Super Cruise.
But many experts worry that these systems, because they
enable drivers to relinquish active control of the car, may lull them into
thinking that their cars are driving themselves. Then, when the technology
malfunctions or cannot handle a situation on its own, drivers may be unprepared
to take control as quickly as needed.
Older technologies, such as automatic emergency braking and
lane departure warning, have long provided safety nets for drivers by slowing
or stopping the car or warning drivers when they drift out of their lane. But
newer driver-assistance systems flip that arrangement by making the driver the
safety net for technology.
Safety experts are particularly concerned about Autopilot
because of the way it is marketed. For years, Mr. Musk has said the company’s
cars were on the verge of true autonomy — driving themselves in practically any
situation. The system’s name also implies automation that the technology has
not yet achieved.
This may lead to driver complacency. Autopilot has played a
role in many fatal crashes, in some cases because drivers were not prepared to
take control of the car.
Mr. Musk has long promoted Autopilot as a way of improving
safety, and Tesla’s quarterly safety reports seem to back him up. But a recent
study from the Virginia Transportation Research Council, an arm of the Virginia
Department of Transportation, shows that these reports are not what they seem.
“We know cars using Autopilot are crashing less often than
when Autopilot is not used,” said Noah Goodall, a researcher at the council who
explores safety and operational issues surrounding autonomous vehicles. “But
are they being driven in the same way, on the same roads, at the same time of
day, by the same drivers?”
Analyzing police and insurance data, the Insurance Institute
for Highway Safety, a nonprofit research organization funded by the insurance
industry, has found that older technologies like automatic emergency braking
and lane departure warning have improved safety. But the organization says
studies have not yet shown that driver-assistance systems provide similar
benefits.
Part of the problem is that police and insurance data do not
always indicate whether these systems were in use at the time of a crash.
The federal auto safety agency has ordered companies to
provide data on crashes when driver-assistance technologies were in use within
30 seconds of impact. This could provide a broader picture of how these systems
are performing.
But even with that data, safety experts said, it will be
difficult to determine whether using these systems is safer than turning them
off in the same situations.
The Alliance for Automotive Innovation, a trade group for
car companies, has warned that the federal safety agency’s data could be
misconstrued or misrepresented. Some independent experts express similar
concerns.
“My big worry is that we will have detailed data on crashes
involving these technologies, without comparable data on crashes involving conventional
cars,” said Matthew Wansley, a professor the Cardozo School of Law in New York
who specializes in emerging automotive technologies and was previously general
counsel at an autonomous vehicle start-up called nuTonomy. “It could
potentially look like these systems are a lot less safe than they really are.”
For this and other reasons, carmakers may be reluctant to
share some data with the agency. Under its order, companies can ask it to
withhold certain data by claiming it would reveal business secrets.
The agency is also collecting crash data on automated
driving systems — more advanced technologies that aim to completely remove
drivers from cars. These systems are often referred to as “self-driving cars.”
For the most part, this technology is still being tested in
a relatively small number of cars with drivers behind the wheel as a backup.
Waymo, a company owned by Google’s parent, Alphabet, operates a service without
drivers in the suburbs of Phoenix, and similar services are planned in cities
like San Francisco and Miami.
Companies are already required to report crashes involving
automated driving systems in some states. The federal safety agency’s data,
which will cover the whole country, should provide additional insight in this
area, too.
But the more immediate concern is the safety of Autopilot
and other driver-assistance systems, which are installed on hundreds of
thousands of vehicles.
“There is an open question: Is Autopilot increasing crash
frequency or decreasing it?” Mr. Wansley said. “We might not get a complete
answer, but we will get some useful information.”
Komentarų nėra:
Rašyti komentarą