Can Tesla data help us understand car crashes?

Just before 2 p.m. on a clear day in July 2020, when Tracy Forth was driving near Tampa, Florida, her white Tesla Model S was hit from behind by another vehicle in the left lane of Interstate 275.

It’s the kind of accident that happens thousands of times a day on American highways. When the two cars collided, Mrs. Forth’s car skidded in the middle like the other, a blue Acura SUV, spinning across the highway and over the far shoulder.

After the collision, Ms Forth told police officers that the autopilot – the driver assistance system from Tesla that can steer, brake and accelerate cars – suddenly activated its brakes for no apparent reason. She was unable to regain control, according to the police report, before the Acura crashed into the back of her car.

But her description is not the only record of the accident. Tesla recorded just about every detail, right down to the steering wheel angle in milliseconds before the collision. This data was captured by the cameras and other sensors mounted on the vehicle, and provides an amazingly detailed account of what happened, including video from the front and back of Ms Forth’s car.

It shows that 10 seconds before the accident, the autopilot was in control as the Tesla was traveling on the highway at 77 mph. Then I asked the autopilot to change lanes.

The data that Ms. Forth collected the Form S was no accident. Tesla and other automakers are increasingly capturing this information to operate and improve their driving technologies.

Automakers rarely share this data with the public. This has clouded the understanding of the risks and rewards of driver assistance systems, which have been involved in hundreds of accidents over the past year.

But experts say this data could fundamentally change the way regulators, police departments, insurance companies and other organizations investigate anything that happens down the road, making such investigations more accurate and less costly.

It could also improve the way cars are regulated, giving government officials a clearer idea of ​​what should and shouldn’t be allowed. The death toll on the country’s highways and streets has risen in recent years, hitting a 20-year high in the first three months of this year, and regulators are trying to find ways to reverse the trend.

“This can help separate technology-related accidents from driver-error-related accidents,” said Brian Reimer, a research scientist at MIT who specializes in driver assistance systems and automated vehicles.

This data is significantly more comprehensive and specific than information collected by event data recorders, also known as “black boxes,” which have long been installed on cars. These devices collect data in the few seconds before, during, and after a crash.

By contrast, Tesla data is a continuous stream of information that includes video of the vehicle’s surroundings and statistics—sometimes called vehicle performance data or telematics—that describe its millisecond to millisecond behavior.

This provides a comprehensive view of the vehicle collecting the data as well as insight into the behavior of cars and other things on the road.

The video alone provides insight into malfunctions that were rarely available in the past. In April, a motorcyclist was killed after colliding with a Tesla in Jacksonville, Florida. At first, Tesla’s owner, Chuck Cook, told the police that he had no idea what had happened. The motorcycle crashed into the back of his car, out of his field of vision. But the video clip captured by a Tesla car showed that the accident occurred because the motorcycle lost a wheel. The culprit was a loose nut.

When detailed stats are paired with such a video, the effect can be even more powerful.

Matthew Wansley, a professor at Cardoso College of Law in New York who specializes in emerging auto technologies, saw this power during his tenure at a self-driving car company in late 2010. He said data collected from cameras and other sensors provided extraordinary insight into the causes of accidents. and other traffic accidents.

“Not only did we know what our car was doing at any given moment, down to milliseconds, we knew what other vehicles, pedestrians and cyclists were doing,” he said. “Forget the testimony of eyewitnesses.”

In a new academic paper, he argues that all automakers should be required to collect this type of data and share it publicly with regulators whenever an accident occurs. With this data in hand, he believes, the National Highway Traffic Safety Administration can improve road safety in ways that were previously impossible.

The agency, the nation’s largest auto safety regulator, is collecting small amounts of this data from Tesla as it investigates a series of accidents involving autopilot. The agency said in a statement that such data “reinforces the results of our investigation and can often be useful in understanding crashes.”

Others say this data could have a bigger impact. Mrs. Forth’s lawyer, Mike Nelson, is building a company around her.





Mike Nelson in a Tesla.




Hana Yeon for The New York Times

Backed by data from a Tesla, Ms Forth eventually decided to sue the driver and owner of the vehicle she hit, alleging that the vehicle attempted to overtake her vehicle at an unsafe speed. (An attorney representing the owner of the other vehicle declined to comment.) But Mr. Nelson says such data has more important uses.

His recent startup, QuantivRisk, aims to collect driving data from Tesla and other automakers before analyzing it and sell the results to police departments, insurance companies, law firms and research labs. “We expect to sell to everyone,” said Mr. Nelson, the Tesla driver himself. “This is a way to gain a better understanding of technology and improve safety.”

Mr. Nelson has had data on about 100 collisions with Tesla vehicles, but expanding to much larger numbers can be challenging. Because of Tesla’s policies, it can only collect data with the consent of each individual car owner.

Tesla CEO Elon Musk and a Tesla attorney did not respond to requests for comment for this article. But Mr. Nelson says he believes that Tesla and other automakers will eventually agree to share such data more widely. It might expose when their cars break down, he says, but it also shows when cars behave as advertised — and when drivers or other vehicles are at fault.

“Driving-related data needs to be more open to those who need to understand how accidents happen,” said Mr. Nelson.

Mr. Wansley and other experts say that sharing data publicly in this way may require a new legal framework. At the moment, it is not always clear to whom the data belongs – the automaker or the owner of the car. And if automakers start sharing data without car owners’ consent, it could raise privacy concerns.

“For the data on safety, the case for public sharing of that data is very strong,” Mr. Wansley said. “But there will be a privacy cost.”

Mr. Reimer, of the Massachusetts Institute of Technology, also cautions that these statements are not infallible. Although very detailed, they may be incomplete or open to interpretation.

With the meltdown in Tampa, for example, Tesla supplied Mr. Nelson with data for only a short period of time. It is unclear why the autopilot suddenly hit the brakes, although the truck on the side of the road was to blame.

But Mr. Reimer and others also say the video and other digital data collected by companies like Tesla could be a great advantage.

“When you have objective data, opinions don’t matter,” he said.

Leave a Reply

%d bloggers like this: