Black boxes in cars and the expectation of privacy

Yesterday’s Times brought to the fore an important safe-streets topic: using data from “black boxes” in cars to determine what happened in a crash. This has flown under the popular radar for a while now, so it’s good to see it in the mainstream press.

The piece casts these devices in a battle between public interest and personal privacy. Manufacturers have used black boxes for over two decades to monitor performance, but now outside agencies are using it for other – yet very specific – purposes.

The National Highway Traffic Safety Administration, the regulatory body that might mandate that all cars have black boxes in the next few years, cites a reasonable use for the data they collect.

The black boxes “provide critical safety information that might not otherwise be available to N.H.T.S.A. to evaluate what happened during a crash — and what future steps could be taken to save lives and prevent injuries,” David L. Strickland, the safety agency’s administrator, said in a statement.

If you’re unsure what actually happened in a collision, you check with an objective witness. Seems fair.

So who’s against this?

Let’s see what the other side has to say.

But to consumer advocates, the data is only the latest example of governments and companies having too much access to private information. Once gathered, they say, the data can be used against car owners, to find fault in accidents or in criminal investigations.

I stared blankly at that last sentence for about three minutes, trying to comprehend what it means. It sounds like these “consumer advocates” want to protect the “right” of drivers to get away with murdervehicular manslaughter.

Privacy is nice, sure, but if investigators have tools to accurately determine fault, they should have the ability use them. (This issue is different from the NSA surveillance program in that data are only retrieved after a crash has taken place, not preemptively.) Take, for example, the opening anecdote about a politician with a hazy memory.

When Timothy P. Murray crashed his government-issued Ford Crown Victoria in 2011, he was fortunate, as car accidents go. Mr. Murray, then the lieutenant governor of Massachusetts, was not seriously hurt, and he told the police he was wearing a seat belt and was not speeding.

But a different story soon emerged. Mr. Murray was driving over 100 miles an hour and was not wearing a seat belt, according to the computer in his car that tracks certain actions. He was given a $555 ticket; he later said he had fallen asleep.

Look, if you want to win a Darwin Award, more power to you. Just make sure you’re not in a populated area when you perform your moronic act. And then don’t lie about it.

[Side note: the writer uses the term “accident” three times in the article. The NYPD retired the term in March in favor of the more-accurate “collision”; I emailed the Public Editor at the Times to see if the paper has (or is considering) a similar policy. I’ll let you know if I hear anything back.]

What’s more, consumer advocates say, government officials have yet to provide consistent guidelines on how the data should be used.

“There are no clear standards that say, this is a permissible use of the data and this is not,” [Khaliah Barnes of the Electronic Privacy Information Center] said.

Here are some suggested clear standards, which the N.H.T.S.A. is welcome to adopt: if you are involved in a car crash, you can expect investigators to take any available data that might help them in their review. And the findings can be used in court.

“It’s data that has not been shown to be absolutely reliable,” [lawyer Daniel] Ryan said. “It’s not black and white.”

OK, Mr. BSD. Let’s play a little guessing game. What else is not absolutely reliable?

That’s right! A driver’s testimony. You can re-read the story above if that’s unclear. This is especially true when another side is unable to share his version of events because, oh, I don’t know, maybe he was killed in the collision?

But privacy advocates have expressed concern that the data collected will only grow to include a wider time frame and other elements like GPS and location-based services.

Classic example of the slippery slope fallacy. If you can’t answer the question posed, create an apocalyptic scenario that will supposedly result from a chain of events started by this first one.

And compare this idea with the previous one. Privacy advocates are (1) concerned about the accuracy of data but (2) against taking any steps to make them more accurate. Pick one – you can’t argue both sides.

Come on, guys. No one is advocating constant surveillance of drivers. We only want more-accurate information to investigate crashes. If you’re scared of Big Brother, say so – not that you’re worried that the data can be used against car owners.

Just curious, advocates: what are your feelings on subpoenas for phone data after a crash?

“For most of the 100-year history of the car, it used to be ‘he said, she said,’” [data-recorder expert Thomas] Kowalick said. “That’s no longer going to be the way.”

For context, Kowalick was pivotal in getting manufacturers to install black boxes, but now hawks a device to lock down your data.

Replace the car in his quote with baseball, football, tennis, or pretty much any other sport. They went a long time without instant replay. Oh, how the purists howled when they were proposed! Taking away the integrity and beauty of human errors of judgment for the sake of “getting things right”!

But video review is now considered an integral part of professional play. And determining ball position is small potatoes compared to ensuring an accurate reconstruction of a crash, particularly when innocent lives are involved.

This entry was posted in Thoughts and tagged , , , , , , , , , . Bookmark the permalink.

Leave a Reply