Bogus statistics on drug use among drivers (Shallow Thoughts)

Akkana's Musings on Open Source, Science, and Nature.

Sat, 31 Jul 2010

Bogus statistics on drug use among drivers

The "Roadshow" column in yesterday's Merc had some pretty ... odd ... statistics involving marijuana and driving.

It quotes "an NHTSA report" as saying:

contrary to popular belief, marijuana has been found to play a significant role in car accidents across the United States, with as much as 33 percent of drivers arrested at the scene of the accident being positive for marijuana and another 12 percent testing positive for marijuana and cocaine. Every year, 28 percent of drivers in the U.S. will attempt to drive within two hours after ingesting alcohol or illicit drugs. Marijuana is the drug used most often — 70 percent — by drivers who drove after drug use and is a major factor why crashes are the leading cause of death for American young people.

Whoa. Let's play that back again: 45 percent of all drivers arrested at accident scenes (33 plus another 12) test positive for marijuana? Nearly half?

Mr. Roadshow, you don't really believe that number, do you?

I didn't. So I did some searching, looking for the NTHSA source.

When I searched for large portions of the quoted phrase, I didn't find anything from the NHTSA. The Roadshow quote appears to come from an article on friendsdrivesober.org (I'm sure that's an unbiased source). Here's their MS Word file or Google's cached HTML version). The same article is also available as a PDF at prevnet.org and there are lots of other pages making reference to it.

The friendsdrivesober.org article cites "Brookoff, Cook & Mann, 1994; Sonderstrom, Dischinger, Kerns & Trillis, 1995." for the 33% number. There's no citation offered for the "28% will attempt to drive...". They credit "NHTSA, 2000" for "Marijuana is the drug used most often ... by drivers who drove after drug use", but that one's not important because it says nothing about prevalence in accidents, merely that it's used more often than other drugs (no surprise there).

The NHTSA weighs in

Googling on a more general set of terms, I found my way to a October 2000 NHTSA report, Field Test of On-Site Drug Detection Devices. It's a roundup of many different studies, with drug use numbers all over the map, though none larger than the 33% figure and certainly nothing near 45%. That 33% figure is near the bottom:

Brookoff et al. (1994) used on-site testing devices in a study that found a 58% prevalence rate for drugs in subjects arrested for reckless driving (who were not found to be impaired by alcohol). The Brookoff team found that 33% of their sample tested positive for marijuana, 13% for cocaine, or 12% for both. (Because of sampling flaws in the study, these drug test rates should not be interpreted as drug prevalence rates for reckless drivers.) Interestingly, the on-site device (Microline) used by Brookoff and his colleagues generated a significant false positive rate for marijuana when compared to GC/MS results.

The horse's mouth

So what about the original study? I wasn't able to find Dischinger, Kerns & Trillis, but here's Brookoff et al. at the New England Journal of Medicine: Testing Reckless Drivers for Cocaine and Marijuana (cookies required).

A couple of important notes on the study: the figures represent percentage of drivers arrested for "reckless driving that would constitute probable cause to suspect intoxication by drugs", who were not considered to be under the influence of alcohol, and who were suspected of being under the influence of marijuana or cocaine ("all patrol officers were told that they could summon [the testing van] if they stopped a person suspected of driving recklessly under the influence of cocaine or marijuana"). Morover, not all drivers consented to be tested, and the percentages are only for those who were tested.

Seems like a perfectly valid study, as far as it goes (though there's been some mild criticism of the test they used). It's mostly interesting as a study of how marijuana and cocaine use correlate with visible intoxication and sobriety test results. It's not a study of the prevalence of drugs on the road: the NHTSA report is right about that. The numbers it reports are useless in that context.

So the jump from that study to what friendsdrivesober.org and Roadshow implied -- that 45% of people involved in car accidents test positive for marijuana -- is quite a leap, and attributing that leap to the NHTSA seems especially odd since they explicitly say the study shouldn't be used for those purposes.

What really happened here?

So what happened here? Brookoff, Cook, Williams and Mann publish a study on behavior of reckless drivers under the influence of drugs.

NHTSA makes a brief and dismissive reference to it in a long survey paper.

Then friendsdrivesober.org writes an article that references the study but entirely misinterprets the numbers. This study gets picked up and referenced by other sites, out of context.

Then somehow the paragraph from friendsdrivesober.org shows up in Roadshow, attributed to the NHTSA. How did that happen?

If you look at the friendsdrivesober.org article, the paragraph cites Brookoff in its first sentence, then goes on to other unrelated claims, citing an NHTSA study at the end of the paragraph. I suppose it's possible (though hard to understand) that one could miss the first reference, and take the NHTSA reference at the end of the paragraph as the reference for the whole paragraph. That's the best guess I can come up with. Just another example of the game of telephone.

Nobody with any sense thinks it's a good idea to drive under the influence of marijuana or other intoxicants. But bogus statistics don't help make your point. They just cast doubt on everything else you say.

Tags: , , ,
[ 12:33 Jul 31, 2010    More headlines | permalink to this entry ]