Robert Gehl writes that the devil is in the details. Or more specifically, the devil is in the statistics.
If you can get people to believe the statistics you’re throwing out, you can change hearts, minds, and even laws.
That’s why reporters and people in the news media need to be very careful when citing statistics, using them in context and correctly doing so.
Sadly, this doesn’t happen much of the time.
Like a statistic that was tossed around stating that 4.2 percent of American children have witnessed a shooting in the past year.
This was widely reported in the media – including The Washington Post and the Dallas Morning News. It was based on a national study about kids and guns.
A free-thinking reader of the Dallas paper questioned that number. “Really?” Steve Doud wrote. “Does it really sound believable that one kid out of every 24 has witnessed a shooting in the last year? I think not, unless it was on TV, in a movie, or in a video game. In that case it would probably be more like 100 percent.”
So the paper decided to look into the statistic. And guess what? It’s all bogus.
“Here is the unfortunate story of how a couple of teams of researchers and a whole bunch of news organizations, including this one, unintentionally but thoroughly misinformed the public,” the paper’s editor Mike Wilson wrote.
It all started in 2015, when University of New Hampshire sociology professor David Finkelhor and two colleagues published a study called “Prevalence of Childhood Exposure to Violence, Crime, and Abuse.” They gathered data by conducting phone interviews with parents and kids around the country.
The Finkelhor study included a table showing the percentage of kids “witnessing or having indirect exposure” to different kinds of violence in the past year. The figure under “exposure to shooting” was 4 percent.
Those words — exposure to shooting — are going to become a problem in just a minute.
Earlier this month, researchers from the CDC and the University of Texas published a nationwide study of gun violence in the journal Pediatrics. They reported that, on average, 7,100 children under 18 were shot each year from 2012 to 2014, and that about 1,300 a year died. No one has questioned those stats.
The CDC-UT researchers also quoted the “exposure to shooting” statistic from the Finkelhor study, changing the wording — and, for some reason, the stat — just slightly:
“Recent evidence from the National Survey of Children’s Exposure to Violence indicates that 4.2 percent of children aged 0 to 17 in the United States have witnessed a shooting in the past year.”
And it’s the wording that makes all the difference.
The Washington Post wrote a piece about the CDC-UT study. Why not? Fascinating stuff! The story included the line about all those kids witnessing shootings.
The Dallas Morning News picked up a version of the Washington Post story.
And Steve Doud sent me an email.
Editor Mike Wilson continues:
When I got it, I asked editorial writer Michael Lindenberger to do some research. He contacted Finkelhor, who explained the origin of the 4 percent “exposure to shooting” stat.
According to Finkelhor, the actual question the researchers asked was, “At any time in (your child’s/your) life, (was your child/were you) in any place in real life where (he/she/you) could see or hear people being shot, bombs going off, or street riots?”
So the question was about much more than just shootings. But you never would have known from looking at the table.
Finkelhor said he understood why “exposure to shooting” might have misled the CDC-UT researchers even though his team provided the underlying question in the appendices. Linda Dahlberg, a CDC violence prevention researcher and co-author of the study featured in The Post and this newspaper, said her team didn’t notice anything indicating the statistic covered other things.
Then again, the Finkelhor study didn’t say anything about kids “witnessing” shootings; that wording was added by the CDC-UT team. Dahlberg said she’ll ask Pediatrics about running a correction.
All of this matters because scientific studies — and the way journalists report on them — can affect public opinion and ultimately public policy. The idea that one in 25 kids witnessed a shooting in the past year was reported around the world, and some of the world probably believed it.
No matter where you stand on guns or any other issue, we ought to be making decisions based on good information.
Finkelhor’s team caused confusion by mislabeling a complicated stat. The CDC-UT researchers should have found the information suspect. The Washington Post should have asked more questions about that line from the CDC-UT study.
And we should have been as skeptical of the Washington Post report as Steve Doud was.
Wording and framing matter in journalism; the media at large needs to remember this and revive standards of journalism.