The Physicists of Journalism

This Alberto Cairo piece on “data journalism” has been kicking around for a while, and it’s taken me a while to pin down what bugs me about it. I think my problem with it ultimately has to do with the first two section headers in which he identifies problems with FiveThirtyEight and Vox:

1. Data and explanatory journalism cannot be done on the cheap.

2. Data and explanatory journalism cannot be produced in a rush.

The implication here is that “data and explanatory journalism” is necessarily a weighty and complicated thing, something extremely Serious to be approached only with great care. But that seems to me to miss the entire point of FiveThirtyEight and Vox, and the thing that makes them great.

That is, what seems to me to be the best feature of these sites is that they don’t view “data journalism” as Serious and unapproachable. And that’s a very good thing, because if the people producing it view it as a difficult and weighty enterprise, that gets through to the readers. Who will then view data-driven reporting as something Serious, only to be approached when you’re in the mood for some heavy slogging.

And that attitude already exists and is one of the biggest problems facing science communication and serious policy discussions. People view math and science as Difficult and Serious, and flinch away when they come up. Which allows canny and cynical politicians to exploit this to create an illusion of confusion around policy issues where the science is actually perfectly clear-cut.

The biggest potential benefit of sites like FiveThirtyEight and Vox is that they might break that association between “data-driven” and “hard work.” Yeah, a lot of their stuff is kind of superficial, but I’d say that’s a feature, not a bug– if you can ease people into thinking a bit more quantitatively through somewhat superficial treatments of popular subjects, that’s all to the good. The kind of lightweight stories that you can do on the cheap and in a rush are exactly what we need. If the only data-driven stories we get are massive exhaustively researched stories about weighty topics, then people will continue to think that statistics are hard and boring, and nothing will change.

But then, I would say that sort of thing, given that I’m a physicist and a blogger (also, full disclosure, Nate “FiveThiryEight” Silver figures prominently in one chapter of my forthcoming book). Physicists are famous for our love of simplified fast-and-cheap approximate models (“spherical cows” and all that), and bloggers are all about generating content quickly. So reading FiveThirtyEight and Vox is sort of like reading Dot Physics if Rhett wrote about current events. And if I ever lost my mind and decided to give up the tenured professor gig for the glamorous life of a full-time journalist, that’s the sort of thing I’d be most likely to go for.

So, while I occasionally have issues with things that they write, as a general matter, I love what FiveThirtyEight and Vox are doing. Nate Silver and Ezra Klein are the physicists of the journalistic world, and I think we need more of that. We need “data journalism” that’s fast and cheap and most of all fun.

——

(Disclaimer: I don’t read everything that FiveThirtyEight and Vox post, and I don’t like everything that I do read, so please spare me the “gotcha” comments pointing to some stupid thing that one of their writers posted. I’m talking about the general approach, here, which I like very much.)

5 comments

  1. We need “data journalism” that’s fast and cheap and most of all fun.

    We need “data journalism” of all kinds, but yes, we definitely need some of the easy stuff that’s cheap enough that hungry journalists (to say nothing of corporate bean counters) are willing to take the risk to do it and support it financially.

    Anything but the “wholesome return of speculation on a trifling investment of fact” which Mark Twain associated with science but these days applies much more strongly to the pundit-driven news cycle we have now.

    BBC News has finally come to its senses and pulled Mark Mardell off of its web-based reporting of US politics–he wasn’t bringing anything, other than a BBC English accent (which is irrelevant on the Web), to the table that couldn’t be found elsewhere. I hope other major news networks follow suit, and consign most of their pundits to the unemployment line.

  2. The problem with fast and cheap data journalism is that there is a certain mindset in the non-scientist community (heck, even within the scientist community) which basically amounts to “It’s got numbers – it must be right!”

    Fermi estimations are great and all, but they become dangerous (especially in the social/political sphere) when they’re interpreted by readers as something more than a quick back-of-the-envelope idea-checker/generator, and instead are taken to be a definitive analysis. “No, you’re wrong – 538 did an analysis on this and proved that there’s no correlation.”

  3. I agree that over-weighting “scientific” analysis is a possible problem, but I don’t think the solution to that is to demand that only exhaustive and rigorous analyses be presented, any more than the solution to hype is to insist that nothing can be discussed with the media until peer review has run its course. What we need is more of this stuff, so people can get used to the idea of back-of-the-envelope estimates as a real thing that scientists do. And, more importantly, something that non-scientists can understand, and even do.

  4. Perhaps it’s not so much that statistics is difficult as that it’s unintuitive.

Comments are closed.