Lessons from Covering the Gulf Oil Leak

Posted Wednesday, July 7, 2010 at 2:56 p.m. by Chris Amico in Frameworks for Reporting , Lessons From... and Projects about JavaScript, journalism and programming

Cross posted at MediaShift

The oil spill in the Gulf of Mexico has lasted more than two months now. It is the worst spill in US history, and it is likely to continue until at least August. And in covering it, the NewsHour has broken every traffic record it ever had.

So, what have we learned here?

(Quick note: A lot of the thinking behind this post comes from a debriefing at work with my colleagues Vanessa Dennis, Travis Daub and Katie Kleinman, and from conversations about the spill and our coverage with other people in and out of the newsroom. Just so no one thinks this is all coming out of my head. Now then...)

Embrace the uncertainty

It's incumbent on [journalists] to level with the users. If that means backing up to say, "Actually, it’s hard to tell what happened here,” or, "I'll share with you what I know, but I don’t know who's right," this may be unsatisfying to some, but it may also be the best an honest reporter can do. Portraying conflicting accounts or clashing interpretations is an exacting skill, which does require a certain detachment. But there is no necessary connection between that skill, or that kind of detachment, and the ritualized avoidance of all conclusions, such as we find in He Said, She Said and the View from Nowhere.

Jay Rosen: Fixing The Ideology Problem in Our Political Press

Rosen is talking about political journalism, but I think it applies very well here (and there are plenty of political facets to this story). As I said in my earlier post on the spill, part of what made me hesitant to make that now-famous ticker was having to choose a flow rate when there were so many conflicting reports.

Uncertainty is part of the story here. Sometimes it's a huge part. There are probably a lot of journalists uncomfortable saying so explicitly, "We don't know, and neither does anyone else," but it's what the story is here.

Commit to the story

The NewsHour has gotten good at this kind of story of late. For big, complicated events, where lots of people are watching, where knowing what happened is easy but knowing what it means is hard, the NewsHour has learned how to tell the ongoing story and, critically, to stick to it.

We don't do this for most stories. There are lots of one-off blog posts and features, and plenty that can be told with one segment on the show. The stories where we can dive in and hang on, though, good stuff happens there.

Also, putting it all in one place is helpful.

Give users tools to answer their own questions

Here's what I told Poynter's Al Tompkins:

The "NewsHour" is a public media company, and I think part of our mission is to give the public tools to understand the news better. People see this and have different reactions, and by letting them embed it on their own sites, we allow the conversation to spread beyond areas we can think up ourselves.

There are questions we'll never think of. That's true of the NewsHour, and it's true of the New York Times. And even if we could think of every possible angle to a story, there is no guarantee that we'll answer your particular question. Building tools our users (and reporters) can use gives us a way to catch more of those questions and find more of those answers.

Build things that make your reporting better

Here's what I'm most proud of about the widget I didn't want to build: It made our reporting better.

If we were going to estimate how much oil had flowed into the Gulf, it was vital that we knew what the estimates were, how they were made and what numbers were defensible. I've rewritten the JavaScript a handful of times as the situation has changed, and tracked those changes. My colleague Lea Winerman has gone back to scientists repeatedly to get their read on the latest data. We can stand by our math.

Most of this is just good beat reporting, but having a constant, visual reminder that we need to be right is a nice prod.

Do something new

Probably obvious, but still...

Clarity, FTW

I've written a lot of blog posts about math. I try to make these as readable as possible, but it's math. And I think it's important to explain where we're coming from and how we reach the numbers and conclusions we reach.

This comes back to embracing uncertainty. Here's what we said a week ago, as we struggled to find out whether more oil was coming out of the ruptured well after BP cut the riser pipe:

Did the flow rate increase significantly after June 3, when BP cut the riser pipe in order to put the current containment dome in place? And if the flow rate did increase, by how much?

We haven't found a clear answer to that question. An Interior Department official said that preliminary analysis suggested a modest increase, but that they didn't have definitive information to measure the change.

And Ira Leifer, a researcher at the University of California-Santa Barbara and a member of the flow estimate panel, told us in an e-mail that the scientists can't be sure of whether there was an increase because BP didn't provide enough data from before the riser cut to get a good estimate of the flow then.

Given that uncertainty, we initially left the minimum flow rate in our Gulf Leak Meter at 20,000 barrels per day, reflecting what the government's Flow Rate Technical Team reported on June 10 -- an estimate they based on data from before the riser was cut.

But today, BP says it captured 16,020 barrels of oil and flared another 9,270, for a total of 25,290 barrels (1,062,180 gallons) diverted from the Gulf.

(I say "we" in this case because parts of that post were written by me, and parts by Lea Winerman.)

This is getting awful long, so in keeping with the above principals, I'm going to open it up from here. What other lessons should we learn from covering the spill?



Comments:

Comments are closed for this post. If you still have something to say, please email me.