This article talks about the 10 point guide to data journalism. I particularly like point 5:
Data journalism is 80% perspiration, 10% great idea, 10% outputThe Prezi under point 5 explains the process of how data is used to support news, the angles to consider when mashing datasets together, the technical challenges of working with data, iterative calculation and QA process, which finally get turned into the beautiful output with the various (mostly free) visualisation tools.
This is practically the same process that an Operational Research consulting project takes - or any application of OR or Science in general:
- Understand what the problem/question is
- Create a hypothesis to be proven or disproved
- Define what data is needed for the quest
- Get the data
- Clean it, and manipulate/wrangle with it so it's usable for analysis
- Analyse/calculate to come to some conclusion - hence proving or disproving the hypothesis
- Compare it to subject matter experts' view on what the likely answer should be (sanity check)
- Refine the analysis until satisfied
- Shape the output message so it can be easily understood by the audience
- Communicate the findings
- All throughout the process, keep communicating to the audience to make sure they are engaged and understand (principle-wise) what you're trying to do, so that they are not unpleasantly surprised when the final answer is presented
- Best yet, to ensure smooth change management if your solution is to be implemented, work closely with the end users from the start of designing the solution, and then implement and test, so that they believe in the solution because they were part of the creation process.
For those interested in the how of data journalism, read this about the work that went into reporting on the 2011 London Riots. Fascinating social media analytics at work. Not easy. Impressive and very interdisciplinary.
P.S. Most of this post has been sitting as draft since the summer, hence referencing 'old' news. It's still relevant, so why not.