A new data working group from BBC News, a data library adds uploading capabilities, and a timeline of data journalism.
BBC News is the latest media company to create a working group tasked with developing “innovative and experimental” journalism projects. The BBC ‘NewsLabs’ team will focus on data journalism and data visualization. The Guardian calls it a ‘back to the future’ move by the BBC’s new managing editor, James Harding.
After Washington Post owner Jeff Bezos announced this week that that Amazon may soon be making customer deliveries by drone, USA TODAY wondered whether newspaper delivery boys in Bezos’ jurisdiction should be worried.
The New York Times is replacing Nate Silver’s FiveThirtyEight blog (which Silver took to ESPN back in July) with a brand new site intended to “produce clear analytical reporting and writing on opinion polls, economic indicators, politics, policy, education, and sports.” The venture will be headed by D.C. bureau chief David Leonhardt, who also helmed the search committee and selected himself for the job. Naturally, his colleagues are teasing Leonhardt for “pulling a Dick Cheney.” The new team will also include presidential historian Michael Beschloss, Nate Cohn of The New Republic, and economist Justin Wolfers.
Take it from me – If you are short on time, do not even attempt to play around on the new Spending Stories website. Developed by the folks at Open Knowledge Foundation and Journalism++, Spending Stories is intended to help journalists understand and contextualize spending data by making easy comparisons to other data. For example, using the site, I was able to see that $15,000 US dollars is equal to 3% of private ambulance costs in Yorkshire, England; 0.02% of the cost of the contract awarded to IT company CGI for implementing healthcare.gov; and 90% of government spending per person per year in the UK in 2012. It’s a fun tool!
The ProPublica Nerd Blog this week features an article by Hassel Fallas, a data journalist at La Nación in Costa Rica. Fallas was a 2013 Fellow at the International Center for Journalists, where she studied up on Data-Driven Journalism’s Secrets. Spoiler alert: The secret is…don’t keep secrets.
Over at the data-driven journalism blog, A Fundamental Way Data Repositories Must Change includes some fascinating examples of how data has been historically manipulated in Romania and Rwanda, including some examples from the present day.
Google Chrome’s new extension, Knoema, provides access to more than 500 data repositories and provides visualization tools for use with those databases. Knoema’s CTO says the platform can be used solely as a data source, but more importantly, it can be used as a tool for journalists to create embeddable visualisations. Pretty cool.
Data journalism and social media merge, a call-out for ‘crap’ data journalism, and tips for creating a data resume.
I suppose it was only a matter of time before the worlds of data journalism and social media cozied up and got comfortable. The London office of the Trinity Mirror announced that their new initiative, Mysterious Project Y, will focus on creating data journalism that will be compelling to share on the social Web. The site will focus on visualizations; “charts, graphs, facts, and figures” that “people care passionately about.”
You may have heard the statistic floating around lately that it is more difficult to land a job at a new location of the supermarket chain Wegmans than it is to gain admittance to Harvard University. The dragonflyeye blog refutes the numbers, and says that the story is an example of “crap data journalism.” Ouch.
As an aspiring journalist, I worked in the Washington Post newsroom in an entry-level position that used to be known as a “copy boy”. (Later updated to the more inclusive “copy aide.”) I loved taking in the energy of the reporters, especially when they had pulled off a “scoop,” or a story that the other papers didn’t have yet. There was a pall over the newsroom when other papers “scooped” us, and published a story that the Post reporters had been too slow to report.
Finnish data journalist Esa Makinen says that data visualizations are journalism’s “new scoop.” Text stories can be quickly re-published by competitors, Makinen told journalism.co.uk, but data visualizations can not be copied. Makinen works on the data desk at Finland’s daily paper and website, Helsingin Sanomat, and spoke this week at the Digital Journalism Days conference in Warsaw.
A new tablet-first investigative publication is in the works from a team of data journalists around the world. Acuerdo (an old Spanish word for ‘agreement’) bills itself as “long-form journalism for pissed off readers.” The first edition will be published next month in three languages. If you self-identify as a pissed-off reader, consider making a contribution to Acuerdo’s Kickstarter campaign.
Making corrections to data stories, a Brazilian hackday, and the ‘truth’ about big data and agnostic storytelling.
A few weeks ago in this space, I wrote about efforts to create a corrections policy for data journalists. It turns out, the Toronto Star needed this policy sooner rather than later, after a summer intern pitched and created a project that featured a searchable database of banned license plates, which included material from another Star reporter’s article three years prior. The Star published a public editor’s note about the issue. Does the problem of plagiarism become more complicated when it includes previously-reported data?
For those who are interested in the field of data journalism but unsure of where to start, The Data Journalism Heist offers a quick introduction. The e-book’s tagline: How to get in, get the data, and get the story out – and make sure nobody gets hurt
When a government shutdown renders government data websites useless, what’s a data journalist to do? This week, reporters hoping to gather data from sites like the US Census Bureau, the USDA’s Food and Nutrition Service, and the Bureau of Economic Analysis were out of luck, as access to most online government data was blocked due to the government shutdown.
The Pew Research Center offered a mostly comprehensive list of the data casualties of the shutdown.
As is the case with practitioners of most emerging and rapidly expanding fields, data journalists are finding it increasingly necessary to generate a code of sorts to deal with ethical issues and problems. In The Times Regrets the Programmer Error, a newsroom developer at the New York Times asks whether it’s time to create a detailed and explicit corrections policy for data.
And Paul Bradshaw of Birmingham City University imagines what a code of ethics for data journalism would look like. Ethical guidelines are necessary because of the sheer volume of data available in public databases, he says.
Finalists for the Gannett Award for Technical Innovation in Digital Journalism were announced by the Online Journalism Association this week. They were the data visualization tool D3.js; Quartz, a digitally native news site for business people; and Tarbell, a content management system created by the Chicago Tribune News Applications Team (and named after muckraking journalist Ida Tarbell.)
Data journalism’s ‘secret weapon’, data newswires, and the newest data-scraping tools for journalists.
When investigative reporter and journalism instructor Chad Skelton needed help writing a curriculum for a data journalism course, he turned to NICAR-L, the email listerv for the National Institute of Computer Assisted Reporting, for advice. Skelton says that virtually every data journalist in North America is plugged in to the NICAR listserv, making it data journalism’s “secret weapon.”
In 5 tips for a data journalism workflow, the online journalism blog advises newsrooms to find and tap into “data newswires” in the same way newsrooms have used traditional newswires like AP and Reuters.
Latin America’s Media Party, Tow/ Knight Research Projects, and why necessity is the mother of invention.
Fun fact: Over the last 2 years, the Buenos Aires delegation of Hacks/Hackers has grown to be the second largest chapter in the world, with more than 2200 members. (New York City is the largest.) This weekend, the city hosts the second annual Media Party, one of the biggest events in the Americas for newsroom programmers and data journalists. Featured guests include NPR news apps editor Brian Boyer, assistant editor for interactive news at The New York Times, Jacqui Maher, and Media Factory, Latin America’s first venture capital fund for emerging news organizations.
In fact, developing countries around the world have been hosting a brand new crop of data journalism initiatives. Most recently, the Canadian-based nonprofit, Journalists for Human Rights, collaborated with media outlets in Ghana to report several data-driven stories, like one that examined the frequency of paying bribes in Ghana. The Data Driven Journalism blog reported on the progress of the Kenya Open Data Initiative (KODI), two years after its launch. And 12 reporters from seven nations are learning data journalism as part of ‘Flag It’, a training course designed by Ecolab in partnership with the European Youth Press.