ENTRIES TAGGED "Gov 2.0"

Tracking the data storm around Hurricane Sandy

When natural disasters loom, public open government data feeds become critical infrastructure.

Just over fourteen months ago, social, mapping and mobile data told the story of Hurricane Irene. As a larger, more unusual late October storm churns its way up the East Coast, the people in its path are once again acting as sensors and media, creating crisis data as this “Frankenstorm” moves over them.

Hurricane Sandy is seen on the east coast of the United States in this NASA handout satellite image taken at 0715 GMT, October 29, 2012.

[Photo Credit: NASA}

As citizens look for hurricane information online, government websites are under high demand. In late 2012, media, government, the private sector and citizens all now will play an important role in sharing information about what’s happening and providing help to one another.

In that context, it’s key to understand that it’s government weather data, gathered and shared from satellites high above the Earth, that’s being used by a huge number of infomediaries to forecast, predict and instruct people about what to expect and what to do. In perhaps the most impressive mashup of social and government data now online, an interactive Google Crisis Map for Hurricane Sandy pictured below predicts the future of the ‘Frankenstorm’ in real-time, including a NYC-specific version.

If you’re looking for a great example of public data for public good, these maps like the Weather Underground’s interactive are a canonical example of what’s possible.

Read more…

Comments: 9 |

DataMarket charges up with open energy data

Want to build a business on open data? Add value by solving a problem for your users.

Hjalmar Gislason commented earlier this year that open data has been all about apps. In the future, it should be about much more than consumer-facing tools. “Think also about the less sexy cases that can help a few people save us millions of dollars in aggregate, generate new insights and improve decision making on various levels,” he suggested.

Today, the founder and CEO of DataMarket told the audience of the first White House Energy Datapalooza that his company would make energy data more discoverable and usable. In doing so, Datamarket will be be tapping into an emerging data economy of businesses using open government data.

“We are honored to have been invited to take part in this fantastic initiative,” said Gislason in a prepared statement. “At DataMarket we focus on doing one thing well: aggregating vast amounts of heterogeneous data to help business users with their planning and decision-making. Our new energy portal applies this know-how to the US government’s energy data, for the first time enabling these valuable resources to be searched, visualized and shared through one gateway and in combination with other domestic and worldwide open data sources.”

Energy.datamarket.com, which won’t go live officially until mid-October, will offer search for 10 thousand data sets, 2 million time series and 50 million energy facts. DataMarket.com is based upon data from thirteen different data providers including the U.S. Department of Energy’s Energy Information Agency (EIA), Oak Ridge National Laboratory, Energy Efficiency and Renewable Energy program, National Renewable Energy Laboratory, the Environmental Protection Agency (EPA), the Bureau of Transportation Statistics, the World Bank and United Nations.

Last week, I interviewed Gislason about his company and why they’re focusing on energy data.

Read more…

Comment |

Knight winners are putting data to work

The common thread among the Knight Foundation's latest grants: practical application of open data.

Data, on its own, locked up or muddled with errors, does little good. Cleaned up, structured, analyzed and layered into stories, data can enhance our understanding of the most basic questions about our world, helping journalists to explain who, what, where, how and why changes are happening.

Last week, the Knight Foundation announced the winners of its first news challenge on data. These projects are each excellent examples of working on stuff that matters: they’re collective investments in our digital civic infrastructure. In the 20th century, civil society and media published the first websites. In the 21st century, civil society is creating, cleaning and publishing open data.

The grants not only support open data but validate its place in the media ecosystem of 2012. The Knight Foundation is funding data science, accelerating innovation in the journalism and media space to help inform and engage communities, a project that they consider “vital to democracy.”

Why? Consider the projects. Safecast creates networked accountability using sensors, citizen science and open source hardware. LocalData is a mobile method for communities to collect information about themselves and make sense of it. Open Elections will create a free, standardized database stream of election results. Development Seed will develop better tools to contribute to and use OpenStreetMap, the “Wikipedia of maps.” Pop Up Archive will develop an easier way to publish and archive multimedia data to the Internet. And Census.IRE.org will improve the ability of a connected nation and its data editors to access and use the work of U.S. Census Bureau.

The projects hint at a future of digital open government, journalism and society founded upon the principles that built the Internet and World Wide Web and strengthened by peer networks between data journalists and civil society. A river of open data flows through them all. The elements and code in them — small pieces, loosely joined by APIs, feeds and the social web — will extend the plumbing of digital democracy in the 21st century.

Read more…

Comment |

Uncertain prospects for the DATA Act in the Senate

If legislative efforts to standardize federal government spending data founder in the U.S. Senate, it's a missed opportunity.

The old adage that “you can’t manage what you can’t measure” is often applied to organizations in today’s data-drenched world. Given the enormity of the United States federal government, breaking down the estimated $3.7 trillion dollars in the 2012 budget into its individual allocations, much less drilling down to individual outlays to specific programs and subsequent performance, is no easy task. There are several sources for policy wonks to turn use for applying open data to journalism, but the flagship database of federal government spending at USASpending.gov simply isn’t anywhere near as accurate as it needs to be to source stories. The issues with USASpending.gov have been extensively chronicled by the Sunlight Foundation in its ClearSpending project, which found that nearly $1.3 trillion of federal spending as reported on the open data website was inaccurate.

If the people are to gain more insight into how their taxes are being spent, Congress will need to send President Obama a bill to sign to improve the quality of federal spending data. In the spring of 2012, the U.S. House passed by unanimous voice vote the DATA Act, a signature piece of legislation from Representative Darrell Issa (R-CA). H.R. 2146 requires every United States federal government agency to report its spending data in a standardized way and establish uniform reporting standards for recipients of federal funds.

Read more…

Comments: 2 |

The dark side of data

In a world of big, open data, "privacy by design" will become even more important.

Map of France in Google Earth by Steven La Roux

A few weeks ago, Tom Slee published “Seeing Like a Geek,” a thoughtful article on the dark side of open data. He starts with the story of a Dalit community in India, whose land was transferred to a group of higher cast Mudaliars through bureaucratic manipulation under the guise of standardizing and digitizing property records. While this sounds like a good idea, it gave a wealthier, more powerful group a chance to erase older, traditional records that hadn’t been properly codified. One effect of passing laws requiring standardized, digital data is to marginalize all data that can’t be standardized or digitized, and to marginalize the people who don’t control the process of standardization.

That’s a serious problem. It’s sad to see oppression and property theft riding in under the guise of transparency and openness. But the issue isn’t open data, but how data is used.

Read more…

Comments: 17 |

Predictive data analytics is saving lives and taxpayer dollars in New York City

Michael Flowers explains why applying data science to regulatory data is necessary to use city resources better.

A predictive data analytics team in the Mayor's Office of New York City has been quietly using data science to find patterns in regulatory data that can then be applied to law enforcement, public safety, public health and better allocation of taxpayer resources.

Comment |

mHealth apps are just the beginning of the disruption in healthcare from open health data

Rockstars from music, government and industry convened around healthcare at the 2012 Health Datapalooza

Two years ago, the potential of government making health information as useful as weather data may well have felt like an abstraction to many observers. In June 2012, real health apps and services are here, holding the potential to massive disrupt healthcare for the better.

Comments: 3 |
US CTO seeks to scale agile thinking and open data across federal government

US CTO seeks to scale agile thinking and open data across federal government

Todd Park is looking for Presidential Innovation Fellows to help government work better.

In this interview, U.S. chief technology officer Todd Park lays out his ambitious agenda to apply technology in the public interest. Park has introduced new presidential fellowships and programs to scale open data across the federal government, releasing more health information and making digital government citizen-centric.

Comment |

Profile of the Data Journalist: The Data News Editor

John Keefe learned data journalism from the online community and applied it to public radio.

John Keefe is a senior editor for data news and journalism technology at WNYC public radio, based in New York City, NY. He attracted widespread attention when an online map he built using available data beat the Associated Press with Iowa caucus results earlier this year.

Comment |

Profile of the Data Journalist: The API Architect

Jacob Harris is building APIs and data into elections coverage at the New York Times.

To learn more about the people who are redefining the practice computer-assisted reporting, in some cases, building the newsroom stack for the 21st century, Radar conducted a series of email interviews with data journalists during the 2012 NICAR Conference.

Comment |