HuffingtonPost has recently released the HuffPost Pollster API that enables open access to all the polling data on the website. Software developers use this API to access information about public polls and publish the same on the Huffington Post. This will be of more interest for the public as the presential race is heating up, and people are quite interested in political polls to understand the current opinions of the electorate. The initial release of the Pollster API includes data from more than 215,000 responses to questions on a broad range of subjects, 13,000 polls that HuffPost organized by subject and geography into more than 200 charts. Here’s what HuffPost has to say about the poll API mechanism:
“Since being able to understand the methodology behind opinion surveys is an important step toward increasing transparency in the opinion polling industry, we’re including information about the methodology for each poll. And to make the data independently verifiable, we’ve included a link to the original source that conducted or reported the poll along with each entry. We currently calculate these by running a locally weighted polynomial regression on every poll for a specific category of question. But we’re continually improving our methodology for combining the information from different opinion polls into a single estimate”.
The release of Pollster API is all about pulling together people and data transparency, enabling people to get a better idea of what people think (by asking them). It also means that other writers, researchers, and data-visualization producers can generate graphs, charts, from HuffingtonPost data.
This ties in nicely with Big Data visualization as the Pollster API mechanism will utilize Big Data to visualize and display data for public use. Web data visualization is one of the significant benefits offered by Big Data as it boosts infographics, graphs, charts, and other data visuals. Since public institutions generate a great deal of data, and generally run on statistics, it is really useful for them to release that data not just to the media but to allow the public to digest it directly. One of the best examples in recent times is the Google’s cloud-based web data visualization interface for the everyman in the Google Public Data Explorer. It currently supports 27 data sets and more than 300 metrics, right from the labor productivity and Internet speed to gender balance in parliaments, government debt levels, and population density by municipality. The public data explorer was launched back in February, with the following statement from Google:
“Today, we’re opening the Public Data Explorer to your data. We’re making a new data format, the Dataset Publishing Language (DSPL), openly available, and providing an interface for anyone to upload their datasets. DSPL is an XML-based format designed from the ground up to support rich, interactive visualizations like those in the Public Data Explorer.”
While this was about the progress of visualization and Big Data stuff, on the flip side, we heard about US government big data democratization programs getting their budgets axed by almost 75%.The fund supports sites like USASpending.gov and Data.gov. The Sunlight Foundation, a government transparency watchdog organization also made a plea against this step:
“Some of the most important technology programs that keep Washington accountable are in danger of being eliminated. Data.gov, USASpending.gov, the IT Dashboard and other federal data transparency and government accountability programs are facing a massive budget cut, despite only being a tiny fraction of the national budget. Help save the data and make sure that Congress doesn’t leave the American people in the dark.”
This shift from data transparency from the government will definitely alter the way the public can understand the inner workings of the State. It fuels the engine of discovery for citizen journalism, public self-education into government spending and statistics, and even gives ordinary citizens thinkers and bloggers access to what traditional newspapers would spend thousands of dollars on to educate them. Without this information, services such as Google’s Public Data Explorer visualization tool and IBM’s City Forward tool would be much less useful.
Latest posts by Isha Suri (see all)
- Zetta.net introduces cloud backup migration for Symantec Backup Exec - December 5, 2013
- Salesforce’s Dreamforce: Upcoming mobile hackathon offers $1 million prize - October 29, 2013
- Express Logic adds kernel awareness of the ThreadX RTOS in ARM DS-5 tools - October 28, 2013