Count the Obama administration as the latest high-profile organization to embrace Big Data.
The White House will announce today its “Big Data Research and Development Initiative,” which spans multiple federal agencies and seeks to exploit Big Data Analytics to improve the efficiency of Medicare and Medicaid, better coordinate disaster response services, and improve terrorist threat detection, among other goals.
The White House will revel more details in a live webcast this afternoon (Thursday, March 28) at 2 pm ET. You can watch it here.
In a blog post this morning, Tom Kalil, Deputy Director for Policy at the White House Office of Science and Technology Policy, wrote:
To launch the initiative, six Federal departments and agencies will announce more than $200 million in new commitments that, together, promise to greatly improve the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data.
Among the projects is a Hadoop deployment to support analytic and reporting requirements from Medicare and Medicaid programs. The goal, as the White House writes in its Big Data Fact Sheet, is “to develop a supportable, sustainable, and scalable design that accommodates accumulated data at the Warehouse level.”
The Obama administration should be lauded for including Big Data as a tool in its arsenal to tackle some of the country’s most pressing challenges. As we state in the Big Data Manifesto, Wikibon believes Big Data’s benefits cross all vertical industries, and that includes government services and defense. President Obama’s high-profile embrace of Big Data should also help promote Hadoop and other new approaches to data management and analytics to a wider audience of traditional US enterprises, including SMBs.
As for the federal Big Data initiative specifically, the White House will need help from the industry and the open source Big Data community, a point it readily acknowledges. The federal government, like most enterprises, lacks the internal Big Data expertise needed to successfully exploit Hadoop and other Big Data approaches, meaning it must rely on the training, technical and professional services of outside vendors and organizations to make its Big Data initiative a success.