Services A Must For White House Big Data Initiative to Succeed

Services A Must For White House Big Data Initiative to Succeed

Count the Obama administration as the latest high-profile organization to embrace Big Data.

The White House will announce today its “Big Data Research and Development Initiative,” which spans multiple federal agencies and seeks to exploit Big Data Analytics to improve the efficiency of Medicare and Medicaid, better coordinate disaster response services, and improve terrorist threat detection, among other goals.

The White House will revel more details in a live webcast this afternoon (Thursday, March 28) at 2 pm ET. You can watch it here.

In a blog post this morning, Tom Kalil, Deputy Director for Policy at the White House Office of Science and Technology Policy, wrote:

To launch the initiative, six Federal departments and agencies will announce more than $200 million in new commitments that, together, promise to greatly improve the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data.

Among the projects is a Hadoop deployment to support analytic and reporting requirements from Medicare and Medicaid programs. The goal, as the White House writes in its Big Data Fact Sheet, is “to develop a supportable, sustainable, and scalable design that accommodates accumulated data at the Warehouse level.”


The Obama administration should be lauded for including Big Data as a tool in its arsenal to tackle some of the country’s most pressing challenges. As we state in the Big Data Manifesto, Wikibon believes Big Data’s benefits cross all vertical industries, and that includes government services and defense. President Obama’s high-profile embrace of Big Data should also help promote Hadoop and other new approaches to data management and analytics to a wider audience of traditional US enterprises, including SMBs.

As for the federal Big Data initiative specifically, the White House will need help from the industry and the open source Big Data community, a point it readily acknowledges. The federal government, like most enterprises, lacks the internal Big Data expertise needed to successfully exploit Hadoop and other Big Data approaches, meaning it must rely on the training, technical and professional services of outside vendors and organizations to make its Big Data initiative a success.

RELATED:  The battle for analytical dominance: Pentaho vs. Amazon | #pworld15

Jeffrey Kelly

As Wikibon’s lead Big Data analyst, Jeff Kelly applies a critical eye to trends and developments in the Big Data and business analytics markets, with a strong focus on helping practitioners deliver business value. Jeff’s research includes market analysis, emerging technologies, enterprise Big Data case studies, and more. He also appears frequently on theCUBE to share his insights. Prior to joining Wikibon, Jeff spent seven years as a writer and editor at TechTarget, where covered a number of business and IT topics including IT services, mobile computing, data management and business intelligence. He holds a BA from Providence College and an MA from Northeastern University.


Join our mailing list to receive the latest news and updates from our team.

Submit a Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Share This

Share This

Share this post with your friends!