SiliconANGLE Extracting the signal from the noise. Thu, 24 Jul 2014 18:14:13 +0000 en-US hourly 1 At 6 billion clicks a month, Bitly has a distributed analytics system DevOps plan Thu, 24 Jul 2014 18:14:13 +0000 Continue reading ]]> bitly-logoMarketing folks love click stats, and tiny URLs spread clicks across numerous spaces in social media. Google, Facebook, Twitter and others already have their respective shortening service. URL shorteners give them even more data and makes them even smarter repositories of consumer information. It’s useful for Google’s search algorithm, and as Twitter and Facebook get more into paid services, it gives them even more data.

Bitly, one of the most popular URL shortener services and most well-known independent URL shortener, has some ideas on that.  The service has added several features that allow users to save, customize and update their shortened links to be public and private. At the recently concluded Bacon conference, Sean O’Connor, Lead Application Developer at bitly, shared some insights into Bitly’s distributed system and how Bitly is using big data analytics to make money.

Bitly distributed analytics system

When Twitter announced the launch of its own URL shortener in June 2010, many people thought that Bitly would close it services. Instead, the service has been innovative in offering its expertise to companies. To make marketing lemonade from all this data flowing in, the company offers a dashboard for businesses to track their links on the web.

Bitly distributed systems process 6 billion clicks a month, 600 million URLs every month shortened with the service and hundreds of millions of web pages crawled from the seller. The platform uses HDFS, S3, Nagios, the real time distributed messaging system Nsq, hostpool and a few different databases.

Distributed systems can often be challenging to build and operate, but they can offer significant benefits in terms of availability, cost effectiveness, and capacity. Specifically, Sean speaks to the benefits of Service Oriented Architecture, using asynchronous streams, scaling, dealing with failure, and monitoring.

Many services on asynchronous communication

In the case of Bitly, the company has 50 people, amid whom are approximately 20 engineers. The distributed system operates 400 servers including nearly 30 servers for handling all incoming traffic from the outside world including shortens, redirects, API requests, web UI, etc.

Connor shared that the URL shortener does not rely on a central application. Rather, the different services work independently and communicate asynchronously with each other. For example, service box A does not wait for a response from service B, but assumes that it has received the message. If service B have problems, the message is queued and executed at a later time.

This approach makes the handling of errors in one of the participating systems easier because of the failure of a single service is not a major problem for the rest of the system. However, the actual URL shortener service operates synchronously. This is to increase the speed.

When a Bitly URL is decoded, an HTTP redirect reflects the message to multiple services: an archive of the message is saved in HDFS and S3, a real-time analytics service receives the message, and a longer term history analytics platform also digests the data.

The Bitly Metric system again operates asynchronously, since a slight delay in the evaluation of analytics data does not constitute a broken leg, according to O’Connor. Data about the conversion is popped onto a queue for the analytics and other downstream systems.

In addition, relies on the self-developed software HostPool. The tool helps to manage the distribution on different hosts by being informed by the client for failed requests.

Lesson learned

Connor says the inclusion of a service-oriented architecture (SOA) has reduced functionality error while processing requests. In addition, individual components are isolated and different processes can run concurrently with by the combination of SOA, queues and asynchronous communication.

This asynchronous approach isolates components, speeds up the request and helps keep services focused. It also prevents disruptions from cascading through the message chain, since each service can just slot messages into a queue and does not need to time-out waiting for a response, it can go about its business even if something is offline.

Sean O’Connor’s lecture offers some interesting insights into Bitly work with distributed systems. The more you understand the distributed system, the better decisions you can make and the more efficiently you can work.

]]> 0
Doubling down on data governance | #MITIQ Thu, 24 Jul 2014 18:00:17 +0000 Continue reading ]]> small__8286829055Data is valuable. As the challenge of protecting customer data mounts, more and more businesses are embracing data-governance strategies to manage the information that serves as the lifeblood of the company. Without a doubt, data has become the raw material of the information economy, and data governance is a strategic imperative.

Speaking to theCUBE at MITCDOIQ 2014 yesterday, Tina Rosario, Vice President of Global Operations at SAP America, said that data governance is becoming increasingly important as data becomes more distributed. “It’s not just data on premise, or on a laptop,” she said. “Nowadays, it’s data on an iPad, or data on some other device. It’s much more distributed and because of that it requires much more governance.”

What is data governance anyway?


Wikibon analyst Jeff Kelly asked Rosario how she would define data governance, and was told that it can be broken down into four key capabilities – Good organization; Processes; Maintenance; and technical solutions.

“The first capability is good organization regarding practices around data governance – rules, standards, policies,” said Rosario. “Second, we look at the right processes for simplifying, creating, updating and maintaining data. We also look at data from an ongoing maintenance point of view, such as the right operations and tools to automate maintenance of data because we know it decays.”

You could be forgiven for thinking it sounds like data governance at SAP is a pretty laborious task, and probably very confusing for non-data people. But that’s not the case, because Rosario’s team tends to avoid using too much data speak. “We try to break it down and say, what is required by the business, what are the most critical bits of information that you need to run your business processes?” she explains. “Lets focus on those fields and lets focus on that critical set of information, and that’s what we’re going to govern.”

Rosario believes it’s important to consider data governance from the perspective of those who use it. She says that no one ever thinks about the data being bad, but people will notice if the information they need isn’t accessible, or if they find they can’t trust it. “We try and think of it from those points of view, and ask what can we do to enable those business processes to run more efficiently?” she said. “How can we get the data to them faster and with the right level of content?”

Tina Rosario, VP of Global Operations at SAP America

Tina Rosario, VP of Global Operations at SAP America

The other side of the coin is that it’s important for data governance teams to enjoy a good working relationship with those who’re analyzing the data to drive business goals. As she explains to Jeff Kelly, it’s the data governance team’s job to ensure that the company’s data is at the right level of quality and standards, so that it’s accessible to the analytics people and that they don’t have to waste time rationalizing whatever they want to do.

“We get requirements from them,” said Rosario. “They might come and say “we’re about to run this report, we need this level of data, can you help make sure we get it from the right source, that it’s from the right level of quality, and that it’s available to us? It’s a very symbiotic relationship between us.”

It’s a relationship that works both ways too, because part of the data governance team’s job is to analyze the level of data quality, and to do so they need the analytics guy’s expertise. “We have a good partnership there, they provide us with access to the tools and analytical capabilities we need,” she said.

Complexity in the cloud


Rosario makes data governance sound easy, but at SAP they do face some difficulties. Asked by Dave Vellante what her team’s main priorities were, she said the biggest challenge was governing data that resides in the cloud.

“SAP’s vision is to become the cloud company, and with that vision comes cloud capabilities internally, and so how do we data govern in the cloud?” she asked.

One of the main difficulties is the distributed nature of SAP’s data. The company sits in a hybrid environment, with some data on-premise, some in the cloud, with more and more being moved to the cloud. It’s a challenge for her team to work out how to govern data in this situation.

“It’s becoming much more distributed, and some might say that’s much more complex but I see it as an opportunity for us to have more governance and to spread that across all of these various channels,” said Rosario.

Watch the rest of Rosario’s interview on theCUBE at MITCDOIQ 2014 below.

photo credit: cucchiaio via photopin cc
]]> 0
Smart city adoption on the rise, revenue to increase by 9.4% Thu, 24 Jul 2014 17:18:46 +0000 Continue reading ]]> This week’s Smart City roundup features a new report on the state of smart building adoption, the first smart city course offering, and a new Internet of Everything (IoE) innovation center to open in 2016.

smart city cityscape landscape skyline urban

Smart building adoption on the rise despite challenges


A report from market research firm Frost & Sullivan reveals that smart building adoption is on the rise as a result of numerous market drivers such as population strain on aging infrastructures, greater demand and improved affordability of building level sensors, and the evolution towards a connected world.

The report “The Smart Building Systems Market in North America” stated that in 2013, the smart building market boasted revenue of $2.5 billion and that value is expected to rise at a compounded annual rate of 9.4 percent, or as much as $4.39 billion by 2018.

In the report, the firm stated that those who engaged early in smart building adoption will greatly benefit, as more things in the physical world connect to the Internet. But widespread adoption is still hindered by a lack of understanding coming from building owners and managers as to how smart buildings can directly benefit them, as well as the high-cost involved in purchasing building automation systems.

“A concerted educational effort will be vital to effectively communicate the value proposition of smart building solutions and provide prospective customers with the tools to perform a thorough cost-benefit analysis of competing systems,” said Frost & Sullivan Visionary Innovation Consultant Pramod Dibble. “These tools must be developed by an expert third-party organization with demonstrated excellence in economic modelling and exhaustive industry knowledge.”

India will be the first to offer smart city classes


Narendra Modi, the Prime Minister of India, has an ambitious plan of building 100 smart cities throughout the country, as it aims to transform its urban areas. CEPT University, one of India’s architecture, planning and design institutes, will start offering a smart city course as an elective for students in  masters degree level in architecture, planning, technology and management.

“In view of recent government announcement of ’100 Smart Cities’ in the union budget and allocation of Rs 7,600 crore there is a great excitement amongst the states and cities all over the country to take advantage. However, the general understanding of ‘what is a Smart City?’ is not very known, nor there exists any trained manpower or expertise who could guide and handhold such development or at least make a road map,” said faculty for the Smart City court Saswat Bandyopadyay.

The four-month course will start on July 25 and many students were excited about the course.  The course sits 40 students, but over 120 students signed up for the smart city class.  In the long run, CEPT aims to make the course available to a larger audience, particularly to government officials involved in city planning.

Cisco to open IoE innovation center in 2016


Cisco announced that it will be opening an Internet of Everything (IoE) innovation center in Barcelona, Spain by 2016, which will greatly focus on smart city technologies.

Cisco has partnered with the Barcelona City Council to build the 1,720 sqm IoE innovation center in Ca l’Alier, a 19th century heritage building at the core of the 22@Barcelona district, as part of the new Smart City Campus.  The innovation center will provide a platform for research, technological development and new market opportunities related to smart city technologies.

The company plans on inverting $30 million between 2015 to 2020 for the facility’s restoration, innovation lab, IT equipment, as well as acquiring engineering, application development and research talents.

Aside from Barcelona, Cisco will also place IoE innovation centers in Brazil, South Korea, Canada and Germany.

photo credit: medically_irrelevant via photopin cc
]]> 0
IBM Social CrowdChat on #BigData #TEDatIBM Thu, 24 Jul 2014 17:05:22 +0000 IBM is having a CrowdChat on #TEDatIBM.  A crowdsourced conversation about Big Data.

Here is the feed at

]]> 0
VMware expands European cloud footprint with second UK data center Thu, 24 Jul 2014 16:26:56 +0000 Continue reading ]]> cloud_computing_2014_0003While some competitors opt to announce their cloud expansions 15 data centers at a time, VMware Inc. is taking a much more reserved approach to scaling its infrastructure footprint, sticking to one facility per press release. The frequency of the additions has been steadily accelerating over the last few months, however, against the backdrop of the virtualization giant intensifying its efforts to move up the stack with value-added services.

The latest location where VMware has chosen to set up shop is the London suburb of Chessington, which is situated approximately 19 miles to the west of its UK headquarters in the town of Camberley.  The site just so happens to be a short drive from where rival IBM started constructing its newest cloud data center three weeks ago as part of a $1.2 billion initiative to make the services provided by hosting subsidiary SoftLayer more accessible to international clientele.

In step with previous such announcements, VMware didn’t provide any details about the Chessington facility except that the expansion is meant to support growing demand for its vCloud Hybrid Service, or vCHS for short. The infrastructure-as-a-service platform was introduced last September during the company’s annual customer conference in response to the fast-growing percentage of on-premise applications being moved to public clouds operated by competitors such as Amazon Inc. and Microsoft Inc..

Recognizing that it can’t differentiate on pricing or functionality fast enough to make the necessary investment worthwhile, VMware built vCHS to exploit of the only substantial competitive advantage that it has at its disposal: a dominant share of the virtualization market.  The platform offers full interoperability with existing deployments of its widely-used hypervisor right out of the box, according to the company, thereby eliminating the need for customers to worry about compatibility issues between their on- and off-premise environments. Ensuring a consistent service level across the two is an entirely different matter, but VMware claims to have got that covered as well.

Since launch, the EMC subsidiary has taken major steps to further augment the platform, adding a competitively-priced virtual desktop service in March and following that up a month later with a managed disaster recovery offering that targets one of the last remaining holes in Amazon’s cloud portfolio. In fact, Microsoft is the only one of the top providers to have even come close to matching the functionality of the solution, and its response is still very much in the works.

Laying claim to international territories


The announcement  of the new Chessington data center comes nine months after vCHS first arrived to Europe with the opening of a VMware facility in Slough to the north west. And it follows hot on the heels of VMware expanding to Japan through an alliance with the IT reseller arm of long-time partner SoftBank Telecom Corporation.

The purpose of establishing a physical presence in key markets is twofold. One is reduce latency, which has been named the main barrier to hybrid cloud adoption by  the head of Morgan Stanley’s data center operations, and the other is regulatory compliance. A sizable portion of data managed by enterprises and government agencies is not allowed to leave its home jurisdiction by law, after all, which means that an organization falling into that category won’t bite unless they are given the option to keep information locally. VMware is addressing both requirements while aggressively building out its service portfolio in a two-pronged push for hybrid cloud dominance.

Photo credit: Stuck in Customs via photopin cc
]]> 0
4 years later : OpenStack milestones and future challenges Thu, 24 Jul 2014 15:21:04 +0000 Continue reading ]]> It has been four years since OpenStack was officially launched – as the cooperative celebrates, we take a look at the major milestones this open source initiative has accomplished, and what the future holds.

hello my name is open source

In a blog post, Paul Voccio, Rackspace Senior Director and Product Software Engineering, shared just some of the highlights OpenStack has had over the past four years.

As of May 2014, OpenStack boasts of 16,266 individual members in 139 countries from 355 organizations and has 2,130 contributors, 466 average monthly contributors and 17,209 patches merged. Compared to last year, OpenStack’s individual members have almost doubled in number, while contributors have more than doubled.

Enterprise interest in OpenStack has also grown, as indicated by the program’s Atlanta Summit earlier this year, packed with more than 4,500 attendees from 55 countries.

Voccio stated that one of OpenStack’s goals in the past year was to “close the feedback loop between operators and developers” as “operators can tell us what works and what works at scale.”  To foster this, OpenStack launched DefCore, a set of standards and tests that will help the community understand which projects are stable, widely used and key to interoperability, earlier this year.

“I’m as optimistic about OpenStack’s future as I am humbled and inspired by its growth. It’s truly a project that we – the community– have taken from a handful of lines of code to a production-ready cloud operating system that world-beating enterprises use and trust,” Voccio stated.

He expects OpenStack’s fifth year to be a big one, but before that, let’s take a look back at the many developments of the past four years, as well as the expectations that the cooperative should address in the next year and beyond.

OpenStack Milestones



July 19 – OpenStack launched with source code from NASA and Rackspace and suport from 25 participating organizations

August 30 – OpenStack launched iPad app based on Rackspace Cloud Pro

October 21 – OpenStack Austin Code was released

October 22 – Microsoft joined OpenStack community and adds Hyper-V support to OpenStack



January 18 – Internap launched Storage Service using OpenStack

February 3 – OpenStack Bexxar Code was released; Citrix adds support for VMware hypervisor; Cisco joins OpenStack community

March 8 – Rackspace announced OpenStack services with Rackspace Cloud Builders

March 31 – Rackspace, Dell and Equinix launched the OpenStack Demo Environment

April 7 – Facebook launched Open Compute

April 12 – VMware launched Open Source PaaS Cloud Foundry

April 15 – OpenStack Cactus Code was released

May 10 – Canonical announced support for OpenStack in Ubuntu distribution

May 25 – Citrix announced ‘Project Olympus’ OpenStack distribution

July 14 – SecureStore uses OpenStack for Cloud Storage Service

September 29 – OpenStack Diablo Code was released



April 5 – OpenStack Essex Code was released, adds 150 features including Compute, Object Storage, Dashboard, Identity, and Image Service.

September – OpenStack Foundation was officially launched

September 27 – OpenStack Folsom Code was released, adds 185 new features across compute, storage and networking.



April 4 – OpenStack Grizzly Code was released, adds nearly 230 new features across compute, storage, networking and shared services in the cloud platform.

October 17 – OpenStack Havana Code was released, adds 400 plus new features across compute, storage, networking and cross-platform services.



April 17 – OpenStack Icehouse code was released, over 350 new features added including the OpenStack Database Service which was incubated during the Havan release cycle.

May 12 – OpenStack Marketplace launched with five initial categories: public cloud services, distributions and appliances, consulting and system integration and drivers, with offers from the likes of HP, IBM ,Red Hat, Nebula and Mirantis.

June – Rackspace unveiled OnMetal OpenStack-based cloud servers

Obstacles and expectations


OpenStack may be looking at a bright future ahead, but there are still things it needs to address in order to secure such a promising future.

theCUBE, SiliconANGLE’s roving broadcast studio, was present at the OpenStack Summit in Atlanta last May where the hosts, John Furrier and Stu Miniman, got to chat with customers and contributors for OpenStack. The hosts noticed that the event focused on OpenStack’s momentum, such as the increasing number of contributors and vendors getting involved with the open source ecosystem, but no big announcement or launches were made.

Furrier noted in a Day 2 Wrap-up segment that OpenStack is still facing obstacles and  could “go off the rails,” if tugged in too many different directions: “weird agendas, getting forked,” said the host.

“What are these big guys going to do? Chip away at the momentum to pull it back onto their terms?” Furrier added.

  • Does OpenStack need containers to succeed? 

Other questions that surfaced during the event was whether containers are needed for OpenStack to gain traction, and whether Rackspace was trying to put a leash on OpenStack.

During the summit, one CrowdChat thread touched on the importance of OpenStack focusing on making containers first-class citizens of the ecosystem, which got a response from theCUBE alumnus Rich Miller who said, “I would make a strong statement. Without acknowledgment and real leadership in embracing containers, OpenStack will fail to gain traction.”

OpenShift Community Manager for Red Hat, Diane Mueller, expects that the community itself will make containers a first-class citizen.

  • Rackspace control

As mentioned earlier, OpenStack was launched using codes from both Rackspace and NASA.  Despite being open source, some have accused Rackspace of controlling OpenStack too tightly, which led to what some believe is the reason Rackspace’s presence in the community has decreased.

In an interview with Rackspace CTO John Engates, he was asked to comment on these accusations, to which he responded that the company is not walking away from OpenStack and added that “this wouldn’t be a strong community today if we had tried to smother it and hold it for ourselves.”

See the entire Day 2 wrap segment from this year’s OpenStack Summit below:

photo credit: opensourceway via photopin cc
]]> 0
Tucci expected to fight activist EMC investor pushing for VMware spin off Thu, 24 Jul 2014 14:16:13 +0000 Continue reading ]]>

emc-vmware-logoA hedge fund with more than $1 billion invested in EMC plans to convince the company to spin off its subsidiary, VMware, the Wall Street Journal reported Monday. Elliot Management Corporation is the fifth-largest shareholder, owning approximately two percent of EMC, and people running the hedge fund believe EMC will significantly increase its stock value if it spins off VMware.

EMC’s core business has and continues to be storage, while VMware is known for virtualization. Elliot’s analysts believe EMC can make substantial gains by spinning off VMware and returning focus to the core server and storage products and services.

Companies like Elliot are often called “activist investors” as they primarily invest for the purpose of effecting major change within a corporation. EMC owns 80 percent of VMware and considers it a major strategic asset, part of its federation strategy, which includes its storage business, its information infrastructure (VMware) and its cloud developer (Pivotal).

Wikibon CEO Dave Vellante commented via Twitter, “Elliott Management move to strong arm EMC to relinquish its control of VMware – Tucci will fight but he’s retiring in 6 mos -new era coming?”

SiliconANGLE CEO John Furrier added, “@dvellante It will be interesting to see how EMC handles this they are not push overs for self serving activists.”

Some analysts believe EMC will attempt to appease Elliot investors by spinning off some VMware shares and then buying them back. VMware revealed its second-quarter earnings Tuesday, while EMC is scheduled to report Wednesday before markets open.

]]> 0
A veteran’s new best friend: What is IBM’s Watson Engagement Advisor? Thu, 24 Jul 2014 13:22:16 +0000 Continue reading ]]> IBM_USAA_Watson_appIBM’s Watson is no longer just the world’s most famous Jeopardy! game-playing computer. IBM yesterday announced that it has put Watson to work as the world’s most intelligent digital assistant in the first consumer-facing app for veterans making the transition from military to civilian life.

IBM and the United Services Automobile Association (USAA), a financial services provider for the military community, today announced they have teamed up to offer IBM’s Watson Engagement Advisor in a pilot program to assist USAA members. USAA provides insurance, banking, investments, retirement products and advice to 10.4 million current and former members of the U.S. military and their families.

Named after IBM founder Thomas J. Watson, IBM Watson uses natural language processing and analytics, and can process information similar to the way people think. This helps organizations to quickly analyze, understand and respond to vast amounts of Big Data. IBM’s Watson Engagement Advisor analyzed USAA’s business data and now understands more than 3,000 documents on topics exclusive to military transitions.

During the initial phase of the pilot program, veterans can log into USAA’s website or use a mobile browser to ask Watson questions specific to leaving the military including “Can I be in the reserve and collect veterans compensation benefits?” and “How do I make the most of the Post-9/11 GI Bill?”

This is the latest in a series of moves that IBM has made to commercialize Watson. Since IBM unveiled the IBM Watson Developers Cloud platform in November 2013, more than 2,500 businesses have applied, seeking to build cognitive apps. In January 2014, IBM invested $1 billion into the new New York City-based Watson Business Group to advance cognitive computing in the marketplace.


Watch this Cube Conversation in which Wikibon analysts Stu Miniman and Jeff Kelly discussed IBM’s new Watson Business Group:

Photo courtesy of IBM and USAA
Video courtesy of theCUBE
]]> 0
Study: People are “more honest” when chatting to a robot Thu, 24 Jul 2014 12:35:44 +0000 Continue reading ]]> small__12728858904A study has shown that chat-bots might be a better alternative to filling in questionnaires when it comes to screening applicants for security clearance.

The research was carried out at National Center for Credibility Assessment along with military IT contractor ManTech International, and found that people were more honest when chatting with a robot than when writing their replies.

ManTech is a mega-corporation that does everything from providing IT and software to intelligence agencies, to multimillion-dollar maintenance jobs for the US Navy. Meanwhile, the National Center for Credibility Assessment’s reason for existence is to “provide graduate and continuing education courses in psychophysiological detection of deception (PDD)”.

The study saw 120 US Army trainees fill out traditional pen ‘n’ paper questionnaires about their lifestyles, before sitting down in front on an automated chat-bot. On the whole, when sat in front of a computer-generated avatar, the trainees were a lot more forthcoming about sensitive topics like drug use, alcohol abuse and psychological problems they might have suffered. The results suggest that chat-bots might be a better alternative than bog-standard questionnaires – though we should point out that ManTech is just the kind of company that would like to build such software.

“Automating this process using a [computer graphics] interview format could save time, and allow agencies to utilize their human interviewers more effectively,” the researchers wrote.

Interestingly the study, which was published in the Computers in Human Behavior scientific journal, didn’t involve any sophisticated AI. The software relied on a simple scripted speech format that would ask follow up questions based on the answers it received. The avatar’s face was ethnically ambiguous and didn’t display any emotion, though almost a quarter of those interviewed claimed they did see an emotional response.

As well as being more open in front of the chat-bot, most interviewees said they felt more comfortable with the chat-bot than filling out an old-fashioned questionnaire on a sheet of paper.

However, the researchers noted that chat-bots are still some way off being able to replace humans for screening interviews. But if they can replace questionnaires, it might help use save a few trees at least :-)

photo credit: Alisa Perne – via photopin cc
]]> 0
Your guide to international Big Data universities: IBM edition Thu, 24 Jul 2014 12:00:20 +0000 Continue reading ]]> college graduatesThe Big Data market is expected to grow to $28.5 billion by the end of 2014 and to top $50 billion by 2017, according to a recent Wikibon report. With the growing market comes millions of new Big Data and Analytics jobs that are being created across the globe. But the amount of new jobs far outweigh the number of scientists and engineers who have the education to fill them.

“By 2015, 4.4 million IT jobs globally will be created to support Big Data,” said Peter Sondergaard, Senior Vice President and Global Head of Research at Gartner, in a statement. “But there is a challenge. There is not enough talent in the industry.”

This is good news for those interested in becoming data scientists, as there are a large number of universities that IBM and other industry vendors are now partnering with to develop undergraduate and graduate degree programs designed to prepare students for Big Data and Analytics careers. Here, we will take a look at some universities who are partnering with IBM to offer degrees for aspiring data scientists.


Athens University of Economics and Business


To help narrow the data scientist skills gap, IBM is partnering with more than 1,000 universities across the globe to develop curriculum to prepare students for Big Data and analytics careers. “We’re…working with…universities globally to actually put together a curriculum—both in the business school as well as in the technical schools—for certifications and advanced…Masters classes around various data type jobs,” said Inhi Cho Suh, Vice President and General Manager of Big Data, Integration & Governance on theCUBE to co-hosts John Furrier and Jeff Kelly at Hadoop Summit 2014.

IBM’s latest educational initiative in Europe came in April 2014 when IBM announced that it is collaborating with the Athens University of Economics and Business (AUEB) in Greece to create the first national Postgraduate Degree in Business Analytics, expected to launch in September 2014.


University of Piraeus in Greece


IBM also announced in April 2014 that it will partner with the University of Piraeus in Greece on the development and design of hands-on practical sessions in Business Intelligence and Business Analytics modules of MBA programs.


Ohio State University


The worldwide shortage of professionals trained in data analysis and critical thinking is occurring at a pivotal moment in history, according to Christine A. Poon, dean of Ohio State University’s Fisher College of Business. “While leaders in all industries have the data at their fingertips,” she wrote in a blog post, “they lack the highly skilled workforce to connect the dots and advance their businesses and organizations to new heights.”

classroom school desks educationTo help with the shortage problem in the United States, IBM announced a collaboration between Ohio State University and the IBM Client Center for Advanced Analytics in Columbus, Ohio to develop new curricula at the undergraduate, graduate and executive education levels to help students (and mid-career professionals) gain the latest skills in analytics. “Our strong collaboration with IBM will help our students across a variety of majors gain the latest skills in this burgeoning Big Data discipline and set them on a path to secure the high skilled jobs of the future,” said Poon in a statement.

In February 2014, Ohio State University announced the details of its new undergraduate major in data analytics. The new major is structured in three parts: core subject matter (mathematical, statistical, and computing foundations), discipline-specific specializations (visual analytics and sense-making, system modeling, pattern recognition, and machine learning), and what the school calls “an integrative experiential education” component.


More U.S.-based Big Data programs


In addition to partnering with Ohio State University, IBM in May 2014 announced that it is also now partnering with Boston University, Case Western Reserve University, Johns Hopkins University and the University of Missouri to offer Big Data and Analytics curricula. Boston University’s Metropolitan College is offering a Master of Science degree in Computer Information Systems with a concentration in Database Management & Business Intelligence. Case Western Reserve University is launching a new undergraduate program in data science and analytics in the Fall 2014 semester. The program includes a major and a minor in applied data science, and eventually a post-baccalaureate certificate program.

Big Data, Worldwide connected Big DataThe Johns Hopkins University’s DC-based Center for Advanced Governmental Studies is offering a Master of Science in Government Analytics and a Certificate in Government Analytics to “provide students with the needed skills to address contemporary political, policy and governance challenges.” And the University of Missouri is developing an interdisciplinary Master of Science in Data Science and Analytics degree, providing students with access to IBM’s Open Cloud Architecture to “have a comprehensive skill set in building, deploying, and managing cloud resources to analyze big data in journalism, engineering, informatics, and learning analytics.”

Other universities in the United States with which IBM is partnering to develop Big Data courses include: Arizona State University, Babson College, Dakota State University, Illinois Institute of Technology, Illinois State University, Indiana University, Iowa State University, Northwestern University, Rensselaer Polytechnic Institute, San Jose State University, Southern Methodist University, University of Arkansas at Little Rock, University of Arkansas at Fayetteville, University of Denver, University of Colorado at Boulder, University of Maryland in College Park, University of Massachusetts in Boston, University of North Carolina at Charlotte, University of Southern California, University of Texas at Austin, University of Tennessee at Chattanooga, University of Tennessee at Knoxville, University of Virginia and Worcester Polytechnic Institute.


Over 30 universities throughout China


Earlier this month, IBM announced a major collaboration with China’s education ecosystem focused on “addressing the Big Data and Analytics skills opportunity” in China. As part of the collaboration, IBM will initially help launch undergraduate and graduate programs in 30 universities to help prepare students for Data Scientist and Chief Data Officer jobs. “Big Data is big business, but its rapid growth has outpaced colleges’ and universities’ ability to develop and implement new curriculums,” said Li Shu Chong, President of CCID Consulting, in a statement. “IBM’s extensive initiative is poised to help develop new talent in China that will be needed to realize the full potential of Big Data.”

The seven pilot schools that will roll out new Big Data and Analytics programs this Fall include the Beijing Institute of Technology, Fudan University, Guizhou University, Huazhong University of Science and Technology, Peking University, South China University of Technology and Xi’an Jiaotong University.


Watch the Hadoop Summit 2014 interview between theCUBE co-hosts John Furrier and Jeff Kelly and IBM’s Inhi Cho Suh:


Photo credit: Herkie via photopin cc
Photo credit: dcJohn via photopin cc
Photo credit: marsmet547 via photopin cc
Video courtesy of theCUBE
]]> 0