UPDATED 19:07 EDT / SEPTEMBER 13 2011

NEWS

SAP’s Balancing Act

LAS VEGAS – SAP is at an inflection point not unlike it found itself 25 years ago. Back then, SAP put all its money on R3. This time SAP is betting the house on an in-memory data processing engine called HANA.

But company executives know they can’t afford to alienate customers running its legacy software and systems, which, after all, bring in the vast majority of SAP’s revenue. It’s quite a balancing act, but SAP has a plan to pull it off.

Speaking at SAP TechEd this week, SAP Executive Board Member, Technology and Innovation Vishal Sikka put it this way: “We can renew our applications without disrupting them and in parallel build new applications, amazing new applications that were never possible before.”

Underpinning the effort is HANA, SAP’s in-memory database engine. When it debuted last year, HANA stood for High-performance Analytic Appliance, but I’ve seen no mention of that term at TechEd this week. That’s probably because SAP is now promoting HANA as a single platform to support both analytical and transactional workloads. (Let the comparisons with Exadata begin.)

SAP plans to gradually slide all of its legacy applications onto HANA. The benefits for users, according to SAP, will be the ability to load, process and analyze significantly larger volumes of data than is currently possible with SAP systems like Business Warehouse and to do so in near-real time. That means accounting departments can close their books in hours not days. Utilities can measure full sets of energy consumption data rather than aggregate samples. And retailers can process point-of-sale data while the consumer is still at the checkout counter.

The trick for SAP is to slide HANA under existing SAP deployments with minimal disruption for customers. This means reconciling the numerous metadata models used by various SAP systems with HANA’s single data model for both analytic and transactional processing. Developers will also need to adjust to pushing the application logic down into the database, a significant shift from SAP’s traditional approach. According to company executives here, SAP is working on mechanisms to make the transition as seamless as possible.

SAP’s first big test will come in a month and a half when SAP begins migrating SAP Business Warehouse customers onto the HANA platform. There are currently 16,000 BW deployments worldwide, and SAP hopes to transition all of the to HANA eventually. Transitioning BW deployments to HANA is a relatively easy sell for SAP, Ken Tsai, Global Marketing at SAP, told me, as customers have been clamoring for improvements to BW’s “unwieldy” nature for quite awhile.

Big Data By Any Other Name

It’s no stretch to say that HANA is the foundation of SAP’s entire product roadmap and, in my view, it’s the company’s answer to the Era of Big Data, though you won’t hear SAP executives use that phrase much.

Following a panel discussion, I spoke with Michael Eacrett, SAP’s Chief Portfolio Architect for Data and Analytic Engines. He said many SAP customers are dealing with Big Data challenges, but not of the well-publicized type confronted by Google, Facebook and LinkedIn. Rather than petabytes of unstructured data from around the web, SAP customers are struggling to harness years’ worth of mostly structured data housed in SAP systems. The data volumes involved are indeed big – tens of terabytes in some cases – but not Google big.

SAP customers want to build applications that improve the speed and efficiency of business processes by accessing complete data sets in real-time and incorporate third-party data, from the web or elsewhere, when needed. That’s what HANA allows them to do, Eacrett said.

For Tsai, HANA is not just about improving query speed, but about improving the speed of business processes. Put another way, for SAP Big Data does not equal Hadoop. SAP is taking a “very practical approach to Big Data,” according to Tsai, and the company has been very deliberate in identifying HANA use cases that will bring the most value to SAP customers in the here-and-now. He said he thinks Hadoop has significant potential, but SAP has not been able to identify use cases for Hadoop in Fortune 100 companies (that are not giant web or media companies) at the moment.

Big Data’s Many Parts

SAP is correct that the Era of Big Data presents a number of challenges beyond simply huge data volumes. Organizations want to do real-time analytics and other functions on relatively large data sets that batch-oriented Big Data technologies like Hadoop aren’t optimized for.

But it’s not a one-or-the-other question. I believe Hadoop and other MapReduce methods of data storage and processing are going to become indispensible as web-based data, machine generated and other semi-structured data types continue to proliferate. This data is ripe territory for analytics by data scientists to find new, breakthrough ways of doing business and organizations – including Fortune 100 companies — that don’t at least begin investigating how Hadoop can impact their business are making a Big (pun intended) mistake.

At the same time, lightening quick analytic engines like HANA have the potential to significantly empower business users, giving them the ability to ask questions they never bothered to ask before because it took existing technology simply too long to provide the answers.

So SAP actually has two balancing acts to perform, in my opinion. It must continue to support and ultimately transition legacy deployments while simultaneously innovating on top of HANA. But it must also help customer incorporate and support other Big Data approaches where they make sense (for which they will need the help of partners like IBM and potentially smaller players in the Hadoop ecosystem.) Perhaps a better analogy than two balancing acts is that SAP is juggling three important balls and can’t afford to drop any of them.

Services Angle

Big data and in-memory analytics are changing application development. CIOs have opportunities to increase performance and business value by orders of magnitude in this new world. Understanding the implications of new development approaches and exploiting in-memory capabilities for data analytics is an emerging trend that will be critical for competitive advantage over the next five years.


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU