UPDATED 13:05 EDT / APRIL 17 2012

NEWS

TIBCO Says In-Memory Platform Supports Both Historical and High Velocity Analytics

Sometimes you want to run analytics on large volumes of historical data – aka Big Data analytics. And other times you need to perform analysis on data as it arrives at your front door – aka real-time analytics. Wouldn’t it be great if there were one platform that allowed you to do both, with each type of analysis supporting and enriching the other?

Well, that’s what TIBCO says it has with the latest iteration of its ActiveSpaces hybrid in-memory computing platform. According to TIBCO CTO Matt Quinn, ActiveSpaces 2.0 “is the glue that holds these two pieces together.”

It does so thanks in part to its architectural design, which Quinn calls “shared nothing persistence.” ActiveSpaces supplements its in-memory storage capabilities with disk-based storage, allowing administrators to elastically scale-out deployments to hundreds of disk-based nodes if needed. And because it’s a distributed peer-to-peer network, the platform does not require a central server.

TIBCO Active Spaces - Courtesy of TIBCO 2012

The upshot is ActiveSpaces can be used to store and process large volumes of transactional data for Big Data analytic applications on disk while also feeding real-time applications, such as complex event processing apps, with high velocity data stored in-memory supplemented with historical context.

Active Spaces came about, in part, as a result of the struggles TIBCO’s complex event processing products were having writing data to traditional databases, according to Quinn. “Just dealing with round-trip issues, retrieving high velocity data quickly was hard,” Quinn said.

In-memory storage means applications no longer have to call out to far-flung databases, but instead perform functions on data stored locally in-memory. Practically speaking, this means ActiveSpaces gives TIBCO customers the ability to execute code in standard Java next to the data rather than using triggers or stored procedures that must be maintained across databases, according to Quinn.

In-memory computing platforms are becoming more popular as the price of memory drops, with SAP in particular responsible for its emergence in analytic use cases. The company is basing much of its future growth on HANA, a purely in-memory computing appliance that allows users to quickly run “what if” scenarios on large data volumes.

ActiveSpaces’ ability to support both historical analysis and real-time decision making could give it a leg-up on competing Big Data approaches, such as Hadoop. Hadoop has proven itself a relatively inexpensive platform for storing and processing large volumes of multi-structured data, but it handles data loading in a batch form. This makes it an ideal platform for performing historical analysis, but unsuitable for supporting streaming analytics and real-time applications.

Still, hybrid in-memory/disk-based platforms like ActiveSpaces have yet to be put to the test in many high volume, high velocity production environments. CIOs should require TIBCO and other vendors pushing such platforms to perform rigorous PoCs before settling on the approach as the basis of Big Data practices. Another option is to compliment Hadoop deployments with in-memory engines or Next Generation Data Warehouses for fast loading, real-time capabilities.


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU