UPDATED 06:07 EDT / SEPTEMBER 30 2015

NEWS

Got Big Data, but will it blend? Pentaho says it can

Pentaho Corp. has released the first major update to its Big Data platform since it was acquired by Hitachi Data Systems back in February, with the goal of helping users to blend their data and manage the analytics pipeline more efficiently.

Pentaho kept its old brand name in the wake of its acquisition, and now operates as a HDS company while continuing to develop its Big Data products. With Pentaho version 6.0, the company has introduced the first edition of the Pentaho enterprise-class server to give companies the ability to blend their data no matter where it resides in the data pipeline.

The company says this ability to blend data from different sources is necessary because of the emergence of a spectrum of data architectures built for different use cases. Pentaho is talking about traditional data warehouses that host structured data, and data lakes that act as a repository for raw, unstructured data. It’s also referring to data refineries, which sit somewhere between the latter two architectures and transform raw data, giving users the option of incorporating data sources that are too varied or fast-moving to stage in the data lake.

With Pentaho 6.0, users can access new data services and delivery options to “blend and virtualize” datasets on the fly, giving them faster access and greater flexibility when mashing up their Big Data, the company said. Pentaho has also thrown in appropriate governance structures for analytics to be performed at scale. Other new features include inline analytic model editing that lets users share custom-made measures and metrics for collaboration purposes.

Meanwhile, enhancements made to the “push down optimization” feature allow for the automation of data transformations to ensure these are completed using the most efficient resources available. Also, newly introduced “data lineage capabilities” have been put in place to help users understand where their data is coming from.

“Companies need software that can efficiently manage the process flow of multiple diverse data sources in a scalable manner to create the unified analytic data sets that lead to insight,” Tony Cosentino, VP and research director at Ventana Research, said in a statement. “This is one of the most effective ways to provide the value needed and expected by management. In this latest release, Pentaho 6.0 addresses not only the need to manage the process flow, but also to help automate the entire analytic data pipeline.”

The company says it will demonstrate the capabilities of Pentaho 6.0 at its PentahoWorld event in Orlando, Florida, on October 14, the same day the platform hits general availability.

Image credit: appelogen.be via flickr.com

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU