UPDATED 12:30 EDT / APRIL 17 2017

BIG DATA

Unifying the data processing pipeline with Apache Beam

Setting up a data processing pipeline is a juggling act. What applications work with the backend? Can those applications work together? What about fitting it into existing infrastructure?

The best answer to those questions is a system that can reach across many applications and infrastructures, according to Kenneth Knowles (pictured),  software engineer at Google Inc.

One such system is Apache Beam, an open-source, unified model for data processing workflows. “The truth is it’s extremely general,” said Knowles, who spoke to George Gilbert (@ggilbert41), co-host of theCUBE, SiliconANGLE’s mobile live streaming studio, at the Flink Forward 2017 event last week in San Francisco, California.

Knowles and Gilbert discussed Beam, Apache Flink and data processing solutions. (*Disclosure below.)

A unification of backends and languages

The genesis of Beam came from a combined code donation to Apache concerning Google Cloud Dataflow, Apache Spark and Flink, Knowles explained. These three efforts toward the same end pointed out the need for a unified model.

There are three main aspects to Beam: portability across backends, unification of streaming and batch processing, and the power to work across multiple languages, like Python and JavaScript. Beam is also ready to handle a wide choice of use cases, from high-frequency trading to fraud detection. A company can also use it to crawl the web, Knowles stated.

Beam’s usage profile puts it in direct competition with MapReduce, a component of the Apache Hadoop processing framework. Knowles confirmed that Beam is intended as a replacement for MapReduce. Anyone writing a MapReduce pipeline should benchmark it against a Beam pipeline, he suggested.

As for working with Flink, Beam still has a ways to go. All the backends are missing some pieces of the final model, Knowles confirmed. Beam is not trying to take the intersection of these backends, however. They’re working to add features for every system, he said.

“For myself, the goal is that nobody is going to be locked into a particular backend,” Knowles concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of Flink Forward 2017. (*Disclosure: TheCUBE is a paid media partner at Flink Forward. The conference sponsor, data Artisans, does not have editorial oversight of content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU