Stream processing turning the database inside out
When stream processors began, they were considered by the computing industry as a mere addition to the batch data stack or other stacks being utilized by information technology administrators. But stream processing has come a long way, and it is beginning to shift the way people collect data and upgrade applications. This evolution is shifting the principles applied to stream processing as the technology’s limitations are lifted.
“When you start to take this streaming style approach to things … it’s been called turning the database inside out, unbundling the database,” said Stephan Ewen, chief technology officer of data Artisans GmbH. “Your input sequence of event is arguably the ground truth, and what the stream processor computes is as a view of the state of the world.”
Ewen spoke with George Gilbert (@ggilbert41), co-host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, at the Flink Forward event in San Francisco. They discussed how stream processing has evolved and what Flink brings to the table through stream processing. (* Disclosure below.)
Stream processing for real-time applications
When stream processors thought of state with a different consistency model, as Ewen explained, the way they thought of time was actually different than batch processors of the databases. So early stream processors couldn’t complement the stack with that interpretation; a batch job could not interpret a streaming job.
“So once the stream processors adopted a stronger consistency model, a time model that is more compatible with reprocessing … all of these things all of a sudden fit together much better,” Ewen said.
As stream processing evolved, platforms like Apache Flink were able to take a concept of distributed, consistent snapshots and create one big local snapshot. When consistent snapshots are in place, then upgrading applications — which usually involves running two version of the same application — becomes much easier to switch between one application to the other as if nothing was shared.
“Now that is what is … kind of the core idea and what helps Flink generalize from batch processing to … real-time stream processing to event-driven applications,” Ewen concluded.
Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the Flink Forward 2018 event. (* Disclosure: TheCUBE is a paid media partner for Flink Forward 2018. Neither data Artisans GmbH, the event sponsor, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU