Many data streams, one data lake: the new design for efficient processing

bigdatasv-lake-panel

Data is like water — heavy, expensive to move and countless ways to store it. Working efficiently with data means moving the processing to where the data lives, but if a company’s information comes hundreds or thousands of sources, that can be tricky.

One solution is called a data lake, a massive collection of data in one place. Those thousand sources can feed into the lake, leaving the company with one target for its processing work.

To shed some light on the makings and advantages of data lakes, a panel of experts sat down together at the BigData SV 2017 conference in San Jose, CA. The panel included Itamar Ankorion chief marketing officer at Attunity Inc.; Martin Lidl, director at Deloitte UK; and IT Solution Architect Chris Murphy.

The panel joined George Gilbert (@ggilbert41), co-host of theCUBE, SiliconANGLE’s mobile live-streaming studio (*Disclosure below).

Effective data storage, efficient data processing

Gilbert opened the discussion by asking the guests to explain why they create data lakes.

According to Murphy, his company built a data lake, first and foremost, as an operational asset to produce value for the company. Beyond that immediate value, once the data was in place and curated, the lake became a research asset.

In a similar way, Lidl explained they were working on creating business cases from the start. There were many moving parts and a number of teams working on specialized areas. However, the result was a flexible architecture that could respond to changes very quickly. This would allow them to better serve a customer’s particular needs.

Building a data lake is one thing, getting information into it is another. Ankorion stated that’s where his company comes in. People needed an efficient way to get data into their lake to support real-time processing on fresh data. Attunity gives businesses the ability to capture data as it changes across databases, he said.

“We turn the databases into something like feeds that can stream,” Ankorion explained.

The effect of a proper data lake was immediate for Murphy. “It’s like we’re moving from candles to electricity; there’s no single use case,” he said.

Implementing the data lake put Murphy’s company in control of its data. “It was like night and day,” he said.

The great benefit of this lake was the ability to get a 360-degree view of the customer, according to Murphy. It gave the company a customer-centric view and improved its ability to understand what it could do for its customers.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of BigData SV 2017. (*Disclosure: Some segments on SiliconANGLE Media’s theCUBE are sponsored. Sponsors have no editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE