UPDATED 14:02 EDT / MARCH 15 2017

BIG DATA

Data automation and scheduling becoming mission critical to big data

Over the years companies have explored many avenues to handling and manipulating large amounts of data. However, with the advent and growth of the cloud and big data, the resulting explosion in the quantity of data being analyzed and the seemingly endless number of data sources, the need for fast, flexible and smart automation of the data workflow that can adapt and handle changes is now more important than ever.

“It’s no longer about taking one or two use cases around big data and driving success. Data and intelligence is now at the center of everything a company does,” said Basil Faruqui, solutions marketing manager at BMC Software Inc.

Faruqui spoke with John Furrier (@furrier) and George Gilbert (@ggilbert), co-hosts of theCUBESiliconANGLE Media’s mobile live streaming studio, during the BigData SV 2017 conference in San Jose, CA. (*Disclosure below.)

The discussion centered on how data flow and management is adapting as the big data environment continues to rapidly expand, as well as how data automation is becoming mission-critical for many companies relying on real-time data intelligence.

Automation is the heart of big data management

The most important factors to having success in the big data environment are not new. Batch automation and job scheduling are at the heart of modern big data management, now as well as in the past, according to Faruqui. Finding ways to automate as much of the workflow as possible is critical to adaptable, effective, intelligent and accurate data management.

“Scheduling and workflow automation is absolutely critical to the successes of big data
projects,” said Faruqui. “And this is not something new. Hadoop is only 10 years old, but other technologies that have come before Hadoop have relied on this foundation for driving success.”

The number of situations facing chief information officers and data modelers are complex and limitless. Everything, from designing customer engagement models to building new development ecosystems to back office optimization requires not only a firm grasp of the data and data sources companies’ currently have access to, but also designing for the data and data sources they may be exposed to in the future, Faruqui explained.

This includes facing and navigating the “data swamp” scenario, the complex management of both enterprise and legacy data pools that must be seamlessly integrated for any real-time “smart” data application to respond as it should, Faruqui concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of BigData SV 2017(*Disclosure: Some segments on SiliconANGLE Media’s theCUBE are sponsored. Sponsors have no editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU