UPDATED 17:52 EDT / JUNE 01 2016

The DevOps of Data: How Informatica Prepares Developers for the Age of Data 3.0 #INFA16

Last week’s Informatica World 2016 brought out a lot of talk involving data quality, real-time live data and the automation of ingesting and analyzing data in order to turn it into something businesses can use. Over the past years cloud technologies have delivered the power of provisioned elastic computing resources and expanded data capabilities to developers and even automated much of it, but with the influx of large amounts of data DevOps models are still needed to ingest and process it.

Anil Chakravarthy, CEO of Informatica LLC, calls this the Age of Data 3.0, which is the opportunity to build business models based on essential questions about the data produced by a company but might remain enigmatic because actionable information doesn’t come without analysis.

“Data is an essential part of making your digital strategy work,” Chakravarthy said during a SiliconANGLE theCUBE interview at the conference.

This belief intersects with the DevOps paradigm because understanding how an application is functioning requires data flow from the app running on production. The challenge for extremely large companies is that any number of sensors can be added to every level of application deployment, which can generate a great deal of data that needs to be moved, stored and finally analyzed to turn it into something operations or development can take action on.

The DevOps Angle from Informatica World 2016

During his interview with theCUBE Chakravarthy used a practical example of data coming from an aircraft engine while in flight for a mechanic crew on the ground. His presentation commented on how analysis could be used to examine the data coming from sensors on the engine that could be used to determine if it was behaving out of the ordinary. This could be done by setting particular ranges (for alerts or further analysis) and comparing multiple sensors.

For the purposes of DevOps, data streaming in from software sensors in servers, on mobile, on the web provide information on how applications are behaving. DevOps teams face the challenge of not just choosing what to instrument (where to put sensors and what kind of data to pull from applications) but also how to move, store, ingest and finally process that data into actionable information.

Data security does not drift far from how DevOps handles data: faults and flaws in software running in the field require action by operations and development just as security flaws require reaction from the security team.

When it comes to how Informatica provides security to data Bill Burns, Chief  Information Security Officer and interim CIO at Informatica, noted that any cloud data automation solution should allow a company to keep to its core competency and be able to trust the provider. On the DevOps side, to be agile teams are often smaller and the individuals involved do many jobs related to both operations and development. As a result of automation, a great deal of information and data is flowing through other parties (for storage, ingestion, transformation and analysis).

Burns notes that small, medium businesses and enterprise pulling data through Informatica’s solutions seeking value need to be sure that they can trust the processes and pipes provided by Informatica. This is even more important that more data is being gathered and many clouds no longer live on premises and trust in those off premises solutions begins to become key.

Both security and DevOps continue to trend towards allowing data analysis to make autonomous decisions based on real-time data coming in through the various sensors and instrumentation. Also both security and DevOps automation needs to react to unexpected outcomes from that data–if an application is behaving unexpectedly it could be either a flaw in the logic (a user is attempting to do something they shouldn’t by accident) or indeed also a security problem (a malicious user is attempting to do something that should not happen in order to break into the system).

Both instances require investigation by someone from the security team or the DevOps team in order to determine if there is a problem and how to resolve it. Security problems may mean the necessity of a more real-time reaction to prevent an attacker from accessing data they should not; although operations problems might mean the need to add servers (to lower lag or prevent customers from hitting roadblocks) and development could mean quickly producing a patch that prevents the issue in the future.

In all instances: information influx, capable analysis of that data and action on it require a framework capable of processing it and Informatica hopes to deliver these tools and this trust with current products.

Aircraft engine maintenance

In the evolving field of rapid deployment and continuous integration, DevOps teams need more information about how software behaves while it’s running. Just like maintaining an aircraft engine, knowing what’s happening “inside” running software and getting ahead of flaws is a data problem. photo credit: Inside the A-10 via photopin (license)

 

Informatica and Automation: The future of the DevOps of Data

With the explosion of opportunities from Big Data there is a revolution in automation for a DevOps of Data.

The DevOps of Data someday soon become automating the deployment of instrumentation, the movement and storage of that data into its cloud and finally the process analytics and crunching needed to discover actionable information. Informatica’s cloud and data solutions such as Informatica PowerCenter, a data orchestration and integration solution, and Informatica Cloud link up nicely to provide operations and developers management and automation of these resources.

What is Informatica’s response to the Age of Data 3.0 for the DevOps of Data? In a guest post on Informatica’s blog guest poster Frank Melchiorre, Digital Marketing Coordinator from Advanced Systems Concepts, Inc., made a brilliant case as to what that would look like and the answer lies somewhere between data management and data storage and the extraction, transformation and loading of data (ETL) using orchestration extension solutions for ActiveBatch nformatica PowerCenter Extension and Informatica Cloud Extension.

On one level, DevOps is an attempt to automate and abstract away complex processes involved in configuration, deployment, testing and maintained of software under development. Typically, this is done by filtering information about the processes needed to deploy and maintain projects gathered from operations.

As for Melchiorre’s case for a unified workload, this means built-in scheduling and automation of workflow operations, event triggers based on expectations (to bring in the DevOps crew when something needs to happen or something is going wrong) and the automation of the processes that ingest and analyze logs and other data about the operation of current applications.

Informatica already provides native data integration with Informatica PowerCenter providing enterprise-grade ETL for real time data analytics, advanced transformation and data management. The framework provided by this solution provides the range triggers, pattern recognition and analysis mentioned above by Chakravarthy when commenting on analysis of data from an airplane engine to predict failing parts. Applications in the wild work very similarly and also depend on logic and subroutines that could be faulty under certain types of load.

Featured image credit: Wikimedia, https://upload.wikimedia.org/wikipedia/commons/e/e0/A_view_of_the_server_room_at_The_National_Archives.jpg

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU