GoodData, a venture-backed developer of cloud-based intelligence (BI) software provider, is jumping into the unified analytics race with a new service for ingesting and processing different types of data from multiple sources.
“Advanced analytics can be the biggest advantage companies can have in today’s market, but most use just 10 percent of the data available to them,” noted GoodData founding CEO Roman Stanek. “To get a true competitive advantage from analytics, businesses need to use all the data available to them – to move beyond big data and become all data enterprises.”
The firm’s freshly announced Open Analytics Platform promises to help customers deliver on this vision and make analytics a part of their business with a set of APIs that plug into backend data integration, workflow and discovery processes. The interfaces also enable IT organizations to push new functionality by taking advantage of built-in connectors and metadata maps for more than 50 data sources, including traditional on-premise systems and Big Data solutions like Hadoop as well as popular cloud applications.
GoodData is taking on the ambitious task of serving up data sets from variable sources, leveraging APIs and an endless supply of software updates to set itself apart. Democratizing data is a matter of automating user-friendly software solutions that simplify the specialized skill of analysis. Building a business on such Big Data promises is one I hear often, especially as the buzz dies down and buyers demand comprehensive solutions.
When asked how GoodData provides the end user data discovery freedoms they otherwise could not access, Stanek explains it’s a mix of IT-managed access points, and real-time delivery to end users.
“IT gets to control and manage the policies, processes and infrastructure to deliver trusted data to the business people, while business users get to explore the data to uncover insights in a easy to use and attractive interface,” says Stanek. “GoodData’s interactive data discovery capabilities give users a constant, real-time view of every part of a business, from sales and marketing to supply chain operations. Visual analytical explorers enhance ad hoc, custom and pre-built dashboards to give users the right insight whenever they want it.”
On the operations side, it provides version control functionality, allows admins to “propagate changes instantly” and offers governance capabilities for ensuring that the right corporate data is available to the right people at the right time. GoodData claims that the service can be set up in days, compared with the typical 18 months it takes to deploy a legacy BI solution from SAP or IBM.
Servicing Big Data in the cloud
GoodData’s looking to gain ground in a maturing market, where data-driven insights are being applied to every department within an organization. Data’s become the competitive advantage, and a services market has cropped up around its growing demand. The cloud model is making its way across the software scene, where expensive developer + IT projects are sourced out to third party services. Scalable, virtualized and contracted, this market is also becoming one where differentiation matters.
So what does an “all data” enterprise look like? For GoodData, it’s one that recognizes the significance of data.
“All data enterprises view the ability to leverage advanced data and analytics as a key competitive advantage,” explains GoodData CEO Roman Stanek. “By using analytics as the fuel behind nearly every process and decision of the business, these organizations literally run on data. Examples of companies who have done this extremely well are Netflix, AirBnB, and LinkedIn.”
“But for every LinkedIn, there are 10,000 companies that don’t have that level of technological expertise,” Stanek goes on. “They rely on companies like GoodData to power their analytics. Today, thanks to Hadoop and other big data technologies, companies can view previously disconnected data as a cohesive whole. Employees at every level of a company need to be able to access, analyze, and digest data.
“It has nothing to do with the volume of data, which is where the now-cliche buzzword “big data” originated, but rather the concept that all data — regardless of data type or source — should be accessible in order to uncover vital correlations between, say, customer call center data, website usage and sales.”
Maria Deutscher contributed to this piece.
Kristen Nicole has also contributed to other publications, from TIME Techland to Forbes. Her work has been syndicated across a number of media outlets, including The New York Times, and MSNBC.
Kristen Nicole published her first book, The Twitter Survival Guide, and is currently completing her second book on predictive analytics.
Latest posts by Kristen Nicole (see all)
- The Land of Variables: IoT’s map to monetization - September 14, 2016
- Destroy to create: How one CEO innovates in object storage, open source - September 8, 2016
- Where’s the money in IoT? Start with real-time data - August 25, 2016