The results of a study conducted by Gartner shows that BI (Business Intelligence) is one of the top three priorities for business development. Therefore, many companies are really starting to pay attention to the analytical capabilities of their business applications, as they are a key factor influencing consumer choices.
While many companies are aware of the benefits of control of Big Data, the majority is not yet able to manage and analyze this goldmine of information. The multiplicity of data could be in the process of forming a digital divide between those companies able to manage and analyze information from multiple sources in real time, and those who cannot.
The predictive eye on real-time analytics
A number of emerging trends have proven effective in the realm of predictive analytics. They have given rise to ways of working that take place in real time. Currently, we are entering the era of VRM (Vendor Relationship Management), which means that, thanks to Web 2.0, consumers can share information in order to attract a specific profile of brands and products/services. The real time analysis is a logical trend to study when you look at the requirements behind processing demands of the incessant data shared by consumers and end-users alike, on the web, through mobile devices and log data, to name a few.
For example, if information is available on a big data analytics platform that states that a person wants to go on holiday to a European country with a particular budget, there’s a travel agency that offers a product available in the timeframe closest to the publication of the application on a platform such as a travel website. But as time passes, this consumer and application data loses its relevance.
To be effective, real-time analysis must make companies more efficient. This approach must include sensors in real time for each step of the decision-making process for the travel agency’s web content and targeted placement, be it in analytics across both real-time and historical data, correlation of past, present, and real-time information, multi-dimensional analysis of continuous real-time feeds and real-time event capture, filtering, pattern detection, matching, and aggregation.
Predictive Analytics: Visualization for Big Data
Real-time data analytics touches almost every part of a business in order to collect, use, and analyze data. Now, more than ever, businesses leaders, from the data managers and enterprise architects, marketing, advertising and sales managers, are focused on the immediate implications of data.
Tapad, a two-year old company that provides cross-platform digital advertising technology, delivers real time analytics and decision-making tools with its big data platform, which processes around 150,000 ad impressions per second to select the right ad, for the right user and right device, all in real time.
Tapad uses technology from Aerospike to deliver ad units in real-time to potential buyers. Aerospike offers a flexible NoSQL platform designed to be used as an in-memory database with intelligent data migration, automatic re-balancing, SSD or flash-optimization.
Providing tough competition to industry heavyweights such as Oracle, HP, IBM etc., Aerospike technology is being used by companies worldwide to run real-time database in environments where managing billions of objects and predictably processing, anywhere from 200,000 TPS to 1 million-plus TPS is a must, and even a momentary failure is not an option.
Aerospike provides the only real-time NoSQL database that can combine both structured and unstructured data to support real-time database environments for transactions and other business-critical operations.
Decision analysis, in real time and historical, is a major challenge for companies looking to increase their productivity, because the complexities behind the technology and implementation can actually interfere with the merits of Big Data and complex event processing.
Based on the Netezza technology, IBM’s PureData Analytics System is an appliance for processing simple data that has been designed for complex analysis. This solution simplifies and optimizes the performance of data services for analytical applications to run very complex algorithms in time, counted in minutes rather than days. This system is optimized to perform complex analysis, and to treat up to one thousand interactive queries per second.
Open source BI vendor Actuate Corp recently teamed up with KXEN, a provider of predictive analytics for business users, to help companies make better business decisions by deploying easy-to-use predictive analytics on petabytes of big data. KXEN’s automated predictive analytics capabilities combine with ActuateOne’s open technology for merging and visualizing various big data sources, supporting customers’ needs for big data infrastructure, from accessing data sources to visualizing to operationalizing data for collaborative decision making.
Vitria’s Analytics Engine, the operational intelligence company, provides advanced, continuous analysis of real-time information and historical data using sophisticated complex-event processing (CEP). Vitria Operational Intelligence platform delivers real-time information to track key performance indicators (KPIs), monitor service level agreements (SLAs), perform geospatial analytics, measure business performance in real time and make faster, better business decisions.
Then there’s the Next Generation Information Optimization solution from HP, a product borne of the acquisition of Vertica and Autonomy. The HP Vertica Analytics platform indexes and analyzes both structured and unstructured data in a single platform.