Organizations around the world, cutting across every industry, are learning how to supplement their “gut feeling” with facts and figures, bringing new meaning to their business leaders’ decision-making processes. Technological advancements that can take into account historical data, internal transactions and employee logs have ushered in a new era of data analysis, and it’s empowering companies to operate with the confidence of clarity and self-awareness.
Data’s always been there, but we haven’t always captured it, let alone analyze it. But in a knowledge-driven world, data’s become a competitive advantage in understanding your respective market. The past couple of decades have enabled a highly structured approach to data capture and analysis, but new technologies in storage and data transfer have crafted a much friendlier environment for more data sources.
The data melting pot
Indeed, data can come from anywhere: browser clicks, log files, mobile data and social networking posts are some of the many forms of data now pushing information analytics into the big leagues. Any company with a digital footprint could easily exceed 10 terabytes (TB) of data for storage and transfer, just to run a single marketing campaign.
The problem is, conventional data warehouses cannot scale up to support terabytes of data and advanced analytics. During the last decade, parallel processing platforms and the databases that store in columns have started a new revolution in data analysis.
The interest in data analysis is driven primarily by the need to handle large volumes of data that are loose or inconsistently structured as alerts or news in social networks, web links, email, documents and other text-centric information. These data types can be used for applications, such as sentiment analysis of your customers, but you cannot effectively manage a relational database such as an SQL Server, Oracle Database or DB.
Big data seems to be the realm of big business, but its impact is as significant as its ambitious name suggests. Tableau Software is one such vendor that helps companies to see and understand data, setting out to help anyone quickly analyze, visualize and share information.
Tableau’s desktop-based data visualization tool is being adopted by dozens of large companies and media organizations, and is used to create sophisticated visual representations, business dashboards, and fast analytics of big data. Tableau Desktop lets people create interactive reports, dashboards and visualizations using an easy drag n’ drop interface.
The result is company-wide access to knowledge, impacting an organization on every level. It can be a democratizing effect that makes everyone a little bit smarter, a little bit freer, and a lot happier with their contributions at work. But it begs the question: are we dumbing down data for the masses?
Are we dumbing down data?
Tableau doesn’t see it as “dumbing down” anything, but rather, making it more accessible. “When people think of dumbing down, they think of less capabilities,” explains Elissa Fink, CMO at Tableau. “we think it’s about making the tools accessible.” She goes on to describe the three prerequisites Tableau considers for their software:
1. you need to give access to the data
2. the tools have to be easy to use
3. they have to be fast (get the answer when u need it)
As far as leveraging data to take the emotion out of a business’ decision-making process? Fink thinks “the first thing is, when you empower people in the organization to have access to the tools they need to do a better job, then it results in more satisfied employees. Take that and think, ‘I need data to do my job better,’ and I’ll make better decisions and improve the company overall. And you’re more satisfied.”
The Tableau way
Tableau Software is currently being used by more than 9,000 companies and tens of thousands of people, all analyzing and sharing data. The company was ranked by Gartner and IDC in 2011 as the world’s fastest growing business intelligence firm. This year Tableau came in at number 14 on the list, beating out the majority of the 100 Fastest-Growing Private Companies.
One industry particularly interested in consumer data is journalism, where understanding your readers as well as trending topics requires skill and lots of metrics. The Seattle Times uses Tableau software to enhance its digital storytelling. They added a “Show Me” button to featured articles, granting readers an interactive opportunity to see records visualized in different ways, such as laying out housing rates on a map.
“Virtually every news organization out there right now is trying to figure out how to make sure that their readers are engaged in what they’re providing, and having interactive visualizations on our site is critical to doing that. It keeps us ahead of the curve and ahead of the competition with other news sites. We need to be leaders in our field, and Tableau helps us do that,” said Cheryl Phillips, Seattle Times data enterprise editor.
Data journalism lives, breathes data
One specific way in which The Seattle Times is using Tableau’s data analysis tool is to visualize bike accident sites from 2007-2012 on a map, using data to enhance the story’s written text. We all consume information in different ways, and with more data analysis tools we can provide the range of media needed to effectively tell a story.
“Readers could dig down into the corners and intersections and the streets where they lived and rode their bikes,” said Justin Mayo, the paper’s computer-assisted reporting specialist, in an interview with Tableau. “It gave them much more detail than we would have ever given them in the paper, with just a static map.”
Baking in data analysis and visualization
Many in the big data community recognize the benefits of Tableau’s simplified approach to data analysis and visualization, incorporating its tool into their own platform. Hadapt, a provider of a big data analytics solution, recently released version 2.0 of its Adaptive Analytical Platform on Hadoop via the Hadapt Interactive Query, with built-in integration for Tableau Software. The inclusion of Tableau with Hadapt’s Interactive Query capabilities delivers access to advanced analytics on Hadoop via SQL at the petabyte scale.
Powerhouse Factories, the brand-building agency, has recently implemented Tableau Software to power its client analytics portal that allows companies to deliver business intelligence to its clients. In this case, Tableau is used to uncover correlations between marketing, merchandising and operational data to strengthen consumer experience.
“Tableau breaks down the barriers between all of the marketing and point of sale data sources and delivers business intelligence that hasn’t been possible before,” said Michael Cristiani, Visual Analytics Architect at Powerhouse Factories. “Its flexibility in working with various data sources and ability to publish to our Tableau Server based client portal allows us to deliver meaningful insights right to our clients’ desktop, in real time.”
Data democratization: the new BI
Tableau’s certainly struck a cord with market demand, and they’re not the only one. Just about every analytics vendor and business intelligence vendor is working on incorporating advanced data visualization modules to their platforms in recent months. Some include Microsoft’s Power View, Cloudera Impala, IBM Cognos Insight, SAS Visual Analytics, SAP Visual Intelligence, MicroStrategy Visual Insight and Oracle Exalytics appliance. For more on the growing importance and remaining challenges of data visualization in business, see our in-depth interview with Venkatesh Rangachari, the head of QED Velocity Analytics at Thomson Reuters here.
Contributors: Saroj Kar
Kristen Nicole has also contributed to other publications, from TIME Techland to Forbes. Her work has been syndicated across a number of media outlets, including The New York Times, and MSNBC.
Kristen Nicole published her first book, The Twitter Survival Guide, and is currently completing her second book on predictive analytics.
Latest posts by Kristen Nicole (see all)
- The Land of Variables: IoT’s map to monetization - September 14, 2016
- Destroy to create: How one CEO innovates in object storage, open source - September 8, 2016
- Where’s the money in IoT? Start with real-time data - August 25, 2016