UPDATED 08:15 EDT / OCTOBER 30 2012

NEWS

Tableau Software on Big Data Visualization

Seattle-based Tableau Software is a data visualization company making some big waves in the world of Big Data.  Daniel Jewitt, the VP of Tableau Software shared his company’s history and Big Data story on theCube at Strata-Hadoop World 2012.   The company was founded back in 2003 as a tech spin-off of a Stanford University PHD project funded by the Department of Defense.  Pat Hanrahan, the famed Pixar engineer was approached by the DOD to aid in visualization of their enormous data environments.   Today the Tableau product provides insightful visualization based on queries within the realms of relational databases, unstructured data, and Big Data.

Tableau’s mission is to help people see and access data.  As new announcements hit the wire, they herald the arrival of new connectors and extended footprint overall into the world of Hadoop and Big Data.  With a year-long integration with MapR and Cloudera in its story, the reach is expanding even further.  By supporting Impala, the environments in which the product has been tested and hooked adds to the expanding set of supported distributions, including such technologies as Hadapt, HortonWorks, DataStax, and Cassandra.  Tableau maintains that they are not a visualization company, but actually an analytics company, focused on helping the individual understand their data.  One of the biggest advantages to using Big Data is the focus on making sense out of the data, focusing not only on visualization alone, but looking ahead to what that insight delivers.  Visual paradigms make data stories come to life.

Data is useless if it cannot be understood, and thus a gap is often formed.  Those are lost opportunities to gain a data advantage.  That’s where visualization becomes so important and the use cases are all over the map, including a major retailer that has standardized on Tableau, leading-edge medical research, and even high-school project work.  The same tool applies in all these cases, only scaled.

Tableau predates Hadoop, and its adaptation to that world started with building around the premise of connecting to data, such as running live database queries, interacting with data in this interactive way and leveraging the database.  In the world of Big Data, elements like latency and scale mean the experience may not be interactive, so Tableau introduced a new memory analytics engine some years back.  The combined ability to run queries to a live data source, snapshot into the memory engine, and accelerate analytics alongside these queries is creating a seamless transition experience in pulling meaningful data out of raw source or choosing to plug it in  to memory and supercharging performance.

Tableau has a vision for the long term, and envisions a pervasive trajectory.  To support this they have taken their product online launching Tableau Public – a visualization tool free to the world, where data can be uploaded, and interaction with data can be done immediately.  They continue to lead in this world of Big Data and suggest that the next unanswered question in Big Data requires an iterative discovery process – data will tell you a story and with the right tools the story and next questions will present themselves.


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU