Tableau Conference Like No Other

In part this was because the vast majority of attendees — probably 70 percent or more — where end-users — business, financial and other kinds of professional data analysts — rather than CIOs or others attending primarily to gather information on product upgrades, add-ons, and the vendor’s direction. For them this was an educational conference, and the breakout sessions included classes on advanced analytics, visualizations, and similar professional subjects.

At the same time, they believe that they can change the world and that Tableau is the tool they need to do it. This is clearly the era of data, when political statistical analyst and Tuesday Keynote Speaker Nate Silver becomes a TV celebrity after correctly predicting the 2008 U.S. presidential results, and analysts are finally getting the caché and tools they need to apply data to the problems that once got “best guess”, “seat of the pants” answers.

Even with that, however, this was the only conference I have attended where during the sessions absolutely no one was on the vendor conference floor except the people manning the booths (and even some of those were empty) and occasional members of the hotel staff. Everyone else was in a session somewhere. The attendees were universally excited, high-energy, and very focused.

Direction Direct from the Users

It was also unusual in that tableau brought 700 of its employees, virtually their entire development staff plus their senior management and co-founders — across the country to attend the sessions. One developer told me that these annual conferences are very important to them as a major opportunity to talk to users and learn not just what they like about the present version but what they want in the next one. This, he said, is where many of the developers get their project list for the coming year.

The result is that Tableau’s direction is set largely by its users, so rather than users coming to the conference to learn where the company and product is going, tableau’s staff comes to learn where they will be going. That direct link is a huge advantage in the marketplace. While the older competition like Cognos have to guess or rely on focus groups to decide what features to add, Tableau, because it was designed from the beginning to be used by non-IT tech end-user analysts, has this direct participation by its users.

The Attraction for CIOs

Another very interesting aspect of the conference is that the rest of the attendees were high-level IT executives from user companies. As a journalist who entered the IT industry in 1981, on the wave of the PC revolution, I clearly remember the intense internal political situations in companies where end-users brought in their own first-generation Apple IIs, TRS-80s and IBM PCs, while IT first denigrated them as toys and then did everything possible to ban them from the company. In contrast today’s CIOs, many of whom are introduced to Tableau by the growing populations of end users in their offices who were using the desktop version, often embrace it and move the company to the enterprise version. Today, a Tableau sales person told me, a growing number of sales are enterprise level, and in some cases this is how Tableau enters a company. This is a reversal of Tableau’s original strategy of “land and expand” — capture a foothold of a few users in an organization and then encourage them to introduce the tool to their co-workers.

So why do CIOs embrace this? First, it keeps the data where it belongs — in the data center where it is most secure. Actually, because Tableau can import data from spreadsheets as well as a list of larger data sources, it can be a tool for moving important company data back into the data center, where it can be backed up and otherwise secured.

Second, it is an obvious major step forward for users, who can create their visualizations in minutes and then edit, change, and redo them until they have the information they want interactively. This has immediate, obvious impact on the company’s competitive position by making the company more agile than competitors who rely on canned reports that often do not answer the relevant question in the right way.

Third, it also frees IT staff from the constant demand for programming support to create analysis on the older generation of data analysis systems. Given that most IT shops today are understaffed, this means that those staffers can be reassigned to other pressing needs.

Tableau Partners

They aren’t the only part of the IT community that is adopting Tableau. The conference vendor floor held some surprises including data analytics vendors Rapid Insight, Lavastorm, and Alteryx, which at first glance would seem to be competitors but who both use Tableau as end-user front-ends for their systems. In a demo, an Alteryx spokesperson explained why this is the right strategy. Putting up a sample chart of business data, he said, “We used to show a chart like this in a product demo and the customer would say, ‘That’s nice, but can we drill into this section of the data and do this conversion and then create this visualization?’ We would have to say, ‘No.’ Now we just pull up Tableau.”

Teradata and NoSQL database vendor MarkLogic, both of which provide Tableau as a front-end analysis tool, were there, showing how you can run Tableau directly against their databases of any size. MarkLogic, by the way, was the database powering the BBC’s Olympics Website, where it was used to capture the results from events real-time and deliver them to end-users. At its height that site had half the traffic across all of Great Britain, and the database did not break. It is also the database the U.S. federal government is using to power the new national Healthcare Insurance Exchange. HP and IBM were also there talking about how Tableau can run on their large servers, providing the processing power to handle those large data sets.

One of the most interesting Tableau partners at the conference was Cirro, a startup which extends Tableau to work on very large databases in situ, including Big Data on the Web, following the Big Data pattern of bringing the processing to the data. This can be important because it is impractical to move very large amounts of data across the network.  Another approach is to do the initial data sorting in situ and then import only the data that is of interest. For instance, very few companies really want to look at everything in Facebook. Most are only interested in individuals who fit their customer profile. This is the approach DataSift takes — it subsets data from social media to meet the needs of the individual user or analysis, taming the Big Data beast.

Of course other, more traditional companies, including several ETL, vendors that now can use Tableau as a front-end, were also at the conference. These better known names like Informatica continue to offer value by combining structured data from multiple internal sources, and having Tableau work with them provides real value. But they are no longer the leading edge and are not by themselves sufficient for handling big data, and in particular unstructured or semi-structured data from outside the company.

It is a cliché to say that we are moving into a new world in IT. But if data is the heart of the business, then the Tableau Conference provided a snapshot of how that world works and how end-users are driving the next generation of enterprise as well as consumer applications. For better or worse, this is the new world.