Nothing else quite compares to the annual Tableau Conference. This event has grown from 140 attendees in 2007 to over 14,000 data enthusiasts in 2017. The vibrant Tableau community passion was palpable – as always. This special group launched #data17donates to help those affected by the recent tragedy in Las Vegas to give back to the local community during the event. For those of you who missed the conference, you can still watch keynotes and enjoy top sessions on the Tableau Conference 2017 Live site. In this article, I will share my perspectives on what I saw, heard, craved, and read between the lines during industry analyst sessions.

Still Growing Despite Aggressive Competition

To be open with you, I was anxious about Tableau coming into this event. I know the data visualization market is brutal right now. I wrote early warnings about Microsoft waking up from hibernation way back in January 2016. In 2016 and 2017, I have seen top talent from Qlik, Birst (acquired by Infor), Microstrategy, Thoughtspot, and several more business intelligence vendors making leaps into other areas of the analytics market. I knew Qlik would struggle back in 2013 and Qlik Sense would not restrain Tableau in 2014. In fact, last week during Tableau’s event Qlik made a CEO change announcement. This market is changing.

Tableau shared that they are still growing and still the gold standard in visual analytics. They now have 1,200+ partners, 55,000+ active user group members, 150,000+ online community members, and 61,000 customer accounts. They won 15,000 new customer accounts in the past year with more than 50 investing over $1 million. Sure, those numbers might pale when compared with big tech claims of millions or billions depending on what vanity metric is being shared, active users, downloads or shelf-ware.

The “who” in Tableau’s numbers was telling. As I walked around the expo hall, I noted big name Fortune 500 enterprise accounts, financial institutions, healthcare, energy, government, top consulting firms, and so on. I still think Tableau was in the “right place at the right time” back in 2012 when big traditional vendors were bumbling, not listening and not caring enough about customers. I give Tableau a ton of credit for leading the way and staying the course.

Let’s Revisit 2016

To fully appreciate what was shown in 2017, I recommend briefly skimming through the three year sea of announcements from last year. They unveiled plans for Tableau on Linux, new Hyper engine, hybrid cloud connectivity, Project Maestro for data prep, templates of pre-built analytics, recommendations of workbooks, data sources, and contextually relevant insights, alerting, governance, collaboration, new APIs, custom visualizations and other stuff.

Delivered in 2017

Throughout 2017, Tableau released four incremental updates with features from some of those announcements, 18 new connectors, TabPy for Python integration, previewed Tableau Bridge for hybrid cloud connectivity and countless smaller updates that I have previously covered. They acquired ClearGraph for natural language query and evolved to a subscription pricing model.

viz-in-tooltip_gif

 

Shown on Stage Last Week

Keep in mind that I always have extremely grand expectations for Tableau to lead the way forward, innovate and be creative. I am also incredibly difficult to impress. I slightly disagreed with the opening keynote myth about artificial intelligence and automation. I do believe there is value in expediting analysis with augmented analytics to find and continuously monitor the metrics that matter most.

Artificial intelligence will take on heavy lifting in the future

I agree that analysts will not be replaced – I think they will be more productive and make a bigger impact with better tools. Tableau did sprinkle in a note that “artificial intelligence will take on heavy lifting to free you to use your human intuition in the future”. That statement did not go unnoticed as it aligns with my forecast on where our industry is inevitably heading.

Linux and Hyper

Since Tableau shared a three year roadmap last year, what I saw on stage this year disappointed me until I “read between the lines”. It was only after talking to Tableau’s product team leadership that I fully understood the sheer magnitude of the full port to Linux (not just emulation) and Hyper integration work that has been accomplished. These platform milestone achievements set Tableau up nicely for faster feature releases going forward.

To put what I heard with regards to platform updates in perspective, when other BI vendors needed to re-platform from Silverlight or Flash to HTML5 or something else such as Qlik (Qlik View, Qlik Sense), Power BI (Excel/SharePoint, Silverlight, Office 365, stand-alone), etc. it usually has taken several years of work and required customers to invest in not-so-easy or elegant migrations.

Linux

For Tableau to just keep working with a simple in-place upgrade of the new Hyper data engine or spin up a Tableau Server on a Linux box is impressive. Hyper is already running at full scale on Tableau Public – workbook authors did not have to republish. Allan Folting alluded to those points on stage but I’m not sure the visualization loving audience recognized the true beauty of the back-end engine integration.

Tableau 10.5 Beta

Tableau 10.5

Tableau 10.5

 

Hyper is ready for testing in the Tableau 10.5 beta that is now available. I can verify that Hyper is waaaaaay faster than Tableau TDE extracts. I did hands-on performance testing earlier this year. It was effortless to upgrade my TDEs to Hyper format. Loading data and rendering Tableau workbooks with larger data volumes exceeding 100 million and several billion records was where I noted obvious speed improvements. This huge enhancement will be warmly welcomed by existing customers that use Tableau TDE’s right now. On the other hand, database vendors that leaned on “we make Tableau faster” sales pitches will need to update field sales playbooks and messaging.

Project Maestro

I finally got to see the alpha per se of Project Maestro. It reminded me a little of Trifacta, Paxata and the already available data prep within TIBCO Spotfire. I did like the visual user experience and flow of data profiling, cleansing, grouping and “SMART” design. SMART meaning Project Maestro intelligently applies the data prep operations in the pipeline. If inefficient data prep steps are created, Project Maestro is designed to check and optimize SQL statements.

Project Maestro

One question that came up was “Will Project Maestro take share from Alteryx, Datawatch and other data prep vendors? From what I saw thus far, Project Maestro looks like a basic level offering that only feeds data into Tableau. If you need deep data prep capabilities or other data source destinations outside of Tableau, you’ll still need another data prep tool.

The key unanswered questions for me. 1) When will Project Maestro be available? Project Maestro should be in public beta in early 2018. 2) Will Project Maestro ever be a stand-alone offering? 3) How much will Project Maestro cost * if *  there is an extra license needed?

New Extensibility API

By far my favorite announcement this year was a new Extensibility API that allows third-party solutions to add read and write capabilities to Tableau. Tableau demonstrated write-back, data catalog integration, NLG results from Automated Insights and powerful prescriptive analytics from a wonderful client of mine, Frontline Systems, to illustrate common use cases.

Sales Trip Planner

In my opinion, the Extensibility API opens up a world of possibilities for augmented analytics, machine learning, statistics, advanced analytics, workflow and other types of apps to integrate directly within Tableau. It allows customers to mix and match the best tools. It also offers choice and flexibility.

Tableau’s core strengths do include choice and flexibility for cross-platform analytics

When it comes to analytics, you should not have to compromise or be limited by “good enough”. Analytics is a strategic investment. You want a unique competitive analytics advantage versus doing the same thing with the same tools everyone else is using.

Mission Critical Analytics Platform

Back in 2014, I dinged Tableau on governance. I was not alone. Apparently, they got the memo. Tableau has made strides on improving version control, administrative utilities, dashboards, audit, granular controls, certified data sources, smart recommendations and myriad of other governance-related features.

Tableau versions

This year we saw a fantastic new capability to save and publish Tableau content to different Tableau versions. That helps enterprises maintain heterogenous environments that are unavoidable in the real-world. We also saw Tableau Server Manager in browser, a live scale up demo for Tableau Server admins, data source certification, lineage, a new data source page that shows where data sources are being used, recommended data sources and nested projects.

Lots of Little Stuff

The highly anticipated Devs on Stage keynote this year was not at all what I expected. There were five sections jammed with approximately 40 new features. Unlike the shock and awe of big surprises last year, this year I saw a lot of little things. I wanted to see augmented analytics, baked in advanced analytics, real-time streaming and on so. However, the Tableau audience seemed pleased with the following shown enhancements.

  • Nested sub group sorting
  • Viz in tooltip easy insert sheet option
  • Formatting grid alignment
  • Keyboard nudging
  • Spatial database object support including joins with intersects
  • Gorgeous new heatmap and density mark type
  • Pre-built content for popular data sources

My surprise this year = I didn’t see anything showcased for Tableau Online?!?

The Bottom Line

All in all, I am happy that Tableau has retained a strong community and an extensive worldwide enterprise customer base. They demonstrated that they achieved what they set out to do last year to set themselves up for future success. Now that other analytics ecosystem vendors can extend their platform, it is going to be fun to see what happens next.

What awesome combinations can finally come together?

I left Tableau Conference 2017 wondering if unveiling the three-year product roadmap last year was the right thing for them to do. Then again, they might have needed to do that to continue to win in the face of extremely aggressive competition while they were working on the back-end platform. Now I am wondering if Tableau is holding back…becoming more secretive again with innovation. I did notice NDA labs during the event. Tableau used to refuse providing any future roadmap insights. If they do show too much, the company over the bridge and other vendors will copy it quickly.

I’m wondering if Tableau is becoming more secretive again with innovation

What do you think? Do you think Tableau showed just enough to make it look like they are not being secretive while actually returning to their old secretive ways to survive and thrive?