And so it begins…Tableau enters the Salesforce era. This year Tableau Conference, the world’s largest gathering of data enthusiasts, continued to grow with over 18,000 attendees joining the party in Las Vegas. Unlike past years where I got the inside scoop directly from Tableau’s product team in analyst briefings, this year I’ll share my perspective from the vendor sidelines.

Data as a Translation of Life

This year Tableau opened with an emotional keynote that brought a new perspective to the same data culture message they have been evangelizing for a decade. Well-known community leader and former Zen Master, Anya A’Hearn, was magnificent in helping us think of data as a translation of our voice and our life. Her beautiful section was followed by a moving story about the Code Breakers of World War II. If you work with data, I urge you to watch this keynote. Start at 22:22 if you don’t care about Tableau’s growth stats.

Mark Animations = My Favorite New Feature

To open up the Dev’s on Stage Keynote, we saw Tableau data artistry elevated to the next level using Mark Animations with different chart types. Pay attention. It is the little things like this one will that make all the difference in deciding who wins or loses in commoditized markets. My only wow new feature from Tableau this year was literally Mark Animations.

Mark Animations

I bet that one addition alone will sell a heck of a lot more Tableau! I used to buy third-party controls like Fusion Charts to get animations in important POCs when I was a sales engineer selling PerformancePoint in SharePoint. People are emotional creatures. Sizzle sells.

Related page player and web authoring in Tableau Public were also exciting little things that will make a big difference. Dashboards with animations do get noticed, can win hearts and minds, close deals, influence lawmakers, and so on. I experienced it before with old Excel Power Maps, Power View and SandDance too.

Scaling Analytics to the Enterprise

During the opening keynote, Tableau shared findings from West Monroe Partner research that only 24% of companies cite they can readily access their data. Per McKinsey & Company, only 8% of organizations succeed in achieving analytics at scale. For my peers reading this article, we know these stats are unacceptable. Data catalogs and analytics solutions are so simple and low cost to scale today. Why is this still a problem? Déjà vu.

Tableau Blueprint

According to Tableau, the answer is creating a data culture with its Tableau Blueprint methodology. Having spent 15 years in data warehousing and analytics implementation consulting, the Blueprint approach looks familiar. Best practices that cover people, process and technology for rolling out analytics and improving enterprise adoption …there is nothing novel about it. However, Blueprint sounds better than enterprise analytics roadmap or dare I say it… self-service BI governance. (Notably, I hired Melissa Coates to write my most important, “must-have” white paper for Power BI adoption in early 2016 – it covered enterprise deployment governance. I knew organizations needed that guidance to scale analytics successfully.)

What might be new per se for Tableau is how they elegantly unified the process of finding curated data sources to managing and using them in visual analysis in a governed manner. I also genuinely appreciated the Tableau server content migration capabilities for common development, test and production life-cycle management tasks. Tableau also added customizable branding (another big, little thing), and improved licensing management with server sign-in options.

Tableau Migration

Data Catalog and Data Prep

This year, we saw key enhancements to Tableau’s data catalog that included a wonderful, warmly welcomed lineage capability. Notably, Tableau highlighted to the audience that their cooked in data catalog required no manual setup – unlike existing partner data catalog solutions. The tone seemed to change since Tableau Europe in June 2019 where partner catalogs were also featured in the keynote.

Tableau Data Catalog Lineage

I expect Tableau’s messaging to continue to change as they expand further into more areas of data management and advanced analytics that previously were partner-led capabilities. It will be interesting to see if Tableau keeps nurturing best-in-class, niche vendor partnerships as they merge into Salesforce’s ecosystem. Unlike previous Tableau keynotes, we did not see any niche partners highlighted. Only mammoth cloud vendors – Amazon AWS, Microsoft Azure, Google Cloud, and Alibaba Cloud – made it into the keynotes.

Unlike previous Tableau keynotes, we did not see any niche partners highlighted.

After the cool data catalog demos, we saw a much more mature Tableau Prep that added Conductor for sharing, incremental data refresh, write to database and rich browser-based authoring functionality. During the Devs on Stage keynote, Tableau walked through improvements to wrangling data using functions such as fixed level of detail lists and rankings. They also showcased fantastic reusable steps. With reusable steps, you can share your time consuming, tedious data preparation work that you did for others to efficiently reuse with drag and drop ease.

Tableau Reusable Steps

Ask Data and Explain Data

Moving on, we saw interactivity, calculations and time-based calculations added to Tableau’s Ask Data feature for historical analysis. Then Explain Data, an engine that uses machine learning to automate statistical insights, was questionably described as a “data scientist on demand”. Gotta love marketing.

Explain Data is more like a “statistician on demand” that answers historical why questions.

Don’t get confused and expect predictions from Explain Data. To clarify, Explain Data is more like a “statistician on demand” that answers historical why questions. Most data scientists I know are using machine learning to build models to predict future outcomes and offer prescriptive guidance on what to do next.

In the illustration above, I’ve highlighted where I believe Explain Data fits in the analytics maturity curve. Explain Data helps you understand the past. It is the Empirical Systems acquisition functionality baked into Tableau.

The Explain Data implementation I saw is simple, lovely and provides supportive natural language explanations. With this progression further up the analytics maturity curve, Tableau will teach the masses to start using advanced analytics widely across their organizations. That is a good thing.

Tableau Explain Data

What I didn’t see or hear about was any hint of Salesforce Einstein integration that was mentioned during Tableau Europe 2019’s keynote. However, I do expect that to come – but when? Since Power BI, Qlik, TIBCO Spotfire, Sisense, YellowFin, ThoughtSpot, Tellius, and other visualization vendors already have automated insights and some predictive analytics capabilities, both Ask Data and Explain Data are late to market. In May 2016, I shared research that revealed if you had not budgeted to add predictive and prescriptive analytics into your analytics arsenal, you were already lagging behind.

Better Late than Never

Additional catch up features this year included the long-desired Dynamic Parameters (took ~5 years), set controls, geospatial buffer calculations, PDF subscriptions with scheduling and report bursting. We had some of those things all the way back in my dinosaur reporting days in Crystal Reports, Microsoft Reporting Services, PerformancePoint, and other enterprise BI platforms. Nevertheless, the news delighted Tableau fans.

Tbaleau Dynamic Parsm

Tableau Dynamic Parms

In another highly anticipated demo, Tableau unveiled richer data modeling and smart join capabilities playfully referred to as Noodles. Noodles automatically detect the correct granularity of joins and support multiple-fact table, many to many relationships. Those key enhancements simplify working with complex enterprise data scenarios. My dimensional modeling peers likely cheered with joy seeing that wish finally evolve into reality. I know I did.

Tableau Noodles

My Top Takeaways in 2019

This year we heard the same messages that we have always heard from Tableau. Nothing much has changed yet. What will change and when? I don’t know. From earlier conversations with Tableau leaders, I learned this acquisition will be treated similar to the MuleSoft one.

From what I can see from the vendor sidelines, Tableau’s impeccable user experience design lives on with bright, diverse product team talent leading the way forward. Tableau seems to be focused on maturing offerings in practical ways to better serve enterprise customers. What was shown this year makes sense – Tableau continues to be customer-focused. APIs and partners got neglected. A surprise for me was seeing Tableau embrace animation. I’m not sure if anyone asked them for it but we do like it.

All in all, this year felt like the competitor catch-up marathon that started several years ago will be ongoing. Nothing shown publicly would leapfrog Tableau ahead on the analytics maturity curve. Next year might be another story with the joint force of Salesforce in the mix. Time will tell. Regardless, Tableau’s community growth and love within the analytics market persists.

Additional Conference Resources: