Over 13,000 data enthusiasts gathered this past week at a sold-out Tableau Conference 2016. After facing a challenging, much more intensely competitive market this year, Tableau openly shared a vision for the future analytical flow of data. This year they surprised me. As the event wraps up, I have been pondering the announcements, industry analyst briefings, staff, customer and partner conversations, and overall experience. In this article, I’ll share the key announcements, interesting tid bits, and my top event takeaways.
If you were not able to attend, you can watch the Tableau Conference keynotes, view sessions, and download content by registering for the Virtual Conference. It is a fabulous free resource!
Before I dive into the announcements, findings, predictions, and overall vibe…
I want to express a heartfelt thank you.
Leaving the Tableau Conference, I felt incredibly grateful and fortunate. I am genuinely touched by the warm reception from Tableau’s product team, former Tableau peers, industry analysts, and awesome blog readers. It is always wonderful to personally connect with you. For those of you that I could not find in the two split expo halls on Wednesday, let’s try to meet up via a conference call.
Tableau’s actions are more powerful than words. Regardless of where I worked, Tableau has always been good to me. They understood when I left abruptly in 2013 due to non-compete pressure and remained supportive even when I re-joined Microsoft in late 2014. Those actions speak volumes with regards to the unique culture of Tableau. Thank you.
Ok, let’s get started. Tableau ventured into unanticipated terrain by showcasing a glimpse of a three-year product roadmap. Historically, you could not get much insight on development planning from them. Even enterprise customers that receive multi-year product roadmap briefings from mega-vendors with “anything can change” disclaimers would not hear much from Tableau. Tableau’s decision to willingly share planning information is a warmly welcome, positive change.
Tableau loves Linux
A surprise for me was news of Tableau Server for Linux. Popular Linux is used by governments, educational institutions and businesses of every size. Linux servers in the cloud are much cheaper than Windows servers. Tableau for Linux will be available in 2017.
New Hyper Engine
Predictably the new Hyper powered engine, acquired in March 2016, was unveiled. Hyper delivers:
- Faster next generation data and federation engine
- In addition to improved columnar compression, Hyper is capable of handling different types of data such as structured, unstructured and streaming
- High-speed IoT data ingestion for near real-time analysis
- Enhanced data integration, data transformation and data blending capabilities
- Advanced analytics with features for k-means clustering and window functions
- Unifies analysis (OLAP) and transactional systems
Hyper will be available in 2017. Notably for customers, no changes to Tableau and no new hardware will be needed to enjoy this major enhancement.
Live Cloud Hybrid Data Connectivity
A key new capability for Tableau Cloud adoption is a new “live query agent” for on premises data sources. Live query agent acts as a secure tunnel to on-premises data. Tableau already offers live query capabilities for cloud data sources. In a hybrid data world, the ability to work with data anywhere is becoming more important. A couple cloud BI competitors already had this feature. Live query agent will be available in 2017.
Data Prep Surprise
I was initially stunned by news of a new stand-alone Tableau Data Prep product, codenamed Project Maestro. I have to imagine Datawatch, Alteryx, Paxata, Trifacta, Talend, and other niche data prep players that have appreciated a joint go-to-market approach will revisit longer-term strategy. Speaking to several data prep vendors in expo exhibits, I understand they were personally called by Tableau prior to this announcement.
Stepping back and thinking more about Project Maestro, it makes sense to me but I don’t know how many non-technical users will mash up data. We are already seeing a peak in self-service BI adoption growth rates. This past summer I tested a plethora of niche data prep tools along with my extended subject matter expert team. Non-technical testers failed basic exercises with existing tools. My team found that current offerings best serve a power user or BI professional persona. If Tableau can do for data prep what it has done for data visualization, Project Maestro will be a winner but they will likely have to migrate existing users to take share.
Why not just buy an existing data prep solution? My speculation is that Tableau wants to reinvent the data prep space – not just play a “me too” game. It also can be challenging to combine different code bases together.
Why not just build data prep into Tableau like other data visualization vendors? When you build data prep into a data visualization tool, you limit your users, use cases, and overall market potential. Expanding Tableau’s product portfolio and market potential is a good thing for investors.
Templates and Instant Analytics
Another change of heart for Tableau was the addition of instant analytics. In past Tableau keynotes, we heard automation dismissed. I told you to embrace aspects of analytics automation! I love the potential of intelligent analytics automation pioneered several years ago by IBM Watson Analytics, BeyondCore, TIBCO Spotfire Recommendations and KXEN/SAP Infinite Insight.
Tableau demonstrated a new capability for new solution templates for more than one data source and also automatic contextual insights. Granted you could use Tableau workbooks as a template today, sometimes you need to play the game, add a label for marketing, and then take it one step further! This will be progressively released throughout 2017 and beyond.
Recommendations and Alerting
One of my favorite announcements was the new smart recommendations. Tableau showed a sneak peek demo of a machine learning powered recommendation engine. Algorithms will surface recommendations for workbooks, data sources, and contextually relevant insights to individual workflows. To my knowledge, BeyondCore is the only player offering intelligent recommendations today. If you know of another vendor, please let me know. Recommendations will be available in 2017.
AND…Data-driven alerting was finally added!
I have literally been nagging Tableau about needing data-driven alerting for three years. It is one of those must have capabilities in my opinion. Data-driven alerting allows you to define thresholds, proactively manage by exception, and let the system continuously monitor a zillion metrics. The shown demo included thresholds that can be set on a visual or metric along with a customized email subject.
Not surprising to me was the move to add natural language query. Natural language was a popular Power BI v1 feature in 2014 with some customers only using that feature. ThoughtSpot has also entered the market providing a Google-like natural language query user interface for analytics. We also have seen the emergence of speech engines Google Assistant, Amazon Alexa, and Cortana along with other natural language vendors such as Automated Insights, Narrative Science, Yseop.
One difference in approach that I noticed was Tableau’s demo of visual properties into these queries. I have not seen that unique query design anywhere else.
They listened and now are surpassing my expectations in this area.
Enterprises will soon get new capability in Tableau Server to certify data sources, easily conduct impact analysis on sources and workbooks, promote content and write workflows with simple drag and drop gestures.
These capabilities will be available in 2017 and beyond. The Tableau Server improvements shown surpass what is available in the market today. With these improvements plus new metadata features with optional API integrations with data catalogs, Tableau has gone from getting picked on to leading the pack.
Another unsurprising addition is unified collaboration with related visualization and email notifications. We have seen collaboration in other offerings for years. It is a valuable improvement for adding context to dashboards. Users will be able to collaborate and discuss insights directly within an analysis to drive better business outcomes starting in 2017.
API and Custom Visualizations
API enhancements for better embedded analytics and new support for third-party custom visualizations such as D3.js where also shown in the popular Devs on Stage keynote.
Other New, Wild and Exciting Updates
In addition to the highlighted news, many little things were crowd pleasers. Here are a few other notable updates on what is coming soon for Tableau. For others, please check my Twitter feed. There are just way too many to itemize here.
- New Adode PDF data source connector
- New spatial file data source connector
- Visual database unions during data load
- Joins on calculations
- Automatic, dynamic latest date filtering
- New KPI metrics and display cards
- Mapbox support for map backgrounds
- Automatic map drill-down from country to detailed zip code or GIS shape
- Expressive rich text for taking Photoshop with data to the next level
- Styling storytelling features – no more boring boxes
- Distribute evenly to ease dashboard design pains
- Offline mobile with deep linking and improved tool tips
- Cloud “stuff” that network and admins care about
- Backups to Amazon S3
- AWS Cloudwatch for monitoring
- Sample AWS CloudFormation templates
Interesting Tid Bits
In talking to peers and feeling the vibe, I observed a bit more humility than previous years. It was refreshing to see advanced analytics, governance, and other enterprise deployment topics highly valued this year.
I am glad to see that Tableau is expanding their portfolio of offerings. I was hoping to see more infographics and a venture into immersive data visualization with augmented (AR) or virtual reality (AR). Tableau has the right expertise to do it.
Imagine Tableau + Magic Leap = TOTALLY AWESOME
The mainstream AR/VR viewers are quickly improving. I am still urging them to consider it for the future though I know it is still a wee bit too early and not a market need today.
This year I did not hear a single peep about previously hyped, Project Elastic aka Vizzable. I also did not see streaming visualizations to accompany Hyper stream ingestion or operational reporting that honestly still is a market need.
Congratulations Tableau for showing your courage in the face of improving, aggressive competition, ongoing commitment, exceptional support, humility, willingness to embrace change, and always leading the market with regards to creating an inclusive culture. Year after year, your culturally diverse presenters are totally amazing.
Tableau’s culture remains a wonderful asset and is truly unique. The best-in-breed ecosystem partnerships are also outstanding differentiators. As data visualization commoditization continues, culture and other intangible areas will become more important to customers.
When you have many vendor options, start thinking about differentiation including customer experience (CX).
How do vendors treat employees, customers, partners, and competitors? Are there hundreds of lawsuits found in a simple Google search? Do they bully or engage in fair trade practices? Do they historically have reasonable price increases and viable upgrade paths? Do they provide support after a deal is won or refer you to a public forum to figure it out?
Once a company has you as a customer, how will they treat you?
In the southeast United States, I am seeing Google’s data platform winning in enterprise accounts that were formerly dominated by Teradata, Oracle, and SQL Server. Google does have an awesome cloud data warehouse called BigQuery. It is a contender with Amazon for the biggest cloud database in the world project. After years of battling top database vendors during enterprise licensing renewal exercises, it appears that huge customers are open minded and buying cloud solutions from new vendors that get customer experience (CX).
That wraps up my Tableau Conference 2016 update. If you want more insights and predictions, I am happy to do so in a paid consultation. For my financial portfolio followers, don’t miss my webinar this week on updated self-service BI market trends. It is a must-see for understanding the latest market research on where we are right now and where growth is expected to happen.