Much like the cloud wars and the race to zero, the data visualization market is also at war. Multiple mega-vendor players are driving down prices in the short-term for long-term market share gain. For analytics pros working with popular data visualization tools today like Tableau, Qlik, TIBCO Spotfire, Power BI or SAP Lumira the following hypothetical scenario might sound familiar to you.

Management is pressuring for budget cuts. Finance is questioning your renewal spend with Vendor X. Recently executives saw a marvelous data visualization demo by Vendor Y. Vendor Y touts widespread adoption, excellent reviews, incredible solution time to value, massive investment, rapid innovation, and so on. Vendor Y licensing costs are far less than Vendor X. Vendor Y is providing incentives to make the move right now. Then your boss orders you to look into migrating from Vendor X to Vendor Y. Where do you start? What should you do?

In the fiercely competitive BI and data visualization market, requests to evaluate vendor migrations are becoming more common. Executives, management and finance gurus are motivated by a save money now sales pitch. They don’t have any idea what is really entailed in a BI or data visualization solution migration or the vast sea of differences in data visualization offerings. To them, all the modern dashboard solutions look and sound the same.

Too Good to Be True?

“If it sounds too good to be true, it usually is” was a catchphrase used by the Better Business Bureau to alert the public. If you encounter a vendor that sounds too good to be true, dig deeper before making a move. Avoid the “quickly migrate right now” sales tactic.Reputable vendors will allow you time to review your environment to ensure migration success.

Get management support to perform adequate due diligence before migrating away from a working modern data visualization solution just to save money. Depending on your current implementation, design patterns, data availability and existing BI solution features, some migrations might be simple, fast and a mega-success while others might be complex or not entirely possible to rebuild.

If it takes a long time and many resources to rebuild reporting in Vendor Y that was already working in Vendor X, the assumed Vendor Y licensing savings may never be realized. Even worse, you might end up with an inferior reporting solution for vital data-driven decision making. In the longer-term, Vendor Y will likely raise licensing costs or Vendor X might drop their licensing costs.

If you are happy with Vendor X today, give them a chance to keep your business tomorrow.

Evaluating Migration Potential

There are many factors to consider before making a move to a new BI or data visualization tool. Here are my top 10 tips on how to properly approach a potential migration.

  1. Get sponsorship for a Migration Assessment Project
    To explore the feasibility of a migration, do a migration assessment. The assessment should be treated as a real project. Assessment time can vary widely depending on the scope of the current BI or data visualization tool implementation. Are there only a few front-end reports or many reports and integration points? Is there an ecosystem of ETL, data models and embedded business logic within those reports to review?
  2. Use internal and neutral resources on the Migration Assessment Project
    Do not rely on the vendor or vendor’s preferred partner resources to do the migration assessment work for you. They are biased. Perform the migration assessment work in-house or with a neutral partner. They can get vendor support during the process. Most vendors anxiously respond to migration scenario questions to save or win market share.
  3. Involve business and technical resources in the Migration Assessment Project
    Even though business groups often manage modern data visualization tools without requiring IT or technical help, assign a technical resource to help in the process. Technology migration is not a new concept for IT or technical professionals that have been moving operating systems, email, databases, software and hardware for years. That technical resource may recognize deal breakers that a business user will not and vice-versa. The business user brings subject matter expertise, existing BI or data visualization tool expertise, user experience and non-technical usability requirements.
  4. Understand differences in technical architecture, components and features
    Migration success or failure depends on availability of comparable or better features for hosting or cloud, managing and securing the environment, back up, disaster recovery, scalability, version control, connecting to data sources, encrypting and storing data, moving your data, proprietary semantic model, hidden or embedded data model, cleansing and transforming data, scheduling data updates and report subscriptions, designing reports, incorporating business logic, publishing reports, and consuming reports on multiple devices, web sites or applications. Within these areas, there are a multiple capabilities to examine. Also load test for performance and scalability – especially if you are considering a cloud BI or hybrid BI offering. The elastic cloud does not always perform acceptably depending on your data source gravity, connectivity speed and location.
  5. Determine if your data is locked in a proprietary storage model
    Personally, I prefer direct connect data sources with modern data visualization tools versus proprietary tool data engines that only work with one vendor. Quite a few modern data visualization vendors use in-memory, columnar engines that are proprietary. You can easily get data into it but you can’t legally get that data back out without violating the license agreement IF it is even possible. Be sure to ask the vendors how you can export or get all of your data back out of a proprietary engine if necessary.I can’t emphasize how super important it is to control data destiny. You should be able to move reporting data and not have it locked in to one data visualization tool in a “bring your own reporting tool” world. Data is gold. Would you put your money in a bank that won’t let you take it out or move that money to a different bank?
  6. Assess ETL and data cleansing
    There are many different approaches modern data visualization tools take to connect, union, pivot, split, cleanse and join data sources before a report ever gets built. These steps might happen outside of the tool or inside the tool. They might use proprietary scripts that require rebuilding/recoding. ETL and data prep is often the most time consuming part of reporting – approximately 80% of the work effort. Recreating the visualizations is the easy, fast part.
  7. Figure out if there is a proprietary or hidden semantic layer
    Self-service BI users might not have any idea if there is a hidden or created semantic layer in their favorite data visualization tool that is a report dependency. Many of the modern solutions do have semantic layers but they don’t label them in that manner. Essentially this is where data source relationships, hierarchies, sets, groups, calculations and other business logic might get defined. Figure out how and if that model might migrate. This is another area that can be effort and time intensive if you have to rebuild it.
  8. Review report designs 
    Be sure to look at simple and complex reports. Itemize current report data sources used, role-based security, report layout or template, formatting, data visualizations types, interactivity, drill-down, drill-through, view details, specific click actions, filters, controls, parameters, advanced analytics, custom modules, add-ons, tooltips, annotations, publishing, storytelling, printing, exporting, and collaboration capabilities. Often in the front-end of data visualization reports, business users will customize calculation logic in a proprietary language or tool script. Check if used script functions exist in the target tool. Compare online help docs between vendors to figure out how to rebuild reports with available features.
  9. Do a Proof of Concept (POC)
    Select a few reports that range in complexity for a proof of concept project. Have a couple power self-service BI tool users that build data visualization tool reports today, build the proof of concept reports with the tool being evaluated. Don’t have the vendor, IT, technical talent or an external group build the POC reports. Self-service data visualization tool users need to understand any differences in user experience both good and bad, trade-offs in functionality, learning curve, available help resources and technical support responsiveness.You’ll also want to include the power self-service BI users in the evaluation for buy in later if you do choose to migrate to another tool. They might love it or they might hate it. A change in modern data visualization tools might be difficult for them to embrace. Highly sought after analytics/power self-service BI talent has been known to leave employers just to work with desired data visualization tools.
  10. Talk to customers that have already migrated
    Vendors should have customer references that you can call or meet to discuss their migration experiences and lessons learned. Don’t rely solely on vendor provided customer adoption numbers, video and slide references – talk to those customers. Be wary of common customer adoption numbers games that vendors play, executive level calls or vendor preferred partner migration reference calls. Try to find and talk to the self-service BI report builders and the talent managing the migrated data visualization solution. Those people can share the details that you need to know to make an informed go/no-go decision and plan for a successful migration. Happy customers usually welcome talking with their peers.