As data continues to grow and exceed current BI and analytics system capabilities, more organizations are adopting big data analytics solutions. Please join me and Wendy Gradek from AtScale in a webinar that will discuss planning and estimating migrations. We will show you specifically what to look for in popular BI, analytics and data visualization tools such as Tableau, Qlik, Power BI, TIBCO Spotfire, MicroStrategy, etc. to assess the level of work effort required to make a migration. We will also share a cool utility to expedite the process. Some solutions are simple to migrate. Others might be a bit more entailed. You never really know until you review how dashboards and reports are designed.
Evaluating Migration Potential
There are many factors to consider before making a move. Here are my top 10 tips shared previously on how to properly approach a data visualization solution migration. The approach for big data analytics migrations will be similar yet slightly different since you will be swapping out the back-end data source versus the front-end visualizations.
- Get sponsorship for a Migration Assessment Project
To explore the feasibility of a migration, do a migration assessment. The assessment should be treated as a real project. Assessment time can vary widely depending on the scope of the current BI or data visualization tool implementation. Are there only a few front-end reports or many reports and integration points? Is there an ecosystem of ETL, data models and embedded business logic within those reports to review?
- Use internal and neutral resources on the Migration Assessment Project
Do not rely on the vendor or vendor’s preferred partner resources to do the migration assessment work for you. They are biased. Perform the migration assessment work in-house or with a neutral partner. They can get vendor support during the process. Most vendors anxiously respond to migration scenario questions to save or win market share.
- Involve business and technical resources in the Migration Assessment Project
Even though business groups often manage modern data visualization tools without requiring IT or technical help, assign a technical resource to help in the process. Technology migration is not a new concept for IT or technical professionals that have been moving operating systems, email, databases, software and hardware for years. That technical resource may recognize deal breakers that a business user will not and vice-versa. The business user brings subject matter expertise, existing BI or data visualization tool expertise, user experience and non-technical usability requirements.
- Understand differences in technical architecture, components and features
Migration success or failure depends on availability of comparable or better features for hosting or cloud, managing and securing the environment, back up, disaster recovery, scalability, version control, connecting to data sources, encrypting and storing data, moving your data, proprietary semantic model, hidden or embedded data model, cleansing and transforming data, scheduling data updates and report subscriptions, designing reports, incorporating business logic, publishing reports, and consuming reports on multiple devices, web sites or applications. Within these areas, there are a multiple capabilities to examine. Also load test for performance and scalability – especially if you are considering a cloud BI or hybrid BI offering. The elastic cloud does not always perform acceptably depending on your data source gravity, connectivity speed and location.
- Determine if your data is locked in a proprietary storage model
Personally, I prefer direct connect data sources with modern data visualization tools versus proprietary tool data engines that only work with one vendor. Quite a few modern data visualization vendors use in-memory, columnar engines that are proprietary. You can easily get data into it but you can’t legally get that data back out without violating the license agreement IF it is even possible. Be sure to ask the vendors how you can export or get all of your data back out of a proprietary engine if necessary.I can’t emphasize how super important it is to control data destiny. You should be able to move reporting data and not have it locked in to one data visualization tool in a “bring your own reporting tool” world. Data is gold. Would you put your money in a bank that won’t let you take it out or move that money to a different bank?
- Assess ETL and data cleansing
There are many different approaches modern data visualization tools take to connect, union, pivot, split, cleanse and join data sources before a report ever gets built. These steps might happen outside of the tool or inside the tool. They might use proprietary scripts that require rebuilding/recoding. ETL and data prep is often the most time consuming part of reporting – approximately 80% of the work effort. Recreating the visualizations is the easy, fast part.
- Figure out if there is a proprietary or hidden semantic layer
Self-service BI users might not have any idea if there is a hidden or created semantic layer in their favorite data visualization tool that is a report dependency. Many of the modern solutions do have semantic layers but they don’t label them in that manner. Essentially this is where data source relationships, hierarchies, sets, groups, calculations and other business logic might get defined. Figure out how and if that model might migrate. This is another area that can be effort and time intensive if you have to rebuild it.
- Review report designs
Be sure to look at simple and complex reports. Itemize current report data sources used, role-based security, report layout or template, formatting, data visualizations types, interactivity, drill-down, drill-through, view details, specific click actions, filters, controls, parameters, advanced analytics, custom modules, add-ons, tooltips, annotations, publishing, storytelling, printing, exporting, and collaboration capabilities. Often in the front-end of data visualization reports, business users will customize calculation logic in a proprietary language or tool script. Check if used script functions exist in the target tool. Compare online help docs between vendors to figure out how to rebuild reports with available features.
- Do a Proof of Concept (POC)
Select a few reports that range in complexity for a proof of concept project. Have a couple power self-service BI tool users that build data visualization tool reports today, build the proof of concept reports with the tool being evaluated. Don’t have the vendor, IT, technical talent or an external group build the POC reports. Self-service data visualization tool users need to understand any differences in user experience both good and bad, trade-offs in functionality, learning curve, available help resources and technical support responsiveness.You’ll also want to include the power self-service BI users in the evaluation for buy in later if you do choose to migrate to another tool. They might love it or they might hate it. A change in modern data visualization tools might be difficult for them to embrace. Highly sought after analytics/power self-service BI talent has been known to leave employers just to work with desired data visualization tools.
- Talk to customers that have already migrated
Vendors should have customer references that you can call or meet to discuss their migration experiences and lessons learned. Don’t rely solely on vendor provided customer adoption numbers, video and slide references – talk to those customers. Be wary of common customer adoption numbers games that vendors play, executive level calls or vendor preferred partner migration reference calls. Try to find and talk to the self-service BI report builders and the talent managing the migrated data visualization solution. Those people can share the details that you need to know to make an informed go/no-go decision and plan for a successful migration. Happy customers usually welcome talking with their peers.