Once upon a time, an analytics team lead decided to design a system that would automate the time consuming, repetitive processes that he used to effectively find signals in vast volumes of pharmaceutical data noise. Using a design thinking approach, Airflow data pipelines, Python libraries, Elasticsearch and Graph-based artificial intelligence (AI) techniques, he and his team successfully reinvented the analytics user experience. His vision is now a rising startup called Stories. Here is Stories story.

Stories Analytics

Stories graph visualization of strength of the data relations in Stories Graph AI engine

Rarely do I come across a truly innovative vendor in the analytics world that is compelling, practical and winning significant clients. When I do, it awakens my passion for this space. Stories is one of those extraordinary, delightful surprises. It is a next era, augmented analytics solution. Stories progresses the entire analytics lifecycle workflow. A graph AI engine expedites analysis, searches trends and ranks millions of signals to isolate the important insights with context and explanations.


Stories automated analytics workflow

According to Stories co-founder, Peter Fedoročko, historically “80% of analyst’s work is to search for meaning in data. Unfortunately, 95% of signals found are pure noise. That is a tremendous waste of time for smart people.” He is right. If I think back to my data analysis projects, the steps were frequently repeated. I would get familiar with the data set, profile it, look for relationships, correlations, significant changes, exceptions and so on over time.

In 2015, Fedoročko developed a Python process that leveraged 20+ analytics libraries including pandas, scikit-learn or networkx to automate standard analytics steps. His solution was much faster and outperformed manual data analysis. Thus, the concept for Stories was born. Since then, Stories has continued to evolve, mature and is now being used by financial institutions, retailers, insurance, healthcare and other industries across Europe. They are opening offices in the US soon through the Alchemist Accelerator.

Introducing Stories

The Stories engine automates discovery, ranking and prioritization of business situations by looking at every possible node of input data separately to evaluate causal drivers. A story is an important situation in your business. It has a value, a start date, an owner, a topic and it’s updated as time unfolds.

Stories Analytics

Stories work like an X-ray in your data by highlighting trends and changes. The relevant findings are summarized in interactive, drill-able, list output that looks like an email inbox removing the need to drag and drop, slice, dice, and drill to look for insights. If you do see a story that you want to further explore, you can delve into it just as you have done in the past. You likely want to know why the story happened. The Stories engine has already done all the drilling and slicing and presents just the most significant explanations. It also facilitates collaboration with peers, assignment of proactive alerts, and tasks to close the insight to action loop.

Stories Analytics

Stories interactive analysis over time

How It Works

To get started, you select a data source to analyze. Currently Stories can connect to relational databases, flat files and Excel. OLAP data sources are coming soon. After connecting Stories to a data source, you map business questions to context and assign owners.

Stories Context Mapping

Stories Context Mapping

The user interface is flexibly designed to allow for customization of story display.

Stories Analytics

Customizable story widgets

Although the offering works across many varied industries and use cases, Stories comes with deeper contextual understanding of 50+ business topics across marketing, sales and finance including but not limited to the following concepts.

  • Monitor Sales Trends
  • Detect Incidents
  • Summarize Data into Top Stories
  • Automate P&L Analytics
  • Margin Mining and Profitability Analysis
  • Improve Product Profitability
  • Track Competitive Positioning
  • Exploit Pockets of Growth
  • Product Launch Excellence
  • Optimize Assortment
  • Exploit Growth Opportunities
  • Reduce Dropouts
  • Benchmark Production Costs
  • Key Business Drivers

After you connect data sources and define topics, Stories then analyzes all the combinations or slices of your data via Airflow data pipelines, Python libraries, Elastic search and graph-based artificial intelligence (AI) techniques to find and rank insights, trigger alerts, assign tasks and display insights in personalized visual story lists. That process can be scheduled to run on a recurring basis. One of Stories clients, monitors 50 million search requests per day to identify key trends in their sales.

While talking to the Stories team, I asked “why Airflow” since I had not run into it previously. Apparently, Airflow is a project that supports programmatic authoring, scheduling and monitoring of workflows. It was selected as the Stories data pipeline after initial AWS Lamda and AWS Data Pipeline options were insufficient to scale massive data analysis workloads and not the right tool for the nature of the data processing work being performed.


Technical Architecture

Stories can be deployed on-premises or in the cloud since it leverages mainstream Docker container architecture. Due to the resource intensive calculation peaks, cloud implementation is optimal. If you’d like to better understand the technical architecture powering Stories, please watch Fedoročko’s presentation from PyCon 2017.

For More Information

I have barely touched on Stories, an exciting new player in the augmented analytics space. If you’d like to further explore what Stories has created, get a demo or try it hands-on, please check out their website – https://www.stories.bi and contact them directly. If you are attending the upcoming Gartner Data & Analytics event in Texas, I do understand that Stories will be presenting on March 4, 2018 in the “Innovative Analytics in Action: Emerging Trends You Need to Know” session on Sunday afternoon from 3:30 PM – 5:30 PM.