Three years ago, I wrote about trends driving the future of analytics. At that time I cited several future technology trends that I believed would significantly influence analytics designs including embedded analytics, cognitive (the IBM-Watson effect, Siri, Natural Query, etc.), automation of analytics, hyper-individual experiences, marketplaces and cloud.

Fast-forward to 2017, indeed we are seeing these trends change the analytics landscape with simple natural language search queries, automated data warehouses, ETL, insights with bots and smart data discovery, personalization, industry model templates, data markets, algorithm exchanges, and cloud. Gosh, I love saying I told you so! If only I could monetize my industry predictions.

Of all these trends, the most fascinating in my opinion is the emergence of cognitive and affective computing. Back in 2014, I was following a little known company called nViso. At that time I was intrigued by their emotional analytics capabilities to detect human non-verbal signals and responses.

Emotion is a universal language.

Emotion is a universal language. No matter where you go in the world, a smile or a tear is understood. Even our beloved pets recognize emotion. What about a computer? So far we have had to send in smiles and frowns. What if a computer could detect your delight with a new feature or frustration automatically?


nViso emotion detection innovation was being experimented with for measuring customer satisfaction, advertising messaging and a variety of other applications. Although I wanted to be a tester, no one from nViso ever called me up to make faces at cameras for them. My offer still stands if they do want more testers.

What is Affective Computing

Affective computing is the ability for computers to detect, recognize, interpret, process, simulate human emotions from visual, textual, and auditory sources. and appropriately respond. Although affective computing is rather young, it is already being used today in cognitive applications.

In 1995, Rosalind Picard wrote a white paper on affective computing for the ability to simulate empathy. She is one of the most amazing women in tech. She is a Fellow of the IEEE for contributions to image, video analysis and affective computing. CNN named her one of seven “Tech SuperHeros to Watch.” Picard has multiple patents for wearable and non-contact sensors, algorithms, and systems for sensing, recognizing, and responding respectfully to human affective information. Her inventions have applications in autism, epilepsy, depression, PTSD, sleep, stress, dementia, autonomic nervous system disorders, human and machine learning, health behavior change, market research, customer service, and human-computer interaction.

For my data loving audience, Picard’s MIT Affective Computing Group has shared Eight-Emotion Sentics Data for research per the stated terms of use. The data sets contain physiological signals and eight affective states (from the Clynes sentograph protocol) of neutral, anger, hate, grief, love, romantic love, joy, and reverence.

In affective computing, deep learning algorithms interpret the emotional state of humans and adapt behavior to response according to perceived emotions including sarcasm. Don’t you wish we had this already for calls to your bank, insurance company or technical support? Affective computing advances should vastly improve the quality of customer experiences and our future interactions with robots. Yes, the robots are coming.

Yes, the robots are coming.

Affective computing humanizes digital interactions by building artificial emotional intelligence. As natural language interactions with technology continue to evolve starting with search, bots and personal assistants like Alexa, emotion detection is already emerging to improve advertising, marketing, entertainment, travel, customer service and healthcare.

Measuring Emotion

Another player in this space, Affectiva, measures emotions with an SDK via facial expressions. Affectiva Affdex metrics include seven emotion metrics, 20 facial expression metrics, 13 emojis, and four appearance metrics. The SDK allows for measuring valence and engagement as alternative emotional experience metrics.


Engagement is a measure of facial muscle activation that illustrates the subject’s expressiveness. The range of values is from 0 to 100. Valence is a measure of the positive or negative nature of the recorded person’s experience. The range of values is from -100 to 100. Emotion predictors use the observed facial expressions as input to calculate the likelihood of an emotion.


Emotion, Expression and Emoji metrics scores indicate when users show a specific emotion or expression (e.g., a smile) along with the degree of confidence and a composite emotional metric called valence which gives feedback on the overall experience. Sounds like fun to integrate into apps and test, right!


In addition to nViso and Affectiva, other vendors in this space include Realeyes, Beyond Verbal, Sension, CrowdEmotion, Eyeris, Kairos and Intel RealSense,

Becoming an Emotionally-aware Society

The inevitable future shift towards becoming a more emotionally-aware society will certainly be interesting. I have to imagine there will also be challenges adapting algorithms for individuals and cultures that show and express emotion differently. How will that be detected? You can download the Moodies app to see it in action!

Download the Moodies app 

Like it or not, oblivious individuals might get awakened by affective computing to realize the emotions they incite in those around them. Employers might be able to more easily detect healthy from poor work environments. Candidates interviewing for jobs or suspects in crime interrogations will likely get emotionally profiled. Facebook and other groups already collecting our personal data will also start tracking our emotions over time.

As a result, will the next generation become more emotionally aware? Could they become less genuine in their interactions inside …and outside of the office? Affective computing will undoubtedly bring many amazing new possibilities but it also raises questions for the future human emotional experience.