Using AI to analyze what your users feel

Computer vision and artificial intelligence in UX research


Emotions are abstract and the meaning behind them can differ from person to person, making emotional reactions difficult to measure and use in research.

But what if they weren’t?

Imagine you can visualize a feeling and use it in usability testing.

For this idea, we developed an artificial intelligence program that can take a video of your face, figure out what you feel, and show your experience in one handy graph.


Image 1: Emotion detection AI (Photo of eyes by Samer Daboul on Pexels.)


At UXTesting we help enterprises conduct remote usability testing with our software solution.

A while ago we created a new AI-based feature that scans the face of testers for facial features, analyzes those features to understand their expression, and presents all emotions they show in one handy graph.

With this feature, you can see how someone reacts to your software test and discover other details that can help you find more insights into your research.


Image 2: Computer vision in Snapchat Filters (Photo of computer vision from PetaPixel.)​​​​​​


How it works


The emotion detection AI works similar to Snapchat’s Filters, and uses computer vision to analyze a face for the shape and movement of its facial features: like your nose, your lips, and your eyebrows. Once it sees your face, it compares the shape of your facial features to what it thinks different emotions look like and to its database of other test videos. Then the emotion detection AI takes an educated guess on what you feel once a second. Discover how computer vision analyzes your face and Snapchat filters here (2018).

Our AI divides your facial expressions into 7 emotions: anger, contempt, disgust, fear, happiness, sadness, and surprise.

Each emotion is displayed separately as a percentage. Your most emotional moment in the video is set as 100% and your least emotional moment is marked as 0%.


What it shows


What kind of insights can this AI actually show you?


Anger


Image 3: Cumulated anger (Photo of man screaming in phone by Icons8 on Unsplash.)


The emotion detection AI can show you the basic expressions as they happen with a video of what caused that reaction.

Image 4: Cumulated anger in the graph (Emotion detection of anger on UXTesting.)


You can see spikes in negative emotions; such as anger, sadness, contempt, or just utter disgust.

Clear spikes of these emotions mark the graph whenever someone feels annoyed on your website. 

But you can even see these emotions build up and affect each other in a test to the point where your users want nothing more than to leave your website and never come back.

Together with a video of your tester and a video of your screen, the emotion detection can provide you the context to measure the emotional reaction on your testers and to uncover how different people react to an annoyance. Providing you the insights to work towards solutions.


First reactions to the ordeal


Image 5: A first look at the ordeal (Photo of angry bearded man by Matthew Henry on Burst.)


Because this AI analyses your face constantly, it can notice reactions most people would miss.

Image 6: A first look at the ordeal graph (Emotion detection of fear on UXTesting.)


Imagine going to a signup form and taking a first look at that long-complicated form you will need to fill in. You look at it in fear or surprise as you see how far down the page loads, or you feel sad or disgusted as what you were hoping to be a short task just became a three-minute ordeal. 

We have all been there and felt the same, but we only show that reaction for a split second before returning to staring emotionlessly at our screen.

Most people would miss that reaction entirely and you will probably miss it too unless you were actively looking for it.

Yet the AI saw it and marked your first impression of that beautiful form with a quick peak of fear and disgust. 

If that is not the first reaction you hope to get with your webpage; then consider removing some non-crucial questions to shorten the form, or divide your form over several pages with fewer questions each to water down that initial reaction. Then test it again and see if the changes had any effect.


A first look at your website


Image 7: Bounce rate visualized (Photo of woman at laptop by Sarah Pflug on Burst.)



It takes only 50 milliseconds (that is just 0.05 seconds) for people to decide if they like or hate your website (SWEOR, 2019).

Therefore, your first impression of a website is incredibly important but the few seconds you take to show it is far too short to analyze yourself.

With the emotion detection AI; you can watch people look up your website, watch their first reaction as they first lay eyes upon your website and see their reaction change in their first few seconds on your website.

Image 8: Bounce rate visualized in a graph (Emotion detection of happiness and surprise on UXTesting.)



But then what emotions do you want to see? 

If you see a spike in any negative emotions, then people will likely leave right away. Nothing surprising about that.

But no clear expression is just as bad. If people open your website with an emotionless stare, then they feel bored and uninspired. When people see nothing surprising on your website then they leave just as fast as if they see anything negative. They did not discover what they are looking for or get caught up in anything particularly interesting on your website.

A peak in surprise and happiness is likely the best first reaction to see. Your users enter your website, immediately discover something they were looking for and are happy to stay on your site longer to complete their goal.

While watching people spend their first few seconds on your website, you can discover what can surprise or entertain your users and what does not.
Then you can discover how you can keep your users on your site and test again to measure if your adaptation worked.


Readability


Image 9: Reading the unreadable (Photo of man with hands on temple by Bruce Mars on Pexels.)


Do you know that moment when the text on a site is getting awfully small and you are staring at the text with your face just inches from your screen just to make out one word?

Image 10: Reading the unreadable in a graph (Emotion detection focussing to read on UXTesting.)



While staring at your screen, the AI commemorates this moment with flat hills of sadness and contempt in the graph as you try to remember if your eyes are really this bad.

Consider adjusting the font size or the contrast between text and the background. Then repeat your tests again with the new site and compare the reactions in the two tests to see if it worked.


Giving up


Image 11: Giving up and moving on (Photo of businessman leaning back by Engergepic.com on Pexels.)


Do you ever try to use a website but every second of it is awful and you decide that it is just not worth the effort? Do you remember the feeling of happiness or pride as you just shrug off whatever you were trying to do and start closing the tabs of that website?

Image 12: Giving up and moving on (Emotion detection of giving up on UXTesting.)


The AI displays this feeling wonderfully with a sudden spike in surprise or happiness at the end of a video. The moment of “I give up” as you decide that whatever task you had to do is fine like this and you move on with your life.

A time like this can be the moment your users leave your website and go straight to your competitor’s site. So, it is important to know when people want to leave and what happened earlier that brought them to this moment. Once you know what makes users want to leave, you can find ways to make them stay.


Conclusion


All in all, these are just some of the cases we came across where emotion detection technology can help you discover more in usability testing. 
It is fascinating to see that the AI is surprisingly accurate in understanding your reactions, regardless of culture and ethnicity. As it can pick up subtle displays of annoyance common in different cultures. 

Even if the reaction is not one of the AI’s supported emotions, the software will still mark the moment as an interesting reaction to let you consult the video and review the reaction correctly yourself.

But we would love to hear what insights you can discover with our AI. 

Try it out for yourself and see what insights your feelings can show.


References:

  1. Snapchat’s Filters: How computer vision recognizes your face
    https://medium.com/cracking-the-data-science-interview/snapchats-filters-how-computer-visin-recognizes-your-face-9907d6904b91
     
  2. Snapchat Filters: How Do They Work?
    https://dev.to/aubreyarcangel/snapchat-filters-how-do-theywork-4c83
     
  3. 27 Eye-Opening Website Statistics. SWEOR, 2019.
    https://www.sweor.com/firstimpressions


Authors

Ivo Bout
Business Development at UXTesting