Timbre: designing with AI part 2

The technological possibilities and social implications for designing AI through human personas are vast.

30.06.16 | Research

By Mike Shorter

header

Timbre is a radio that plays content depending on your emotion. If you’re happy it will play happy music, and if you are sad, sad music and so on. Pretty simple. Timbre attends to your every emotional need, creating a perfect seamless soundtrack to your life.

This prototype came to fruition after a week long design sprint exploring artificial intelligence (AI).  At Uniform we had yet to dip our toes into the world of AI, despite it being something we find fascinating, not only through a technological lens, but also a philosophical one. One idea in particular that captivated us was around agency and deception: trying to fool AI as well as AI trying to fool you.

ollie

As you walk up to Timbre the metronome-style feature starts moving. This lets you know that it can see you. As you stand in front of Timbre, your eyes appear on the screen as two circles. The circles track your eyes; as you move around, so do they.  When the radio recognises your emotion, the eyes start to spin round in a circle, suggesting something is playing. A track begins that is reflective of your emotions. As you walk away, the eyes start to fade and the metronome feature stops moving.

Timbre uses CLMTrackr (a javascript library for fitting facial models to faces in videos or images) to determine your emotion. This emotion is converted into a value between 0 - 1 (0 = sad, 1 = happy). Spotify already generates a value for the valence of each song it has in its database. Spotify gives each song a measure from 0.0 to 1.0 describing the musical positiveness conveyed by a track. Tracks with high valence sound more positive (happy, cheerful, euphoric), while tracks with low valence sound more negative (sad, depressed, angry). CLMTrackr speaks to the Spotify API to play a (hopefully) relevant track from a playlist dependant on your mood.

kirst 2

During our initial exploration we extrapolated three characteristics of AI: the buddy, the butler and the police. Timbre could have embodied any one of these. If it embodied the Buddy characteristic, it would give you options, but ultimately let you make the decision. The options would be things that it thinks you want to hear - friendly recommendations. Like a mix CD from a friend with great taste. If the music player embodied the Butler characteristic, it would give you what it thinks you want automatically. If you are sad it would play sad music. It would allow you to wallow in your misery (as we naturally tend to do). If the radio embodied the Police characteristic, then it would try and do things for the greater good, such as potentially  turn the volume down at a certain time of day, or refusing to play explicit tracks before the watershed.

We chose the Butler as the AI persona to explore with Timbre. It raises the question of obedience. Do we want what we think we want? Future iterations of Timbre may see us explore the other two personas. Like many things we make here at Uniform Timbre is a question rather than an answer. What characteristics of AI should products take? How do those design decisions impact our relationship with the eventual AI? Each persona has challenges and benefits. We go into great detail on the thinking behind our exploration in our latest blog, Designing with AI.  

One challenge with Timbre was navigating a visual language still in its nascency. Like many new technologies, we tend to take our cues from sci fi literature and films. How do we know when an object is scanning our faces for some sort of metrics, whether emotions, age, sex or even name? Does there have to be a screen to make this explicit, or can this be a more subtle designed interface? We found that when the radio was in debugging mode, users were more inquisitive. They could see it was watching them, and were drawn in to explore what was happening.

Another complexity is one that will likely improve as AI algorithms increase in sophistication: the issue of generic features. Why does Timbre always think I’m angry? I thought I was quite a happy person. Turns out my monobrow fools the AI emotional robot! And Leo, why is he always so sad in AI land? It’s his beard. It fools the AI system and makes him look down. AI has (and is) to learn a lot more to navigate our physical individuality. Or, on the other hand, we can all start grooming ourselves to look the same, and train our smiles and frowns into the perfect shapes for AI to recognise….

Another benefit of designing through the lens of personas is that human characteristics leave room for forgiveness. A Butler drops a drink, where a seamless experience does not. You don’t expect a robot to make mistakes, but you do expect them from your buddy, butler or police. We found that creating this constructed human identity for AI fostered a healthier relationship with the technology.

The technological possibilities and social implications for designing AI through these human personas already seem endless. Watch this space for future forays into Artificial Intelligence.

 

We are a design and innovation company. 
We use design, innovation and imagination 
to create change.

Imagine the impossible.

© 1998-2016 Uniform - Privacy