Affective Computing Age





Affective Computing Age                             

Sherry Courington


One of human’s most unique characteristics is the ability to read other’s expressions and determine emotions they’re experiencing. This is a very early trait we acquired pre-language and writing. It is what makes us a unique species, in addition to our ability to question our origin and long-term future. It’s also present in other vertebrates who display emotions like hunger to their parents.

Emotion gives us a depth of understanding and experience we wouldn’t have otherwise, first physically and later cognitively. The absence of consciousness in computers has always led us to conclude that they are like zombies – absent emotions and motivation. It’s easy to dismiss computers as dumb machines. But consider the changes our information age has brought. Google almost any question and Google’s spiders cross an infinite number of spider webs to find the answer to your question among a vast amount of data amalgamated from a larger number of user inputs.

This sets the preface for establishing Theory of Mind (ToM) - programming machines to exhibit emotions with numbers (rankings).

“The greater the difference between the two entities, the greater the need for a well-designed interface.” – Brenda Laurel.

Imagine that your smart phone is able to interpret your tone of voice and expressions and infer what you must be feeling. Affective Computing (AC) is term coined by Rosalind Picard (MIT). It is achieved by cameras & voice recognizers reading your nuances & facial expressions.

Technological innovation takes course much more quickly than biological – exponentially instead of linearly and because we live and think linearly, exponential changes usually takes us by surprise. The consequence has been an increasing need for ever more sophisticated Graphical User Interfaces (GUIs) providing audio and video (YouTube, Google Voice Search, Google Maps) for our convenience and entertainment.

MIT is already underway with the following research:   A digitization of affect or mood. And it may make our PCs our best friends. This digitization of affect is a term psychologists use to refer to the emulated display of emotion in machines and robots. Japan already has hotels where a human is not required for check in. Robots greet, take payment and port luggage to rooms.

The absence of consciousness in machines has always enabled us to disregard them as anything but programmed by us and for our increased abilities. And yes, consciousness is a vague term. But let’s call it ability to reason from our somatic (bodily) sensations (The hairs standing on the back of your neck tell you you’re frightened).

Suppose that the future of computing enables computers to learn via sensors placed on our bodies what our sensations and expressions are and collects this data over a long period of time. I realize it sounds far-fetched but consider that you never expected a computer to be capable of navigating and driving your car. And today, this is being done.

Suppose the following:  you wear a smart watch during your nightly sleep. It tracks your deep sleep, light sleep and movement.

I wear one (a VivoSmart HR+) that does that very thing. My phone displays its results each morning advising me as to the quality of my rest.

Now, imagine your phone on your nightstand takes in that collected data during the night in your sleep. When you wake, it says in a benevolent tone:


“Your deep sleep looks peaceful. You should feel very productive during your 8:00 am meeting with your boss today. I suggest you wear your Navy Herringbone suit for your quarterly presentation. The last time you wore it at work, you slept much better the following night. Your paperwork is by the front door. I suggest you hit the gym to relieve your hump day stress. Have a fabulous Wednesday!”


Your trusted smart phone may know your unconscious life and motivations better than you do.

Science fiction is usually written around such imagined creations but the fact is that this technology is already helping autistic children who can’t detect the meanings of people’s expressions. It’s being used to educate them and adjust to their differences. And like all technology, where there’s a purpose, or a marked increase in productivity, another application and more development inevitably occurs.

Imagine what Madison Avenue could accomplish by reading our emotions and having chat bots email us coupons appealing to our window shopping euphoria by catering to our lustful desires with discounts. Mon dieu!!!

What will be the post-information age outcome of our becoming the somatic sensory interfaces for computers? The need for ever more sophisticated Graphic User Interfaces (GUIs) may bring ToM into widespread existence having computers that emulate and understand our emotions and their origins. This is called digitization of affect (mood) and may make our PCs and smart phones our best friends.

Elon Musk, Stephen Hawking and Nick Bostrum have spoken of what may happen as we move close to computers that are able to think and reason as well or better than humans. Most believe consciousness is extremely unlikely to arise out of brittle software, either spontaneously or by design.

What are your thoughts on Affective Computing (AC)? How might it change our futures?



Comments