Artificial Intelligence and the Detection of Human Emotion

Headshot of Luke Stark in a blue plaid shirtBy Courtney Stone

April 2022

Luke Stark is an Assistant Professor at Western University and a science fiction fan who once spoke Klingon on Canadian national television.

When you hear the term “artificial intelligence” do you picture robots taking over the world? Well, don’t worry about that apocalypse quite yet.

Luke Stark, an Assistant Professor at Western University, is taking a deep dive into AI and its ability (or lack of) to read and respond to human emotions. He is currently working to complete and publish a book tentatively titled, Ordering Emotion: Histories of Computing and Human Feelings from Cybernetics to AI, to be published by the MIT Press. The book looks at the ways human emotion has been understood, ignored, quantified and handled by computer scientists from the 1940s until the present.

An understanding of human emotions allows developers of artificial intelligence systems to attempt to manipulate and control social interactions and relations between people online.

Key Discovery

The digital data related to human emotions collected for analysis by AI systems is not reliable. This unreliabilty poses a problem when AI systems are used to hire people for certain jobs, to assess whether a student is a good fit for a university, or in making other important decisions. These systems should not be used in high stakes circumstances.

Your Google search results and your TikTok “For You” page are just two examples of online platforms that use artificial intelligence to filter the content you see. These systems learn about your interests based on what you click and search online. Then, they spit out the content you’ll be most likely to interact with to keep you locked onto your screen.

And that’s why before you know it, you’ve been scrolling on TikTok for two hours.

Stark says that developers use AI analysis to gain insights into human activities and behaviour with the goal of creating more accurate systems to filter and target people with content that will successfully sway their opinion or actions.

“That makes emotion really critical for a society as our social lives are totally mediated through these systems.”

The question is can AI technology be trusted to work accurately and without bias? With corporations, governments, justice systems and health care providers all using this technology to make decisions that affect real people, how well does it perform?

Black and white photo of a man in a suit with a clipboard sitting next to a machine on a couch.

One major difference between technology and humans is the fact that humans experience life and the emotions subjectively. Different people respond differently to the same situation. Machines can only be programmed to produce objective results. Because of this, Stark believes that the ability of AI systems to track human emotions will never be fully successful and that your understanding of yourself differs from how technology understands you.

“Emotion also really highlights the fact that this desire to know people through these digital systems is always going to be incomplete because our experience of emotion is always subjective.”

Stark’s interest in artificial intelligence and human emotion started in the early 2000s when emojis were first used on platforms like MSN Messenger to convey additional emotional context to the message receiver in the absence of usual human cues.

“I remember thinking how weird it was that something we would usually convey through tone of voice, gesture, expression, often not giving as much of a thought, we had to then explicitly think about in the crafting of digital messages.”

“I remember thinking at the time, [emoji] were a sort of silly thing to research. I thought 'well, somebody could write a paper about that someday, but why would they?' And low and behold, like 10 years later, I wrote one of the first scholarly papers on emoji.”

Stark’s early research on emojis helped him understand the nuances of communicating human emotion through technology. He keeps an eye on the constantly evolving ways to communicate human emotion online, from emojis to GIFs to stickers.

Over the course of his research Stark says he was surprised to discover the significant influence older social movements have had on the computing industry today.

“Social theories from 100 years ago, like eugenics, that don't at first blush seem to have much to do with the history of computing, actually really shape the ideas about emotion, gender, race and society that were present at the beginning of the history of computing and have continued to influence and shape computing for the last 75 years.”

Take content filtering algorithms for example. Stark notes these systems still tend to prioritize wealthy, white voices over people of colour and those with lower incomes due to systemic racial and societal biases. AI systems skew the content we are exposed to online in ways we might not even notice.

His ultimate goal is to generate research findings that will affect policy decisions made by governments down the line and hopefully help to guard against the implementation of harmful AI technologies.

Stark’s interest in technology and artificial intelligence predates his professional work. As a lifelong science fiction fan, he’s always been fascinated by the role and potential of technology in human lives and society.

Star Trek was one of the things that I really immersed myself in as a kid, to the extent that I—it’s embarrassing to admit this—but I once spoke in Klingon [an alien language spoken in the Star Trek franchise] on Canadian national TV.”

While Stark has yet to have a cameo in the Star Trek universe, his research is sure to make a dent here on planet earth.

----

This profile is part of a series written by graduate students in MMJC 9604 Professional Writing, during the Winter 2022 term. Profiles have been edited by FIMS Communications staff for clarity.

Black and white image above: Advertisement for Digital Logistics Synthesizer, 1977 (Source: Archives Center, National Museum of American History, Smithsonian Institution).