Menu
Synthetic emotions: HAL on the desktop

Synthetic emotions: HAL on the desktop

Twelve scientists had indirectly paid a tribute to HAL, the computer from 2001: A Space Odyssey as they met in Austria to discuss the topic of emotions in humans and artifacts.

Remember HAL from 2001? HAL knew how far you could get with humans by employing emotions, even if you are just a computer. Scientific research published during the past few years points in the same direction, boosting further research with tags like synthetic emotions, affective computing and affectware.

The main aim of the research is to minimise the frustration humans often feel when trying to work with computers. The continuing efforts to make software user-friendly have not removed all those issues; some users even get more frustrated, for example, by some applications' automated "know-it-all" correction features, according to researchers.

Synthetic emotions, a hot topic because of its broad appeal, has mostly attracted the attentions of com-puter scientists from the areas of HCI (human/ computer interaction) and AI (artificial intelligence). However, the Vienna meeting is a step toward bridging the gap between people working with emotions in computers and the people working with real emotions.

The participants in Vienna included a philosopher, a neuroscientist, a psychologist, people from the interactive gaming industry and several computer scientists. The computer scientists included Gene Ball, a senior researcher at Microsoft, and Rosalind Picard, an associate professor at the MediaLab at the Massachusetts Institute of Technology (MIT) near Boston who made headlines in the US after publishing the book Affective computing.

"My hope is that we can learn from each other," said Professor Robert Trappl, who invited the scientists to the Vienna meeting. Trappl heads the Department of Medical Cybernetics and Artificial Intelligence at the University of Vienna.

"We, who work on artificial intelligence, made the mistake of ignoring the subject of emotion. It was considered as just a disturbing factor. However, emotions are what help humans to survive and develop."

Microsoft's Ball was intrigued to hear a variety of views on emotions. "It [synthetic emotions] certainly takes a combination of knowledge to develop," he said. Ball is working on a computer-based architecture for modeling emotions and personalities. Such models will keep an "eye" and an "ear" on the user to detect whether he or she is happy or sad, excited or calm. Some researchers even look for signs that the user is disappointed with the computer.

Users' choice of words, the pace and rhythm of their speech, their facial expression, posture and gestures are among the signs that can be monitored and analysed. Based on such observations, a computer could then adjust itself to the mood of the user, sending out appropriate responses.

A user who is about to blow his or her top because of technical problems could have their frustration acknowledged by the computer and vent their feelings, as well as receive some extra help in solving the problem. A user who appears to be deep in concentration on a particular task would be "disturbed" by the computer as little as possible, whereas a user not so involved might receive some helpful hints from the computer, along with a joke.

The software for recognising user emotions is seen as a stepping stone to having a so-called "spoken conversation" take place between user and computer. But it can be used, to a certain degree, when the conversation takes place via keyboard and screen, Ball said. He expects synthetic emotions to be used first by the computer game industry.

Ball doesn't believe that synthetic emotions will result in major changes. "The advantage will be marginal for the user. A little more ease of use," he said. "I don't know any situation where it will make or break the success of a system."

But Ball worries about the subject attracting too much attention, "controversial as it is". Although some people may get excited about the technology, there may be a backlash against it from others who fear its uses, Ball said. He also is worried about the possibility that synthetic emotion technology will be misused, for example, by marketers.

"It is not ethical for synthetic emotions to be used to coerce consumers, and we in the industry have to be vigilant, so when marketers try to do it, somebody must speak out," Ball said.

Furthermore, he is concerned with the "spoken conversation" between the user and computer. "There are a lot of conventions in spoken conversation, socially and emotionally, and we have to be very sensitive to that. Unless a computer can speak according to the conventions, it [the conversation] will seem very flat," Ball said.

But today's computers have another problem - their own perceived personalities. "Even if it is very mechanical, it [the computer] still transmits a somewhat aggressive, annoying personality," Ball said, referring to software that instructs users to do tasks in a fixed, predetermined way.

People react negatively to this "passive, aggressive" personality. But research also shows that people react positively when participating in experiments in which computers use praise and humour.

Humans are even afraid to hurt computers' "feelings", and they perceive a computer with a male voice as being a more effective teacher on all subjects than a computer using a female voice, except when it comes to the subject of relationships, researchers have found.

Ball works closely with the two social scientists at Stanford University, who were the first to publicly unveil research on how people and computers relate. Professor Byron Reeves and Associate Professor Clifford Nass, both in Stanford's Communications Department, also wrote the book, The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places, published in September 1996.

Microsoft used the research carried by Nass and Reeves to develop a new, more conversational way for users to communicate with their PCs. Dubbed "Bob", the user interface featured a dog character that "guided" users through tasks. It was released in January 1996 and never caught on. However, the experience from the project has since been applied to Microsoft's continuing work with so-called "conversational interfaces", such as the paperclip and Genie characters in Microsoft's Office applications suite.

The latest research by Nass and Reeves shows that most users prefer computers with personalities mirroring their own, whether extroverts or introverts. A person who would tell the computer to "do this" most likely expects a more direct dialogue than a person who would say "Could you please do this?"

The researchers suggest that software designers should consider a user's emotional state as a factor when developing the interactive technology, even in text-only interactions.

Professor Picard believes emotion is a key ingredient of human intelligence and intelligent interaction. Therefore, program designers are making a mistake by completely avoiding emotions, she says. However, synthetic emotions should be used with caution, she warns.

If a person, for example, hates it when the Microsoft Office paperclip winks at them, then it is the system's responsibility to change that software behaviour. In other words, the paperclip should turn off its own winking, according to Picard. "The user should feel respected and in charge, not manipulated, and not belittled," Picard said.

However, within the HCI field, there are some who are sceptical about synthetic emotions and Lucy Suchman, principal scientist at Xerox's PARC (Palo Alto Research Center in the US) is one of them.

Suchman has worked at PARC since 1979, doing research concerning the relation of everyday working practices to computer systems design.

The main aim of her research is to make the designers aware of how computer users function well with their systems - pointing out practices that should be supported by the software and not, as often happens, changed by it.

Also, the scientists look at how the designers can encourage working practices that will improve the way users interact with computers.

For Suchman the most important aspect of the research is to make the system as easy to use as possible, thus reducing the potential for user frustration. As she sees it, the primary drivers for the new HCI trend on emotions have less to do with usability than with the fascinations and fantasies of computing communities, on the one hand, and the interests of the marketers, on the other.

With regard to using synthetic emotions, Ball and Suchman were asked about a specific e-mail example. Should the system caution a user who is obviously, or observed to be, hot-tempered against sending a potentially inflammatory e-mail with a suggestion like: "If it is not urgent, you might consider leaving it in the draft box till tomorrow?"

Microsoft's Ball liked the idea, comparing it to what many secretaries have done when transcribing notes for their bosses. Suchman didn't.

"Based on what I have seen to date, I'm sceptical. My inclination is to think that the criteria used to instruct the system to utter such warnings would be subject to so much ambiguity that the warnings wouldn't be very reliable," she said.

Suchman also fears that systems designed to be affective will reproduce stereotypical and trivialised conceptions of humans.

"Furthermore they mystify computing, when we should be working to make it more understandable," she said.

The double-edged sword of science

The co-author of an algorithm that serves as the basis for research on how computers can interpret human interactions for better interactivity is himself a little apathetic about where that research is heading. However, he is concerned about other uses of his algorithm that may end up having computers make decisions for humans.

"Computers ought to be more courteous, and if my algorithm can be used for that purpose, it is fine with me," said Finn Verner Jensen, a professor at Aalborg University in Denmark.

Gene Ball, a senior researcher at Microsoft, is using Jensen's algorithm to develop a computer-based model of emotions and personalities, synthetic emotions, for use in creating technology that will improve the interaction of humans and computers, including affective computing, in which a computer soothes a frustrated user.

Jensen himself does not think there is a need for affective computing. And he doesn't see much chance in the technology leading to people perceiving computers as more human-like. "I doubt whether today's children will perceive the computer as a person when they grow up," he said.

Jensen's algorithm is also currently considered the most powerful for use in probability calculations in Bayesian networks, which are used for making auto-mated reasoning in areas where there is uncertainty because knowledge is incomplete.

Although Judea Pearl, a professor at the University of Los Angeles, is considered the father of Bayesian networks, they are named after the Reverend Thomas Bayes. Bayes, born in London in 1761, first established a mathematical basis for probability inference, which is a means of calculating the probability that an event will occur in future trials based on the number of times it has not occurred in past trials.

Jensen's algorithm is used in several other research areas at Microsoft, and Hewlett-Packard has recently established a research lab in Aalborg focused on predicting technical problems by using Bayesian networks.

Other uses of the technology, though, have Jensen worried. The military uses Bayesian networks as a so-called "scene interpreter", for example in making decisions about taking aggressive actions in particular situations. The technology is used to guide officers by asking questions like: is this object a tank? If yes, are there signs of a hostile intent? If yes, should we fire at the tank just as a precaution?

The technology is used in space vehicles, tanks and submarines that do not have humans on board, according to Jensen.

"In the long run, such unmanned crafts, provided with computer-based scene interpreters, will be sent to war in order for us to avoid loss of lives," Jensen said. "That I find scary."

More information on the research done by Gene Ball, senior researcher at Microsoft Research, can be found at www.research.microsoft.com/~geneb/pubs/um99.doc. More information on Professor Rosalind Picard and her research can be found at vismod.www.media.mit.edu/ people/picard/.


Follow Us

Join the newsletter!

Error: Please check your email address.
Show Comments