Australians have never really been an emotional bunch. But if some of our researchers get their way, we might just be showing our feelings at work with a little help from some innovative IT.
Aimed at incorporating emotional characteristics into our interaction with computers, La Trobe University associate professor information systems, Rhajiv Khosla, has devised what he calls an emotionally intelligent information and communication technology (ICT).
"We are trying to make computers more human-like in their design and how they kind of personalise information and assist in decision making," Khosla said. "When the PC came it started isolating people in terms of the environment in which they work. That's where my work started in the mid-1990s - I wanted to work on a human-centred approach to the design of systems."
Working in conjunction with NEC and universities in India, Japan and Singapore, Khosla's system measures emotional state changes through a video and image processing system that captures facial expressions and body language via a tiny camera built into the monitor. The camera is sensitive enough to capture fluttering eyelids. The captured images are then processed, compiled into an emotional profile and compared against a cognitive profile generated when users answer a set of behavioural questions. All of this happens in real time.
For movie buffs it sounds remarkably like the eye analysis machine Harrison Ford's character, Decker, uses on suspected replicants in the cult classic, Blade Runner.
"We don't focus on any particular emotion, whether the person is happy, sad or angry or whatever," Khosla said. "What we have done is focus on the changes in emotional states as the users are interacting with these devices. What we are more concerned about is whether they have a positive emotional state or a negative emotional state."
The professor identified human resource management and other sectors including healthcare, police enforcement (interrogations), driving instruction, tourism and emergency services as possible users of the technology.
In particular, Khosla pointed to recruitment as one industry the university has already targeted. One example involved a recruitment firm or HR department interviewing a sales candidate and evaluating them on different areas related to the job through direct questions related to the industry.
"As they are providing cognitive responses to these questions, and there are 76 of them, there is a small dot in the upper half of the monitor, maybe 2mm in diameter, which is also taking the video stream," Khosla said.
"Even though people know there is a camera looking at them, as they start to answer and get involved in the interaction they just forget about it."
The system would then measure the emotional responses compared to their cognitive responses by evaluating body language and facial expressions and provide feedback to recruiters.
"You need to factor in this very important non-verbal information. People are made up of emotions and values," Khosla said.
In addition, you could also integrate the system with a regular desktop to check your mood and your state of mind when you start work, Khosla noted. This then could be monitored by management or the machine itself to potentially automate responses. But that's not all.
"It can watch your weight in an emotionally intelligent manner and give you advice," Khosla offered.