Taking cues from humans

Chris Zhang wants to build machines that can understand human emotion to help people make better decisions.

By Michael Robin

"People have emotions, they react to them and base decisions upon them," explained Zhang, a professor of mechanical and biomedical engineering. "If machines cannot understand human emotions, communications are compromised."

Zhang and his team are working to incorporate emotional cues into how people interact with machines. An example is the warning light that comes on when a car is getting low on gas; the light will prompt some drivers to pull in at the next gas station, while others will decide to put it off until later. There is no emotional cue to tell the driver how urgent the warning is.

"If you could give it an interface that is emotional, it could help make the human take action at the proper time," Zhang said.

But to deliver the right response, machines need more and better inputs from the human side of the conversation.

To get these inputs, Zhang's team used cameras and sensors to capture information on blood pressure, heart beat, skin conductivity (think sweaty palms) and eye movement. For example, rolling the eyes could signify fatigue or exasperation, while a wandering gaze might indicate boredom.

These data are analyzed and interpreted by custom-written computer software to predict human emotions. Zhang said the system can accurately predict a person's emotional state about 90 per cent of the time.

The work has many potential applications, including physical rehabilitation. One machine in Zhang's lab has the patient hold the end of a mechanical arm attached to a computer. The person manipulates the mechanical arm to move objects on a computer screen, mimicking a wrist rehabilitation.

Sensors track the patient's performance and software infers when they are getting frustrated or fatigued.

"If we can understand the emotional state of the patient, we can know this state may significantly disturb the functional performance," Zhang said.

Now that the researchers have a system that can read a patient's emotional state, the next step is to create passive and active feedback systems. Zhang uses the analogy of gym equipment: an exercise bike is passive in that the user must decide to pedal while a treadmill is active – the user must keep moving. The team wants to take this one step further.

"We want both physical and mind," Zhang said. "This is the novel aspect of our approach."

For example, as the wrist rehabilitation system monitored emotions, it could cue messages to encourage the patient. The system could also prevent patients from overdoing it if it sensed they were pushing themselves too hard.

Zhang explained that one of the challenges is keeping the sensors unobtrusive and easy to use. "The key to having machines understand human emotion is to have sensors that can non-intrusively and non-obstructively get the signals from the human."

At home, the system would become a virtual partner. That is, the computer would learn from the patient and help them direct their own rehabilitation.

"My plan is not only management of patient function and performance, but also that emotions become active in rehabilitation. We would have on screen an advisor – like a friend."

"This whole project is based on the concept of home-based rehabilitation," Zhang said. "That is very important for Saskatchewan, where many people live far away from cities and major hospital facilities and they prefer to stay at home."