When robots feel your pain

Computers will become prized assistants to psychiatrists

Science and technology

Isaac Asimov, Arthur C. Clarke, Gene Roddenberry and their futurist kin all expected robots one day to play a pivotal part in the realm of medicine. It is safe to say that systems as complex as the heart surgeon in Asimov’s “Segregationist” and the Emergency Medical Hologram from Roddenberry’s “Star Trek: Voyager” are not going to become reality in 2017. However, artificial intelligence is now in a position to transform psychiatric hospitals for the better in the year ahead.

This breakthrough has been a long time coming. More than a decade ago researchers started feeding videos of people engaged in social interactions to computers so that the machines could analyse the files and learn the meanings of specific facial expressions. An early project of this sort led by Louis-Philippe Morency and Jonathan Gratch at the University of Southern California’s Institute for Creative Technologies (ICT) was designed to help computers spot the difference between a nod that meant “I understand what you are saying” and one that meant “I agree with you”. Their initial intention was for this technology to help make screen avatars and robots more personable to interact with, but this is not where the technology ultimately proved most useful.

Dr Morency, now at Carnegie Mellon University, Dr Gratch and their colleague Stefan Scherer, also at ICT, have mastered the art of teaching computer systems how to analyse the facial expressions and voices of people suffering from psychiatric diseases so that the machines can help detect those diseases. In recent years they have worked extensively with America’s Department of Defence to design systems that can analyse the faces and voices of soldiers returning from tours of duty and note whether they are suffering from post-traumatic stress disorder.

Over the past year Dr Morency has been collaborating with a team led by Justin Baker, a psychiatrist at the McLean Hospital in Massachusetts, to teach a computer connected to a camera and microphone in a single room to spot signs of bipolar disorder and schizophrenia. Dr Scherer has been working with a similar system at the San Francisco Veterans Affairs Medical Centre. So far, these computers have been doing the lion’s share of the learning, but soon it will be psychiatrists’ turn.

Nuances in patients’ behaviour help clinicians diagnose and monitor mental diseases, but such reliance on expert human judgment is expensive to maintain, difficult to monitor for efficacy and often results in uneven care. The need for objective measurements is huge—and this is where the computers of Dr Morency and Dr Scherer are poised to make a big impact. By tuning their systems to analyse and record traits like gaze and speech prosody (rhythm, tone and volume, among other aspects), which are difficult for human eyes and ears to quantify, they are providing a set of measurements that psychiatrists can refer to—in much the same way that general practitioners make use of blood pressure, pulse and respiratory rate.

Robots will not replace psychiatrists as they did heart surgeons in Asimov’s work. But in the year ahead they will shed their prototype status and become valuable allies in the battle against some of the most debilitating mental diseases.

You are reading a small selection of content from The World in 2018.
To read all the articles in this year’s edition download The Economist app.
Download 'The World In 2018 iOS app'
Download 'The World In 2018 Android app'