Campus & Community

Researchers face up to liars: expressions speak louder than words

4 min read

How good are you at detecting a lie? Liars often give themselves away by facial expressions or changes in vocal pitch. But most people do no better than 50/50 at lie detecting, that is, they are right only about half the time.

Investigators working at Massachusetts General Hospital in Boston have found an exception to that rule: not Secret Service agents, or psychiatrists, or even moms — they are aphasics, people who, because of stroke or other forms of brain damage, have been robbed of their ability to understand language.

“We compared aphasics to healthy people, to those with other types of brain damage, and even to students at Massachusetts Institute of Technology (M.I.T),” says Nancy Etcoff, a clinical instructor in psychology at Harvard Medical School. “Aphasics are clearly superior at perception of deception.”

That skill was suspected for a long time, but the new study is the first to prove it. In the book, The Man Who Mistook His Wife for a Hat, neurologist-author Oliver Sacks describes a group of aphasics watching Ronald Reagan speaking on television and laughing at what they perceived as deceptive statements by him.

Etcoff tested this idea scientifically with the help of Paul Ekman from the University of California at San Francisco, Mark Frank of Rutgers University, and John Magee of, a private company in Cambridge, Mass.

The team showed videotapes of 10 women telling the truth and lying to 10 aphasics, 10 people with other types of brain damage, 10 healthy subjects, and 48 M.I.T students. The aphasics had suffered damage to language centers on the left side of their brains. Ten others had damage done to the right side of their brains but no loss in ability to understand or speak. All, except the students, were matched for age, gender, and IQ.

The women on the videotapes spoke of feeling happy and relaxed as they watched pleasant nature scenes, which were not visible to the subjects watching their faces. Half the time, they told the truth. The other times, they actually watched gruesome medical scenes showing amputations and burn victims.

Before viewing the tapes, the smiling women, all nurses, had been told that their supervisors would be watching and judging them on their ability to appear calm and serene.

Non-aphasics did no better than flipping a coin in detecting the lies; they made wrong calls about half the time. Aphasics were right 60 percent of the time when they relied on both facial and verbal cues. Their scores jumped to 73 percent, or about three-quarters of the time, when they judged facial expressions alone.

Compensation or camouflage?

Why should people who cannot fully comprehend language be so good at spotting lies when they don’t even know what a person is saying?

Etcoff sees two possibilities. “Perhaps strokes, or other types of damage to brain areas controlling language, stimulate the development of compensatory skills in recognizing nonverbal communication,” she speculates.

On the other hand, all people may have these skills but we don’t use them because language is so predominant. “Language may camouflage other communications skills we all possess,” Etcoff says. “We usually listen to what people are saying without paying close attention to their facial expressions or changes in voice.”

To conduct a quick test of the latter idea, participants in the study viewed the videotapes of the nurses’ expressions with the sound turned off. As it turned out, the non-aphasics didn’t do any better. “That might change if normal people received more training in reading expressions or concentrating on voice changes,” Etcoff admits.

She and her colleagues plan additional tests to find out more about the intriguing possibilities of this situation. One question to answer is whether aphasics perceive other complex emotions – those not tied to lying – better than those without this handicap.

Scientists already know that aphasics do no better than people without brain damage at perceiving simple emotions, such as fear, happiness, or sadness. Why then, are they better at picking up subtle changes in facial expressions and voice that reveal a disconnection between what people are saying and what they are feeling?

For further investigations, the team will use a split-screen technique they found useful in past tests. It enables researchers to see at the same time what a person is looking at and the expression it produces on his or her face. “It’s fun to watch,” Ectoff notes. “You can actually see the ‘aha’ moment when an aphasic detects a lie. You don’t usually see that with normal people.”

Ectoff and her team described their experiments on facial expressions in the May 11 issue of the British scientific journal Nature.