Old-school lie-detection technology has served its purpose in many, many criminal cases over the years, but it’s still not terribly reliable. Polygraph tests measure many different things, including blood pressure, pulse, and perspiration, all of which tend to change when a person is telling a lie, but it’s also possible to completely fool the machine. Likewise, false-positives aren’t all that uncommon, and that’s not good for anyone. Now, new emotion-detection systems are being developed that could call you out if you’re lying, or even tell you if you’re at risk of a number of different diseases.
In a recent TED talk, Poppy Crum of Dolby Laboratories explained how existing technologies can combine to paint a very accurate picture of what is going on in side the mind and body of a person. Crum declared it “the end of the poker face,” but the implications are far greater than a few lost card chips.
“We broadcast our emotions,” Crum said. “We will know more about each other than we ever have.”
Crum explains how combining various types of sensors allows researchers to build systems that reveal far more information than we realize we’re giving away. From the amount of carbon dioxide that we exhale to the cadence of our speech, these subtle cues reveal our emotions in a way that algorithms can decode. Stilted speech patterns could hint that someone is at risk of developing dementia, while body heat and the chemicals in our breath can reveal that we’re excited, anxious, or even about to becoming physically violent.
“It is really scary on one level, but on another level it is really powerful,” Crum noted.
The scientist explained how these systems work, a pretty general sense, but also touched on the implications they could have on privacy. Do you want everyone to be able to read your emotions? If technology that can almost read your mind becomes so streamlined that it could fit into a smartphone, is that something we want to unleash to the general public?
“I realize a lot of people are having a hard time with people sharing our data, or knowing something we didn’t want to share,” Crum explained. “I am not looking to create a world where our inner lives are ripped open, but I am looking to create a world where we can care about each other more effectively.”
The relentless march of technology would suggest that, if systems like this already exist in some form, it’s likely not long before they reach the mainstream in some form. “We will be able to know more about each other than we ever have. Let’s use that for the right reasons rather than the wrong ones,” Crum said. That might be easier said than done.