Can police officers and detectives really use artificial intelligence (AI) to distinguish psychopaths from other, less dangerous suspects? Well, not exactly, but technology is definitely headed in that direction thanks to a group of scientists at the University of New Mexico (UNM) – and it all comes down to head movement! Don’t worry – we’ll catch you up to speed.
UNM Psychologists Create AI Psychopath Tool
Over the course of 20 years, Professor Kent Kiehl – a psychologist at UNM – interviewed more than 500 incarcerated men, with each interview lasting between one and four hours. Kiehl and his team used this data to test a theory – that psychopaths tend to keep their heads still when talking to others.
The machine learning program – which utilized head pose estimation and tracking algorithms to quantify head dynamics – analyzed more than 36,000 frames of footage and tracked six facial reference points. The scientists also used the Hare Psychopathy Checklist-Revised to quantify each inmate’s level of psychopathy.
What Is The Hare Psychopathy Checklist-Revised?
The Hare Psychopathy Checklist-Revised is a psychological assessment tool measuring a person’s interpersonal, emotional, lifestyle, and antisocial traits. There are 20 such traits listed, with each trait being scored on a three-point scale – 0, 1, or 2. It was developed by Robert D. Hare in the 1970s.
The average person would score below a 10 (usually a 5 or 6), but Americans who score a 30 or above are diagnosed with a psychopathic personality disorder – though that number is 25 or higher in the United Kingdom. Kiehl and his team used the test to categorize the test subjects before comparing the data to their head movements.
Scientists Find Common Trait Among Psychopaths
Kiehl and his team published their findings in the Journal of Research in Personality in 2021, and the data produced an eye-opening conclusion – though it’s a conclusion they expected. It turns out psychopaths do tend to keep their heads still while talking to other people.
“As predicted, dwell times indicate that those with higher levels of psychopathic traits are characterized by more stationary head positions, focused directly towards the camera/interviewer, than were individuals low in psychopathic traits,” the team of UNM scientists concluded in the study.
Were There Any Inconsistencies With The UNM Study?
While the study produced a clear conclusion, Kiehl and his team are aware of several issues, limitations, and inconsistencies with the study. For example, the study only looked at interviews with incarcerated men – not women or adolescents. And if the camera couldn’t make out one of the six reference points, the frame was discarded.
The researchers view this data as a stepping stone – and hope more data comes in the future. They’re already discussing the possibility of analyzing eye movement and other nonverbal, subconscious behaviors, such as patterns and hand movements. It’s a work in progress – but the progress is going well!
How Can This New AI Tool Be Used In The Future?
While we don’t have it yet, an AI tool that can detect psychopathic tendencies and behaviors in certain individuals would be beneficial in a lot of ways – but primarily useful to law enforcement and health professionals. It wouldn’t replace them, but it can certainly be used as a supplement to their knowledge and expertise.
Law enforcement – whether it be a police officer or detective – encounters psychopaths on a daily basis. Identifying these behaviors while talking to a suspect would make their lives a lot easier – and more accurate. The same goes for mental health professionals who are trying to diagnose psychosis in a patient.
New Theory: Psychopaths Have A Defect In The Para-Limbic System
Of course, with new data comes new questions. Now that researchers are a little more confident in their head movement theory, many are starting to ask – why the lack of head movement? They’re still trying to figure that out, but scientists have their theories – as they always do!
The theory is that psychopaths have a defect in the para-limbic system – a group of structures in the brain that play a role in processing emotions, setting goals, motivation, memory, and self-control (among other functions). Researchers have found three primary concerns – the amygdala, orbitofrontal cortex, and brain tissue.
Problem #1: The Amygdala
The amygdala is an almond-shaped structure located deep within the temporal lobe of the brain. It was named after the Greek word for ‘almond’ – which is ‘amygdale.’ It plays a major role in detecting and responding to fear, but is also responsible for behavior, emotional control, and learning.
According to the researchers, amygdala dysfunction and damage is a ‘hallmark neurobiological feature of psychopathy.’ They believe it could be the reason psychopaths often struggle with emotional processing, reinforcement learning, and interpersonal interactions – but they need more data and research to back it up.
Problem #2: The Orbitofrontal Cortex
The orbitofrontal cortex is the area of the brain that sits directly above the eye sockets – at the very front of the brain. Researchers are still trying to figure out what this area of the brain does, but they believe it plays a role in impulse control and response inhibition – as well as rational thought, reasoning, and personality expression.
In Kiehl’s study, he found that psychopaths demonstrated low levels of activity in the amygdala and orbitofrontal cortex in situations where they should be high. “Not only were they not picking up on the emotionally charged content, but their brains didn’t seem to be equipped to attach meaning to it either,” he said of the inmates.
Problem #3: Brain Tissue
A separate study – led by Dr. Micheal Craig and his colleagues at the Institute of Psychiatry at King’s College London in the UK – looked deeper into the amygdala and orbitofrontal cortex. Using a Diffusion Tenser Tractography, his team analyzed the brain tissue connecting these two areas of the brain.
What they found was that ‘the integrity of the tissue connecting the two regions was reduced in people with psychopathy compared to control subjects.’ They likened it to a scratched CD – the poor tissue made it difficult for clear signals to move from one part of the brain to the next.
MIT Scientists Stress The Importance Of Data
Of course, this isn’t the first time AI has been used to study psychosis and psychopathy – in fact, scientists have been doing it for years. For example, the MIT Media Lab has created four similar algorithms, and has come to a conclusion about AI – we must be careful about the data we provide it.
Between 2016 and 2018, they tested this theory by training different AI programs to exhibit psychopathic tendencies – which they successfully did by feeding it biased data (such as negative or violent data). Let’s take a closer look at these four programs and what they found.
2016: Nightmare Machine (AI Horror Imagery)
In 2016, a group of MIT scientists created the Nightmare Machine – an AI program that could generate scary imagery based on an existing photo. They tested it on photos of human faces and famous landmarks, and even had the AI create different ‘horror’ styles – such as ‘haunted house’ or ‘slaughter house.’
The primary goal of this study was to see if AI could learn how to scare us – especially since ‘creating a visceral emotion such as fear remains one of the cornerstones of human creativity.’ The AI-generated images that the Nightmare Machine created can be viewed on MIT’s official website.
2017: Shelley (AI Horror Stories)
MIT’s next investment into AI-generated horror was Shelley – the world’s first AI-generated horror writer. They fed Shelley a plethora of scary stories (more than 14,000!) from a group on Reddit called ‘r/nosleep,’ giving her all the tools she needed to write her own horror stories – which she did!
In fact, Shelley wrote hundreds of scary stories, and people were able to contribute and collaborate with her through Twitter. Shelley would start by releasing the opening of a story. Users would join in and keep the story going, with Shelly hopping in from time to time.
2017: Deep Empathy (AI-Powered Empathy)
In 2017, the MIT Media Lab moved on to its next project – Deep Empathy. It was an AI-powered empathy program that took a different approach to psychopathy. Instead of trying to teach AI to see the dark side of everything, they tried to teach it to empathize with others.
This time, they fed AI pictures of Syrian neighborhoods that were destroyed during the brutal war that began in 2011. They then fed Deep Empathy photos of popular cities around the world and asked it to recreate the image to resemble what it would look like if the Syrian war happened in that particular city. The results were depressing.
2018: Norman (AI Psychopath)
Their final project was Norman – the world’s first AI psychopath. They fed Norman data from some of the darkest corners of Reddit to see if it would start to think like a psychopath during a Rorschach test. What they found is that feeding AI biased data produces biased results.
As MIT put it, Norman ‘represents a case study on the dangers of Artificial Intelligence gone wrong when biased data is used in machine learning algorithms.’ For example, Norman’s response for the first inkblot was ‘a man is electrocuted and catches to death’ – whereas normal AI saw ‘a group of birds sitting on top of a tree branch.’
GIPHY App Key not set. Please check settings