Emerging Tech

Blog archive
Can brain scans spot insider threats?

Can brain scans spot insider threats?

In the 1980s, when I was up for a job at The San Jose Mercury News, I was asked to take a personality test.  Several of the questions had been taped over so as not to be asked, but I could still read what was underneath.  One of them was, "Does the sight of dirty, ragged fingernails repulse you?"

I wondered who had come up with the questions and what they were supposed to reveal about me.  And, of course, I imagined what impact my answers would have on my job prospects.

Today, researchers at Iowa State University are taking personality screening to a new level.  In a recent study, they have found that they can detect people who are prone to be cybersecurity risks by reading their neural activity. 

The study, directed by Qing Hu, professor of information systems, measured the brain activity and response times of subjects presented with a series of security scenarios. Hu's team found that people with higher self-control posed less security risk than people with lower self-control.

According to Hu, questionnaires measuring individuals' levels of self-control were developed by criminologists more than 20 years ago.  Those questionnaires were used to screen 350 students.  The researchers then selected the 20 students at each end of the spectrum to test their neural responses to security scenarios. 

Researchers found two effects in the electroencephalograms (EEGs), said Robert West, a psychologist who worked with Hu on the study.  "One of them is that individuals with high self-control will have more neural activity when they are considering major violations.  That is attenuated in those with low self-control."

In short, the location and degree of neural activity can be used to predict who is likely to be a security risk.

According to Hu, those individuals who display greater activity in the prefrontal cortex when faced with a security scenario demonstrate higher levels of self-control and are less likely to present a threat.  "Some people have developed an ability to use more executive control," said Hu.  "Others rely more on their primitive reflexive evolutionary capability to make decisions."  The latter group, he said, tend to be greater security risks.

"We're not saying that people with low self-control are bad people," Hu was quick to add.  "People with low self-control simply may not be good candidates for a job that has access to sensitive, confidential data because they are easily induced or enticed by external factors." 

At the same time, Hu acknowledged that individuals with high self-control can also be security risks.  Of Edward Snowden, the National Security Agency analyst who released huge amounts of classified data, Hu said that, "he appears by all indications to have very active high self-control.  People like Snowden will defeat the system that we are talking about."

But Hu said he believes that Snowden is an exception.  There are plenty of people who have compromised security because of one small external incentive or stimuli, he said.  "They don't think about whether they might be caught go to jail."

Hu and West acknowledge that more work is required.  "Is a technology ready for companies to use in making decisions? I think we're pretty far away from that," said West.  "There is a lot of nice laboratory work.  Scaling that out to real-world applications has been a little bit trickier.  This is really the first study to use EEG in information security."

What's more, it's impractical for employers to hook up applicants to EEGs.

"We don't want companies to use expensive equipment to do screening," said Hu.  "Once we establish the standard values using the sophisticated equipment, then a company can screen as part of your job interview using 20 or 30 questions.  That's all we need."

As interesting as the research is, what companies may ultimately do with it is creepy.  By some estimates, as many as a third of companies already use personality tests as a factor in making hiring decisions.  If the new screens are considered more accurate, it's likely that more companies and, potentially, government agencies may adopt them.  That prospect will understandably upset privacy advocates and job applicants alike.

Editor's note: This article was changed May 20 to correct the university affiliation of Qing Hu. He works at Iowa State University, not the University of Iowa as originally reported.

Posted by Patrick Marshall on May 19, 2015 at 5:57 AM


Featured

  • Defense
    Soldiers from the Old Guard test the second iteration of the Integrated Visual Augmentation System (IVAS) capability set during an exercise at Fort Belvoir, VA in Fall 2019. Photo by Courtney Bacon

    IVAS and the future of defense acquisition

    The Army’s Integrated Visual Augmentation System has been in the works for years, but the potentially multibillion deal could mark a paradigm shift in how the Defense Department buys and leverages technology.

  • Cybersecurity
    Deputy Secretary of Homeland Security Alejandro Mayorkas  (U.S. Coast Guard photo by Petty Officer 3rd Class Lora Ratliff)

    Mayorkas announces cyber 'sprints' on ransomware, ICS, workforce

    The Homeland Security secretary announced a series of focused efforts to address issues around ransomware, critical infrastructure and the agency's workforce that will all be launched in the coming weeks.

Stay Connected