Who is It?

NIST electronic engineer P. Jonathon Phillips and his colleagues recently conducted the largest-scale face-off of facial-recognition systems to date. The results lead him to conclude that this biometric technique may have matured enough that it is ready for real-world use.

Olivier Douliery

The accuracy of facial-recognition systems degrades over time. Virtually identical images taken 18 months apart are identified with 65 percent accuracy.

NIST faces up to a big test

The National Institute of Standards and Technology this month will release large-scale test results for 14 vendors' facial-recognition systems.

P. Jonathon Phillips, an electronic engineer in NIST's Visual Image Processing Group, said he has been evaluating such products since 1992, when the Defense Department's Counterdrug Technology Development Program Office set up its Face Recognition Technology program.

More recently, NIST has been conducting the Face Recognition Vendor Test 2002 under a mandate in the USA Patriot Act of 2001. Phillips and his colleagues completed a test protocol, specifications and data sets in April. By the end of May, 14 vendors had signed up to be evaluated.

'They had to do a dry run to ensure that their programs could read our test images and that we could read and score their output,' Phillips said.

In July and August, the NIST team administered the tests at the Naval Surface Warfare Center in Dahlgren, Va. 'The tests are complete, and we are scoring the results and writing the report,' Phillips said.

The tests evaluated three aspects of facial-recognition performance. The first test presented a facial image and asked the system to display the 10, 20 or 25 most similar known faces.

The second test determined whether the system can verify identities from the images.

The third test determined how well a system works under different lighting conditions and how long it took to process a known image.

'We will not be creating a buyer's guide, but we will generate performance specifications for people who want to look more closely at these products,' Phillips said. There will be an overview for managers as well as a more detailed technical report.

15 billion comparisons

In the first test, the products had to compare test faces against 121,000 images of visa applicants from Mexico. 'That leads to 15 billion comparisons' in a matrix of known identities and unknown images, Phillips said.

The second part of the test used mug shot-style images from NIST.

The vendors did not have to disclose their facial-recognition technologies, which generally are proprietary.

The algorithms for scoring the tests were not too complex, Phillips said, but developing a trustworthy algorithm meant incorporating a statistical technique called normalization.

In addition, the NIST team developed style sheets and document type definitions so the test images could be processed with Extensible Markup Language.

'The XML structure allows systematic evaluation of biometric systems,' Phillips said. That was a significant challenge for the testing team, he said.
Until now, there has been no test of facial-recognition performance on a scale as large as 100,000 images. The database was 60G, he said, and 'we had to design methods for storing and formatting the data.'

Another test requirement was that the facial-recognition products had to be stable enough to stay up and running for two weeks.

Phillips said he believes the results show that facial recognition has matured enough for real-world use.

Two years ago, he said, the best systems achieved only about 98 percent accuracy in comparing virtually identical images captured within minutes of each other. 'If you went to images taken 18 months apart, you got 65 percent accuracy,' he said. Developers are unsure why accuracy erodes over time.

NIST is also evaluating fingerprint biometric systems as mandated by the Patriot Act.

Besides Phillips, the testers were Duane Blackburn, Mike Bone, Patrick Grother and Ross Michaels.

NIST had 16 sponsors, including agencies involved with law enforcement, defense and homeland security as well as government units from Australia, Canada and Great Britain.


  • business meeting (Monkey Business Images/Shutterstock.com)

    Civic tech volunteers help states with legacy systems

    As COVID-19 exposed vulnerabilities in state and local government IT systems, the newly formed U.S. Digital Response stepped in to help. Its successes offer insight into existing barriers and the future of the civic tech movement.

  • data analytics (Shutterstock.com)

    More visible data helps drive DOD decision-making

    CDOs in the Defense Department are opening up their data to take advantage of artificial intelligence and machine learning tools that help surface insights and improve decision-making.

Stay Connected