DARPA wants better detection of doctored images
- By Mark Pomerleau
- Sep 22, 2015
With image manipulation technology readily available to the general public, the number of “enhanced” images has dramatically increased. But before government analysts can evaluate the content of images and video uploaded to social media, it must be able to identify those that have been altered.
To address this need, the Defense Advanced Research Projects Agency announced a proposer’s day to discuss objectives of the anticipated release the Media Forensics (MediaFor) broad agency announcement.
In one of the most famous examples of photo manipulation, Iran produced a photograph in 2008 that purported to show a “provocative” missile test launch. While the photo was discovered to be a Photoshopped hoax, the image ran on the front page of American newspapers before that determination was made.
Because there are few tools for image manipulation detection available in the commercial sector, media forensics analysts rely heavily on their own background and experience, making the process “more art than science,” DARPA said.
MediFor seeks to “level the playing field” that favors the image manipulator with technologies for the automated assessment of the integrity of images or videos. MediFor will integrate these technologies into a forensic-based platform that will “detect manipulations, provide analysts and decision makers with detailed information about the types of manipulations performed, how they were performed… in order to facilitate decisions regarding the intelligence value of the image/video.” Additionally, MediaFor aims to be able to discover image associations across visual media collections that will help confirm veracity.
The proposers day will take place on Oct. 2, 2015, and will detail DARPA’s interest in the area of media forensics, as well as the proposal requirements for the anticipated MediFor BAA.
Mark Pomerleau is a former editorial fellow with GCN and Defense Systems.