video editing (TarikVision/Shutterstock.com)

The dangers of 'deep fakes'

False and doctored media can be used for misinformation campaigns, and advanced technologies like artificial intelligence and machine learning will only make them easier to create and more difficult to detect.

Deep fakes are images or videos that combine and superimpose different audio and visual sources to create an entirely new (and fake) video that can fool even digital forensic and image analysis experts. They only need to appear credible for a short window of time in order to impact an election, Sen. Marco Rubio (R-Fla.) warned at a recent Atlantic Council event.

"One thing the Russians have done in other countries in the past is, they've put out incomplete information, altered information and or fake information, and if it's done strategically, it could impact the outcome of an [election]," Rubio said. "Imagine producing a video that has me or Sen. [Mark] Warner [D-Va., who also spoke at the event] saying something we never said on the eve of an election. By the time I prove that video is fake -- even though it looks real -- it's too late."

Rubio, who has warned about the impact of deep-fake technology in the past, is part of a growing group of policymakers and experts to fret over the impact false or doctored videos could have on electoral politics. Earlier this year comedian Jordan Peele and BuzzFeed released a now-viral video that used deep-fake technology to depict former President Barack Obama (voiced by Peele) uttering a number of controversial statements, before warning the viewer about the inherent dangers that such tools pose.

The technology is far from flawless, and in many cases a careful observer can still spot evidence of video inconsistencies or manipulation.  But as Chris Meserole and Alina Polyakova noted in a May 2018 article for the Brookings Institution, "bigger data, better algorithms and custom hardware" will soon make such false videos appear frighteningly real. 

"Although computers have long allowed for the manipulation of digital content, in the past that manipulation has almost always been detectable: A fake image would fail to account for subtle shifts in lighting, or a doctored speech would fail to adequately capture cadence and tone," Meserole and Polyakova wrote. "However, deep learning and generative adversarial networks have made it possible to doctor images and video so well that it's difficult to distinguish manipulated files from authentic ones."

As the authors and others have pointed out, the algorithmic tools regularly used to detect such fake or altered videos can also be turned around and used to craft even more convincing fakes. Earlier this year, researchers in Germany developed an algorithm to spot face swaps in videos. However, they found that "the same deep-learning technique that can spot face-swap videos can also be used to improve the quality of face swaps in the first place -- and that could make them harder to detect."

Researchers at the National Institute of Standards and Technology and the Defense Advanced Projects Agency have been working to develop technology that can detect deep fakes.

In its Media Forensics Challenge, NIST aims to advance image and video forensics technologies so its easier to determine if an image or video was modified, the exact section that was altered and where the "donor" parts of the image came from.  

DARPA's five-year MediaFor program that launched in September 2015 attempts "to level the digital imagery playing field, which currently favors the manipulator, by developing technologies for the automated assessment of the integrity of an image or video and integrating these in an end-to-end media forensics platform."

"We're now in the early days of figuring out how to scale [the system] so we can do things quickly and accurately to stop the spread of viral content that is fake or has been manipulated," Hany Farid, a Dartmouth College digital forensics expert who is participating in the MediaFor program, said in a recent article in Communications of the ACM.  "The stakes can be very, very high, and that's something we have to worry a great deal about."

This article used portions of a story that was first posted on FCW, a sibling site to GCN.

About the Authors

Derek B. Johnson is a former senior staff writer at FCW.

Susan Miller is executive editor at GCN.

Over a career spent in tech media, Miller has worked in editorial, print production and online, starting on the copy desk at IDG’s ComputerWorld, moving to print production for Federal Computer Week and later helping launch websites and email newsletter delivery for FCW. After a turn at Virginia’s Center for Innovative Technology, where she worked to promote technology-based economic development, she rejoined what was to become 1105 Media in 2004, eventually managing content and production for all the company's government-focused websites. Miller shifted back to editorial in 2012, when she began working with GCN.

Miller has a BA and MA from West Chester University and did Ph.D. work in English at the University of Delaware.

Connect with Susan at [email protected] or @sjaymiller.

Featured

  • 2020 Government Innovation Awards
    Government Innovation Awards - https://governmentinnovationawards.com

    21 Public Sector Innovation award winners

    These projects at the federal, state and local levels show just how transformative government IT can be.

  • Federal 100 Awards
    cheering federal workers

    Nominations for the 2021 Fed 100 are now being accepted

    The deadline for submissions is Dec. 31.

Stay Connected