Researchers at the Defense Advanced Research Projects Agency in Arlington, Va., are working to combat "deep fakes." (Courtesy of Techcrunch.com)
Researchers at the Defense Advanced Research Projects Agency in Arlington, Va., are working to combat "deep fakes." (Courtesy of Techcrunch.com)

It’s as frightening as anything imaginable.

Fake pornography videos are the new craze, one in which celebrities and everyday folk are potential targets.

So-called “deep fake” creators reportedly are making disturbing realistic, computer-generated videos with photos taken from the web, and turning them into nightmares for victims.

In one case reported in Sydney, Australia, a video showed a woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile.

It was her face. But, as the Sydney Morning Herald reported, it had been seamlessly grafted, without her knowledge or consent, onto someone else’s body — a young pornography actress just beginning to disrobe for a graphic sex scene.

Like countless other incidents, a crowd of unknown users had been passing it around online.

The woman said she felt nauseous and mortified: What if her co-workers saw it? Her family, her friends? Would it change how they thought of her? Would they believe it was a fake?

“I feel violated — this icky kind of violation,” said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career.

“It’s this weird feeling, like you want to tear everything off the internet. But you know you can’t,” she told the newspaper.

As the daily newspaper noted, airbrushing and Photoshop long ago opened photos to easy manipulation.

Now, videos are becoming just as vulnerable to fakes that look deceptively real. Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike “deep fake” videos have quickly multiplied across the internet, blurring the line between truth and lie.

In the District, strict laws are in place to dissuade such practices, which count as a felonies that carry a fine and up to three years in prison.

“MPD has not received any reports regarding these specific activities,” said MPD spokeswoman Alaina Gertz, adding that residents who might be victimized in such a manner should immediately file a police report.

Of course, victims of deep-fake pornography include some of Hollywood’s biggest names.

In a published interview this week, actress Scarlett Johansson talked of being a victim, noting that there isn’t much she and her team can do about it.

“I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself,” said Johansson, Hollywood’s highest-paid actress.

In 2012, a hacker received a 10-year prison sentence after leaking nude photos of Johansson and other celebrities.

“Every country has their own legalese regarding the right to your own image, so while you may be able to take down sites in the U.S. that are using your face, the same rules might not apply in Germany,” she said. “Even if you copyright pictures with your image that belong to you, the same copyright laws don’t apply overseas. I have sadly been down this road many, many times.”

From California to D.C., researchers are hard at work trying to figure out ways to detect such fakery. That, according to the San Jose Mercury News, includes SRI International in Menlo Park.

Manipulating videos used to be the stuff of movies, done by Disney, Pixar and others, said Bob Bolles, program director at the AI center’s perception program at SRI.

“Now face-swap apps are getting better and better,” Bolles told the Mercury News.

Recently, he showed the newspaper how he and his team are trying to detect when video has been tampered with. The team looks for inconsistencies between video and the audio track — for example, watching whether a person’s lip movements match the sounds of the video.

The team also tries to detect sounds that may not jibe with the background, said Aaron Lawson, assistant director of the speech lab at SRI. For example, the video may show a desert scene but reverberations can indicate the sound was recorded indoors.

“We need a suite of tests” to keep up with hackers who are bound to keep figuring out new ways to keep fooling people, Bolles said.

Matt Turek, program director for MediFor at the Defense Advanced Research Projects Agency in Arlington, Va., told the Mercury News that 14 prime contractors are working a project to keep up with hackers. The project began nearly three years ago.

“There’s been a significant change in the last year or so as automated manipulations have become more convincing,” Turek said.

Artificial intelligence and machine learning have helped make tools to create more realistic-looking deep fakes, and researchers are using the same methods to fight back.

“The challenge is to create algorithms to keep up and stay ahead of the technology that’s out there,” Turek said.

Stacy M. Brown is a senior writer for The Washington Informer and the senior national correspondent for the Black Press of America. Stacy has more than 25 years of journalism experience and has authored...

Leave a comment

Your email address will not be published. Required fields are marked *