Image that was faked about Lithuania incident and posted on social media

Image that was faked about Lithuanian incident and posted on social media

The U.S. military is gearing up to fight “fake news,” but this “fake news” has nothing to do with the politically charged rhetoric, mostly aimed against President Trump, that floods establishment media.

It’s a variation on the old “honey pot” strategy that has existed for years in which Americans are set up by foreign agents to be photographed or recorded in compromising situations.

In this case, however, the compromising situation never existed: It’s created by digital manipulation.

Kyle Rempfer at the Military Times warns: “The emerging technology could be used to generate Kompromat – short for compromising material in Russian – that portrays an individual in deeply embarrassing situations, making them ripe for blackmail by a foreign intelligence service. Or, just as likely, deep fake technology could be used to generate falsified recordings from meetings that actually did take place, but where the content discussed is manipulated.”

One recent example, the Times reported, developed on a road between Kaunas and Prienai, Lithuania.

During a training exercise, four U.S. Army Stryker vehicles were driving in formation when the lead vehicle suddenly braked for an obstacle. A second hit the first.

Shortly later, on social media, “a doctored image was posted showing unconcerned soldiers near a crushed bicycle and child’s corpse,” the Times reported.

“This is a very typical example of the hostile information, and proves we are already being watched,” commented Raimundas Karoblis, the Lithuanian defense minister.

He told a meeting of NATO officials, “We have no doubt that this was a deliberate and coordinated attempt aiming to raise general society’s condemnation to our allies, as well as discredit the exercises and our joint efforts on defense strengthening.”

The online attack was characterized as an attempt to drive a wedge between NATO allies.

While the image was refuted, today’s digital technology can be used to produce images that look like reality.

Rempfer noted a hypothetical from Bobby Chesney of the University of Texas School of Law, who studies the technology.

Audio from a closed-door meeting could be doctored to make it sound as if a senior U.S. official told his Russian counterpart “don’t worry about the Baltics, we won’t lift a finger to defend them,” said Chesney.

He explained in the Times report that often the claims are about killing or harming civilians.

“And yeah, you can have actors play the role and impersonate, but how much the better if you can use the technology of deep fakes to make more credible instances of supposed atrocities?” he said.

Such attacks, though rudimentary in nature, already have been documented. The State Department has records of Russian intelligence revealing a U.S. diplomat in 2009 “purchasing a prostitute.”

But it was a video of the American making phone calls spliced with fake footage.

Faked videos are getting much easier to produce, the report said, using “puppeteering systems,” Google’s Chris Bregley said in the report.

“That means you take lots of video of somebody and then use machine-learning to change the lips or some other parts of the face, and it looks like someone said something entirely different,” he explained.

It’s not hard to acquire, since codes were posted last year on Reddit.

“If you have some software engineering skills, you can download that code, turn it into an application, collect a bunch of examples of faces of a person who is there in a video and faces of the person you want to replace, and then you buy a graphics card that costs less than $1,000,” Bregler said. “You let your system run on your home computer or laptop for sometimes several days, or one day, and then it creates a deep fake.”

 

Note: Read our discussion guidelines before commenting.