Seeing is believing, but you can’t always trust your eyes, especially now that we are living in the digital age of photography. For better or worse, programs like Photoshop make it all too easy to manipulate an image. It’s great for photographers who want to clean up their images, but can also be used for nefarious purposes. Can you ever really be sure that what you’re looking at is the real deal? Probably not, though to help with that very task, Adobe is leveraging artificial intelligence to detect when a photo has been doctored.
On the surface, that might not seem like a big deal. However, we are living in an era where the bulk of information about any given topic is shared online. The ease of which a user can fake a photograph has scary implications. When it comes to elections, a Photoshopped image could potentially damage a candidate’s chance of being elected. A fake photo could also seemingly incriminate a celebrity and go viral, to give another nefarious example.
For the past two years, Vlad Morariu, senior research scientist at Adobe, and his team have been working on technologies to detect image manipulation as part of the DARPA Media Forensics program.
“We focused on three common tampering techniques—splicing, where parts of two different images are combined; copy-move, where objects in a photograph are moved or cloned from one place to another; and removal, where an object is removed from a photograph, and filled-in,” Morariu explains.
Whenever an image is manipulated, it leaves behind clues, such as artifacts, strong contrast edges, different noise patterns, and so forth. The problem is these clues are often difficult, it not impossible, to detect with the naked eye. That is where machine learning comes into play.
Source: Adobe Research (PDF)
“Using tens of thousands of examples of known, manipulated images, we successfully trained a deep learning neural network to recognize image manipulation, fusing two distinct techniques together in one network to benefit from their complementary detection capabilities,” Morariu added.
One of the machine learning techniques Morariu developed is to look for changes in an RGB stream (chages to red, green, and blue color values of pixels) to detect tampering. His AI system also uses a noise stream filter, as many photographs and cameras have unique noise patterns, making it possible to detect noise inconsistencies between authentic and tampered parts of a photo.
It’s not a perfect system, but it’s nice to see companies like Adobe working on ways to separate fact from fiction in photography.
Top Image Source: YouTube via Adobe