Adobe Photoshop’s Content-Aware Fill is the current industry standard when it comes to removing unwanted artifacts and distracting objects, but that might not always be the case. Because while Adobe is currently working on an advanced deep learning-based "Deep Fill" feature, NVIDIA just demonstrated its own AI-powered spot healing tool, and the results are pretty incredible.

As you can see from the two-minute demonstration above, the prototype tool can handle both basic tasks, like removing a wire from a scene, as well as more complicated tasks, such as reconstructing books and shelves inside an intricate library scene.

The secret behind this tool is NVIDIA’s "state-of-the-art deep learning method" that the tool is built on. Not only does the tool use pixels from within the image to reconstruct an area—it actually analyzes the scene and figures out what it should look like when it’s finished. This helps to create a much more accurate and realistic result, even when the original image is an absolute disaster.

The best examples of this can be seen in a paper NVIDIA team members published titled ‘Image Inpainting for Irregular Holes Using Partial Convolutions.’ As seen in the comparison images below, NVIDIA’s tool blows Photoshop out of the water when reconstructing portraits where much or most of the face is removed.

From left to right: the corrupted image, Adobe's Content-Aware results, NVIDIA's results and the actual image.

In the discussion section (section 5.1) of the aforementioned paper, NVIDIA says its "model can robustly handle holes of any shape, size location, or distance from the image borders. Further, our performance does not deteriorate catastrophically as holes increase in size."

NVIDIA does note, however, that "one limitation of our method is that it fails for some sparsely structured images such as the bars on the door," as seen in the image comparison below.

From left to right: the corrupted image, NVIDIA's results and the original image.

Current shortcomings aside, this particular tool—prototype or otherwise—appears to be leaps and bounds ahead of everyone else that's currently on the market. Unsurprisingly, there’s no word on when, or if, we’ll ever see this hit the market, let alone in the consumer market, but we'll keep our fingers and toes crossed.