Google's new image search tools can help you spot AI-generated fakes

Google's new image search tools can help you spot AI-generated fakes

An instance provided by Google included uploading a photograph of a fabricated Moon landing, and the tool subsequently presenting how the image has been used in stories that discredit it

Google
Pranav Dixit
  • May 11, 2023,
  • Updated May 11, 2023, 8:44 AM IST

The proliferation of photorealistic images that have been altered using AI editing tools or created by generative AI bots like Midjourney or Stable Diffusion has made it increasingly difficult to discern whether a picture is real or fake. This is particularly problematic in an age where the spread of misinformation is a significant concern. Fortunately, Google is rolling out a new tool this summer called "About this image," which should help with this issue for English-language searches in the US and hopefully in other countries later.

The feature in Google image searches is akin to the "about this" dropdown that shows up on links within standard search results. When you perform a "reverse image search" by uploading an image of unknown origin, you'll now see a menu option that allows you to find out when that picture and others like it were first indexed by Google. You can also find out where on the web it first appeared and which sites it has appeared on since.

This type of tool will be particularly useful in situations where it's unclear whether a picture is real or fake. For example, if a breaking news event picture first appeared on a reputable news site like Getty, Reuters, or CNN, it would seem more likely to be legitimate. But a picture that first appeared on a random comedy subreddit with a news organisation's watermark is more likely to be a fake, regardless of how convincing it looks.

An instance provided by Google included uploading a photograph of a fabricated Moon landing, and the tool subsequently presenting how the image has been used in stories that discredit it. However, the tool will be useful for a wide variety of circumstances beyond just debunking fake news.

In addition to the "About this image" tool, Google also announced that its own generative AI tools will include metadata with each picture to indicate that it's an AI-created image, not a photo. This labelling will be included regardless of whether you see it on a Google platform. Other creators and publishers will be able to label their images using the same technology. Midjourney, Shutterstock, and others will roll out the markup in the coming months, according to Google's blog post.

Meanwhile, here are some other ways that can help you identify AI-generated fakes:

Look for inconsistencies in the image's lighting, texture, or composition. AI-generated images often have subtle inconsistencies that are not present in real images. For example, the lighting in an AI-generated image may be inconsistent, or the texture of an object may be unrealistic.

Check the image's metadata. Metadata is data that is stored with an image and can provide information about the image's creation, such as the date and time it was created, the camera that was used to take it, and the location where it was taken. If an image's metadata does not match the image itself, it is likely that the image has been manipulated.

Use a reverse image search. A reverse image search is a tool that allows you to search for an image by uploading it to a website. If an image has been used on other websites, a reverse image search can help you find those websites. This can be helpful if you are trying to determine whether an image is original or has been copied from another source.

Also Read 

'Buying Netflix at $4 billion would've been better instead of...': Former Yahoo CEO Marissa Mayer

ChatGPT beats top investment funds in stock-picking experiment

Read more!
RECOMMENDED