Google simplifies removal of explicit deepfakes with latest update

New update will make it easier for victims to take down deepfake pictures and videos

New update will make it easier for victims to take down deepfake pictures and videos
New update will make it easier for victims to take down deepfake pictures and videos

Google announced on Wednesday, July 31, that in the latest update, it has made it easier to remove explicit deepfakes from the search.

According to Microsoft Start, the company said that the new system will allow victims to take down their deepfake explicit pictures and videos.

The product manager Emma Higham wrote in a blog post, “With every new technological advancement, there are new opportunities to help people—but also new forms of abuse that we need to combat. As generative imagery technology has continued to improve in recent years, there has been a concerning increase in generated images and videos that portray people in sexually explicit contexts, distributed on the web without their consent.”

Moreover, Google emphasised that it is making these efforts to give people peace of mind, ‘especially if they are concerned about similar content about them popping up in the future.'

The search engine will also implement some ranking changes that will prevent deepfake explicit content and images from even surfacing in the search results.

Google asserted that it is also working on improvements through which the search will automatically detect and remove fake explicit content.