Google’s newest update cracks down on deepfake nudes in Search

Google’s newest update cracks down on deepfake nudes in Search


Summary

  • Google is taking proactive measures to curb access to explicit deepfakes on Search.
  • New reporting system allows affected users to remove offending images faster.
  • Google is automatically filtering out explicit deepfakes from search results to prevent spread on the internet.




Over the past couple of years, we’ve seen some incredible advancements thanks to the use of artificial intelligence. Of course, big names like OpenAI, Microsoft, and Google have continued to push things forward, debuting new and useful tools for a variety of different products. At the same time, these same companies have exercised caution in an attempt to navigate this new road that has never been traveled before. Despite all the interesting things that have come from the adoption of AI, there have also been some nefarious and downright scandalous things that have come from this technology as well.

Related

What is artificial intelligence?

Its presence can be found on many of our devices right now


Of course, Google understands this, and has, for some time, created tools and rules in order to protect its users. With that said, Google is now sharing how it will further curb access to explicit deepfakes on Search, by introducing new measures that will make it easier to remove these types of media from its platform. Furthermore, the brand will now also implement some ranking changes that should prevent these types of images from even surfacing in search queries.


New changes that could have huge impacts

Google DeepFake Hero Image

As far as what changes are being made, there are two. With Google looking to bolster its existing reporting system for explicit deepfakes, that will grant the affected user the ability to get offending images taken down at a much faster rate. It’s able to make this possible by creating a simpler process, giving those that are affected a new way to remove offending content on a larger scale.


When a new report is filed against such media, Google will automatically look to filter out this content from search results. Furthermore, Google will also try and locate and detect duplicates of these images and automatically remove them too. As you can imagine, when deepfakes hit the internet, they can spread like wildfire, and this new method makes it less tedious to keep these images off of Search.

Of course, it needs to be said that this doesn’t remove these images from the internet, it simply just removes them from Google’s search results, making it harder for the public to access. In addition to the above, Google will also monitor this kind of content to ensure that it doesn’t make its way up the rankings in Search. It will automatically lower the rankings for websites hosting this type of content, and will instead produce standard search results with non-explicit content in the form of images and even news articles.


Google has apparently already been implementing this and claims that it has seen a 70% difference when it comes to these types of queries. Now, Google understands that legitimate explicit content is out there, but its main goal here is to remove and even suppress stuff that isn’t genuine. Of course, this is a huge step for Google and there will no doubt be more changes made in the future. But it’ll be a constant cat and mouse game, so it will be interesting to see how things turn out.



Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *