bumble campaign no more unsolicited pics
Social

The Fight Against “Cyberflashing”: Bumble Releases Open-Source AI To Stop Unsolicited Nudes

Bumble users have had enough of unsolicited nude pictures, so the company is releasing an open-source AI tool to combat “cyberflashing”.

The company has launched in 2019 a tool called Private Detector.

As the name suggests, the tool detects nude photos and flags them before the user who receives them can open them. Anyone on dating apps has faced the issue of getting unsolicited nudes but, according to Bumble, in the grand scheme of things that’s a rare incident.

“Even though the number of users sending lewd images on our apps is luckily a negligible minority — just 0.1% — our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task,” the company said in a press release.

However, even that 0.1% of the times could lower, in order to keep users safe from unwanted imagery.

Now, with the tool available for free on GitHub, other developers can use it to cut down on unsolicited nudes. The press release also contains a whitepaper for further help.

Also read: This Dating App Can Report You for Body Shaming

Follow TechTheLead on Google News to get the news first.

Subscribe to our website and stay in touch with the latest news in technology.

The Fight Against “Cyberflashing”: Bumble Releases Open-Source AI To Stop Unsolicited Nudes
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Must Read

Are you looking for the latest innovations in tech? You're in the right place, just subscribe to our RSS feed


Techthelead Romania     Comedy Store

Copyright © 2016 - 2023 - TechTheLead.com SRL

To Top