The Fight Against "Cyberflashing": Bumble Releases Open-Source AI To Stop Unsolicited Nudes
bumble campaign no more unsolicited pics
Social

The Fight Against “Cyberflashing”: Bumble Releases Open-Source AI To Stop Unsolicited Nudes

Bumble users have had enough of unsolicited nude pictures, so the company is releasing an open-source AI tool to combat “cyberflashing”.

The company has launched in 2019 a tool called Private Detector.

As the name suggests, the tool detects nude photos and flags them before the user who receives them can open them. Anyone on dating apps has faced the issue of getting unsolicited nudes but, according to Bumble, in the grand scheme of things that’s a rare incident.

“Even though the number of users sending lewd images on our apps is luckily a negligible minority — just 0.1% — our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task,” the company said in a press release.

However, even that 0.1% of the times could lower, in order to keep users safe from unwanted imagery.

Now, with the tool available for free on GitHub, other developers can use it to cut down on unsolicited nudes. The press release also contains a whitepaper for further help.

Also read: This Dating App Can Report You for Body Shaming

Subscribe to our website and stay in touch with the latest news in technology.

The Fight Against “Cyberflashing”: Bumble Releases Open-Source AI To Stop Unsolicited Nudes
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

To Top