“The front page of the internet” website Reddit has been sued by a woman who claims that “over the course of the last decade” the popular discussion website “has knowingly benefited financially from videos and images posted to its website(s) featuring victims who had not yet reached the age of majority.”
The Jane Doe behind the suit has accused the platform of not only repeatedly allowing an abusive ex-boyfriend to upload both photos and videos featuring her as a minor, but also of taking “virtually no action” to “address this horrifying and pervasive trend.”
Immoral content and financial benefits
The immoral content, which has been taken without her knowledge or consent, contains sexual photos and videos of her as a 16-year-old and has been uploaded countless times on the website that in 2019 alone had at least 430 million active users.
The class-action suit alleges that starting from 2019 till the present time, the plaintiff’s now ex-boyfriend has distributed indecent photos and videos of her in no less than 36 different subreddits with Reddit moderators taking even “several days” before removing the content after she had initially alerted them.
“Because Reddit refused to help, it fell to Jane Doe to monitor no less than 36 subreddits — that she knows of — which Reddit allowed her ex-boyfriend to repeatedly use to repeatedly post child pornography,” the complaint reads. “Reddit’s refusal to act has meant that for the past several years Jane Doe has been forced to log on to Reddit and spend hours looking through some of its darkest and most disturbing subreddits so that she can locate the posts of her underage self and then fight with Reddit to have them removed.”
Even worse, despite repeated infractions, Reddit administrators limited themselves to only banning the user’s original account, thus allowing the man to create a new one and continue posting the stream of revenge porn “often to the exact same subreddit.”
The class-action suit represents anyone in a similar situation
Or someone who had similar content posted on Reddit – while under the age of 18 and applies controversial measures instituted in 2018 under the Allow States and Victims to Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act, a pair of laws known together as FOSTA-SESTA. The law amends a part of the 1996 Communications Decency Act called Section 230, which shields web publishers from being legally liable for what users post on their websites and instead makes the act of knowingly assisting, facilitating, or supporting sex trafficking illegal.
Reddit: Infantil lewd content has no place on the Reddit platform
However, a Reddit spokesperson stated for Gizmodo that any and all content that involves child sexual abuse “has no place on the Reddit platform” and that they “actively maintain policies and procedures that don’t just follow the law, but go above and beyond it.”
A similar statement to The Verge also assures that Reddit deploys “both automated tools and human intelligence to proactively detect and prevent the dissemination of CSAM material. When we find such material, we purge it and permanently ban the user from accessing Reddit. We also take the steps required under law to report the relevant user(s) and preserve any necessary user data.”
Even so, section 230 will most likely protect Reddit from any civil liability if the court will ultimately decide that image sharing does not constitute trafficking.
Follow TechTheLead on Google News to get the news first.