The European Commission is considering turning to serious fines for websites that allow extremist content to linger on their platforms for more than an hour. This regulation would affect Twitter, Facebook and YouTube, to name a few.
So far, the companies have self-regulated themselves, but the EU would want to set in more explicit regulations, especially in the wake of the wave of terror attacks that have swarmed over Europe in the past few years.
Thanks to a study released last month by the Counter Extremism Project, we now know that between March and June, ISIS members and supporters uploaded no less than 1,348 videos on YouTube which garnered 163,391 views on average. The content was left on YouTube for over two hours before it was found and deleted, which was more than enough time for the videos to be downloaded, copied and then re-distributed across other social media platforms such as Facebook and Twitter.
Julian King, the EU commissioner for security has said, in a report from the Financial Times, that there has not been enough progress seen by Brussels concerning the removal of extremist material from the tech companies and that it would “take stronger action in order to better protect our citizens”.
If the EU regulation is eventually approved, it will mark the first time ever that the European Commission would target -directly and explicitly- the way tech firms have been handling illegal content so far.