A recent report claims that the world’s largest, and most influential social media could have prevented a little over 10 billion estimated impressions “for top-performing pages that repeatedly shared misinformation”. The analysis concerns the giant Facebook and was made by Avaaz, a nonprofit organization that promotes activism on a diverse number of issues, including climate change, human rights, animal rights, corruption, poverty, and conflict.
Impressions represent the number of times certain content is shown. Whether it was clicked or not, it still means that the respective content was delivered to someone’s feed. This means that the user doesn’t even have to engage with the post for it to count as an impression and can even have multiple impressions for a single piece of content.
How deep does the rabbit hole go?
Avaaz’a report states that Facebook failed to be proactive and take the necessary steps during the year 2020 and thus fail to stop an estimated 10.1 billion views of content posted on top-performing pages which have been known to share misinformation in repeated cases. This has been particularly increasing in popularity in the eight months before the US elections.
In fact, Avaaz has identified no less than 267 pages as well as groups that have been shown to endorse and glorify violent content during and after the 2020 election. These pages and groups came with a combined following of 32 million users, not a small number by any means. Even worse, a total of 118 out of those 267 pages and groups are still active on Facebook and even more worrying, retain a following of almost 27 million. That means that almost half of them can continue to post misinformation and content bordering on violence incitement despite clearly violating Facebook’s policy.
What we have here… is a failure to communicate
“Failure to downgrade the reach of these pages and to limit their ability to advertise in the year before the election meant Facebook allowed them to almost triple their monthly interactions, from 97 million interactions in October 2019 to 277.9 million interactions in October 2020 – catching up with the top 100 US media pages (ex. CNN, MSNBC, Fox News) on Facebook,” the report concludes.
While Facebook had indeed modified its algorithm in order to reduce the visibility of misinformation or false and hateful content, it only did so in October, around one month before Election Day. Currently, most of those emergency policies have been returned to their previous pre-election state, a noticeable benefit to far-right conspiracy movements like QAnon.
Follow TechTheLead on Google News to get the news first.