New Report Shows How Facebook’s Algorithms Boosted Hate and Misinformation at Inflection Points
Today, Accountable Tech released a new report examining how Facebook’s algorithms operated during the initial months of the COVID-19 pandemic and in the wake of George Floyd’s killing – and found they played a toxic role at each of these inflection points.
On both fronts, Facebook’s algorithms pushed users to join Groups, like Pages, and engage with content that promoted dangerous conspiracy theories, encouraged violence, and actively spread hate speech.
A few of the many alarming examples include:
Facebook repeatedly promoted Groups touting false coronavirus cures, spreading dangerous misinformation about testing, and calling for violence against BLM protestors – Groups like “Hydroxychloroquine” “Fauci & Gates to prison worldwide Resistance” and “BLM is a Terrorist Organization” among others.
After ‘liking’ several ostensibly innocuous Pages, Facebook’s algorithm recommended users to a myriad of racist and anti-science conspiracy theories. Those Pages included “The Truth About Cancer” and “GreenMedInfo” – both of which were found to be top health misinformation superspreaders in a new Avaaz report – as well as ‘Lives Matter’, ‘Educating Libs’, and others that feature racist lies about BLM protesters.
The most outrageous content consistently won out. For example, an Accountable Tech analysis of CrowdTangle data revealed that the single highest-performing post on the platform about George Floyd was Candace Owens smearing him in an 18-minute-long video that’s now been viewed more than 94 million times.
# # #