Facebook's Infodemic on the Pandemic

I am a long-time technology enthusiast, analyst and futurist, and love to discuss and write about the positive impact emerging technologies offer humanity.  That said, we can’t be blind to the negative impacts as well.  Facebook’s algorithms are increasingly showing up as the source and amplifer of many false and misleading postings that are widely distributed and promoted causing serious consequences for us all.  The sheer volume of false information being distributed today by Facebook algorithms is overwhelming the truth.

A recent research paper by the organization Avaaz titled Facebook's Algorithm: A Major Threat to Public Health, found that health misinformation spreading networks on Facebook appear to have outpaced authoritative health websites, despite the platform’s declared aggressive efforts to moderate and downgrade health misinformation and boost authoritative sources.  This finding suggests that "Facebook’s moderation tactics are not keeping up with the amplification Facebook’s own algorithm provides to health misinformation content and those spreading it.” In other words, Facebook’s amplification algorithm robots are battling and winning over their moderation robots that are trying to protect truth.  This is an important research finding, because when false information overwhelms truth on Facebook people’s lives are at risk.

This year in the month of April 2020, 82 websites, a relatively small number, that were flagged by NewsGuard for repeatedly sharing false information, received over 460 million estimated views on Facebook.  That is a massive amount of influence from websites already identified as regularly sharing false information.

The total estimate for the past 12 months is that false health information on Facebook was viewed 3.8 billion times across the five countries in the study — the United States, the UK, France, Germany, and Italy.  When false health related information is viewed that many times, a lot of people are going to believe it – with serious consequences.

I don’t believe Facebook is purposely trying to destroy the world, as they do need living, breathing, humans as users and customers. I just think they have created a Frankenstein monster and no longer have full control over it. Here is what Facebook says, “False news is bad for people and bad for Facebook. We’re making significant investments to stop it from spreading and to promote high-quality journalism and news literacy....our adversaries are going to keep trying to get around us. We need to stay ahead of them, and we can’t do this alone.” I just hope they realize that in a lot of documented cases, their adversaries are their own internal amplification robots and algorithms.

************************************************************************
Kevin Benedict
Partner | Futurist | Leadership Strategies at TCS
View my profile on LinkedIn
Follow me on Twitter @krbenedict
Join the Linkedin Group Digital Intelligence

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I work with and have worked with many of the companies mentioned in my articles.

No comments:

Featured Post

Leadership Advice from a Futurist - A Reading

Leadership is hard.  So for all the leaders and want-to-be leaders out there, here is some advice that I hope you will find useful. ***...