As a futurist, I write often about the advantages of digital transformation for organizations and how early adopters gain extra advantages that aren’t available to laggards. One of the best demonstrations of this point was when Brad Parscale, the digital director of Mr. Trump's 2016 campaign, shared that using Facebook was an important factor in their win. In his words, "Facebook moved the needle for us." He understood how Facebook's computer algorithms worked before others did.
Let’s pause a moment to define what computer algorithms are. A computer algorithm is software code written by people - in this case Facebook employees. Algorithms consist of rules and code that enable software to perform automated reasoning. How does Facebook use them? Kevin Roose, a technology columnist for The Times, describes it as follows, "The platform [Facebook and its algorithms] are designed to amplify emotionally resonant posts, and people and organizations that are skilled at turning passionate grievances into powerful algorithm fodder win.”
Facebook’s algorithms are programmed to amplify content based on these rules: controversy wins, and negative content beats positive content. Facebook’s algorithms love arguments, debates and agitation. Parscale understood this before his opponents. A recent Forbes article also supports this view, "The recommendation algorithms on social media might be complex and somewhat mysterious, but they generally favor engagement; thus, controversy." If you want to attract a mass audience on Facebook or many other social media sites be controversial - that's how the algorithms are programmed. It's not truth or virtue, it's whatever causes audience engagement (i.e. high blood pressure). Higher audience engagement, not surprisingly, equates to higher ad revenue for Facebook.
Here is how Facebook described their 2019 algorithm update, "This update will prioritize posts that spark conversations and meaningful interactions between people. To do this, we will predict which posts you might want to interact with your friends about, and show these posts higher in feed. These are posts that inspire back-and-forth discussion in the comments and posts that you might want to share and react to." What inspires more back-and-forth comments and posts than controversial topics? Could it be that your friends, family and neighbors are not naturally as angry as they appear on social media? Could it be that Facebook algorithms are artificially manipulating and organizing content to agitate us? The answer is - apparently.
Software algorithms aren’t magic. Groups of men and women working in a conference room design the logic and reasoning on a whiteboard. They choose what they want the software code to do. Men and women programmers then sit down in front of a computer screen and write the code. They are very intentional about it - although the full impact of their intentionality, might be unintentional.
Here is more from The Times’ columnist Kevin Roose, “We now know Facebook’s product decisions can make or break political movements, move fringe ideas into the mainstream, or amplify partisan polarization.” Facebook's algorithms are now influencing hundreds of millions of users daily including children, seniors and those less familiar with how social media platforms work.
Humans, as social creatures, seek to be in a community. Facebook fills that need, but at a price. Users must give up control as to what they see and read on Facebook to computer algorithms. These algorithms are sponsored. Organizations pay Facebook for the right to influence our thinking using their algorithms. They maximize the influence they can have on our minds, and then sell access to their customers. Here is how Andrew Marantz of the New Yorker describes it, "Facebook is a company whose business model depends on monitoring its users, modifying and manipulating their behavior, and selling their attention to the highest bidder."
Facebook's algorithms are super effective at getting reactions. My nearly 80-year old mother knows just about all the conspiracy theories, falsehoods and stories designed to manipulate our thinking and opinions as soon as they get shared on Facebook. Her network is highly efficient at reading, reacting and sharing false stories, rumors and unproven claims. Although she believes she knows which ones are false, our conversations are filled with her repeating unproven and unverified stories and rumors.
The concern for all of us is that we may be influenced in ways we don’t understand, recognize or want. Computer algorithms, artificial intelligence and machine learning working 24-hours a day are analyzing our Facebook feeds, photos, comments and friends continuously so they can best understand how to influence our thinking. Algorithms then alter what we see and read based on what sponsoring organizations desire us to think. As a result of Facebook’s business model, we may be fed false and/or unprincipled information that impacts our thinking and opinions without us fully recognizing it.
Critics' concerns about the content being presented to Facebook users is not just about hypothetical scenarios. More from Marantz, "Facebook has built the world’s biggest microphone and then they choose to rent that microphone to liars, authoritarians, professional propagandists, or anyone else who can afford to pay market rate." When Facebook’s founder Mark Zuckerberg was questioned about hosting, sharing and promoting false, misleading and potentially harmful information he answered, “I believe strongly that Facebook shouldn’t be the arbiter of truth for everything that people say online.” His position so far has enabled Facebook to continue to profit from the sharing and promotion of concerning information.
Facebook discovered how to attract and entertain our socially-oriented minds, and then wrote code to influence how we think. They are now selling access to our minds to the highest bidder with little concern for the truth or virtue of the content they share and promote. Are we OK with that?
As a futurist we don't so much predict as consider various future scenarios. It's just a fool's errand to start predicting in a complex world. But in a scenario where we are all participating in social media networks that feed us controversies without end in order to keep up arguing and fighting as a way for them to make profits, I don't foresee a united, peaceful and reasoned outcome. I am hoping we can create for ourselves a different scenario.
Read more on the Future of Information here:
- In 2021, Facebook Decides What "We the People" Think
- Twenty-One People Who Control the World
- The Utility of Truth
- Selling Beans During Boycotts, Buy-cotts and Disinformation
- Mixing Business and Politics Requires a Strategy
- Swarming and the Requirement for a Chief Values Officer
- We Can be Silent No More - Influencer Strategies and Responsibilities
- Our Minds on Facebook Algorithms
- Secrets, Brands and Global Swarming
- Facebook's Infodemic on the Pandemic
Kevin Benedict
Partner | Futurist | Leadership Strategies at TCS
View my profile on LinkedIn
Follow me on Twitter @krbenedict
Join the Linkedin Group Digital Intelligence
***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I work with and have worked with many of the companies mentioned in my articles.
No comments:
Post a Comment