Social engineering has proven it is possible to know the societal or 'systematic' determinants of human 'behavior' in a way that permits them to be manipulated and controlled from afar.
Our minds are vulnerable. The weaknesses in our thinking and decision-making processes are well documented. When these vulnerabilities and weaknesses are exposed to professionals with nefarious intent who are trained in social engineering techniques bad things happen. Social media and messaging platforms both enable scaled access to and profit from these vulnerabilities. They expose the brains of billions (Facebook has over 2.7 billion users) to these techniques by selling and promoting access to our innermost feelings and emotions.
It is critical that we as humans, neighbors and communities understand how social engineering techniques work on our brains and our societies. We must recognize these methodologies so we can defend against them. We need to identify them and call them out. We must warn others. We must legislate against these techniques and hold social media and messaging platforms accountable.
Social media apps and platforms are used by our children, the elderly and everyone in between. These platforms give direct access to our brains. They support mind manipulation at scale. What follows is an extensive, although incomplete list of the strategies, tools and techniques of social engineering I have gleaned from literally hundreds or articles, academic papers and reports. I have organized them into six categories: Amplify and Promote, Constraining, Emotions and Motivations, Mind Manipulation, Personal Information and Strategies and Tactics.
Before jumping into the list of techniques below, I want to first define the word "professional" that I use numerous times. In this context I use it to describe people that make money practicing the dark art and science of social engineering to manipulate our thinking.
Amplify and Promote:
- Deception spreads faster than the truth.
- False political news gets distributed significantly farther, faster, deeper, and more broadly than the truth on Twitter, and this is because humans (rather than bots) spread the false news.
- It is easy to mislead certain groups of citizens, especially older people and those prone to conspiracy theories.
- Those with a polarization toward conspiracy theories are most inclined to spread unverified rumors.
- Professionals know how to use social engineering to amplify existing resentments and anxieties, raise the emotional stakes of particular issues, stir distrust among potential coalition partners, and subtly influence decisions about political behaviors.
Constraining:
- Professionals use social engineering to promote, control and disrupt a given narrative.
- Social engineers know that altering people's exposure to information in ways that constrain their choices and behavior influence their emotions and elicit certain behaviors.
- Studies show conspiracies online are concentrated within the communities who already agree with them.
- “Filter Bubbles” limit exposure to outside sources of views and opinions. This lack of engagement with different ideas makes it difficult to offer factual correctives to disinformation circulating within filter bubbles.
- Dissenting information is mostly ignored or may even serve to increase a group’s polarization.
- Framing or creating false and misleading context for an argument helps the audience resist counter arguments.
- Deterring voters with falsehoods and fear seems to be easier than motivating them with facts and hope.
- When faced with new information, humans often fall back on habits, tradition or prevailing group-think, rather than systematic information-processing strategies and decision-making.
- Nudges are strategies used to limit-choices in an audience. They reinforce habits and traditional ways of thinking, which are resistant to new information and change. Preclusion of reflection and deliberation, and limiting people's choices in order to change their behavior, is coercive.
- Professionals understand how to find the right news to put in front of the right micro-targeted audiences. They know what works and what doesn’t.”
- Professionals present “selected information to audiences to influence their emotions, motives, objective reasoning, and ultimately their behaviors.
- Social media and messaging platforms are used to harass and discourage specific individuals from taking part in public debate or taking specific actions.
- Professionals draw attention to minor issues or actions in order to distract from the real - key issue. Such activities tend to focus on the information environment, seeking to dilute, flood or poison it with alternative messages.
- Professionals engage in “voter suppression operations,” which are intended to influence voters leaning in favor of their opponents to not vote at all.
- In the past professionals have used Facebook quizzes given to thousands of Facebook users to collect a cache of 87 million Facebook profiles which were used to motivate some voters, and to suppress others.
- Professionals cherry-pick facts to advance their own narrative.
- Professionals hijack narratives and national conversations with provocative falsehoods designed to redirect conversations and distract.
Emotions and Motivations:
- Professionals appeal to non-rational motivations, emotional triggers and unconscious biases.
- Professionals use psychographic profiles, neuromarketing tools and data to determine the emotional impact of campaigns, and to learn how to tailor persuasive political messages to appeal to an audience’s psychological needs.
- The goals of professionals are to modify attitudes and shape the target audience’s psychological processes, motivations and ideas.
- Professionals pushed extreme rumors of child-trafficking cabals and falsehoods about voter fraud and mail-in ballots to erode the confidence and will of already hard-to-motivate voters.
- Professionals are trained to manipulate social processes, including relationship development and group dynamics.
- Professionals utilize behavior-change campaigns, which involve profiling individuals and automatically tailoring the adverts presented to them based on personality.
- Professionals often play both sides of a conflict or debate - praising one side while degrading the other. This creates division in the community, turning one against another - which was their goal.
- Research has shown that less informed voters are not only less likely to vote but more likely to believe falsehoods.
Personal Information:
- Using algorithms based on an individual's “likes” of public Facebook pages, can accurately and automatically (using artificial intelligence) predict an individual's personality traits.
- Algorithms can also predict highly sensitive personal attributes including political and religious views, sexual orientation, ethnicity, intelligence, happiness, use of addictive substances, parental separation, age, and gender.
- Cambridge Analytica was reported to have had between 2,000 and 5,000 individual pieces of personal information on every person in the USA over the age of 18 years.
- Once artificial intelligence has analyzed digital footprints, they have been found to be a better judge of personalities than even close friends, family, spouses and colleagues.
- Computer-based models are significantly more accurate than humans in personality judgments.
Strategy and Tactics:
- Changing and adjusting news feeds and/or search results has been shown to influence voting behaviors. This is recognized as coercive behavior.
- Social engineering campaigns often follow a pattern: Analyze audience. Associate data with individual people. Initiate micro-targeting. Start digital outreach with personalized messages. Start a program for voter engagement. Start fundraising.
- Political campaigns now combine public voter files with commercial data from data brokers to develop highly granular and comprehensive voter profiles.
- Using Facebook's advertising platform one can automatically expand the number of people a political campaign can target by identifying voters who have “common qualities” that “look like” known politician X supporters on Facebook, and others identified as “persuadables” and swing voters.
- Behavioral data is used by politicians to target voters with tailored messages that align with their daily activities, such as hearing a radio advert about education when dropping off one's child at school.
- Professionals use behavioral microtargeting to understand individuals' complex personalities; then use psychologists to determine what motivates these individuals to act; and then they engage a creative team to tailor specific messages to those personality types.
- Professionals get unsuspecting users to answer personality questions that can be used to train algorithms, and to generate personality scores for the app users and their Facebook connections. These personality scores are then matched with US voter records, which enable voter profiling and targeted advertising services.
- Professionals use microtargeted ads, follow-up surveys, and large data sets to win over low-education, less-informed voters in critical areas in order to win elections.
- Professionals use social engineering to influence decisions, perceptions and behaviors of political leaders, populations and targeted groups.
- Professionals use social engineering methodologies to exploit existing societal and individual vulnerabilities in opinion formation.
- Professionals use psychographic targeting to create personalized content designed to influence and/or manipulate individuals or groups selected on the basis of their psychographic profiles.
- Social engineering is now widely used because of the ease, speed and virality of information dissemination as well as the increasing reach, scale, penetration, precision and personalization of information targeting that have made these techniques possible and affordable.
- Professionals use social engineering techniques to establish a coherent narrative and to establish a common perspective among their targets/audiences.
- Professionals recruit and promote an ideology using highly individualized, targeted political propaganda based on their target’s interest and preferences.
- Professionals implement disruptive social engineering campaigns to disrupt or destroy an emerging or existing narrative by generating polarization to foment distrust, and spread disinformation among key policy-makers in order to disrupt their decision-making and opinion-forming processes.
- Professionals deploy “digital influence systems” consisting of technologies for surveillance, targeting, testing, and automated decision-making designed to influence a target audience.
- Professionals use social engineering techniques to affect an audience’s choices, ideas, opinions, emotions or motivations, and interfere with their decision-making processes.
- Professionals use social engineering methodologies to exploit their targets’ opinion-formation and decision-making processes through various techniques such as trolling, social bots, memes and dark ads ( a type of online advertising visible only to the advert's publisher and the intended target group).
- Political professionals use social engineering campaigns to target audiences when they are most vulnerable to manipulation through three main strategies: mobilizing supporters; dividing an opponent's coalition; and leveraging influence techniques informed by behavioural science.
- Professionals combine psychological research and data-driven targeting to identify an audience's vulnerabilities.
- Professionals use big data, Facebook and personalized messages to do three specific things - persuade voters, find more of them, and raise big money.
- Political professionals send follow-up surveys to users who have seen their ads, so they can learn which content, narratives, and messengers are most effective at impacting approval ratings, voter enthusiasm, and vote choice across a range of issues.
Read more on the Future of Information, Truth and Influence here:
- Echo Chambers, Old and New Media Influence and Symbiotic Business Models
- The Vulnerable Targets of Social Engineering and Mind Manipulation
- Disinformation is Both Expensive and Deadly
- Social Engineering Escapes the War Zone
- Fooled by Psychographic Profiles and Social Engineering
- Social Engineering - Mind Manipulation at Scale
- Conspiracy Theories and Their Impact on Employment Opportunities
- Ideas as Competitive Advantages
- Facebook Decides What People Think
- Twenty-One People Who Control the World
- The Utility of Truth
- Human Thinking as Friction
- Selling Beans During Boycotts, Buy-cotts and Disinformation
- Mixing Business and Politics Requires a Strategy
- Swarming and the Requirement for a Chief Values Officer
- We Can be Silent No More - Influencer Strategies and Responsibilities
- Our Minds on Facebook Algorithms
- Secrets, Brands and Global Swarming
- Facebook's Infodemic on the Pandemic
- Reality is Required
- Covid-19 and the Value of Ideas
************************************************************************
Kevin Benedict
Partner | Futurist | Leadership Strategies at TCS
View my profile on LinkedIn
Follow me on Twitter @krbenedict
Join the Linkedin Group Digital Intelligence
***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I work with and have worked with many of the companies mentioned in my articles.
No comments:
Post a Comment