The past few months has seen a proliferation of think pieces about ‘fake news’, much of it overstated or wrong-headed, and much also ignoring the fact that fake news has been around as long as news has. But there is certainly some truth at the centre of it all, and it may be more a case that the rise of social networks has allowed for a new type of ‘personalised’ manipulation via fake, hyperbolic and/or emotive stories, and it is that which we are noticing.
For those wishing to better inform themselves – in order to protect themselves against this manipulation – I heartily recommend two articles in particular. The first is a DefenseOne article titled “Weaponized Narrative Is the New Battlespace“:
Weaponized narrative seeks to undermine an opponent’s civilization, identity, and will by generating complexity, confusion, and political and social schisms. It can be used tactically, as part of explicit military or geopolitical conflict; or strategically, as a way to reduce, neutralize, and defeat a civilization, state, or organization. Done well, it limits or even eliminates the need for armed force to achieve political and military aims.
The article hits the nail on the head, I think, by pointing out the ‘information overload’ we are now experiencing makes us vulnerable to oversimplified, emotive narratives (a key component also in the rise in ‘populist’ movements):
Cultures, institutions, and individuals are, among many other things, information-processing mechanisms. As they become overwhelmed with information complexity, the tendency to retreat into simpler narratives becomes stronger.
Under this stress, cultures fragment. Institutions are stretched until they become ineffective or even dysfunctional. Individuals who define their identity primarily through the state – such as Americans, Russians, Chinese, or Europeans – retreat to a mythic Golden Age nationalism, while those who prioritize cultural and religious bonds retreat to fundamentalism.
…By offering cheap passage through a complex world, weaponized narrative furnishes emotional certainty at the cost of rational understanding. The emotionally satisfying decision to accept a weaponized narrative — to believe, to have faith — inoculates cultures, institutions, and individuals against counterarguments and inconvenient facts. This departure from rationality opens such ring-fenced belief communities to manipulation and their societies to attack.
While the observations in the DefenseOne article are mostly about a new type of battleground between nation states, the second article I recommend takes this one step further, and shows how any rich and powerful individual can push their own political view by manipulating us via weaponized narrative that uses our own social data against us. The article, “Robert Mercer: the big data billionaire waging war on mainstream media“, starts off rather blandly, discussing one of the biggest funders of Donald Trump’s presidential campaign, Robert Mercer – “a billionaire who is, as billionaires are wont, trying to reshape the world according to his personal beliefs”. It is the later part of the article, when it discusses how Mercer is doing this, that we should all be paying major attention to:
there was another reason why I recognised Robert Mercer’s name: because of his connection to Cambridge Analytica, a small data analytics company. He is reported to have a $10m stake in the company, which was spun out of a bigger British company called SCL Group. It specialises in “election management strategies” and “messaging and information operations”, refined over 25 years in places like Afghanistan and Pakistan. In military circles this is known as “psyops” – psychological operations. (Mass propaganda that works by acting on people’s emotions.)
On its website, Cambridge Analytica makes the astonishing boast that it has psychological profiles based on 5,000 separate pieces of data on 220 million American voters – its USP is to use this data to understand people’s deepest emotions and then target them accordingly. The system, according to Albright, amounted to a “propaganda machine”.
…[According to the communications director of the Leave.EU (Brexit) campaign], Cambridge Analytica had worked for them…it had taught them how to build profiles, how to target people and how to scoop up masses of data from people’s Facebook profiles.
Facebook was the key to the entire campaign. A Facebook ‘like’, he said, was their most “potent weapon”. “Because using artificial intelligence, as we did, tells you all sorts of things about that individual and how to convince them with what sort of advert.
Facebook profiles – especially people’s “likes” – could be correlated across millions of others to produce uncannily accurate results…with knowledge of 150 likes, their model could predict someone’s personality better than their spouse. With 300, it understood you better than yourself.
According to an expert in the field, Professor Jonathan Rust:
The danger of not having regulation around the sort of data you can get from Facebook and elsewhere is clear. With this, a computer can actually do psychology, it can predict and potentially control human behaviour. It’s what the scientologists try to do but much more powerful. It’s how you brainwash someone. It’s incredibly dangerous.
It’s no exaggeration to say that minds can be changed. Behaviour can be predicted and controlled. I find it incredibly scary. I really do. Because nobody has really followed through on the possible consequences of all this. People don’t know it’s happening to them. Their attitudes are being changed behind their backs.
Quoting short sections doesn’t really do either of the articles justice – I heartily recommend reading them both in their entirety to understand how vulnerable we all are to manipulation in the 21st century. But how do we combat these types of strategies? Your suggestions are more than welcome in the comments section!