“Once every village had an idiot. It took the internet to bring them all together”, are the memorable words of U.S. Army colonel turned historian Robert Bateman, quoted by the authors of LikeWar: The Weaponization of Social Media to illustrate the unreality that typifies the times in which we live. Disparaging as these words may seem, they suit the problem: masses occupy on-line echo chambers, like to be right, hate being proven to be wrong and dig heels in when someone explains the error of their/our ways. As LikeWar sets out in meticulous detail, such a climate is so thoroughly poisoned that we need help to escape it.
The authors, P. W. Singer and Emerson T. Brooking. Singer describes himself as a strategist who advises the U.S military; Brooking as an expert on conflict and social media, and a research fellow. The book is equipped with over a hundred pages of small-print end notes that anchor and triangulate the views they express in nine densely-argued chapters. The book is not a difficult read, is arguably more fact than opinion, and may cause you to go slack-jawed and, at worse, think that we are doomed.
The central argument is that we are ill-equipped to handle the instantaneity and immensity of the information that defines the social media age. March 2018 saw — every minute — an average of 5000,000 new comments, 293,000 new statuses and 450,000 new photos on Facebook; the uploading of 400 hours of video to YouTube; and the posting of 300,000 new tweets to Twitter. That excludes the torrent of metadata activity of who tagged or shared what. This outpouring is neither regulated nor moderated. Only the merest fraction of it can be fact-checked for accuracy — and, if it is, by whose standards does that happen?
The nature of this problem changes when automated bot-posting is added to this mix. The authors cite a Republican strategist named “MicroChip” who, when his systems were at full tilt, could spew out 30,000 re-tweets in a single day. Using a barrage of fake accounts he was able, by his own admission, to “Make whatever claims I want to make”. “That’s how this game works”, he said. So give that some industrial heft, as the Russian Internet Research Agency in St. Petersburg did and, in the final six months of the U.S. 2016 election, Twitter reported that Russian-generated propaganda had been delivered to users 454.7 million times.
The Statistic Brain Research Institute tells us that the average attention span of an internet user was 12 seconds in 2000. Five years later, that had fallen to 8 seconds. Apparently that is no more than the average attention span of a goldfish. (If you made it this far down this post, congratulate yourself.) Maybe this explained why only 10% of the professional media coverage of the 2016 U.S. election was devoted to the policies of the candidates. This fire-hose of attempted persuasion operated during the U.S. election with 400,000 bot accounts being uncovered on Twitter alone (two-thirds of which were backing Trump). On this side of the pond, during our contentious 2016 referendum, pro-Brexit Twitter bots outnumbered anti-Brexit Twitter bots by five to one.
As we see with the present incumbent of the White House, as the authors put it:
Serious reflection on the past is hijacked by the urgency of the current moment; serious planning for the future is derailed by never-ending distraction.”
Research apparently keeps showing a cardinal rule about how information spreads across the internet. It’s not the information’s accuracy or even its content; it is the number of friends who share the content first. An MIT research project unearthed the fact that rumour cascades on Twitter spread six times faster if they involved fake news:
Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information … Lies created by anyone, from anywhere, can spread everywhere.”
Textual analysis of tweets and their diffusion shows that anger will propel a tweet further than any other emotion. Put some angry tail-spin on a falsehood and it will take hold and spread across the internet. The authors cite plenty of examples, many of which have been shown to fuel common conspiracy theories.
Where LikeWar moves into the fast lane is when the authors’ attention moves to mass-processing and neural networks. Did you know that network-trained chatbots have no script, just the speech patterns deciphered by studying millions of conversations? They explained how the marriage of that technology with neural-network processing, which Microsoft did in 2016 with their Tay project, resulted in the fevered artificial brain of what appeared to be a teenage girl became over-run by external (human) trolls to the point that Tay tweeted “RACE WAR NOW” before being unplugged.
This loss of control is set to be compounded — massively — with the creation of what are known as deep fakes. A Montreal-based company, Lyrebird promises to “create the most realistic artificial voices in the world”. In 2017 they released a wholly fake conversation between Barack Obama, Hilary Clinton and Donald Trump. For voice, Adobe previewed a software prototype for editing and generating audio. Adobe VoCo (demonstrated but not released to the public) has been described as Photoshop for Audio. Such tools — and many more like them — put flesh on the Orwellian memory hole, pretending to write history and falsify the present. To disappearance, this new technology adds invention.
The authors close with consideration of what they call generative networks, systems that can study images and video clips and then build create their own versions. These AI systems will, the authors predict, be battling to “obfuscate, evade, disorient and mislead”. “Good” AI systems will be policing this outpouring. We will be caught in the middle.
Singer and Brooking end on an optimistic note by quoting a Wired article by internet sociologist Zeynep Tufekci:
Facebook is only 13 years old, Twitter 11, and even Google is but 19. At this moment in the evolution of the auto industry, there were still no seat belts, airbags, emission controls, or mandatory crumple zones” …
Let’s hope the internet can develop the necessary equivalents before it’s too late.
I feared this might have been a glib and superficial read, but I unhesitatingly recommend this book for its detail and authority.