Arguing in the wrong agora
“Do you think that social media has made acts of extreme violence more prevalent?” ABC interviewer George Stephanapoulos asked Facebook CEO Mark Zuckerberg on 3 April 2019. “It’s hard to say,” Zuckerberg replied. “I think that’s something that’s going to be studied for a long time. I certainly haven’t seen data that would suggest that it has.”
What an extraordinary statement from the head of a company which has admitted to facilitating the Burmese military’s genocidal campaign against Myanmar’s Rohingya Muslim minority, which led to at least 10,000 Rohingya deaths in 2017. Around 400 Rohingya villages were destroyed, and gang rapes and other forms of sexual violence against Rohingya Muslim women and girls were widespread. Some 700,000 Rohingya fled Myanmar for refugee camps in Bangla Desh, where most of them remain. The use of the Facebook platform (and the affordance for bad actors it not only allows but condones) in facilitating this crime against humanity has been well-documented by real media.
What does Zuckerberg consider to be ‘data’, if not this? He can’t say he doesn’t know about it when his company not only ‘fessed up to its role in facilitating this terrible example of state terrorism, but he even wrote a personal apology to Burmese human rights activists after they called him out on not acknowledging that it was the volunteers in their NGOs, not well-paid Facebook staff, who had done most of the identifying of hateful content. The response of the six NGOs which Zuckerberg wrote to was: “This is grossly insufficient and only reinforces our belief that Facebook is not doing anywhere near as much as they should and could do to prevent the spread of hatred in Myanmar.”
That was April 2018. Fast forward to April 2019 and Zuckerberg can say to Stephanopoulos with a straight (indeed – earnest, sincere and smooth) face that: “… the hope is that by giving everyone a voice, you’re creating a broader diversity of views that people can have out there and that even if sometimes that surfaces some ugly views, I think that the democratic tradition that we have is that you want to get those issues on the table, so that way, you can deal with them.”
Before I read The Age of Surveillance Capitalism by Shoshana Zuboff I found it hard to know exactly what to make of this statement, let alone Zuckerberg’s pushback against putting a delay on live-streaming in the same interview, which he considered would “break what live-streaming is for most people”. He was being interviewed less than three weeks after more innocent Muslims had been slaughtered – this time while at prayer in their mosques in Christchurch – and the white supremacist who gunned them down had it live-streamed on Facebook as he did it. Not only had Zuckerberg seemingly learned or done nothing in a year of revelations of the cruelties and mass manipulations perpetrated via his platform, he could even say perkily to Stephanopoulos: “But one of the things this [the mosque shootings] flagged for me was the extent to which bad actors are going to try and get around our systems.” Before I read Zuboff I might (just) have given him the benefit of the doubt, of being truly unaware of the gross harms that his platform does. But since reading what she has to say, I am confirmed in my suspicions that he was lying through his teeth, and doing a very professional job of covering up the fact that he knows exactly why and how his platform does such harm – because that is its business model.
That business model has recently been exposed and criticised in books and articles by insiders as close to the action and as knowledgeable as Roger McNamee, Jaron Lanier, Douglas Rushkoff, Eli Pariser, and Jonathan Taplin. Other fine books about the phenomenon by academics and other writers include those by Robert Lustig, Zeynep Tufekci, Franklin Foer, Virginia Eubanks, Marcus Gilroy-Ware, Cathy O’Neil, Siva Vaidyanathan and Tim Wu.1 This work is very recent (all less than ten years, and mostly less than five), but where things were headed and how bad they could possibly get was well-signposted in the 1990s by Gene Rochlin (Trapped in the Net) and also Theodore Roszak in his seminal work The Cult of Information (1986 and 1994). Shoshana Zuboff’s book is the latest and most comprehensive of all these critiques, and it gets to the heart of the matter.
The problem is not the technology itself, or the privacy settings, or monopoly – although these are indeed problems. The problem at the base of these problems is capitalism, with its relentless drive for accumulating wealth. On top of this basic setting the high tech capitalists have been busy realising a capitalist’s wet dream – total social control in the service of wealth accumulation. In the hands of capitalists like the billionaire founders and owners of Google, Facebook and Amazon, data is a resource like gold, oil or human labour. It is to be exploited relentlessly, at every opportunity, with or without free consent. More often without, for who would consent to such exploitation if given a real choice, and a better alternative?
Rushkoff and Lanier quit Facebook and other social media when they could no longer keep pretending that the values and ethics of these digital platforms are morally acceptable. Further, they no longer wished to be manipulated and exploited by what Lanier calls the BUMMER business model (Behaviour of Users Modified to Make Empires for Rent). They are not expecting thousands (let alone millions) of users to join them any time soon. McNamee asks why this would be, and answers himself: “They crave convenience and utility. They struggle to imagine that they could ever be victims of manipulation, data security breaches or election interference, much less what those things would mean to them. They don’t want to believe that the screens they give their children might be causing permanent psychological harm.” As for users involved in politics, McNamee goes on: “Elected officials like the campaign contributions and technology they get from Silicon Valley. They like that tech is popular with voters. Confident that they would never need it, policy makers have not developed the expertise necessary to regulate technology.” (McNamee, 2019, p 206)
Perhaps Silicon Valley also likes the money it gets from Washington, and this is behind Facebook’s refusal to correct blatant lies told in political advertisements. Julie Carrie Wong reported on 11 October that [President] Trump’s Facebook page had launched 5,883 different ads since news of the Ukraine-call whistleblower broke on 18 September and had spent between $1.3m and $3.8m promoting the ads mentioning impeachment, which had been viewed between 26.5m and 43.9m times in three weeks.
Wong believes that Facebook’s bad behaviour has less to do with economic self-interest and “…more to do with Zuckerberg’s fundamental political programming, which defaults to protecting incumbent power above all else.” She says that “Over and over again, Zuckerberg and Facebook have failed to take action to prevent real and lasting harm, when that action involved challenging the status quo or state power.” She cites the Myanmar example given above as a typical Zuckerberg default position and concludes that “This predilection for the powerful is how Zuckerberg went from calling out Trump on building walls to an Oval Office grip and grin.”
Are there alternatives to surveillance capitalism?
Are there alternatives to being ruled by the tech barons and the liars they facilitate? In Arguing on the wrong axis I made the case that in this era of climate crisis it is necessary to re-think our political positions and move from the Left/Right Local/Global axis to the Modern/Terrestrial axis. That is the time dimension of politics – the times we see ourselves being in and the future we envisage as a result of those times, and (maybe) our efforts as public actors. Equally important is the space dimension. In Team Human Douglas Rushkoff addresses both the virtual and physical aspects of the new media, saying: “The internet platforms most of us use are not merely products but entire environments inhabited by millions of people, business and bots. They constitute a new public square, Main Street and fourth estate.” (Rushkoff, 2019, p 82, emphasis added.) In referring to the public square, Main Street and the fourth estate, Rushkoff is checking off the three traditional pillars of healthy democratic societies – open, face-to-face democratic meeting and debating among citizens (as in the original agora or public square of Athens), independent local commerce contributing to the local economy and society, and independent media holding political representatives, commerce and other actors with power to account. McNamee also makes frequent references to the agora or public square, as in: “To date, the platforms have not demonstrated any understanding of the responsibilities that come with control of the public square.” (McNamee, 2019, p 233)
To Zuboff, this is not at all surprising – they are not the least bit interested in democracy or responsibilities. On the contrary, the sooner such messy human stuff can be replaced by machine-based behavioural control the better. Part III of her book – ‘Instrumentarian Power for a Third Modernity’ – has six chapters on where and how this theory and practice originated, and how it is being applied right now. It is chilling reading – is this is what is in store for all of us? Total manipulation and domination, as depicted in not-so-fictional or futuristic form by Dave Eggers in his 2013 novel The Circle.
Eggers now promotes digital rights – which include the right not to be digital. He notes that humans on line become “less polite, less considerate, less rational” and “every time we move another aspect of our lives into the digital realm, that part of life gets weirder”. Further, “everything that goes online becomes far more likely to suffer abuse.” As an example of how wierd and abusive things can get, Eggers describes the Like button as “the digital equivalent of an eight-year-old’s sticker book, circa 1981.”
So is the human race now doomed to total domination by digital capitalists? Billionaires whose will to power is no less than that of the totalitarian dictators of the twentieth century, but whose methods of behavioural modification do not require torture and execution, because they have no interest in possessing and controlling your mind and spirit – only your ‘behavioural surplus’, which can be readily monetised. Maybe; maybe not. The pushback has begun, with Rushkoff, Eggers and many others working hard to promote pro-social alternatives, while the ‘individual choice’ alternatives promoted by Cal Newport in Digital Minimalism and David Sax in The Revenge of Analog also promote positive ‘unmodified’ social behaviours by those who adopt them.
Coming together in a real agora
Then there are activists like Greta Thunberg, who has inspired literally millions of young people to get off their phones, out of their classrooms and into the agora to call for action on climate change. Six months after Zeynep Tufekci published Twitter and Teargas, which covers the initial promise and ultimate limitations of digital methods of political mobilising and organising, I was visited by Emma, a very digital-savvy woman who had been involved in organising and taking part in the occupation of a literal public square (the Octagon, Dunedin) at the height of the world-wide Occupy movement. She had moved from using Facebook as her main organising and communicating tool with fellow Occupy activists because it was not fit for purpose. She used an open source tool which was much better instead. However, people who opposed the occupation were very active in attacking it in words via Facebook, and in trying to organise a physical attack the same way. Came the day when – according to the Facebook promises and Likes – hundreds of them were supposed to descend on the Octagon and put paid to the Occupiers.
The Occupiers needn’t have worried. The ringleader and a dozen or so others showed up, and hung out on the other side of the square looking daggers at the Occupiers. Emma bravely went over and invited them to come over and talk with her and the other Occupiers. They did so, and real communication ensued. In the course of it Emma commiserated with the ringleader about the number of followers he had on Facebook who said they were coming – and never showed up. He admitted that this was so, and a problem, and Emma responded that it was a problem she also had, with so-called support for Occupy on Facebook proving equally fickle.
In the real agora, you can do human-to-human things like that. It may be possible that you can do it in the digital agora as well, but the current dominant systems don’t allow for it, and indeed facilitate the opposite. Machines use humans, rather than the way it should be, which is the other way round.
Getting out of this mess is going to be really difficult, but the first step is to stop denying that it is a mess, and correctly identify what kind of a mess it is. This has been done from a variety of angles by the authors I have cited and listed below, and some of them have good ideas on what can and should be done to clean up the mess, and create pro-human alternatives. As Douglas Rushkoff says – let’s hear it for Team Human – to which I would add – meeting in a real agora, not the fake one of Fakebook.
In the week I started writing the above I received a poster from a local peace organisation, advertising the “Peace Train Interfaith Memorial Ride A 9 km memorial bike ride visiting a variety of places of worship in and around the centre of Christchurch.” The ride started at the Al Noor Mosque – the one made world famous by live-streaming on Facebook of the massacre there on March 15. The poster announced that one could “Find event details and updates on [you guessed it] Facebook.”
Beyond Ironic Stop Press
On May 24 the Guardian reported that “Facebook refuses to delete fake Pelosi video spread by Trump supporters”saying “Despite the apparently malicious intent of the video’s creator, Facebook has said it will only downgrade its visibility in users’ newsfeeds and attach a link to a third-party factchecking site pointing out that the clip is misleading. As a result, although it is less likely to be seen by accident, the doctored video will continue to rack up views.”….. “A Facebook spokesperson said: “There’s a tension here: we work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that reducing the distribution of inauthentic content strikes that balance. But just because something is allowed to be on Facebook doesn’t mean it should get distribution. In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.”
Eubanks, Virginia (2017) Automating Inequality How High-Tech Tools Profile, Police and Punish the Poor, St Martin’s Press
Foer, Franklin (2017) World Without Mind: The Existential Threat of Big Tech, Penguin Books
Gilroy-Ware, Marcus (2017) Filling the Void: Emotion, Capitalism & Social Media, Repeater Books www.fillingthevoid.wtf
Lanier, Jaron (2018) Ten Arguments for Deleting Your Social Media Accounts Right Now, The Bodley Head
Lustig, Robert (2017) The Hacking of the American Mind, Avery
McNamee, Roger (2019) Zucked Waking up to the Facebook Catastrophe, HarperCollins
O’Neil, Cathy (2016) Weapons of Math Destruction How Big Data Increases Inequality and Threatens Democracy, Allen Lane
Pariser, Eli (2011) The Filter Bubble What the Internet Is Hiding from You, Viking
Rochlin, Gene I. (1997) Trapped in the Net The Unanticipated Consequences of Computerization, Princeton University Press
Roszak, Theodore (1986; 1994) The Cult of Information A Neo-Luddite Treatise on High Tech, Artificial Intelligence and the True Art of Thinking, University of California Press
Rushkoff, Douglas (2014) Present Shock, Penguin Group
Taplin, Jonathan (2017) Move Fast and Break Things How Facebook, Google and Amazon have cornered culture and what it means for all of us, Macmillan
Tufekci, Zeynep (2017) Twitter and Tear Gas The Power and Fragility of Networked Protest, Yale University Press https://www.twitterandteargas.org/
Vaidyanathan, Siva (2018) anti-social media How Facebook Disconnects Us and Undermines Democracy, Oxford University Press
Wu, Tim (2016) The Attention Merchants, Atlantic Books
Zuboff, Shoshana (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Public Affairs