In our lead essay, we argued that Democracy’s Dilemma is that the information flows that democracy relies on can also be weaponized against it. This means that we need to start to understand democracy as an information system to be defended, analyzing its attack surfaces and modeling the threats against it.
We are enormously grateful to everyone for their sharp, insightful responses. We recognize that we’re at the beginning of a long intellectual process, and we do not believe that everything we wrote will turn out to be right. Our goal is to provide a framework to think about the issues of influence operations and their effects on democracy, and—more importantly—to stir debate and disagreement. Other people, with different knowledge and perspectives than ours, are an important part of this. It is heartening that such a smart and varied bunch of people have come up with so many interesting things to say.
Some commentaries start from a similar way of thinking to ours but emphasize different tradeoffs. Riana Pfefferkorn highlights the benefits of anonymity and pushes back against our suggestion that some kinds of identity authentication might help address the abuse of public comment systems. We agree that anonymity plays an important part in democratic speech and that many problems, such as fake news, propaganda, and hate speech, do not magically go away once people are identified. What we would say (and we suspect she would very likely agree) is that the key first step is to recognize that there are tradeoffs. We are only beginning to think about how flooding attacks—overwhelming torrents of speech—can overwhelm democratic systems, and figuring out the appropriate responses will require practical experimentation rather than the appeals to abstract principle that have often dominated debate.
Joe Nye draws a helpful distinction between “soft power” (a concept he has spent decades developing) and “sharp power.” He usefully highlights the costs of defenses against information attacks that lurch in the direction of increased authoritarianism. “We had to destroy democracy in order to save it” is not a winning strategy, and democracies should not employ propagandistic attacks. This said, we don’t think we should focus on whether a particular information operation involves soft or sharp power. We think time is better spent on understanding the nature of the political information systems it is attacking. The relative brittleness of different information systems vis-à-vis different kinds of attacks is an important empirical question, as are the differences between autocracies and democracies. This is why we emphasize the different common knowledge requirements of autocratic and democratic systems, which help explain why media outlets such as Russia Today can play a political, stabilizing role in autocracies such as Russia (as Peter Pomerantsev and others have argued), while at the same time destabilizing democracies.
Allison Berke is generally worried about foreign attacks, which she sees as fundamentally different than domestic information campaigns. Our way of thinking about the problem leads us to disagree. Just as you need to worry about insider attacks in traditional information systems, where authorized users can easily do far more harm, democratic systems too can be undermined by politicians and citizens. That said, changes made to secure democracy against inside attacks may risk doing more harm than good by, for example, closing off democratic openness. This, too, needs to be part of the discussion.
Berke’s disquiet speaks to a broader question: is it appropriate for outside actors to have any influence within a democracy? We think that it is. Since U.S. actions have outside consequences for billions of human beings who happen to have been born without U.S. citizenship, it seems to us wrong that they should not express their voice. Here, Nye’s distinctions may be useful. There are wide-ranging debates in political theory about cosmopolitanism. Philosophers such as John Dewey argue that as problems become more global, democratic institutions need to shift to accommodate the people whom the problems afflict.
We owe a particular intellectual debt to Anna Grzymala-Busse—critical parts of our argument come from conversations with her—and she identifies an important disagreement. Grzymala-Busse says we need to be more respectful to norms, as the “crucial underpinning” of democracy, taking Steven Levitsky and Daniel Ziblatt’s side in our friendly disagreement with them. Levitsky and Ziblatt argue that the current problems of democracy are in large part a result of the decay in stabilizing norms, with the implication that we need to return to them. This is an important debate, and possibly the most important theoretical debate about democracy right now. We don’t have space to provide the full response that her argument deserves, but we can at least sketch it out.
Our approach, which stresses the importance of technological change and democratic disagreement, is clearly incomplete—but so too is that of Levitsky and Ziblatt. It is hard to take arguments that stress the stabilizing force of institutions or norms and combine them with arguments that explain how these institutions or norms may themselves change in a dynamic process. This is one reason why people who were not happy with the state of democratic politics before the last few years of Donald Trump, Viktor Orbán, and others tend to be more skeptical of norm-based accounts; they think that we need to destabilize some norms to strengthen other aspects of democracy or to respond to genuinely existential threats, such as global warming.
Yet whatever your position in this broader dispute, there is a more pragmatic question: are the problems we are discussing the results of decaying norms, or of the technological changes that make certain actions easier and cheaper? When we look at how politicians game elections or lobbyists rig public commenting systems, we suspect that the major problem isn’t that they have stopped obeying political norms. We think that they’ve never been bound by norms, but by technological limits. The technological changes that have made online commenting easier have also made flooding attacks, which are nothing new, easier and more effective. Lobbyists tried to flood commenting processes back in the era of fax machines. Similarly, what is new in election shenanigans is not people’s willingness to manipulate the process, but their access to much more effective means of manipulation. This is why we focus on the relationship between technology and disagreement in these cases. Norms can surely help people who disagree radically to live together in a democratic society, but technology-enhanced spirals both cause normative change and act as an independent cause of change in behavior.
Jason Healey’s comments about the mismatch between the rapid acceleration in the speed of change and the far slower ability of politics to respond are very well taken. The question he asks is how best to slow things down. One possible response is the one that he raises: regulation like the European Union’s GDPR. More broadly, the European Union has often adopted a version of the “precautionary principle.” Under this principle, rather than letting things happen and trying to deal with problems afterwards, regulators should try to anticipate problems in advance and regulate to prevent them from occurring. This approach is widely disliked by Silicon Valley firms that have adopted “move fast and break things” as an existential credo. When the things that they are breaking potentially include democracy and society, the scales necessarily tip towards precaution.
Precaution may not be nearly enough. There is a lot of common ground between Astra Taylor’s and danah boyd’s essays. Both of them politely but emphatically suggest that our framework isn’t nearly ambitious enough. Securing democracy against attack will require nothing less than what Taylor describes as a remaking of the “underlying political economy of the Internet” and reforms that in boyd’s words would “prevent financialized interests from controlling our information ecosystem.” The problem that both of them identify is that our information system is driven by commercial interests and shaped by gross inequalities of power, rather than by and for democratic needs.
We think that boyd and Taylor are right. What we have now is not “freely available information” of the kind that is necessary to democracy, but instead corporate-curated information designed to maximize engagement and profit. This helps explain why it was easy for Russia to weaponize social media; it was able to use an ecosystem that was designed to maximize engagement over anything else and connect would-be commercial-influencers with an audience. This shouldn’t lead one to overestimate the efficacy of these attacks; most advertising is ineffective, whether it is corporate or propaganda. However, it does highlight how our current information ecosystem is simply not well suited to democracy. And, as Berke notes, this is only likely to get worse as new generations of commercial and political influencers start to employ generative adversarial networks to skew debate further.
In any case, fixing the political economy of social media wouldn’t provide a complete solution to Democracy’s Dilemma. We would still be faced with internal and external challenges to democracy’s dynamic stability and problem solving ability. If one examines the attack surface of most democracies, the current social media architecture creates a multitude of vulnerabilities. As always, it is far harder to identify the solutions than the problems, and we are certainly going to make mistakes.
We will come up with better solutions and make fewer mistakes if specialists from a variety of perspectives come together to think, to argue with each other, to provide practical suggestions, and to engage with a public that has its own understanding of the issues. We think that the kind of frank, useful, and goodhearted debate that has happened over the last couple of weeks is an excellent start, and—again—we are extraordinarily grateful to the participants for engaging in it.