The ground war in Afghanistan has skidded to its predictable, spectacularly unsuccessful conclusion, but the War on Terror, launched twenty years ago this week, lives on. To what end?
One measure of a revolution’s legacy is what becomes normal in its wake. To explain the legacy of September 11 is thus partly to ask what is taken for granted in the post-9/11 world. How did the attacks alter the questions we ask—and the answers we give—about the challenges we face as a nation? What distribution of wealth and power now seems so normal that it largely escapes critical reflection? And in service of this new normal, what institutions and practices did we create and embed in U.S. life?
There are four key legacies of 9/11. First, it radically changed our understanding of war. Second, it launched the surveillance state and narrowed our conception of privacy. Third, it elevated border security to a matter of national survival. And, fourth, by creating the impression that the stakes were not merely consequential but existential, the attacks of September 11 normalized previously unimaginable cruelty.
As a civil rights lawyer, I have spent much of the past twenty years challenging various aspects of the post-9/11 state, a journey that so far has included three Supreme Court cases on behalf of detainees. I have seen these changes, and the damage they inflict, firsthand.
A New Kind of War
The most important legacy of 9/11 is the change it produced in our understanding of war. Though today it is hard to imagine, prior to 9/11, transnational terror was understood to be a crime, not an act of war. That framing collapsed before the sun set September 11. “This is obviously an act of war that has been committed on the United States,” said Senator John McCain on September 11. “Everybody said it all day,” Peter Jennings of ABC News correctly observed. It was, Jennings continued, “a declaration of war, an act of war against the United States. Any number of politicians and commentators, us included, who were reminded that the last time there was an attack like this on the United States was Pearl Harbor.”
President George W. Bush adopted this framing as well, beginning on September 12. After meeting with his national security team, he told reporters the attacks “were more than acts of terror. They were acts of war.” The next day, after a morning call with New York City mayor Rudolph Giuliani and New York State governor George Pataki, the president told reporters, “an act of war was declared on the United States of America.” On September 15, he elaborated: “We’re at war. There has been an act of war declared upon America by terrorists, and we will respond accordingly.” He encouraged people to “go about their business . . . but with a heightened sense of awareness that a group of barbarians have declared war on the American people.”
But from the beginning, it was understood that the War on Terror would be different from the many that preceded it. The Holy Grail of our new war was not territory, but intelligence. The attacks were immediately constructed as an intelligence failure. ABC News called 9/11 “a desperate failure of intelligence in both the human and technical area.” The Washington Post described it as “a massive intelligence breakdown.” Leon Panetta, former chief of staff for President Bill Clinton and later director of the CIA and secretary of defense for President Barack Obama, said it “was clearly a colossal failure of our intelligence community.” Vice President Dick Cheney’s infamous warning on September 16—that the War on Terror would take the United States to “the dark side,” working “in the shadows in the intelligence world”—offered a glimpse into the foundational objective of the war: what the NSA called “Information Dominance.”
Achieving this mastery promised to be no small challenge. “The enemy is in many places,” Secretary of State Colin Powell warned. “The enemy is not looking to be found. The enemy is hidden. The enemy is very often right here within our own country.” Such sentiments were endlessly repeated. Our adversary was, Powell explained, “shadowy and elusive,” “capable of continually reinventing itself,” shape-shifters “who can run and hide almost everywhere,” “turning up in Hamburg, Amsterdam, Delray Beach, and Jersey City,” and “creating battle zones potentially anywhere.” Terrorists were supposedly segregated into highly disciplined cells that were nearly impossible to penetrate. Disrupting one cell would do nothing to disturb the others because each operated independently—according to Powell, “blending in locally, earning money at simple jobs,” planning and preparing until they were “activated” by an unknown, unseen mastermind.
As difficult as the challenge was thought to be, the need and urgency were believed to be even greater. Beginning the afternoon of September 11 and continuing without pause for months, virtually every organ that could claim a role in shaping U.S. thought reported endlessly on what might come next. Within a week of the attacks, normally skeptical media outlets such as the Boston Globe were breathlessly repeating a 1998 report by the Arabic news magazine Al Watan that Osama bin Laden had reached an agreement with Chechen gangsters to pay $30 million and two tons of opium in exchange for twenty nuclear warheads, which he planned to convert into “suitcase nukes.” In late September, the New York Times quoted Jerome M. Hauer warning that the nation was “woefully unprepared to deal with bioterrorism.” The likelihood is growing, the Times predicted, that “some rogue state or terrorist group will successfully deploy germ weapons.”
In this superheated milieu, elites in the United States framed the attacks as an intelligence failure that enabled “a group of barbarians” to drag the country into war. This has produced a war like none other in U.S. history. First, it is religio-ideological, since the conflict is ostensibly with a particular rendering of Islam rather than a state. Second, it is global, and the United States has maintained military operations in the War on Terror in more than eighty countries. As the 9/11 Commission explained, “the American homeland is the planet.” Third, it is permanent, since trying to defeat a weaponized religious fundamentalism with arms is like trying to stop the wind with a musket. Fourth, it is dystopian, since it depends on the state learning everything about anyone who might conceivably be, or become, a threat. And, finally, it is potentially apocalyptic.
Though the War on Terror has morphed over time, nearly all of its ugliness can be traced to this initial framing. We launched wars in Afghanistan and Iraq at unfathomable cost to human life, national treasures, and global stability. (Unfathomable but not incalculable; the Costs of War Project at Brown University estimates that, through 2020, the War on Terror has cost over $8 trillion and more than 929,000 lives, including more than 300,000 civilians). “We tortured some folks” in the name of intelligence, as President Obama acknowledged in 2014. We continue to detain people indefinitely and without meaningful legal process in offshore prisons. We maintain a drone war that has already claimed thousands of civilian lives throughout the Muslim world, including ten more in Kabul last week. And while the Biden security team is tinkering with Trump-era rules about the use of drone strikes outside battlefields, it still intends to conduct them and recently launched drone strikes in Somalia—not to support U.S. troops, but in defense of Somali government forces.
Perhaps the most distressing aspect of this framing is not that it came into being, but that it endures. In June of this year, the House of Representatives voted overwhelmingly to repeal the 2002 Authorization for the Use of Military Force (AUMF) in Iraq. The Senate is expected to do likewise. Yet there is no remote prospect that Congress will repeal the earlier and more significant 2001 AUMF, which the legislature passed in the frenzied days after 9/11 and which provides the ostensible legal justification for the larger War on Terror. As Brian Finucane wrote in Just Security earlier this summer, though Biden has made welcome “rhetorical gestures” toward ending the War on Terror, his actions “show an intent to continue—not to end—current conflicts.” In his attempt to excuse the debacle in Afghanistan, Biden recently reported—correctly and completely without irony—that, after twenty years of a global War on Terror, the terrorist threat “has metastasized well beyond Afghanistan”:
. . . al Shabaab in Somalia, al Qaeda in the Arabian Peninsula, al-Nusra in Syria, ISIS attempting to create a caliphate in Syria and Iraq and establishing affiliates in multiple countries in Africa and Asia. These threats warrant our attention and our resources. We conduct effective counterterrorism missions against terrorist groups in multiple countries where we don’t have a permanent military presence. If necessary, we’ll do the same in Afghanistan. We’ve developed counterterrorism over-the-horizon capability that will allow us to keep our eyes firmly fixed on any direct threats to the United States in the region, and act quickly and decisively if needed.
And so it continues.
The Modern Surveillance State
Befitting its founding objective—all-knowing intelligence—the defining feature of the War on Terror is not boots and bombs but surveillance: 9/11 launched the creation of the modern surveillance state. The NSA has spearheaded this transformation, embracing its former director Keith Alexander’s ambition to “get it all.” But the elusive quest for more and better intelligence is not confined to the NSA; it is the shared ambition of the eighteen federal organizations that claim a role in national security, including the CIA and FBI, and which collectively make up the Intelligence Community.
Of course, intelligence is only useful if it is acquired covertly. In the post-9/11 world, the complement of surveillance is secrecy: the aspiration to collect everything is twined with a determination to share nothing. The United States aims to achieve information asymmetries; it wants to know everything about “them” while ensuring they know nothing about “us.” This has produced a bipartisan commitment to prevent and punish transparency. Those who disclose bits and pieces of the surveillance mosaic risk almost certain prosecution. The Obama administration brought more Espionage Act charges against suspected leakers than all other presidencies combined, and the executive branch has repeatedly invoked the “state secrets” doctrine to shut down litigation that it claims would disclose aspects of its sprawling information-gathering machinery. (I am counsel in a case in the Supreme Court in which the government says information about the torture of my client at a now-shuttered CIA prison in Poland is a state secret.)
By design, this makes it impossible to describe accurately the contours of the U.S. surveillance state, though disclosures over the years have given us some insight into the scope of government spying at home and abroad. The New York Times disclosed in 2005 that within months of 9/11, the Bush administration began conducting warrantless surveillance of the international email messages and telephone calls of foreign nationals in the United States who were thought to have some connection to al-Qaeda. The next year, we learned that select telecommunication companies had allowed the NSA, again without a warrant, to harvest metadata—per the Project on Government Oversight, “the data on who you call and receive calls from, when, and how long they last”—on nearly everyone in the country who used a telephone.
In 2013 Edward Snowden leaked, and the Guardian published, an order by the Foreign Intelligence Surveillance Court compelling Verizon to give the NSA, on an “ongoing, daily basis,” the metadata on all telephone calls in its system. Comparable orders were given to other providers, meaning the NSA was vacuuming up the metadata on practically every telephone call in the country. In response to Snowden’s disclosures, Congress tried to rein in the bulk collection program in 2015, with limited success. In the last seven months of 2018, the NSA collected metadata on more than 19 million distinct phone numbers. Congress allowed the statutory authorization for this bulk collection of phone records to lapse at the end of 2019. We don’t know what, if anything, has taken its place. Other programs allow the government to capture not just metadata but content, again without a warrant. The NSA maintains it does not deliberately access the content of calls involving a U.S. citizen, but the extent to which this is true is impossible to confirm.
The NSA also harvests the metadata, and in some cases the content, of electronic communications, including email, Internet searches, videos, photos, and live chats. This was the heart of Snowden’s disclosures in 2013, which revealed the existence of PRISM, a program that enabled the NSA to gain access to major Internet companies, including Google, Facebook, Yahoo, Apple, Skype, and YouTube, all without a warrant. Again, the NSA maintains it does not deliberately access the content of any electronic communication of a U.S. citizen without a warrant, though we’ll have to take its word on that.
As with the new conception of war, the most surprising aspect of the surveillance state is not that it exists, but that it is so widely accepted. Universal monitoring of private behavior has become normalized. No doubt this is abetted by the concomitant rise of what Harvard business professor Shoshana Zuboff calls “surveillance capitalism.” In her 2019 book of the same name, The Age of Surveillance Capitalism, Zuboff describes Google’s surveillance technology, which allows it to monitor the online behavior of billions of users in real time. Google monetizes this constantly expanding cache of information by converting the data into predictions about customer behavior and then selling this information to advertisers. This ubiquitous state and private surveillance has fundamentally altered our conception of privacy. The right to privacy (“That’s none of your business!”) has been supplanted by the expectation of anonymity (“I’m not important; they would never bother with me.”).
Meanwhile, the moderate-liberal left—the left of the Democratic establishment and the Washington Post—now invokes surveillance not as a threat to liberty but as liberty’s best hope. In any police-involved shooting, the first question asked is, “Where’s the body cam?” as though it were axiomatic that officers should be monitored wherever they go. And it gets worse. The House Select Committee investigating the January 6 raid on the Capitol recently demanded—not requested—records from fifteen social media companies, including Twitter, TikTok, Google, Snapchat, and Facebook. The committee wants “records, including data, reports, analyses, and communications stretching back to spring of 2020” “related to the spread of misinformation, efforts to overturn the 2020 election or prevent the certification of the results, domestic violent extremism, and foreign influence in the 2020 election.” The stunning breadth of this demand is shocking, as is the expectation that social media companies can be entrusted to decide which personal “communications” “relate to the spread of misinformation” or constitute “domestic violent extremism.”
Thus, a meaningful fraction of the center left, no less than the right, aspires to round-the-clock surveillance of those they mistrust, and like the right, relies upon surveillance in the mistaken belief that it will provide an accurate, impartial rendering of contested events. And like the right, this attachment to surveillance is stubbornly impervious to whistle-in-the wind warnings about Big Brother. The expectation of surveillance has thus become an accepted—in fact, welcome—part of the cultural landscape.
In the decade prior to 9/11, it became commonplace for intellectuals to predict the end of the nation-state and the rise of a “borderless world.” Globalization, and particularly the accelerating trend toward transnational economic integration, would make the state increasingly irrelevant, we were told. A growing cadre of academics, business prophets, and politicians confidently proclaimed the demise of geography, and foresaw not just the free movement of capital but also of labor. Borders would fade into irrelevance; the boundary between the United States and Mexico would have no more significance than the Mason–Dixon Line. On July 2, 2001, the Wall Street Journal ran an editorial calling, as it had for decades, for a constitutional amendment that would guarantee “open borders for not only goods and investment but also people.”
9/11 dashed this vision, instantly making talk of a borderless world heretical. As journalist Daniel Denvir put it in his 2020 book All-American Nativism, “as the U.S. military rendered an expansive ‘Muslim world’ into a battlefield, the country became a home base, the borders of which required maximal fortification.” Within hours of the attacks, anti-immigration organizations had retooled their message to take advantage of the new reality. Dan Stein, executive director (now president) of the Federation for American Immigration Reform, quickly linked national security to hardened borders. “The nation’s defense against terrorism,” he said, “has been seriously eroded by the efforts of open-borders advocates, and the innocent victims of today’s terrorist attacks have paid the price.” Steven Camarota of the Center for Immigration Studies ridiculed open-border advocates as “ideological kooks.” In a stark and telling conflation of immigrants with terrorists, the California-based Voices of Citizens Together quipped, “Give us your tired, your poor, your terrorists.”
In service of the new focus on border control, Congress passed the Homeland Security Act in 2002, which created the Department of Homeland Security and enacted the largest reorganization of the federal government since the creation of the Department of Defense in 1947. As part of the reorganization, the many entities previously charged with border security—including Customs, Immigration and Naturalization Service, and Border Patrol—were restructured as part of Homeland Security, many of their responsibilities reallocated to the newly minted agencies of ICE and Customs and Border Protection. Since then, funding and staffing to secure the border has skyrocketed, ballooning from $5.9 billion and 10,700 agents in 2003 to a staggering $17.7 billion and 19,700 agents in 2020, all of which goes to show that French philosopher Alexis de Tocqueville had it right: “All men of military genius are fond of centralization, which increases their strength; and all men of centralizing genius are fond of war, which compels nations to combine all their powers in the hands of the government.”
The full effect of 9/11 on the border would be difficult to overstate. Framing terrorism as an existential threat elevated border security to a national imperative. What had always been an aspiration for many—“stronger” borders—suddenly became a matter of survival. And since complete control of the border is, and will always be, a physical impossibility, the border came to function as an enduring symbol of national weakness. Like perfect intelligence, secure borders became the thing we must have but can never attain. Any site of such great and unresolvable public anxiety is available for political manipulation: the target can shape-shift from Muslim terrorist to Mexican rapist and back again, but the alleged threat to national life remains the same. The border has never been so patrolled, fenced, and militarized, yet the idea of the border has never been so menacing, so terrifying. This border-terror is a legacy of 9/11, and what gave Trump’s tweet its potency: “A NATION WITHOUT BORDERS IS NOT A NATION AT ALL.”
Anyone glancingly familiar with U.S. history knows that Americans have always had a refined capacity for cruelty. But 9/11 unleashed a particularly virulent strain of the impulse to believe that some among us are less than human.
Americans did not immediately embrace torture. In late October 2001, the Washington Post ran an article about a number of suspects in FBI custody who refused to talk. Agents were growing frustrated. “We are known for humanitarian treatment . . . [but] it could get to that spot where we could go to pressure . . . and we are probably getting there.” The article electrified the torture debate, but the right and left rejected the possibility out of hand. After the Post article appeared, Fox News anchor Jon Scott interviewed Eric Haney, a founding member of the army special operations unit. Scott asked, “Maybe a little strong-arm tactic might be useful to get some info we need?” “It doesn’t work,” Haney answered. “Doesn’t produce the information that you need . . . and it’s always counterproductive.”
We didn’t know it at the time, but as commentators were opining, the government had already begun to consider the use of torture. In late March 2002, U.S. agents arrested Abu Zubaydah in a raid in Pakistan. At the time, they believed he was a high-ranking member of al-Qaeda with valuable information about past and future attacks against the United States. To ensure his isolation, the CIA transferred him to the first CIA black site of the War on Terror, a secret prison in Thailand. Initially, he was interrogated by experienced FBI agents who spoke Arabic and had long familiarity with al-Qaeda. They used conventional, noncoercive interrogation techniques, and Abu Zubaydah cooperated. What he told the agents, however, did not match what the CIA believed he knew. Convinced he was holding back, the CIA passed responsibility for his interrogation to a CIA contractor, psychologist James Mitchell, who was later joined by his colleague and fellow psychologist Bruce Jessen.
In the summer of 2002, Mitchell and Jessen, who had no expertise in al-Qaeda or Islamic fundamentalism, had never conducted an interrogation, had no training in law enforcement or terrorism, and did not speak Arabic, sold the CIA on the “enhanced interrogation program.” Abu Zubaydah was the guinea pig. For twenty consecutive days in August 2002, they tortured him. Eighty-three times, they strapped him to a board with his head lower than his feet while they poured water up his nose and down his throat. Just when he thought he would drown, they raised the board, allowing him a moment to vomit and gasp before they repeated the torture. During one session, Abu Zubaydah became “unresponsive, with bubbles rising through his open, full mouth.”
On other occasions, they slapped him and slammed him into walls, forced him into a tall, narrow box the size of a coffin, and crammed him into another box that would nearly fit under a chair, where he was left for hours. At least once, he was subjected to “rectal rehydration.” The objective was to “induce complete helplessness” and “reach the stage where we have broken any will or ability of subject to resist,” so the CIA could “confidently assess” that he was not holding back information.
They succeeded. By the sixth day of his torture, Abu Zubaydah was sobbing, whimpering, twitching, and hyperventilating. He was so broken that he complied with orders at the snap of a finger. At that point, Mitchell and Jessen believed Abu Zubaydah had no more information to give and recommended that the torture stop, but the CIA disagreed. The torture therefore continued for another two weeks, “on a near 24-hour-per-day basis,” until the CIA concluded that Abu Zubaydah had been telling the truth all along “and that he did not possess any new terrorist threat information.” The United States no longer maintains that Abu Zubaydah was a member of al-Qaeda or that he had anything to do with the attacks of September 11.
I confess I am not neutral here. I have represented Abu Zubaydah for fifteen years. I also represented Mamdouh Habib, an Australian national who was rendered by the United States from Pakistan to Egypt, where he was subjected to an assortment of ingenious tortures that variously involved the threat of electrocution, a small room that gradually filled with water, and a German Shepherd. Habib was released in 2005; Abu Zubaydah remains at Guantánamo. He has never been charged.
At least 39 people, of 119 detained, had their interrogations “enhanced” by the CIA. That doesn’t count those who were rendered to third countries with a well-deserved reputation for torture, nor does it include the prisoners who were tortured while in Department of Defense custody. The torture scandal was far more than an exercise in ritualized brutality. It was the first time that torture had been openly sanctioned and encouraged at the highest levels of U.S. government.
The link between the torture scandal and some of our more recent paroxysms of brutality cannot be traced with certainty, and I do not remotely suggest anything as direct as a causal link between then and now. But I know this: a nation that does not recoil from the thought of repeatedly bringing a man within sight of his own death will likewise nod approvingly at the thought of stripping innocent children from their parents and locking them in cages at the border. We accept state-sanctioned barbarity at the peril of our soul.
History always has the last word, and we do not know what 9/11 will mean in the fullness of time. In the end, I suspect the legacy of 9/11 will be the millions of lives we destroyed in its wake. This destruction follows inexorably from the toxic premise that some among us are beyond the circle of human concern, and that we get to decide who they are. At the end of that road lies a misery that borders cannot contain, surveillance will not prevent, and war will never excuse.