Get our latest essays, archival selections, reading lists, and exclusive content delivered straight to your inbox.
Yemen has been gripped by war since 2015, but it was not until last month that the United Nations determined that the country was experiencing famine.
Arriving at that conclusion has been contested and slow, and we don’t yet know how many scores of thousands of Yemenis have died from hunger and related causes over the last three and a half years. But splitting hairs over this or that technical definition of “famine” is a distraction. The suffering there is sufficiently grave. Indeed, the tragic reality is that even without official statistics or labels, Yemen’s is the defining famine of this generation.
Yemen’s is the defining famine of this generation.
Famines fall on a spectrum, from those due to natural calamity to those resulting from genocide or extermination. We hear a lot about famines attributed to drought and, supposedly, overpopulation, but the former are rare, and the latter simply do not exist. Almost all famines are principally caused by war and political repression carried out with disregard for human life. And so it is in Yemen.
The war in Yemen has seen fighting between two sides: Houthi rebels who control the capital, and an assemblage of forces sponsored by Saudi Arabia and the United Arab Emirates (UAE), including an internationally recognized but politically subordinate government. Fighting is intermittently fierce, but the front lines are mostly stalemated. The key battleground for the Saudi-Emirati coalition is the economy: they have blockaded Houthi areas, closing the land and sea borders and the airports, displacing people en masse and depriving them of food, clean water, and health care. Though distress mounted, with reports of hunger and disease, the Houthis did not withdraw. The Saudi-led coalition then escalated month by month: it bombed medical centers and water wells and drilling rigs, and its puppet government closed the central bank, stopping payments to civil servants and pensioners. There is no shortage of food today; a large section of the population simply doesn’t have money to buy it from the local markets.
Starvation has long been a stagnant backwater of international criminal law. That may finally be changing.
Immediate and unhindered humanitarian aid is needed in Yemen. A ceasefire in war and expedited peace talks are essential. But we can be more precise. We can go beyond the platitude that this is a “man-made famine,” and we can name the men who made it: the Crown Prince of Saudi Arabia, Mohammed bin Salman, and the Crown Prince of Abu Dhabi, Mohamed bin Zayed al-Nahyan. We can then explore legal remedies, including prosecution and reparations.
It would be transformative, and long overdue, to recognize mass starvation as a crime. Seventy-five years ago, starvation should have become a fixture in our understanding of atrocity. In his book, Survival in Auschwitz (1947), Primo Levi writes:
They crowd my memory with their faceless presences, and if I could enclose all the evil of our time in one image, I would choose this image which is familiar to me: an emaciated man, with head dropped and shoulders curved, on whose face and in whose eyes not a trace of a thought is to be seen.
Instead of Levi’s image of the starving man, gas chambers and extermination squads came to dominate our conception of evil. As the Governor General of Poland, Hans Frank, presciently wrote in his diary, “That we sentence 1.2 million Jews to die of hunger should be noted only marginally.” Yet as many people died from starvation and related causes during World War II as died from more direct acts of violence. Had the Nazi’s Hungerplan—the program of starving to death 20–30 million inhabitants of eastern Europe and the Soviet Union—reached its projected quota, it would have been by far the greatest cause of death during the war.
As many people died from starvation and related causes during World War II as died from more direct acts of violence.
For years starvation has been a stagnant backwater of international criminal law. That may finally be changing. The legal understanding of starvation crimes is evolving fast, and over the last two years, a consensus has formed that the legal prohibitions on starvation are in fact much stronger than had been realized. Consider UN Security Council resolution 2417 on armed conflict and hunger, passed in May of 2018. It contains no new law. In fact, several Security Council members specified that they would support the resolution on the condition that it did not introduce new law. But it clearly states that starvation of civilians in wartime may be a war crime.
Starvation has indeed long been a tool of war. The Lieber Code, the code of conduct for Union soldiers during the U.S. Civil War, expressly permitted starvation as a method of combat used to expedite the surrender of the enemy. After World War II, the victorious Allies were not eager to press starvation charges because they themselves had used it as a weapon during both world wars. In fact, the U.S. blockade of Japan in 1945, which involved dropping mines into harbors to prevent imports of food and fuel, was code-named “Operation Starvation.” The 1949 Geneva Conventions did little to prohibit starvation, and although the 1977 Additional Protocols did, they came with many caveats.
Prosecution need not be the end goal. There is power in the simple but profound step of naming the crime.
It was not until the post–Cold War establishment of special tribunals that prosecuting starvation crimes really became possible. The failure to do so reflects less a shortcoming in the law than a lack of prosecutorial ambition, public pressure, and activist attention. At the International Criminal Tribunal for the former Yugoslavia, for example, prosecutors did not press starvation charges against General Stanislav Galić, responsible for the Siege of Sarajevo, on the grounds that it would not have been possible to prove that individuals had died of starvation specifically because of his acts—a mistaken interpretation of what is required to prove the crime. The Extraordinary Chambers of the Courts of Cambodia were an opportunity to prosecute the Khmer Rouge leadership for starvation crimes, but the prosecutors were under pressure to obtain quick convictions, and it was far more straightforward to press charges based on more well-established crimes. The same calculus held for the Special Prosecutor in Ethiopia after 1991.
While the sharp end of criminalization is prosecution, it need not be the end goal. There is emancipatory power in the simple but profound step of recognizing the harm and naming the crime. Mass starvation is an act for which its political and military perpetrators should be ashamed, not its victims. Yet without naming it a crime as such, this is not what happens. Levi called the world of the extermination camp the “gray zone.” It was a world in which victims of a greater crime turned into perpetrators of many lesser but immediate ones, becoming accomplices in their own degradation.
Famine’s “gray zone,” Breandán Mac Suibhne wrote in The End of Outrage: Post-Famine Adjustment in Rural Ireland (2017), is “populated by obscene and pathetic figures, where sometimes, but not always, judgment is impossible.” Drawing on painstaking research into the Donegal community of Beagh, Mac Suibhne continues:
The grey zone of the Great Famine is the demimonde of soupers and grabbers, moneylenders and meal-mongers, and those who took the biscuit from the weak. It is where one finds the mother who denied one child food and fed another, a boy who slit the throats of two youths for a bag of meal, and, indeed, rumoured and reported cases of cannibalism.
To make these observations . . . is to acknowledge some issues raised by the condition to which ragged humanity was, in places, reduced in Ireland in the late 1840s, and to bring into view situations in which moral judgment may not then have been impossible but scarcely can have been easy, and today seems utterly inappropriate. And yet the task remains for the historian to explain the reduction of humanity to that condition, and to assign responsibility for it.
Ireland’s starvation was, ultimately, the responsibility of Her Majesty’s Government in London. But it took a century and a half for a British prime minister (Tony Blair) to acknowledge the “deep scars” of the famine and say that “those who governed in London at the time failed their people.” It took equally long for the descendants of the survivors to begin to raise their eyes from those humiliations, degradations, and feelings of worthlessness, to memorialize the victims of famine with appropriate dignity, and to recognize that the intimate cruelties of the gray zone, those hard-hearted actions and inactions, were the result of a social condition forced upon the poor and vulnerable by those in power.
The famine tore apart the social fabric, not just once but month by month over prolonged years, including the aftermath of emigration, dispossession, and silence. For several generations the survivors of the famine blamed themselves, or saw the hunger as the inevitable workings of Providence or of modernity. Malthusians blamed the Irish peasants for their fecklessness in having been born in the first place. The deep sense of failure felt by parents burying their children became the self-reproach of a society that could not recognize the true nature of the harm inflicted upon them. Their community’s official histories were written, as a review of Mac Suibhne’s book put it, by the “hard-faced men who did well out of the Famine.” At best the famine dead were second-class victims, not properly counted, and certainly not ranked alongside those who died a violent martyr’s death.
To heal a harm, we must first give it its right name. The Irish have long been discomfited by the word “famine” because of its historic connotations of natural scarcity or imbalance between population and resources; they prefer “hunger.” But that doesn’t go far enough. What the Irish call an Gorta Mór (“the Great Hunger”) could appropriately be renamed “the Great English Famine in Ireland.”
Ireland’s hunger is one of the few cases in history where the victims of famine are commemorated in public memorials. The memorial on Cambridge Common in Massachusetts shows two adults, each holding a child, parting from one another. The plinth reads, “Never again should a people starve in a world of plenty.” That indefinite article—“a people”—is the barest hint of starvation as an act, rather than hunger as a generic phenomenon. Yet there are no memorials in the places where the starvation was designed. An appropriate place would be opposite the British Treasury or the Foreign and Commonwealth Office in Whitehall, London.
Starvation must be recognized as a wrong inflicted, not just a suffering experienced.
Such a memorial might have made Britain’s members of parliament literate in the history of their country’s starvation crimes. In turn, perhaps Conservative MP Priti Patel might have hesitated before threatening Ireland with a de facto blockade of food imports to compel it to agree to a Brexit plan involving border controls. Having reviewed a government paper that foresaw a food crisis in Ireland in the event of a “no deal” Brexit, she asked, “Why hasn’t this point been pressed home during negotiations?”
The fog of the gray zone still obscures the nature of the harm in contemporary famine. In South Sudan, among the Dinka people, there is a custom that those who die of starvation are not buried. To bury them is to inter hunger itself within the land, with the result that it will return. During the famines of 1988 and 1998, when their children perished from hunger, Dinka parents cast the bodies into the bush to be eaten by vultures and wild animals. The same thing is happening now, as the current famine in South Sudan, perpetrated by President Salva Kiir Mayardit, his lieutenants, and by the rebel commanders fighting them, has killed an estimated 190,000 people since 2014.
The physical pain of starvation is matched by the psychological pain of blaming oneself, and one’s family, for failure. Famine gnaws away at those bonds of affection, respect, and trust. For the famished people of South Sudan, properly recognizing starvation as a wrong inflicted, not just a suffering experienced, would emancipate the victims from the shame of hunger. It would allow them, at least, to bury their dead with dignity. This is the starting point for the concept of starvation crimes and necessary to reshaping our thinking about the political causes of famine.
“Starvation crime” is a new concept. It does not refer to a legal category as such, but draws together a range of crimes under different provisions of international criminal law in order to give them political salience. In this regard it draws inspiration from legal scholar and former diplomat David Scheffer’s coinage of the term “atrocity crimes” to refer to a cluster of particularly atrocious crimes, prohibited under different elements of law, without becoming side-tracked by a fruitless debate over whether such acts constitute “genocide.”
Acts of starvation, defined in the Geneva Convention as attacks on “objects indispensable to the survival of the civilian population,” are prohibited in international humanitarian law. Similar prohibition is found in the Rome Statute of the International Criminal Court as well as the UN’s Genocide Convention. In general a starvation crime must meet a threshold of scale (a significant number of people must be affected) and severity (deaths or severe suffering), and it must be perpetrated in an orchestrated manner.
Famine inflicts not just physical but also psychological and social pain, gnawing away at bonds of affection, respect, and trust.
Starvation crimes, perpetrated on a sufficient scale and over a sufficient length of time, can cause a famine. However, we need not wait on a definitive diagnosis of famine to identify a starvation crime. This is important because existing diagnostic criteria for determining famine are contentious or may only be possible after the fact. This is illustrated by current controversies over the metrics used by the IPC scale, especially in Yemen.
The IPC metrics focus on the severity of malnutrition, mortality, and food crisis within a specific geographical locale. It sets a very high bar for declaring “famine,” and uses the words “emergency” and “crisis” for lower levels. Yet the IPC designation might be taken to imply that those other levels are somehow tolerable or do not involve hunger and death. Not so: a large number of people in “emergency” or “crisis” can suffer starvation deaths. Indeed, hundreds of thousands of people can die of hunger and related causes without the threshold of “famine” being officially breached in any one locality. Moreover, the criteria used for determining the magnitude of famines are based on total numbers of excess deaths, but these can be used only after the fact, and therefore are of little use when a famine is in prospect or in progress.
Therefore, in the same way that Scheffer’s concept of “atrocity crimes” allows us to sidestep the problematic question of identifying genocide, so the concept of “starvation crimes” allows us to duck the question of a definitive diagnosis of famine, and indeed the bigger question of whether we need metrics of suffering in order to identify such acts. The category “starvation crimes” encompasses a range of criminal acts that include three central elements: (1) an outcome which includes deprivation of food and associated suffering (not necessarily mass starvation unto death, though such outcomes are relevant to demonstrating the gravity of the crime); (2) the act of depriving persons of food (and destruction of other items, or prevention of activities, indispensible for survival) (actus reus); and (3) criminal intent (mens rea, which need not be the intent to inflict starvation as such, only awareness that the act will have such a consequence).
The Crown Princes of Saudi Arabia and Abu Dhabi have deliberately committed multiple acts that have deprived millions of Yemenis the means necessary to survive.
Yemen’s famine, for example, isn’t the outcome of any one specific act, but is instead the product of many different actions and policies—many in themselves not intrinsically criminal, such as closing the central bank or bombing bridges—which cumulatively add up to a vast starvation crime. The Crown Princes of Saudi Arabia and Abu Dhabi have deliberately, over an extended period, committed multiple acts that have deprived millions of Yemenis the means necessary to survive. These attacks were mounted again and again, many of them clearly targeting civilian infrastructure, all while the two Crown Princes also oversaw the obstruction and interruption of humanitarian aid.
The fact that the Houthis have stolen, taxed, and diverted food for their own political ends, and, on at least one occasion, also used starvation tactically to military ends (in the siege of Ta’izz) does not detract from the criminality of the Saudi and Emirati campaign. Yemen was already a poor and food insecure country, with a long-standing water scarcity. This would have been well-known to those who planned and administered the starvation of Yemen. If they knew that Yemenis were vulnerable to starvation, was it not particularly reprehensible for them to fight a war of starvation in that country?
Would prosecution be in the interests of victims? Aid agencies will likely put forward strong arguments for attending to immediate needs and not asking difficult questions. Perpetrators might see aid workers as prosecution witnesses, after all, and want them out. Or they might insist on silence as a precondition for humanitarian staff. Prudence might counsel treading softly around issues of criminal culpability.
But the justice agenda ranges wider. Starvation is a material deprivation. Its victims lose their assets and livelihoods as they suffer hunger and their children die. The approximately $1.5 billion that Saudi Arabia and the UAE have contributed to international aid operations in Yemen should be seen less as a mitigating factor and more as a small down payment on the tens of billions of dollars for which they should be liable in reparations.
Despite the obvious logic, however, there are many reasons why material reparations may not be the best approach. We should consider them on a case-by-case basis. In Yemen, the countries that perpetrated starvation are sufficiently wealthy that they should be considered liable for both material compensation for the victims and funding national development in the country that they tried so hard to destroy. But in other cases, it may not be possible to distinguish between suffering and death specifically caused by starvation crimes and those caused by other adversities and misfortunes. Moreover, in aid-dependent countries, the bill for reconstruction inevitably falls on international donors who would not consider themselves liable for paying reparations as they were not culpable in the commissioning of the famine.
A more fundamental problem with material reparations is that restitution and compensation paid to the victims of crime are primarily intended to restore the status quo ante. In the case of a population that was vulnerable to famine, this is clearly not the desired outcome. What is needed is to sufficiently improve their economic situation and remedy their political marginalization, so that they are no longer exposed to such deprivation.
The ultimate objective isn’t putting a villain in jail, but making the infliction of starvation so morally toxic that it is unthinkable.
Fundamental to the transitional justice agenda is ensuring the crime is not repeated. The goal here is political accountability. Those responsible for mass starvation are often political leaders; if they are called to account by their constituents or shamed by their peers that ensures that they are incapable of perpetrating famine again.
A closely related idea can be found in the work of the philosopher Amartya Sen, who has advanced the hypothesis that “democracy prevents famine.” There is indeed strong evidence that public scrutiny, mobilization, and accountability help to ensure that governments do not perpetrate famine. Yet formal democracy is not enough (in this context, as in many others): the issue of famine must be visible, politicized, and there must be workable mechanisms for famine prevention and relief in place.
Indeed, democratic sensibilities and institutions tend to break down or be overruled in cases of political emergency, such as civil war or fear of terrorism. The true test of the “democracy prevents famine” hypothesis is whether a government maintains a commitment to humanitarian principle under these more difficult conditions. International norms and institutions offer a second layer of protection against the erosion of such domestic failures, and that is why it is important to advance a starvation crime agenda on the international stage. The leaders of South Sudan, Saudi Arabia, and the UAE value their standing in international forums, such as the G20, the United Nations, and the African Union. Being shunned or even prohibited from attending these institutions matters to them. Sovereign legitimacy is only as good as what that legitimacy does for them: if their calling cards are no longer valued, then shaming has done its work.
Democracy may prevent famine, as Armatya Sen has argued, but mere formal democracy is not enough.
Criminalizing starvation should also influence donor policy. For ten months in 2010-11, for example, U.S. counterterror legislation prevented emergency relief to Somalia out of the fear that some of that aid might end up in the hands of the militant group Al Shabaab—a terrorist organization. The humanitarians in the U.S. administration argued long and hard for a way to work around this prohibition, and it would have likely been useful to them if a legal counsel made the case that the U.S. was in danger of being complicit in a starvation crime. As it was, the delay in delivering aid probably cost the lives of over 200,000 Somali children. In Yemen today, the risk of liability by association with starvation crimes might be sufficient for Washington, London, and Paris to reconsider their supply of weaponry and diplomatic cover to Saudi Arabia and the UAE.
The history of twenty-five years of international criminal tribunals suggests that few culprits of starvation crimes would be indicted and fewer still tried and convicted. Even a successful prosecution would be mostly symbolic, as most perpetrators would escape. But this should not discourage us. Criminalizing starvation has many ramifications. It allows us to shift the shame of starvation from the victim to the perpetrator, to explore restorative justice including reparations, and to develop guarantees of non-recurrence.
The ultimate objective isn’t putting a villain in jail, but making the infliction of starvation so morally toxic that it is unthinkable.
This is an adapted version of the Seventh Annual Overseas Development Institute lecture, given in London on December 6, 2018
Alex de Waal is Executive Director of the World Peace Foundation at the Fletcher School at Tufts University. He is the author of Mass Starvation: The History and Future of Famine, The Real Politics of the Horn of Africa: Money, War and the Business of Power, and editor of Advocacy in Conflict: Critical Perspectives on Transnational Activism.
Vital reading on politics, literature, and more in your inbox. Sign up for our Weekly Newsletter, Monthly Roundup, and event notifications.
But I do miss the hymns, / the small, hard apples with their dimpled skin. I do miss / things.
The vast hinterlands of the Global South’s cities are generating new solidarities and ideas of what counts as a life worth living.
Protests in China are shining a light not only on the country’s draconian population management but restrictions on workers everywhere.