Search This Blog

Wednesday, September 8, 2021

RSN: Dexter Filkins | Did Making the Rules of War Better Make the World Worse?

 

 

Reader Supported News
07 September 21

Live on the homepage now!
Reader Supported News

WHY THE URGENCY ON FUNDRAISING? We have an opportunity here to move this project forward. 5 years of hard work are finally starting to pay off. This project wants to grow. We are held back by the constant struggle to secure what is effectively a modest budget. We have to address that. We sincerely appreciate everyone who supports that project in whatever way they do that. We absolutely must find a way to convince those who can afford to help to do so. We provide free service to tens of thousands daily. This is a valuable community service. What we are talking about is an occasional contribution. It's not exactly a huge financial burden. In solidarity.
Marc Ash • Founder, Reader Supported News

Sure, I'll make a donation!

 

A predator drone. (photo: General Atomics)
Dexter Filkins | Did Making the Rules of War Better Make the World Worse?
Dexter Filkins, The New Yorker
Filkins writes: "The bombing ignited a firestorm that sent smoke miles into the sky; the glow was visible for a hundred and fifty miles. In six hours, as many as a hundred thousand civilians were killed, and a million others were left without homes."

Why efforts to curb the cruelty of military force may have backfired.

On the evening of March 9, 1945, the United States sent an armada of B-29 Superfortresses toward Japan, which for months had resisted surrender, even as a naval blockade brought much of the population to the brink of starvation. The B-29s were headed for Tokyo, and carried napalm, chosen for the mission because so many of the city’s inhabitants lived in houses made of wood. The bombing ignited a firestorm that sent smoke miles into the sky; the glow was visible for a hundred and fifty miles. In six hours, as many as a hundred thousand civilians were killed, and a million others were left without homes. In the words of the raid’s architect, Major General Curtis LeMay, the Japanese were “scorched and boiled and baked to death.” Five months later, the United States bombed Hiroshima and Nagasaki, and Japan surrendered.

If the U.S. undertook such a campaign these days, worldwide revulsion would be intense and long lasting. In the past half century, war waged by states has become more humane. Shifting international standards, codified in treaties like the Geneva Conventions, have mirrored a trend among military commanders to choose targets carefully, and to spare civilians whenever possible. Improvements in bomb accuracy have made it easier to focus on military targets.

Most people would consider this a positive development. Samuel Moyn, a professor of history and of jurisprudence at Yale, believes that we have less to celebrate than we might imagine. In his book “Humane: How the United States Abandoned Peace and Reinvented War” (Farrar, Straus & Giroux), he suggests that this new form of warfare is so civilized that it has reduced our incentive to stop fighting. “The American way of war is more and more defined by a near complete immunity from harm for one side and unprecedented care when it comes to killing people on the other,” he writes. “America’s military operations have become more expansive in scope and perpetual in time by virtue of these very facts.” Ours is an era of endless conflict, whose ideal symbol is the armed drone—occasionally firing a missile, which may kill the wrong people, but too far removed from everyday American life to rouse public objections.

The dilemma posed by Moyn belongs to the modern age. Killing is what armies do, and, in the usual course of things, the more they kill the sooner their wars end. In the first two Punic Wars, Rome and Carthage fought in battlegrounds outside their population centers; in the third, the Romans contrived an excuse to lay siege to Carthage, and slaughtered its inhabitants. There wasn’t a fourth. For Clausewitz, the Prussian military theorist, the whole point of fighting was not just to repel the enemy but to destroy it; theoretically, at least, war knows no limits.

In the United States, generals took a page from Clausewitz, applying maximum force to secure military objectives. During the Civil War, General William Tecumseh Sherman, who set fire to Atlanta, believed he was entitled to do anything in pursuit of victory, because he was fighting against an enemy that had begun an unjust war. He vowed to “make Georgia howl.” In the Second World War, Allied and Axis commanders deliberately attacked civilians, in the hope that they could be terrorized into demanding peace. The Allies’ aerial campaign against German cities like Hamburg and Dresden killed as many as a half million civilians. (It was no oversight that mass bombing was not included among the indictments of Nazi leaders at the Nuremberg trials.)

“Total war is the demand of the hour,” Goebbels declared in 1943, speaking in a stadium below a vast banner that read “Totaler Krieg—Kürzester Krieg” (“Total War—Shortest War”). Even twenty-first-century armies have taken this to heart. In the late two-thousands, the Sri Lankan military, after fighting Tamil separatists at a low pitch for a quarter century, attacked rebel strongholds in full force and killed as many as forty thousand civilians, burning the bodies or burying them in mass graves. The war never resumed. The campaign, or what it represented, became known as “the Sri Lanka Solution.”

As Moyn points out, the idea that war should be unrestrained has drawn support not just from battle-hardened officers but even from self-proclaimed pacifists. Foremost among them was Leo Tolstoy, who had served in the Russian Army during the Crimean War and in the Caucasus. Tolstoy disdained the Red Cross, and believed that making war more humane could make war more likely. In “War and Peace,” the vessel for Tolstoy’s views was Prince Andrei, who had been wounded while fighting Napoleon’s Army at Austerlitz: “They talk to us of the rules of war, of chivalry, of flags of truce, of mercy to the unfortunate and so on. It’s all rubbish! . . . If there was none of this magnanimity in war, we should go to war only when it was worthwhile going to certain death.” If Moyn doesn’t quite endorse this view, he’s gripped by its modern implications. In particular, he believes that the American way of war, as it has evolved in our century, has become precisely what Tolstoy feared: so prettified as to be wageable everywhere, all the time.

Moyn was an intern in the White House in 1999, when NATO, without the legal sanction of the United Nations, launched a bombing campaign in Kosovo to stop what appeared to be an almost certain large-scale massacre. At the time, he supported the intervention. “Only later did it seem the early stages of something altogether unexpected,’’ Moyn says. “It has come to be called America’s ‘endless war,’ especially as the campaigns against global terror after September 11, 2001, started off and ground on.”

But the real origins of our predicament, Moyn says, date to the outrages of the Vietnam War, including the My Lai massacre and the devastating bombing campaigns in Vietnam and Cambodia, where napalm was routinely deployed. These horrors, broadcast on TV, made the U.S. military rethink its unrestrained approach to waging war. And they helped lead to the updating of the Geneva Conventions in 1977. The earlier Conventions had covered the treatment of prisoners and the wounded or sick, and had sought to limit such practices as using civilians as human shields. The additional protocols banned indiscriminate attacks on civilians, the targeting of civilian infrastructure, and harm to civilians that was disproportionate to the military objective.

For Moyn, these updates heralded a new era of war. “Before the humbly titled ‘Additional Protocols’ to the Geneva Conventions, one could say with only a bit of exaggeration that there were no laws of war,” he writes. In fact, norms of restraint in war date back to ancient Greece and Rome, even if the norms were not always observed. Bans on torture and wanton destruction have been in place for the U.S. Army since the eighteen-sixties. Violations, such as those committed by Lieutenant William Calley at My Lai, were prosecuted as crimes. What’s more, the U.S. never ratified all of the additional Geneva protocols; American restraint in war, such as it is, had other origins.

Still, the “humanizing” of military action that Moyn describes is a real phenomenon, and does mark a break with the past. These days, when U.S. military leaders are contemplating an action, military lawyers decide whether it comports with humanitarian law. Sometimes the restraint is extreme; in 2010, the rules for air strikes in Afghanistan, tightened by General Stanley McChrystal, were so restrictive that troops complained that they were being put at risk. Moyn bemoans legal standards such as these for another reason: he thinks that they have dampened the sort of public outcry that might induce politicians to end a conflict. “Humane war was a consolation prize for the failure to constrain the resort to force in the first place,” he writes.

Yet Moyn’s argument goes beyond the expected humanitarian critique—the Tolstoyan concern that mannerly military action could promote further suffering. “Americans are proving that war’s evil is less and less a matter of illicit killing or even suffering,” Moyn maintains. Rather, the “worst thing about war” is the assertion of American dominance in the world, which has foreclosed the possibility offered by the end of the Cold War: a “world of free and equal peoples.”

Moyn’s focus on the evils of American power is not exactly new; he belongs to what the historian Daniel Immerwahr has jokingly described as the “menacing eagle” school of American history—so named because books by its adherents often feature, on their covers, an eagle assailing the globe. (“Humane” does not have an eagle on it, but it does have a blurb from Immerwahr.) Yet Moyn’s objective of challenging the legitimacy of American power leads to some unusual choices of villains: the modern-day targets of his book are not the warmongers but the lawyers and the humanitarians who have opposed the violation of civil and human rights.

During the Iraq War, the Bush Administration’s policy of torturing detainees, laid bare by the Abu Ghraib photographs, was met with widespread revulsion. But Moyn argues that these kinds of protests actually had a perverse effect: the “war was cleansed of stigma.” He criticizes Jack Goldsmith, a Harvard law professor who served in the Justice Department under Bush and who tried to impose some legal order on the Administration’s detainee policy. Moyn also chides my colleague Jane Mayer for casting in a good light those in the government who agitated against the use of torture: hand-wringing over abuses and atrocities was all a distraction from the “immorality of the entire enterprise of the war on terror.”

Must one choose between being against torture and being against war? Moyn suggests that opposing war crimes blinds us to the crime of war. If this is an empirical claim, it’s contradicted by the facts. The invasion of Iraq did inspire demonstrations around the world—the public outcry that, in Moyn’s account, could have stopped the war. To judge by survey results, it was only after the revelations of Abu Ghraib that a majority of Americans came to think the war was a mistake.

Moyn’s position might lead us to oppose striking enemy targets with smaller, more accurate bombs because they don’t inspire sufficient public outrage; he is evidently convinced that an effective protest campaign requires a steady and highly visible supply of victims. That logic would favor incinerating entire cities, Tokyo style, if the resulting spectacles of agony lead more people to oppose American power. The difficulty with his “heighten the contradictions” approach is that contradictions can stay heightened indefinitely. Despite Moyn’s chiliastic views, if we plump for greater suffering in the hopes of having less war we may find ourselves with more of both.

Moyn’s analysis is further hampered by a preoccupation with legalism; he largely neglects the fact that much military restraint is attributable less to law than to technology. Allied commanders firebombed cities in Japan and Germany (and Americans did so later in North Korea and Vietnam) in part because they believed that more precise attacks wouldn’t work or couldn’t be safely attempted. Efforts to pinpoint military targets mostly failed; in Germany, despite daily and nightly bombing raids, industrial production rose every year until 1945.

Today, bombing accuracy has dramatically improved. We’ve all seen the slick Pentagon videos showing an aerial bomb picking out one building among many and all but knocking on the front door before exploding. Collateral damage has receded—though only by so much. When civilians are killed, their deaths are often caused by human error. In 2011, in the Yemeni port city of Aden, I examined the mangled limbs of Yemeni children, whose village had been hit by American cruise missiles. An American official with knowledge of the attack told me that the U.S. had struck an Al Qaeda training camp in the village—that he’d seen the evidence himself. That objective doesn’t mean the bombing served American national interests and it doesn’t excuse the killing of innocents. But the contemporary norms of force deployment do make a difference: had General LeMay been confronted with a similar enemy camp, he would have flattened Yemeni villages for miles around. Moyn’s maximalism makes these distinctions irrelevant: if war can’t be abolished, he suggests, any attempt to make it more humane is meaningless or worse. In his desire for a better world, one liberated from American global power, he comes close to licensing carnage.

A more grounded discussion of the American way of war is set forth by William M. Arkin, in “The Generals Have No Clothes” (Simon & Schuster). Arkin, a former intelligence officer and a journalist for NBC News, lays out the situation we find ourselves in twenty years after the attacks of September 11, 2001. The wars in Iraq and Afghanistan are both lost. The war on terror has spread across the Middle East and South Asia, with the United States in tow. The U.S. military has conducted raids in countries all over the world, killing hundreds of terrorists, but new recruits step forward every day. We now field soldiers in the war on terror who were not alive when it began.

Like Moyn, Arkin focusses on these endless conflicts—what Arkin calls “perpetual war”—but his explanation centers on a different culprit. Combat persists, Arkin tells us, because the apparatus of people and ships and bases and satellites and planes and drones and analysts and contractors has grown so vast that it can no longer be understood, much less controlled, by any single person; it has become “a gigantic physical superstructure” that “sustains endless warfare.” The perpetual war, Arkin contends, is “a physical machine, and a larger truth, more powerful than whoever is president,” and the result has been “hidden and unintended consequences, provoking the other side, creating crisis, constraining change.”

An organizational logic, more than an ideological one, holds sway, Arkin suggests. Secrecy is central to the contemporary military; few people, even members of Congress who are charged with overseeing the Pentagon, seem to know all the places where Americans are fighting. The military operates bases in more than seventy countries and territories; Special Operations Forces are routinely present in more than ninety. Four years ago, when American servicemen were killed in Niger, several members of Congress expressed surprise that the U.S. military was even there. When President Trump started questioning the U.S. war effort, Arkin writes, the Pentagon decided to stop publicly reporting how many troops were situated in individual Middle Eastern countries—and began keeping details of air strikes secret. In 2017, when Trump ordered the Pentagon to withdraw the spouses and children of military personnel from the Korean peninsula, Defense Secretary Jim Mattis ignored him. (Mattis says that this is not accurate.) Trump’s order was ill-informed and, as a provocation, potentially dangerous, but ignoring the Commander-in-Chief amounts to a flagrant disregard for the Constitution.

The Pentagon’s skepticism of its civilian leaders is not limited to Trump; it spans the modern Presidency, Arkin tells us. Obama was elected in 2008 on the promise of getting out of Iraq, but his closest advisers, including Defense Secretary Leon Panetta, resisted; Obama’s skepticism about escalating the war in Afghanistan led to a showdown with the generals that the generals are widely seen to have won. With his counterterrorism adviser John Brennan at his side, Obama presided over a huge expansion of the drone program. Both Panetta and Brennan were marquee players in the national-security establishment—a cadre of several thousand people who circulate in and out of government and who, Arkin argues, keep the perpetual machine running no matter who’s in charge.

That machine hums along despite a record of failure. In “The Other Face of Battle” (Oxford), the military historians Wayne E. Lee, Anthony E. Carlson, David L. Preston, and David Silbey examine the Battle of Makuan, in Afghanistan, in 2010, providing a vivid encapsulation of how ill-adapted the U.S. military was to that country, even after fighting there for nine years. The soldiers in Makuan, overloaded with expensive equipment, moved across a gruelling landscape like a group of plodding space aliens, as the enemy quietly faded away; displaced civilians returned to find their village levelled. The fact that the operation was regarded as a victory over the Taliban was another measure of the generals’ delusion.

America’s sprawling intelligence apparatus, too, has a dismaying record of incompetence; it failed to anticipate the 9/11 attacks, the Arab Spring and the civil wars that followed, the rise of ISIS, or the succession of power after the death of Kim Jong Il. Arkin quotes Panetta, who said that, after taking office as C.I.A. director, he was “staggered” to learn how many people the agency had working on Al Qaeda, while neglecting issues that an Obama Administration official said “were just as much influencing our future—climate, governance, food, health.” Sometimes, in the war zones, the intelligence services and the military have pursued entirely opposite goals; in Afghanistan, in 2009, as American military officers led a campaign to root out corruption in the Afghan government, C.I.A. operatives were keeping the government’s most corrupt politician, Ahmed Wali Karzai, on the agency’s payroll.

Even though the U.S. military has not won a major war since the Second World War, it remains the most respected institution in American life. It is popular despite (or because of) the fact that, without a draft, only a tiny percentage of Americans will ever be part of it; the ones who do join are disproportionately from working-class families. In recent years, the number of private contractors killed in American wars has begun to exceed the number of those killed in uniform—another factor that helps relegate the wars to the far reaches of the newspaper. As the military comes to rely on computer networks and high technology, even fewer recruits will be required. Arkin writes that the American way is to “make war invisible, not just because counter-terrorism demands secrecy, but also because the military assumes the American public doesn’t want to know because it isn’t prepared to sacrifice.”

Where Moyn is driven by a photonegative of American exceptionalism—a sense that American power is a singular force of malignity in the world—Arkin is concerned that this perpetual-war machine is at odds with America’s strategic interests. He sees the spread of Al Qaeda and like-minded groups across Asia and Africa as a direct consequence of our attempts to destroy them. Every errant drone strike that kills an innocent invites a fresh wave of recruits. The process resembles what happened in the early days of the Iraq War, when the military’s heavy-handed tactics, employed in villages across the Sunni Arab heartland, transformed a tiny insurgency into a huge one.

Arkin is less persuasive when he argues for the creation of a “global security index,” which would serve as “the security equivalent of a Dow Jones Industrial Average.” Judgments about protecting the country are inevitably human—and inevitably political—and can hardly be relegated to an algorithm. A further complication is that war between states has become exceedingly rare; it has been replaced by states fighting insurgents, or states fighting terrorists, or civil conflicts (with states backing their preferred faction). Of course these wars last longer: it’s difficult to bomb your enemy’s government into surrendering when your enemy has no government at all. The fact that insurgencies often operate in ungoverned areas further complicates military operations.

At the same time, Arkin overstates the case that the military has become immune to external control. The reluctance of the military to pull out of Afghanistan and Iraq had less to do with some deep desire to keep the machine running than with an inability to build a functioning state in either of these countries that could outlast its presence. When Obama did try to leave Iraq, in 2011, his generals warned him that things would fall apart; Obama withdrew anyway, and they fell apart. Three years later, with Iraq continuing to disintegrate, he sent the troops back in. They’re still there. You can decry the folly of a neocolonial occupation or fault the military for its failure to build a state in Iraq, but the dilemma that Obama faced was genuine—and, besides, America’s war in Iraq was begun not by the generals but by civilian politicians, backed by overwhelming public support. In 2021, Joe Biden faced a similar conundrum in Afghanistan; his decision to withdraw all American troops before the United States had evacuated its citizens and Afghan helpers led to a calamity that is still unfolding.

In Arkin’s view, the COVID-19 pandemic brought the 9/11 era to an end: two decades of misdirected resources bookended by displays of official incompetence. Arkin argues that the time is overdue to pull back—to close some of our overseas bases and bring home many of the troops. Biden’s decision on Afghanistan can be seen as an attempt to temper some of America’s commitments. What lies ahead, as the chaos engulfing Afghanistan suggests, may not be that peaceful era of political freedom and pluralism which Moyn thinks our militarism blocked, and indeed Moyn’s singular focus on American power may come to seem strikingly insular. We’ve spent decades fighting asymmetrical wars, but now there’s a symmetrical one looming. The United States has never faced an adversary of China’s power: China’s G.D.P. is, by some measures, greater than ours, its active-duty military is larger than ours, and its weapon systems are rapidly expanding. China appears determined to challenge the status quo, not just the territorial one but the scaffolding of international laws that govern much of the world’s diplomatic and economic relations. If two forever wars are finally coming to an end, a new Cold War may await.


READ MORE


Back on the Trail, Bernie Sanders Campaigns for the $3.5 Trillion Budget PlanBernie Sanders. (photo: Antonella Crescimbeni)

Back on the Trail, Bernie Sanders Campaigns for the $3.5 Trillion Budget Plan
Emily Cochrane, The New York Times
Cochrane writes: "With a khaki-clad leg propped up on a bench, hand on his hip, Senator Bernie Sanders was regaling the post-church Sunday brunch crowd outside a bar with enticing details about Democrats' emerging $3.5 trillion budget bill."

Senator Bernie Sanders is barnstorming the country again, but not for the presidency. Instead, he’s making the case for a $3.5 trillion bill that would be a once-in-a-generation achievement.


With a khaki-clad leg propped up on a bench, hand on his hip, Senator Bernie Sanders was regaling the post-church Sunday brunch crowd outside a bar with enticing details about Democrats’ emerging $3.5 trillion budget bill.

As Meatloaf’s “Paradise by the Dashboard Light” blared in the background, Mr. Sanders, an independent from Vermont, fielded questions from curious diners about plans to provide two years of free community college education and reduce prescription drug prices, interjecting an occasional apology for letting the food grow cold as he gathered feedback about the package.

Before sitting down with his family to finish eating, one man wondered aloud about something else entirely: Less than a year after the end of the 2020 presidential campaign season and with the midterm elections looming, what was Mr. Sanders doing in Iowa?

READ MORE


The Texans Fighting to Keep Abortion AccessibleA rally to protect abortion rights in Texas. (photo: ABC)

The Texans Fighting to Keep Abortion Accessible
Mary Wilson, Slate
Wilson writes: "Her organization doesn't provide abortions, but it helps people deal with the costs and hurdles around getting an abortion - because in Texas, even before this latest ban was on the books, there was a long list of obstacles to abortion care."
READ MORE


America Needs to Decide How Much COVID-19 Risk It Will TolerateA patient hospitalized with COVID-19. (photo: BioSpace)


America Needs to Decide How Much COVID-19 Risk It Will Tolerate
German Lopez, Vox
Lopez writes: "More than a year and a half into the Covid-19 pandemic, America still doesn't agree on what it's trying to accomplish."

A realistic Covid-19 endgame requires accepting some risk. The question is how much.


More than a year and a half into the Covid-19 pandemic, America still doesn’t agree on what it’s trying to accomplish.

Is the goal to completely eradicate Covid-19? Is it to prevent hospitals from getting overwhelmed? Is it hitting a certain vaccine threshold that mitigates the worst Covid-19 outcomes but doesn’t prevent all infections? Or is it something else entirely?

At the root of this confusion is a big question the US, including policymakers, experts, and the general public, has never been able to answer: How many Covid-19 deaths are too many?

The lack of a clear end goal has hindered America’s anti-pandemic efforts from the start. At first, the goal of restrictions was to “flatten the curve”: to keep the number of cases low enough that hospitals could treat those that did arise. But that consensus crumbled against the reality of the coronavirus — leaving the country with patchwork restrictions and no clear idea of what it meant to “beat” Covid-19, let alone a strategy to achieve a victory.

The vaccines were supposed to be a way out. But between breakthrough infections, the risks of long Covid, and new variants, it’s becoming clear the vaccines didn’t get rid of the need to answer the underlying question of what the Covid-19 endgame is.

America is now stuck between those two extremes: The country wants to reduce the risk of Covid-19, but it also wants to limit the remnants of social distancing and other Covid-related restrictions on day-to-day life.

“We’re not trying to go for zero Covid,” Ashish Jha, dean of the Brown University School of Public Health, told me. “The question becomes: When do, in most communities, people feel comfortable going about their daily business and not worrying, excessively, about doing things that are important and meaningful to them?”

Will Americans accept the deaths of tens of thousands of people, as they do with the flu, if it means life returning to normal? Can the public tolerate an even higher death toll — akin to the drug overdose crisis, which killed an estimated 94,000 people in 2020 — if that’s what it takes to truly end social distancing and other precautions?

Does it make a difference if the vast majority of deaths are among those who are willingly unvaccinated, who, in effect, accepted a greater risk from the coronavirus? Are further reductions in deaths worth postponing a return to “normal” — or changing what “normal” means — if continued precautions are mild, like prolonged masking or widespread testing?

There are no easy answers here. Even among the experts I’ve spoken to over the past few weeks, there’s wide disagreement on how much risk is tolerable, when milder precautions like masking are warranted, and at what point harsher measures, like lockdowns and school closures, are needed. There’s not even agreement on what the endgame is; some say that, from a policy standpoint, the goal should be to keep caseloads manageable for hospitals, while others call for doing much more to drive down Covid-19.

One big problem identified by experts: “I don’t think we’re having those conversations enough,” Saskia Popescu, an infectious disease epidemiologist at George Mason University, told me. Instead of the public and officials openly discussing how much risk is acceptable, the public dialogue often feels like two extremes — the very risk-averse and those downplaying any risk of the coronavirus whatsoever — talking past each other.

But the path to an endgame should begin with a frank discussion about just how much risk is tolerable as the coronavirus goes from pandemic to endemic.

We’re looking for a balancing act, not a total end to Covid-19

If there is one point of agreement among most experts, it’s that Covid-19 is here to stay. “Until very recently, I was hopeful that there was a possibility of getting to a point where we had no more Covid,” Eleanor Murray, an epidemiologist at Boston University, told me. Now she believes that “it is infeasible, in the short term, to aim for an eradication goal.”

Particularly with the rise of the delta variant, a consensus has formed that the coronavirus likely can’t be eliminated. Like the flu, a rapidly shapeshifting coronavirus will continue to stick around in some version for years to come, with new variants leading to new spikes in infections. Especially as it becomes unlikely that 100 percent of the population will get vaccinated, and as it becomes clear that the vaccines provide great but not perfect protection, the virus is probably always going to be with us in some form, both in America and abroad.

That doesn’t mean the US has to accept hundreds of thousands of deaths annually in the coming years. While the vaccines have struggled at least somewhat in preventing any kind of infection (including asymptomatic infection), they have held up in preventing severe illness, hospitalization, and death — reducing the risk of each by roughly 90 percent, compared to no vaccine. Research has also found stricter restrictions reduce Covid-19 spread and death, and that masks work.

But it’s also become clear most Americans aren’t willing to tolerate drastic deviations from the pre-pandemic normal — lockdowns, staying at home, and broadly avoiding interactions with other people — for long. While social distancing staved off the virus in the pre-vaccine pandemic days, it also wrought economic, educational, and social devastation around the world. It’s the intervention that, above all, most people want to avoid going forward.

“That’s the goal, in my mind: to eliminate or reduce social distancing,” Jha said.

What policymakers can aim for is not a total end to Covid-19 but a balancing act. On one side of that scale is containing Covid-19 with restrictions and precautions. On the other is resuming normal, pre-pandemic life. Vaccines have changed the balance by giving us the ability to contain Covid-19’s worst outcomes — hospitalization and death — with less weight on the side of restrictions. But vaccines alone can’t drive hospitalizations and deaths to zero if all the weight on the restriction side is removed.

That suggests a choice: Either Americans accept some level of Covid-19 risk, including hospitalization and death, or they accept some level of restrictions and precautions in the long term.

Depending on how that choice is made, the US could be looking at very different futures. Americans could decide some milder precautions, like masking, are fine. Or they could conclude that even masking is too much to ask, even if that means a greater death toll. It hinges on how much weight on the restrictions side remains acceptable for the bulk of the population — how high the threshold is for embracing continued deviations from what day-to-day life was like before.

Regardless, experts say the balance, as the coronavirus becomes endemic, will require accepting some level of Covid-19 risk — both to individuals and to society. America already does that with the flu: In some years, a flu season kills as many as 60,000 people in the US, most of whom are elderly and/or people with preexisting health conditions, but also some kids and previously healthy individuals. As a cause of death, the flu can surpass gun violence or car crashes, but it’s a tolerated cost to continuing life as normal.

“You want to get Covid to a place where it’s more comparable in terms of disease burden and in terms of economic impact to the flu,” Céline Gounder, an epidemiologist at New York University, told me.

With about half the country vaccinated, the Covid-19 death rate is still much higher than that of the flu — the more than 120,000 deaths over the past six months is still more than double the number of people even the worst flu seasons have recently killed. But as more people get vaccinated and others develop natural immunity after an infection, the death rate will likely come down.

A glimpse of what this could look like in the future came from a study in Provincetown, Massachusetts. The study was at first widely reported as evidence that the virus can still spread among the vaccinated because the outbreak happened in a highly vaccinated population, and three-fourths of those who were infected had gotten their shots.

But experts now argue for another interpretation of the study: It’s what a post-pandemic world could look like. Yes, the coronavirus still circulated among vaccinated people. But in an outbreak that eventually infected more than 1,000, only seven hospitalizations and zero deaths have been recorded. If this was 2020, given overall hospitalization and death rates, the outbreak would have likely produced around 100 hospitalizations and 10 deaths.

“We should cheer,” Amesh Adalja, senior scholar at the Johns Hopkins Center for Health Security, told me. “The Provincetown outbreak, contrary to what the press reported, was evidence not of the vaccines’ failure but of their smashing success.”

That doesn’t mean the vaccine is perfect. A 90 percent reduction in death, relative to the unvaccinated, is not 100 percent. But it is a much lower risk. If this holds up despite future variants and potentially waning vaccine efficacy, it’s great news.

But that isn’t how the Provincetown study has been widely interpreted, especially after the Centers for Disease Control and Prevention cited it to reinstitute masking recommendations for the vaccinated in public indoor spaces in areas with substantial or high caseloads.

And the national Covid-19 disease burden may never resemble Provincetown’s anyway, since the city resides in the second most vaccinated state. In that context, Americans may have to come to accept even higher levels of sickness and death if the goal is to return to normal and vaccination rates don’t go up quickly enough.

That leaves the country with a blunt question: How many deaths are Americans willing to tolerate?

We don’t yet know how much Covid-19 risk we’ll accept

The problem is there’s no agreement, including among experts, on Covid-19 risk. Some have accepted merely reducing the coronavirus’s strain on hospitals as the major policy goal. There’s next to no confidence that anything like “Covid zero” can be achieved now, but other experts still prefer harsher restrictions if it means preventing more deaths. And many people fall in between.

It’s this debate, between “flatten the curve” and “Covid zero,” that’s long divided the US’s Covid-19 response. Red states hewed at least for a while to “flatten the curve,” moving to lift Covid-related restrictions and reopen their economies as soon as hospitals stabilized. Blue states never truly pushed for “Covid zero,” but they were generally much less willing to tolerate high levels of cases and deaths — and, as a result, shut down more quickly in response to even hints of major surges. (Although there were some outliers on both sides.)

Even with the vaccines, this division, among both policymakers and the public they serve, has kept America in limbo.

Part of the divide is on a philosophical question about the role of government. But it’s about individuals’ decisions, too: Are they willing to forgo social activities, government mandate or not, to reduce deaths? Are they willing to keep wearing masks? Submit to continued testing in all sorts of settings?

Are 30,000 to 40,000 deaths a year too many? That’s generally what the country sees with gun violence and car crashes — and American policymakers, at least, haven’t been driven to major actions on these fronts.

Are as many as 60,000 deaths a year too many? That’s what Americans have tolerated for the flu.

Are 90,000 deaths a year too many? That’s the death toll of the ongoing drug overdose crisis — and while policymakers have taken some steps to combat that, experts argue the actions so far have fallen short, and the issue doesn’t draw that much national attention.

Is the current death toll — of more than 1,500 a day, or equivalent to more than 500,000 deaths a year — too much? Many people would say, of course, it is. But in the middle of a delta variant surge, Americans may be revealing their preferences as restaurant reservations are now around the pre-pandemic normal — a sign the country is moving on. “The loudest voices on social media and in public are way more cautious than the average American,” Jha said.

Part of the calculus may be influenced by who is getting infected and dying. Once everyone (including children) is eligible for the vaccines, is a high death toll among those who remain unvaccinated simply part of the risk they decided to take by not getting the shot?

This is not something most experts I spoke to are comfortable saying, but it’s a sentiment I’ve repeatedly heard from vaccinated people and even some who are unvaccinated — a very dire version of “actions have consequences.”

Another consideration is whether some Covid-related precautions become permanent. Social distancing in any of its forms doesn’t seem like a candidate. But what about masking in indoor spaces? More frequent testing? Vastly improving indoor ventilation? Doing more things outdoors? Depending on whether Americans embrace these other interventions, the level of Covid-19 risk people have to tolerate may end up being lower — but what “normal” looks like would also be redefined to some degree.

Other countries are talking about these trade-offs more explicitly. Australian leaders, for example, have said that they will shift from a long-heralded “Covid zero” strategy once vaccination rates hit certain thresholds — even though this means continued cases and deaths, particularly among the unvaccinated. In the US, the end goal has never been so clear.

Experts argue that these kinds of questions need to be out in the open, so Americans and their leaders can openly discuss them and decide on a plan forward.

Those conversations “were important to have in the beginning,” Murray said. “But they’re even more important now, as we move into this control phase rather than a phase where elimination or eradication [of Covid-19] seems possible.”

The country may just continue muddling along. Vaccination rates and natural immunity will slowly increase. Deaths and hospitalizations will similarly decline. Eventually, the virus will hit a level that most Americans find tolerable (if that hasn’t happened already). Politicians and the media will talk less about the coronavirus. And, perhaps before we know it, the pandemic will be a thing of the past in the US.

That’s what was happening in June — before the delta surge. But over the past 18 months, we’ve seen that, with no agreement on the endgame, it’s often impossible to say if the end is really near.

READ MORE


The Decline and Fall of the Roman ... Whoops! ... American EmpireU.S. soldiers in Afghanistan. (photo: WSJ)


Tom Engelhardt | The Decline and Fall of the Roman ... Whoops! ... American Empire
Tom Engelhardt, TomDispatch
Engelhardt writes: "They weren't kidding when they called Afghanistan the 'graveyard of empires.' Indeed, that cemetery has just taken another imperial body. And it wasn't pretty, was it?"

TD is back and, as always, I’m urging you, its faithful readers, to take a moment to visit our donation page. Whatever you can contribute will help keep this website heading into a future that it’s been all too sadly accurate about. Very simply, you’re what keeps us going in good times and bad and we’re in a country where TomDispatch-style coverage of our world and its perils is needed now more than ever. Let me also thank those of you who have contributed in these last weeks. I always see your individual donations and feel deeply appreciative (even if also regretful that I don’t have the time to thank you each individually). And now, back to that world of ours. Tom]

-Tom Engelhardt, TomDispatch


The Decline and Fall of the Roman… Whoops!… American Empire
What Really Matters in the U.S. of A.


They weren’t kidding when they called Afghanistan the “graveyard of empires.” Indeed, that cemetery has just taken another imperial body. And it wasn’t pretty, was it? Not that anyone should be surprised. Even after 20 years of preparation, a burial never is.

In fact, the shock and awe(fulness) in Kabul and Washington over these last weeks shouldn’t have been surprising, given our history. After all, we were the ones who prepared the ground and dug the grave for the previous interment in that very cemetery.

That, of course, took place between 1979 and 1989 when Washington had no hesitation about using the most extreme Islamists — arming, funding, training, and advising them — to ensure that one more imperial carcass, that of the Soviet Union, would be buried there. When, on February 15, 1989, the Red Army finally left Afghanistan, crossing the Friendship Bridge into Uzbekistan, Soviet commander General Boris Gromov, the last man out, said, “That’s it. Not one Soviet soldier or officer is behind my back.” It was his way of saying so long, farewell, good riddance to the endless war that the leader of the Soviet Union had by then taken to calling “the bleeding wound.” Yet, in its own strange fashion, that “graveyard” would come home with them. After all, they returned to a bankrupt land, sucked dry by that failed war against those American- and Saudi-backed Islamist extremists.

Two years later, the Soviet Union would implode, leaving just one truly great power on Planet Earth — along with, of course, those very extremists Washington had built into a USSR-destroying force. Only a decade later, in response to an “air force” manned by 19 mostly Saudi hijackers dispatched by Osama bin Laden, a rich Saudi prince who had been part of our anti-Soviet effort in Afghanistan, the world’s “sole superpower” would head directly for that graveyard (as bin Laden desired).

Despite the American experience in Vietnam during the previous century — the Afghan effort of the 1980s was meant to give the USSR its own “Vietnam” — key Bush administration officials were so sure of themselves that, as the New York Times recently reported, they wouldn’t even consider letting the leaders of the Taliban negotiate a surrender once our invasion began. On September 11, 2001, in the ruins of the Pentagon, Secretary of Defense Donald Rumsfeld had already given an aide these instructions, referring not just to Bin Laden but Iraqi ruler Saddam Hussein: “Go massive. Sweep it up, all up. Things related and not.” Now, he insisted, “The United States is not inclined to negotiate surrenders.” (Of course, had you read war reporter Anand Gopal’s 2014 book, No Good Men Among the Living, you would have long known just how fruitlessly Taliban leaders tried to surrender to a power intent on war and nothing but war.)

Allow a surrender and have everything grind to a disappointing halt? Not a chance, not when the Afghan War was the beginning of what was to be an American triumph of global proportions. After all, the future invasion of Iraq and the domination of the oil-rich Greater Middle East by the one and only power on the planet were already on the agenda. How could the leaders of such a confident land with a military funded at levels the next most powerful countries combined couldn’t match have imagined its own 2021 version of surrender?

And yet, once again, 20 years later, Afghanistan has quite visibly and horrifyingly become a graveyard of empire (as well, of course, as a graveyard for Afghans). Perhaps it’s only fitting that the secretary of defense who refused the surrender of the enemy in 2001 was recently buried in Arlington National Cemetery with full honors. In fact, the present secretary of defense and the head of the joint chiefs of staff both reportedly “knelt before Mr. Rumsfeld’s widow, Joyce, who was in a wheelchair, and presented her with the flag from her husband’s coffin.”

Meanwhile, Joe Biden was the third president since George W. Bush and crew launched this country’s forever wars to find himself floundering haplessly in that same graveyard of empires. If the Soviet example didn’t come to mind, it should have as Democrats and Republicans, President Biden and former President Trump flailed at each other over their supposedly deep feelings for the poor Afghans being left behind, while this country withdrew its troops from Kabul airport in a land where “rest in peace” has long had no meaning.

America’s True Infrastructure Spending

Here’s the thing, though: don’t assume that Afghanistan is the only imperial graveyard around or that the U.S. can simply withdraw, however ineptly, chaotically, and bloodily, leaving that country to history — and the Taliban. Put another way, even though events in Kabul and its surroundings took over the mainstream news recently, the Soviet example should remind us that, when it comes to empires, imperial graveyards are hardly restricted to Afghanistan.

In fact, it might be worth taking a step back to look at the big picture. For decades, the U.S. has been involved in a global project that’s come to be called “nation building,” even if, from Vietnam, Laos, and Cambodia to Afghanistan and Iraq, it often seemed an endless exercise in nation (un)building. An imperial power of the first order, the United States long ago largely rejected the idea of straightforward colonies. In the years of the Cold War and then of the war on terror, its leaders were instead remarkably focused on setting up an unparalleled empire of military bases and garrisons on a global scale. This and the wars that went with it have been the unsettling American imperial project since World War II.

And that unsettling should be taken quite literally. Even before recent events in Afghanistan, Brown University’s invaluable Costs of War Project estimated that this country’s conflicts of the last two decades across the Greater Middle East and Africa had displaced at least 38 million people, which should be considered nation (un)building of the first order.

Since the Cold War began, Washington has engaged in an endless series of interventions around the planet from Iran to the CongoChile to Guatemala, as well as in conflicts, large and small. Now, with Joe Biden having withdrawn from America’s disastrous Afghan War, you might wonder whether it’s all finally coming to an end, even if the U.S. still insists on maintaining 750 sizeable military bases globally.

Count on this, though: the politicians of the great power that hasn’t won a significant war since 1945 will agree on one thing — that the Pentagon and the military-industrial complex deserve yet more funding (no matter what else doesn’t). In truth, those institutions have been the major recipients of actual infrastructure spending over much of what might still be thought of as the American century. They’ve been the true winners in this society, along with the billionaires who, even in the midst of a grotesque pandemic, raked in profits in a historic fashion. In the process, those tycoons created possibly the largest inequality gap on the planet, one that could destabilize a democracy even if nothing else were going on. The losers? Don’t even get me started.

Or think of it this way: yes, in August 2021, it was Kabul, not Washington, D.C., that fell to the enemy, but the nation (un)building project in which this country has been involved over these last decades hasn’t remained thousands of miles away. Only half-noticed here, it’s been coming home, big time. Donald Trump’s rise to the presidency, amid election promises to end America’s “endless wars,” should really be seen as part of that war-induced (un)building project at home. In his own strange fashion, The Donald was Kabul before its time and his rise to power unimaginable without those distant conflicts and the spending that went with them, all of which, however unnoticed, unsettled significant parts of this society.

Climate War in a Graveyard of Empires?

You can tell a lot about a country if you know where its politicians unanimously agree to invest taxpayer dollars.

At this very moment, the U.S. is in a series of crises, none worse than the heat, fire, and flood “season” that’s hit not just the megadrought-ridden West, or inundated Tennessee, or hurricane-whacked Louisiana, or the tropical-storm-tossed Northeast, but the whole country. Unbearable warmth, humidity, firessmoke, storms, and power outages, that’s us. Fortunately, as always, Congress stands in remarkable unanimity when it comes to investing money where it truly matters.

And no, you knew perfectly well that I wasn’t referring to the creation of a green-energy economy. In fact, Republicans wouldn’t hear of it and the Biden administration, while officially backing the idea, has already issued more than 2,000 permits to fossil-fuel companies for new drilling and fracking on federal lands. In August, the president even called on OPEC — the Saudis, in particular — to produce significantly more oil to halt a further rise in gas prices at the pump.

As America’s eternally losing generals come home from Kabul, what I actually had in mind was the one thing just about everyone in Washington seems to agree on: funding the military-industrial complex beyond their wildest dreams. Congress has recently spent months trying to pass a bill that would, over a number of years, invest an extra $550 billion in this country’s badly tattered infrastructure, but never needs time like that to pass Pentagon and other national security budgets that, for years now, have added up to well over a trillion dollars annually.

In another world, with the Afghan War ending and U.S. forces (at least theoretically) coming home, it might seem logical to radically cut back on the money invested in the military-industrial complex and its ever more expensive weaponry. In another American world on an increasingly endangered planet, significantly scaling back American forces in every way and investing our tax dollars in a very different kind of “defense” would seem logical indeed. And yet, as of this moment, as Greg Jaffe writes at the Washington Post, the Pentagon continues to suck up “a larger share of discretionary spending than any other government agency.”

Fortunately for those who want to keep funding the U.S. military in the usual fashion, there’s a new enemy out there with which to replace the Taliban, one that the Biden foreign-policy team and a “pivoting” military is already remarkably eager to confront: China.

At least when the latest infrastructure money is spent, if that compromise bill ever really makes it through a Congress that can’t tie its own shoelaces, something will be accomplished. Bridges and roads will be repaired, new electric-vehicle-charging stations set up, and so on. When, however, the Pentagon spends the money just about everyone in Washington agrees it should have, we’re guaranteed yet more weaponry this country doesn’t need, poorly produced for thoroughly exorbitant sums, if not more failed wars as well.

I mean, just think about what the American taxpayer “invested” in the losing wars of this century. According to Brown University’s Costs of War Project, $2.313 trillion went into that disastrous Afghan War alone and at least $6.4 trillion by 2020 into the full-scale war on terror. And that doesn’t even include the estimated future costs of caring for American veterans of those conflicts. In the end, the total may prove to be in the $8 trillion range. Hey, at least $88 billion just went into supplying and training the Afghan military, most of which didn’t even exist by August 2021 and the rest of which melted away when the Taliban advanced.

Just imagine for a minute where we might really be today if Congress had spent close to $8 trillion rebuilding this society, rather than (un)building and wrecking distant ones.

Rest assured, this is not the country that ended World War II in triumph or even the one that outlasted the Soviet Union and whose politicians then declared it the most exceptional, indispensable nation ever. This is a land that’s crumbling before our eyes, being (un)built month by month, year by year. Its political system is on the verge of dissolving into who knows what amid a raft of voter suppression laws, wild claims about the most recent presidential election, an assault on the Capitol itself, and conspiracy theories galore. Its political parties seem ever more hostile, disturbed, and disparate. Its economy is a gem of inequality, its infrastructure crumbling, its society seemingly coming apart at the seams.

And on a planet that could be turning into a genuine graveyard of empires (and of so much else), keep in mind that, if you’re losing your war with climate change, you can’t withdraw from it. You can’t declare defeat and go home. You’re already home in the increasingly dysfunctional, increasingly (un)built U.S. of A.

Featured image: afghanistan by The U.S. Army is licensed under CC BY 2.0.



Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II

READ MORE


Palestine: An Open-Air Museum of ColonialismPalestinian women wait to cross through the Qalandia checkpoint near the West Bank city of Ramallah. (photo: Abbas Momani/AFP/Getty Images)

Palestine: An Open-Air Museum of Colonialism
Omar Khalifah, Al Jazeera
Khalifah writes: "Palestine has been turned, brutally, into a permanent museum of colonialism whose doors should have closed long ago."

On a recent visit to Palestine (I belong to a category of Jordanian Palestinians who can visit Palestine using an Israeli-issued ID card), a Palestinian friend of mine in Ramallah invited me to drive with him to Bethlehem. Thirty minutes into the trip, we stopped at an Israeli checkpoint, pulling into a huge queue of cars. The place was engulfed by an apathetic silence, perhaps indicative of how normal the situation was for those experiencing it. I, however, felt increasingly impatient, and I asked my friend if it would take too long before we were allowed to move. My friend responded, rather sarcastically, “This is Palestine. You can never predict when to move or to stop. People have lost any sense of what a meeting time means. You arrive when you arrive.”

Welcome to Palestine – an open-air museum of colonialism.

For most people nowadays, colonialism is part of a bygone era. The majority of the world’s population has no first-hand experience of it, and many cannot imagine what it means to live under total foreign control. Today we have museums of colonialism, where people can go to learn about how this form of rule affected natives’ freedoms to live, to move, to speak, to work, and even to die peacefully. We live (supposedly) in a postcolonial world, and museums of colonialism serve to transport visitors back to a cruel era, granting them a glimpse of the damage this type of governance wrought on native communities.

What if, however, there were an actual place in our world today where colonialism and post-colonialism co-existed? Herein lies the sad, almost incomprehensible Palestinian contribution to the museum industry. If museums of colonialism reimagine the past in a modern setting, Palestine is both past and present – a colonial and postcolonial reality. In Palestine, there is no need to create a museum of colonialism: the whole country functions as such.

At any museum, you can expect to be able to explore different sections on different themes. The same holds true in Palestine – it has various sections, each displaying a different layer of colonialism. There is the West Bank, where you can see illegal Israeli settlements, expropriated land, a separation wall, and a physically controlled population. Then there is Gaza, where open-air museum meets open-air prison, as two million Palestinians have been living under an Israeli blockade for more than 15 years. And if you are more into surveying a surreal case of colonialism, then head to Israel proper and find out how Palestinians who stayed in historic Palestine after the foundation of Israel live. There, you will learn about stolen houses, demolished villages, second-class citizens, and institutionalised racism.

Open-air museums seek to give visitors a direct experience of what it was like to live in the past. When I tell foreign friends that settler-only roads surround my tiny village, Burin, located a few kilometres southwest of Nablus in the West Bank, they respond with a disbelieving gasp. For many, it is inconceivable to imagine colonial-era conditions in our time, and yet they have been the status quo in Palestine for decades. People who would like to learn about colonialism need look no further than Palestine. It is colonialism incarnate.

Recognising 21st century Palestine as an open-air museum of colonialism casts the longstanding Palestinian-Israeli conflict in a different light. During the latest war in Gaza, some supporters of Israel legitimised its use of force by noting that any sovereign state would have reacted similarly to defend itself had it been under rocket fire from another state. Hamas launched rockets into Israeli territory, so this logic goes, and so Israel has a right to fight back.

This repeated argument ignores one crucial reality of the situation: Gaza is not a state. The West Bank is not a state either. In fact, there is no Palestinian state. The conflict between Israelis and Palestinians is not one between two sovereign states. Rather, it is a conflict between a colonised people and their coloniser.

Framing Palestine as a colonial question is essential to understanding the peculiarity of the Palestinian condition. For many people around the world, Palestine is an enigma. How is it that for so long Palestinians have been stuck in a situation that is seemingly so unchangeable, fixed, intractable? Statelessness, uprooting, refugeehood, and resistance have practically become permanent descriptors of Palestinians. The conflict between Palestinians and Israelis has evolved into a cornerstone of our modern soundscape – something always happens there, except what happens never brings about any serious change to the status quo.

If Palestine is often viewed as a persistent dilemma whose resolution is long overdue, it is because Palestine is more of an anomaly than an enigma. Palestinians have not enjoyed the kind of history that most people in the colonial era have. In most cases, the story of former colonies followed a linear path: colonialism, anti-colonial struggle, and then independence – a new nation-state. This pattern was so forceful and the defeat of colonialism so successful that the last few decades have witnessed the emergence of a powerful new field of intellectual inquiry aptly named “postcolonial studies”. Ironically, one of the grand masters of this field was Palestinian – the late Edward Said.

Not so for Palestinians. Unlike other would-be nations in the Middle East, such as Jordan, Iraq, and Syria, Palestine did not witness an end to a British or French Mandate that would lead to the formation of an independent nation-state. Rather, the termination of the British Mandate of Palestine in 1948 led to what Palestinians view as another form of colonialism.

The Zionist movement, which would form Israel and result in the destruction of Palestinian society and the ethnic cleansing of Palestine (a series of events known in Palestinian historiography as the Nakba, or Catastrophe), has successfully managed to halt the linear progression of Palestinians’ path to self-determination. Both before and after 1948, Palestinians have been struggling to resist first, British and then, Zionist colonialism; realise their dream of a free, independent state; and cast off their own specific, multilayered experiences of imperialism.

Put bluntly, Palestinians have yet to enter the postcolonial world order. As individuals, they live in the 21st century, but as a stateless nation, they are still captive to the pre-1948 colonial moment. This is the anomaly of Palestinian time: as Columbia University professor Joseph Massad characterises it, Palestine can be understood as a “postcolonial colony”, a region where two periods, two world views, two eras, fiercely collide. This is why it functions as an open-air museum of colonialism – it is at once past and present, with the exploitative policies and practices of colonialism on perpetual display.

It is dangerous to view Palestine as solely a human rights issue – it is drastically more. Palestinians are a living demonstration of what colonialism looks like. They simultaneously belong and do not belong to the postcolonial order. For them, 1948 is not just a memory – it is an ongoing reality, a moment in time that has been stretched to define who they are, and who they are not. Palestine has been turned, brutally, into a permanent museum of colonialism whose doors should have closed long ago.


READ MORE


20 Meat and Dairy Companies Emit More Greenhouse Gas Than Germany, Britain or FranceRaising livestock contributes significantly to carbon emissions. (photo: CWA)


20 Meat and Dairy Companies Emit More Greenhouse Gas Than Germany, Britain or France
Sophie Kevany, Guardian UK
Kevany writes: "Twenty livestock companies are responsible for more greenhouse gas emissions than either Germany, Britain or France - and are receiving billions of dollars in financial backing to do so, according to a new report by environmental campaigners."

Livestock companies with large emissions receive billions of dollars in funding, campaigners say

Twenty livestock companies are responsible for more greenhouse gas emissions than either Germany, Britain or France – and are receiving billions of dollars in financial backing to do so, according to a new report by environmental campaigners.

Raising livestock contributes significantly to carbon emissions, with animal agriculture accounting for 14.5% of the world’s greenhouse gas emissions. Scientific reports have found that rich countries need huge reductions in meat and dairy consumption to tackle the climate emergency.

Between 2015 and 2020, global meat and dairy companies received more than US$478bn in backing from 2,500 investment firms, banks, and pension funds, most of them based in North America or Europe, according to the Meat Atlas, which was compiled by Friends of the Earth and the European political foundation, Heinrich Böll Stiftung.

With that level of financial support, the report estimates that meat production could increase by a further 40m tonnes by 2029, to hit 366m tonnes of meat a year.

Although the vast majority of growth was likely to take place in the global south, the biggest producers will continue to be China, Brazil, the USA and the members of the European Union. By 2029 these countries may still produce 60% of worldwide meat output.

Across the world, the report says, three-quarters of all agricultural land is used to raise animals or the crops to feed them. “In Brazil alone, 175m hectares is dedicated to raising cattle,” an area of land that is about equal to the “entire agricultural area of the European Union”.

The report also points to ongoing consolidation in the meat and dairy sector, with the biggest companies buying smaller ones and reducing competition. The effect risks squeezing out more sustainable food production models.

“To keep up with this [level of animal protein production] industrial animal farming is on the rise and keeps pushing sustainable models out of the market,” the report says.

The recent interest shown by animal protein companies in meat alternatives and substitutes was not yet a solution, campaigners said.

“This is all for profit and is not really addressing the fundamental issues we see in the current animal protein-centred food system that is having a devastating impact on climate, biodiversity and is actually harming people around the globe,” said Stanka Becheva, a food and agriculture campaigner working with Friends of the Earth.

The bottom line, said Becheva, is that “we need to begin reducing the number of food animals on the planet and incentivise different consumption models.”

More meat industry regulation is needed too, she said, “to make sure companies are paying for the harms they have created throughout the supply chain and to minimise further damage”.

On the investment side, Becheva said private banks and investors, as well as development banks such as the World Bank and the European Bank for Reconstruction and Development needed to stop financing large-scale, intensive animal protein production projects.

Responding to the report, Paolo Patruno, deputy secretary general of the European Association for the Meat Processing Industry (CLITRAVI), said: “We don’t believe that any food sector is more or less sustainable than another. But there are more or less sustainable ways to produce plant or animal foods and we are committed to making animal protein production more sustainable.

“We also know that average GHG [greenhouse gas] emissions in the EU from livestock is half that of the global average. The global average is about 14% and the EU average is 7%,” he added.

In England and Wales, the National Farmers’ Union has set a target of reaching net zero greenhouse gas emissions in agriculture by 2040.


READ MORE

 

Contribute to RSN

Follow us on facebook and twitter!

Update My Monthly Donation

PO Box 2043 / Citrus Heights, CA 95611







No comments:

Post a Comment

"Look Me In The Eye" | Lucas Kunce for Missouri

  Help Lucas Kunce defeat Josh Hawley in November: https://LucasKunce.com/chip-in/ Josh Hawley has been a proud leader in the fight to ...