On August 27, 1928, in Paris, with due pomp and circumstance, representatives of fifteen nations signed an agreement outlawing war. The agreement was the unanticipated fruit of an attempt by the French Foreign Minister, Aristide Briand, to negotiate a bilateral treaty with the United States in which each nation would renounce the use of war as an instrument of policy toward the other. The American Secretary of State, Frank Kellogg, had been unenthusiastic about Briand’s idea. He saw no prospect of going to war with France and therefore no point in promising not to, and he suspected that the proposal was a gimmick designed to commit the United States to intervening on France’s behalf if Germany attacked it (as Germany did in 1914). After some delay and in response to public pressure, Kellogg told Briand that his idea sounded great. Who wouldn’t want to renounce war? But why not make the treaty multilateral, and have it signed by “all the principal powers of the world”? Everyone would renounce the use of war as an instrument of policy.

Kellogg figured that he had Briand outfoxed. France had mutual defense treaties with many European states, and it could hardly honor those treaties if it agreed to renounce war altogether. But the agreement was eventually worded in a way that left sufficient interpretive latitude for Briand and other statesmen to see their way clear to signing it, and the result was the General Treaty for the Renunciation of War, also known as the Paris Peace Pact or the Kellogg-Briand Pact. By 1934, sixty-three countries had joined the Pact—virtually every established nation on earth at the time.

The Treaty of Versailles, signed in 1919, gets bad press. It imposed punitive conditions on Germany after the First World War and is often blamed for the rise of Hitler. The Kellogg-Briand Pact does not get bad press. It gets no press. That’s because the treaty went into effect on July 24, 1929, after which the following occurred: Japan invaded Manchuria (1931); Italy invaded Ethiopia (1935); Japan invaded China (1937); Germany invaded Poland (1939); the Soviet Union invaded Finland (1939); Germany invaded Denmark, Norway, Belgium, the Netherlands, Luxembourg, and France and attacked Great Britain (1940); and Japan attacked the United States (1941), culminating in a global war that produced the atomic bomb and more than sixty million deaths. A piece of paper signed in Paris does not seem to have presented an obstacle to citizens of one country engaging in the organized slaughter of the citizens of other countries.

In modern political history, therefore, the Paris Peace Pact, if it is mentioned at all, usually gets a condescending tip of the hat or is dutifully registered in footnote. Even in books on the law of war, little is made of it. There is not a single reference to it in the political philosopher Michael Walzer’s “Just and Unjust Wars,” a classic work published in 1977. The summary on the U.S. State Department’s Web site is typical: “In the end, the Kellogg-Briand Pact did little to prevent World War II or any of the conflicts that followed. Its legacy remains as a statement of the idealism expressed by advocates for peace in the interwar period.”

The key term in that sentence is “idealism.” In international relations, an idealist is someone who believes that foreign policy should be based on universal principles, and that nations will agree to things like the outlawry of war because they perceive themselves as sharing a harmony of interests. War is bad for every nation; therefore, it is in the interests of all nations to renounce it.

An alternative theory is (no surprise) realism. A realist thinks that a nation’s foreign policy should be guided by a cold consideration of its own interests. To a realist, the essential condition of international politics is anarchy. There is no supreme law governing relations among sovereign states. When Germany invades France, France cannot take Germany to court. There are just a lot of nations out there, each trying to secure and, if possible, extend its own power. We don’t need to judge the morality of other nations’ behavior. We only need to ask whether the interests of our nation are affected by it. We should be concerned not with some platonic harmony of interests but with the very real balance of power.

A standard way to write the history of twentieth-century international relations is to cast as idealists figures like Woodrow Wilson, who, in 1917, entered the United States into a European war to make the world “safe for democracy,” and the other liberal internationalists who came up with the League of Nations and the Kellogg-Briand Pact. The Second World War proved these people spectacularly wrong about how nations behave, and they were superseded by the realists.

To the realists, such Wilsonian ideas as world government and the outlawry of war were quixotic. Nations should recognize that conflict is endemic to the international arena, and they should not expend blood and treasure in the name of an abstraction. Containment, the American Cold War policy of preventing the Soviet Union from expanding without otherwise intervening in its affairs, was a realist policy. Communists could run their own territories however they liked as long as they stayed inside their boxes. If our system was better, theirs would eventually implode; if theirs was better, ours would. The author of that policy, the diplomat George Kennan, called the Kellogg-Briand Pact “childish, just childish.”

And yet since 1945 nations have gone to war against other nations very few times. When they have, most of the rest of the world has regarded the war as illegitimate and, frequently, has organized to sanction or otherwise punish the aggressor. In only a handful of mostly minor cases since 1945—the Russian seizure of Crimea in 2014 being a flagrant exception—has a nation been able to hold on to territory it acquired by conquest.

Historians have suggested several reasons for this drop in the incidence of interstate war. The twenty years after the Second World War was a Pax Americana. By virtue of the tremendous damage suffered in the war by all the other powers, the United States became a global hegemon. America kept the peace (on American terms, of course) because no other country had the military or economic capacity to challenge it. This is the “great” America that some seventy-five million American voters in the last Presidential election were born in, and that many of them have been convinced can be resurrected by shutting the rest of the world out—which would be a complete reversal of the policy mind-set that made the United States a dominant power back when those voters were children.

By the nineteen-seventies, the rest of the world had caught up, and students of international affairs began to predict that, in the absence of a credible global policeman, there would be a surge in the number of armed conflicts around the world. When this didn’t happen, various explanations were ventured. One was that the existence of nuclear weapons had changed the calculus that nations used to judge their chances in a war. Nuclear weapons now operated as a general deterrence to aggression.

Other scholars proposed that the spread of democracy—including, in the nineteen-eighties, the Velvet Revolution in Eastern Europe and the dismembering of the Soviet Union—made the world a more peaceable place. Historically, democracies have not gone to war with other democracies. It was also argued that globalization, the interconnectedness of international trade, had rendered war less attractive. When goods are the end products of a worldwide chain of manufacture and distribution, a nation that goes to war risks cutting itself off from vital resources.

LEAVE A REPLY