fbpx
Politics Foreign Affairs Culture Fellows Program

The Perils of Hegemony

Washington learns that democracy is not made for export.

A distinguished analyst of international politics, Martin Wight, once laid it down as a fundamental truth of international politics that “Great Power status is lost, as it is won, by violence. A Great Power does not die in its bed.” But 12 years ago, the Soviet Union, a state not exactly averse to violence, confounded all expectations by doing just that. It sickened and quietly expired, without war or bloodshed.

When the communist superpower ceased to exist, it did more than bring the Cold War to an end. It also altered fundamentally the structure of the international political system. For the first time in its history, that system became unipolar. The United States became a global hegemon. While there have often been local or regional hegemonies—the Soviet Union in Eastern Europe, for example, or the United States in the Caribbean, and later in the Atlantic Alliance—there has never before been one that dominated the whole system.

How fundamental a change this is is indicated by the fact that one of the main themes in the history of the state system has been the repeated and determined efforts of alliances of states to prevent any of their number from achieving systemic hegemony, even at the cost of long and bloody wars. Phillip II of Spain in the 16th century, Louis XIV in the 17th and early 18th centuries, Napoleon at the beginning of the 19th century, the Emperor Wilhelm II of Germany and Hitler in the 20th century each tried for domination; all were eventually thwarted. And millions were killed in the process.

Britain played a prominent part in forming coalitions to balance and oppose the would-be dominant power, changing its allies as the challengers changed. Then, in the 19th century, Britain itself became very powerful. It dominated the world industrially, commercially, and financially. Its navy ruled the seas. It had a vast empire and established a Pax Britannica in large areas of the world. All this has led some to claim that in the middle of the 19th century Britain had indeed achieved global hegemony. But it is not a convincing claim. For Britain never achieved or sought to achieve dominance in continental Europe, which was the heart of the state system, where things were finally decided. It never acquired the formidable land army that would have been necessary to exert such dominance. Indeed the German chancellor, Bismarck, used to say derisively that if the British army was to land on the North German coast, he would send a policeman to arrest it.

During the time of their greatest power, the British followed a prudent policy of “Splendid Isolation,” keeping their distance from matters that did not affect them seriously and not taking too assertive a role in European affairs. They played the role of offshore balancer, aiming not at achieving hegemony but at preventing any other states from doing so, while Britain itself dominated much of the rest of the world. So, no, Britain in the Victorian era was not a true global hegemon.

Stronger states have typically joined together against the prospective hegemon—as England, Austria, Holland, and Russia allied against the France of Louis XIV, or as France, England, and Russia joined together to balance a very powerful and assertive Germany before 1914. On the other hand, weaker and more vulnerable states, or those that for some reason—ethnic, cultural, or ideological affinity; a history of past friendly association—have hopes that they may receive favorable treatment at the hands of the ambitious state, may opt to become its associates or accomplices. Balancing or bandwagoning is basically the choice for all those caught in the scope of the hegemon’s ambition.

But how can they know in advance the scope of that ambition? The answer is that they cannot know, but as a matter of prudence they must assume. That is, they must assume that in a system of independent states coexisting in a state of anarchy, without any superior authority to restrain them or common loyalty to bind them, those who have the capacity to do so will dominate others who are weaker. As a wit summed it up: When there is no agreement as to which suit is trumps, clubs are always trumps.

This view may seem unduly cynical, an example of the kind of self-fulfilling fear that characterizes Realpolitik and the Hobbesian view of international politics. And it may be that. But it is an interpretation of the motives and behaviors of states that has a long pedigree. It is to be found, for example, in the first great work on interstate politics, Thucydides’s History of the Peloponnesian War, written some two and a half thousand years before Henry Kissinger put pen to paper.

When Thucydides comes to discuss the causes of that war, he says that he will begin by giving an account of the specific complaints and disagreements that Athens and Sparta had with each other. But he advises that these in themselves will provide an inadequate and misleading explanation of the conflict. In an often quoted sentence, he gives what he considers the real, the fundamental, cause: “What made war inevitable was the growth of Athenian power and the fear which this caused in the Spartans.”

Why the fear? Because, as he puts it, “the strong do what they can and the weak suffer what they must.” What they can, note, not what they might originally have intended to do. For unchecked power creates its own motives and sets its own agenda. As Alexander Hamilton put it in another classic political text, the Federalist Papers:

    To presume a want of motives for such contests, as an argument against their existence, would be to forget that men are ambitious, vindictive and rapacious. To look for a continuation of harmony between a number of independent, unconnected sovereignties situated in the same neighborhood would be to disregard the uniform course of human events, and set at defiance the accumulated experience of ages.

Indeed, the Founding Fathers of the United States had such a fear of uncontained power, even in the hands of their elected fellow countrymen, that they made the separation and balancing of powers the outstanding feature of their constitution.

The policy conclusion that follows from such an analysis was most succinctly put by another Greek historian, Polybius, in the form of a maxim: “It is never right to help a power to acquire a predominance that will render it irresistible”—never right, that is, if a state values its own independence. If it should value order or peace above all else, there might be a case for submitting to the prospective hegemon. But that would be at the cost of one’s independence and freedom of action.

* * *

America’s emergence as a hegemonic power came not by deliberate effort, but inadvertently, by the default of the Soviet Union. One moment the United States was part of a bipolar balance, the next it was left as the one superpower in a unipolar world. It had not changed its policies or mode of behavior to bring this about. The speed with which things changed meant that American hegemony was an accomplished fact before anyone had time to react to it or attempt to prevent it.

And the process drew little attention to itself: most eyes were fixed on Moscow. For these reasons, the usual historical process of determined opposition to an aspiring hegemon did not take place.

Indeed, it took America herself some time to realize what had happened and how dominant she now was. When the Soviet system collapsed, the American people, far from enjoying an unalloyed sense of triumph, were experiencing their own crisis of confidence. In the late 1980s, it was widely believed, especially by American opinion leaders and intellectuals, that America was in decline and suffering from what historian Paul Kennedy had recently labeled “imperial overstretch.” The American economy was experiencing a long bad spell. Japan and Germany were coming up fast, and it was widely believed that the former would soon displace the United States as the number one economic power in the world.

Apart from all that, the country was suffering from serious social ills, and opinion polls were making it clear that the American people were tired of the burdens of foreign policy and wanted a re-ordering of priorities. Jeane Kirkpatrick, who had herself been a dedicated cold warrior, was expressing a widely held view when she wrote in 1990:

    The United States performed heroically in a time when heroism was required; altruistically during the long years when freedom was endangered. The time when Americans should bear such unusual burdens is past. With a return to ‘normal’ times, we can again become a normal nation—and take care of pressing problems of education, family, industry and technology. It is time to give up the dubious benefits of superpower status and become again an open American republic.

It was widely believed, both in the United States and elsewhere, that this was a unipolar moment, not a unipolar era. For the general assumption was that the end of the Cold War signaled a return to normality, and in international politics normality had always meant multipolarity. As late as 1994, Henry Kissinger was predicting the gradual military decline of the United States, the emergence of “at least six major powers.”

All these factors combined to obscure and disguise what should have been obvious both to Americans and the rest of the world: that the United States now had hegemonic power. Whatever problems the U.S. economy had, it still accounted for well over a quarter of the world’s gross domestic product. And soon it was to recover and enjoy a long boom fuelled by the so-called New Economy of information technology. In the 1990s, the United States economy was to grow nearly twice as fast as the European Union and three times as fast as Japan.

The United States also dominated what we have now been instructed to think of as “soft power,” cultural and intellectual influence represented by everything from Harvard to Hollywood, CNN to McDonald’s, popular music to computer software to jeans. Joseph Nye of Harvard, who coined the term, argues that these mold the tastes and thoughts of others, making them want what Americans want—and thus, without any co-ordinated intent, constitutes a kind of cultural hegemony. I have my doubts as to whether all this constitutes “power” in any real sense. After all, many Americans, far from approving of many aspects of their popular culture, are appalled that it represents America in the minds of millions of foreigners. And far from desiring all aspects of American culture, many foreigners see its manifestations as symbolizing all that they reject in America and resent in their own countries.

Last, but certainly not least, the United States possessed in unprecedented measure a form of power about which there is no ambiguity: military power. Until the Cold War, Americans had always been suspicious of professional armies. After all, the country had come into existence in the 18th century as the result of the exertions of citizen-soldiers against a British professional army. In his Farewell Address to the nation in 1796, George Washington, himself the country’s greatest soldier, urged future generations to “avoid the necessity of those overgrown military establishments, which under any form of government are inauspicious to liberty.”

His advice was followed. For nearly two centuries, the country never maintained a large peacetime army. Whenever a crisis occurred, it quickly raised forces of citizen-soldiers to meet it. Once the crisis was over—after the Civil War, after World Wars I and II—these forces were promptly disbanded. Soldiering was a low-prestige occupation, the army marginal to the life of the country. Until the beginning of the Cold War in the late 1940s, the United States did not have a Defense Department. It did not have a National Security Council. It did not have a Central Intelligence Agency. All these were only created in 1947, just as the Cold War was getting underway.

Four and a half decades later, the condition of the U.S. military, and its significance in American life, had experienced a monumental transformation. The Pentagon had become the most powerful department in American government. It sustained a huge defense industry of vital importance to the U.S. economy. Its officers were no longer languishing in the boondocks, but were an influential part of the Washington scene. A network of institutions, colleges, think-tanks, and journals sustained a sophisticated military culture. Given all this, it is not surprising that at the end of the Cold War there was not an immediate demobilization and drastic scaling down of the military establishment. It had become too powerful, too deeply embedded, for that to happen.

In politics, the relationship between ends and means is not all one-way. The capacity to do something contributes—sometimes substantially—to the attractiveness of doing it. Given that the United States had far and away the most powerful military machine in the post-Cold War world, it is not really surprising then that before long Madeleine Albright was asking an astonished Colin Powell: What is the use of having such a powerful military force if you are not prepared to use it?

* * *

According to Charles Krauthammer, from the end of the Cold War until the terrorist attack of Sept. 11, 2001, the United States took a ten-year “holiday from history.” On the face of it, this seems a strange way to characterize American behavior during the decade. United States military forces were more active during these years than at any time since the Vietnam War—in the Gulf and Iraq, in Somalia, in Haiti, in Bosnia, in Afghanistan, in Sudan, in Columbia, and in Kosovo. The American economy enjoyed a sustained six-year boom. On Washington’s initiative, NATO expanded eastward towards the Russian border. The North American Free Trade Agreement was negotiated and the World Trade Organization established.

Far from thinking that the United States was on vacation during these years, other countries were increasingly aware of its dominant presence. During the Clinton administration, German Chancellor Gerhard Schroeder expressed the view, “That there is danger of unilateralism, not by just anybody but by the United States, is undeniable.” The French Foreign Minister, Hubert Vedrine, reflected, “American globalism dominates everything. Not in a harsh, repressive, military form, but in people’s heads”.

Closer to home, by the mid-1990s as sound and patriotic a judge as James Schlesinger was detecting a “growing hubris” in the conduct of American foreign policy, and “a naïve belief that assertiveness is now cost-free and does not entail serious consequences.” In what sense, then, could it be thought that the United States was taking a “holiday from history”?

What Krauthammer meant, I believe, was that during these years, the United States, having become the sole remaining superpower and an authentic global hegemon, had failed to activate a grand purpose commensurate with that status. Most countries might not feel the need for such a thing. But Americans do. They have a great taste for doctrines that set out the objectives that are to determine policy, as in the Monroe Doctrine, the Truman Doctrine, and the Reagan Doctrine.

No such thing was evident during the last decade of the 20th century. George Bush senior confessed that he wasn’t very good at “the vision thing,” and his concept of a “new world order” was stillborn. His successor, William Clinton, was an improviser with little taste for doctrines or vision. A connoisseur of opinion polls and focus groups, he knew that Americans consistently gave foreign policy a very low priority, so he acted accordingly, taking a limited interest in foreign policy. When he did, he lived largely by improvisation.

In January 2001, George W. Bush succeeded Clinton. How the Bush administration’s foreign policy would have developed in the absence of the Sept. 11 attack, we shall never know. But in an instant the terrorists gave the country the clear purpose that it had previously lacked. That organizing principle came under the name of “a war on terrorism.” It was adopted not as a result of cool calculation or choice, but out of necessity and in a mood of understandable outrage at the unprecedented violation that had been visited on the United States.

Now the concept of a War on Terror is general enough to support more than one meaning. It can be interpreted precisely, in terms of destroying the organizations and instruments of terror and protecting the homeland against their efforts. But it can also be defined broadly to encompass changing the conditions that give rise to terrorism, and the creation of an international order that would be inimical to its existence—not only “draining the swamp,” as the phrase goes, but creating a fertile liberal and democratic pasture in its place.

Initially, the stress was on the former. But there were many in Washington’s foreign-policy establishment who saw things in much more sweeping terms, and Sept. 11 shifted the balance in their favor—away from prudence and moderation toward conceptual boldness and an ambitious use of American power. Within a year, the War on Terror had metastasized into something much grander and more radical; something that would give full expression to one of the strongest strands in the history of the American people: the profound belief that they and their country are destined to reshape the world. America’s “cause is the cause of all mankind,” said Benjamin Franklin; “We have the power to begin the world over again,” insisted Tom Paine; “God has predestined, mankind expects, great things from our race …. We are pioneers of the world,” said Herman Melville. Abraham Lincoln declared America to be “the world’s last best hope.” And so on and on.

Many in and around the Bush administration shared this sense of America’s destiny and saw in 9/11 not merely a disaster to be revenged but an opportunity to reawaken and redirect America to its true historic mission.

This is what Robert Kagan means when he insists that “America did not change on Sept. 11. It only became more itself.” As he explains, the national ideology has always insisted that “The proof of the transcendent importance of the American experiment would be found not only in the continual perfection of American institutions at home but also the spread of American influence in the world …. That is why it was always so easy for so many Americans to believe, as so many still believe today, that by advancing their own interests, they advance the interests of humanity.”

In the aftermath of Sept. 11 those who thought in these terms came into their own. The result became fully evident a year after the terrorist attack with the publication of a 31-page statement by the president titled “The National Security Strategy of the United States of America”.

For a document concerned with strategy, it puts an extremely heavy emphasis on ideology in defining America’s purpose. In its first three pages alone, it uses the words “liberty” and “freedom,” or some variation of them, 25 times, while the word “interest” occurs only twice. The document declares that the national strategy will be based on “a distinctly American internationalism.” It will “use this moment of opportunity to extend the benefits of freedom across the globe … will actively work to bring the hope of democracy, development, free markets, and free trade to every corner of the world.” To that end, the United States will seek “to create a balance of power that favors human freedom: conditions in which all nations and all societies can choose for themselves the rewards and challenges of political and economic liberty.” Note that the assumption is that, given a free choice, these are the values that all people will choose.

As well as reordering the internal conditions of countries in this way, the United States will reorder relations among states, for, as the document asserts, “the international community has the best chance since the rise of the nation-state in the seventeenth century to build a world where great powers compete in peace instead of continually prepare for war.”

The president ends his introduction by declaring, “The United States has responsibility to lead this great mission.” It is made unambiguously clear that the United States military will be an indispensable instrument for the creation of a new order and that the United States intends to maintain indefinitely the enormous military superiority it now enjoys. It is time, the president says,

    [T]o reaffirm the essential role of American military strength. We must build and maintain our defenses beyond challenge …. Our forces will be strong enough to discourage potential adversaries from pursuing a military build-up in hopes of surpassing, or equalizing, the power of the United States.

The military will be used actively and assertively, deployed even more widely than it was during the Cold War as a kind of global gendarmerie maintaining order.

And this strategy intends to maintain, if not increase, America’s military power as it discourages others from building up theirs. Thus, two pages before it declares the essential role of American military strength, it advises the Chinese that:

    In pursuing advanced military capabilities that can threaten its neighbors in the Asia-Pacific region, China is following an outdated path that, in the end, will hamper its own pursuit of national greatness.

This might seem a clear example of double standards. The defenders of the new doctrine do not deny this but justify it in terms of the special responsibilities of the United States for world order. As Robert Kagan puts it, because of those responsibilities, America “must refuse to abide by certain international conventions that may constrain its ability to fight effectively …. It must support arms control, but not always for itself. It must live by a double standard.” Which, of course, raises the important question of whether other countries will ever be willing to accept that double standard. The whole history of international politics suggests that they will not.

Also key to the new doctrine is its abandonment of deterrence, which was effective in dealing with a rational and cautious adversary like the Soviet Union, but is less so in dealing with risk-taking rogue states. Instead, a greatly extended policy of pre-emptive action must now be adopted:

    The greater the threat, the greater is the risk of inaction—and the more compelling the case for taking anticipatory action to defend ourselves, even if uncertainty remains as to the time and place of the enemy’s attack. To forestall or prevent such hostile acts by our adversaries, the United States will, if necessary, act preemptively.

Another feature of the Bush doctrine is its unilateralism. (“We will not hesitate to act alone, if necessary.”) In a sense, the very genesis of the document testifies to this, since its intention to alter the international system fundamentally was announced with little or no consultation with other states. What can one say about this strategic doctrine? The first thing to be emphasized is its breathtaking scope and huge ambition to do no less than to effect a transformation of the political universe—according to some of its language, to stamp out evil and war between states, to create a benign world. Students of the realist school tend to see such goals as beyond even the reach of a country with the enormous power of the United States. While America has enough strength to defeat all other adversaries and rivals, it remains to be seen whether she can conquer Utopia.

In insisting upon the dominant role of the United States and the assertive use of American power, the doctrine makes very questionable assumptions about what the other states will accept. They are asked to take good intentions on trust, but states have never been prepared to do this with other would-be hegemons.

Will the United States be the exception? Does the fact that it is a democratic and liberal state make a decisive difference? Will other states accept the concept of a benign hegemon or regard it as a contradiction in terms? Bearing in mind the distrust of unbalanced and concentrated power that is manifest in the United States’ own constitution, Americans should not be surprised if others are skeptical.

The thrust and tone of the doctrine reject the advice given by most pundits on the best way to play a hegemonic role: to be restrained and prudent in the use of power, to disguise it, to strive to act as far as possible by persuasion and consensus. In the 1940s, when the United States was already the dominant power within the Western Alliance, it acted on this advice. It went out of its way to act multilaterally, to create a network of rule-making institutions—the UN, IMF, World Bank, and GATT—that allowed it to act co-operatively with others, as primus inter pares—the first among equals. There is little of this to be found in the current doctrine. The prevailing view in Washington, as famously enunciated by Secretary of Defense Rumsfeld, has been, “the worst thing you can do is allow a coalition to determine what your mission is.”

The Bush doctrine should be taken seriously and not dismissed as rhetoric. It has already been put into effect in Iraq: the use of American military force as the main instrument; pre-emptive action; a clear indication that the United States was prepared to act without a Great Power consensus, and unilaterally if necessary; and the avowed intention to replace a tyrannical regime with a liberal representative government. That is why the Iraq commitment has an importance that goes way beyond the fate of Iraq itself. If, in the end, it turns out successfully, it is likely that the mishaps that have occurred since the end of the heavy fighting will be seen as part of a learning experience, a breaking-in period for a new, revolutionary, strategic doctrine. If, on the other hand, it fails at the first hurdle—if, that is, the United States finds that bringing about a decent political order is beyond its capacity—then not only will there have to be a reconsideration of the whole global strategy, but the limits of the United States’ capacity will have been made evident, and the inclination to resist it greatly strengthened.

* * *

To the critics, the belief that democratic institutions, behavior, and ways of thought can be exported and transplanted to societies that have no experience of them is profoundly mistaken. While the United States can provide an example to emulate, democracy is not a commodity that can be exported, or a gift that can be bestowed. To be viable, political institutions and political cultures require a long, organic, indigenous growth, and to attempt from without a sudden dislocation of what exists is more likely to produce unintended consequences than intended ones.

Supporters of the policy tend to regard all this as defeatist, an elaborate rationalization for doing nothing. Liberty, they assert, is a universal value, every society and culture desires it. To work for its realization through democratic institutions is not to impose anything, but merely to remove impediments and to render assistance in a learning process.

In terms of achievability, the trump cards in the hands of those who favor the policy and usually the first cards played are the examples of post-World War II Germany and Japan. But neither is particularly valid or relevant. The German and Japanese peoples were utterly defeated and crushed at the end of that conflict, and there were no surviving institutions or centers of opposition. In Iraq today the population is considered liberated, not defeated and deprived of rights. Second, Germany and Japan in 1945 were genuine nation states with homogenous populations and a strong sense of identity. This is true of few of the possible candidates for democratization today. Most of the states of the Middle East are artificial creations, arbitrarily carved out by Western powers. Third, and most important, before falling into the hands of extremist regimes, both Germany and Japan had considerable experience of the rule of law and civil society, as well as some significant experience of democratic practice. They had well-educated populations and substantial middle classes. Again, none of this is true of most of the targeted states today.

Another American experience seems much more relevant. Long before the United States became a global hegemon, it was a regional hegemon in the Caribbean. From the end of the 19th century it dominated the region and intervened as it saw fit. It occupied Haiti for 19 years, Nicaragua for even longer. Yet to this day the region has not produced one genuine, stable democracy. Nor was the United States to lay the foundations for a viable democracy during the three decades that it ruled the Philippines.

Some social scientists believe that the most reliable indicator of a country’s chances of achieving a viable democratic system is its economic performance. More precisely, a mean per capita income of around $6,000 makes the chance of a successful democratic transition very high. There are exceptions. The correlation does not apply to states with high incomes derived, not from effort, but solely from the luck of sitting on vast reserves of oil. But the correlation is a strong one, and the reasons are fairly evident. A developed economy requires, among other things, a reasonable education system, a developed middle class, significant access to information, a legal system that enforces rules of commerce in a way that foreign investors and traders find acceptable.

What implications does this have for the policy of promoting democracy? First, in many cases the most efficient way of proceeding initially may not be the direct one of focusing on political reform but the indirect one of developing strong economic institutions. Second, the greater effort should be directed at those countries that are approaching the transition stage with incomes that are not derived from oil or mineral wealth.

Even if the goal of promoting democracy is achievable, is it desirable? This may seem a strange question, for we are all in favor of “democracy,” aren’t we? Yes, we are, but when we speak of democracy, we almost invariably mean liberal democracy: a combination of democracy as a way of selecting government by competitive election and liberalism as a set of values and institutions, including the rule of law, an independent judiciary, an honest and impartial civil service, a strong respect for human rights and private property. While we are accustomed to the two being linked together, it is worth considering whether there is a necessary connection between them.

Liberalism has in the past thrived in countries that were not democratic, as it did, for example, in Britain in the late 18th and early 19th centuries. Democracy can be, and often is, installed in countries that are not liberal. Democratic governments can assume intrusive and oppressive power—regarding property, for example, or religious practice, or the starting of businesses—while still observing the basic democratic requirements. In his recent book The Future of Freedom, Fareed Zakaria argues that illiberal democracy is an increasingly prevalent phenomenon. It would, he believes, be a healthier state of affairs if the evolution toward an orderly rule of law and liberal civil society by some kind of enlightened elite were to precede the installation of a democratic order, as was the case with most stable Western democracies. He argues that in considering the interrelationship between liberalism and democracy, we should recognize that the former is the precondition of a successful implementation of the latter, rather than vice versa. This may be particularly sound advice in dealing with the Middle East, for many of those who know the region well believe that if democracy were to be introduced under the prevailing conditions, the immediate result would be the installation of governments that would be even more militantly Islamist, repressive, and anti-Western than those that now exist.

There is one other important respect in which democracy figures in the discussion of international relations in the post-Cold War era. It is claimed that the increasing spread of democracy across the globe will greatly reduce the incidence of warfare and create an extended zone of peace. For, it is maintained, the historical evidence shows that democracies rarely, if ever, go to war with each other.

One can, of course, argue about definitions and particular cases, but it is generally true that liberal democracies have managed to get along with each other without war. Britain and the United States have not fought a war since 1812. Britain and France, bitter and violent rivals in their pre-democratic days, have been at peace since 1815. America and Canada can live with a common border that is thousands of miles long, without any fortifications on it. Why is this? One answer was given by Immanuel Kant. He maintained that in a republican state, as opposed to an authoritarian one, there would be a presumption against war. Citizens “would be very cautious in … decreeing for themselves all the calamities of war.” For it would be they who would have to fight the wars, pay for them, and repair the devastation that would result.

In an important sense Kant claims too much, for all the reasons he gives for liberal democracies not fighting each other would also apply to liberal democracies not fighting other non-democratic states as well. But in fact, democracies have gone to war with non-democratic states and peoples very frequently, and for a variety of reasons: imperial expansion, ideology, territorial rivalry, to secure markets, to punish, to restore or impose order. More often than not, these wars have been enthusiastically supported by citizens.

Whatever the explanation, it is certainly true that war between liberal democracies in today’s world seems utterly improbable. Whether the same would hold true for other democracies—between, say, an increasingly illiberal India and a Pakistan in which a corrupt and venal version of democracy had been restored—is an open question.

Some years ago, when enthusiasm for exporting democracy was building up in Washington, as the end of the Cold War approached, I wrote,

    Americans of all political persuasions believe profoundly that it is their right and duty—indeed their destiny—to promote freedom and democracy in the world. It is a noble and powerful impulse, one not casually to be ridiculed or dismissed. But acting on it—if one is concerned to be effective and not merely to feel virtuous—is a complicated and delicate business, and the dangers are many. Success requires that this impulse be balanced against, and where necessary, circumscribed by, other interests that the United States must necessarily pursue, more mundane ones like security, order and prosperity. For these represent not merely legitimate competing claims but the preconditions for a lasting extension of democracy.

    Success requires, too, an awareness of the intractability of a world that does not exist merely in order to satisfy American expectations—a world that, for the most part, cannot satisfy those expectations in the foreseeable future. While determination and purposefulness are important ingredients in any effective policy, the attempt to force history in the direction of democracy by an exercise of will is likely to produce more unintended than intended consequences. The successful promotion of democracy calls for restraint and patience, a sense of limits and an appreciation of the wisdom of indirection, a profound understanding of the particularity of circumstances. As Thomas Carlyle once put it, ‘I don’t pretend to understand the Universe—it’s a great deal bigger than I am … People ought to be modester.’

This still reflects my views on the subject.

___________________________________________________

Owen Harries is a senior fellow at the Center for Independent Studies in Sydney, Australia. This article is taken from his 2003 Boyer lectures prepared for the Australian Broadcasting Corporation.

Advertisement

Comments

The American Conservative Memberships
Become a Member today for a growing stake in the conservative movement.
Join here!
Join here