So there we were, the world going to hell in a handbasket, with a regime dominated not just by conservatives, but stupid conservatives embroiling us in a war in Iraq that is unwinnable because their own ignorance and lack of contact with reality made it inevitably so. And I publish a book on rum.
It probably confirms the worst fears of the dour Leninoids who think that Slobodan Milosevic, Kim Jong Il, Fidel Castro and Saddam Hussein are the last of the great socialists and regard people like me as class traitors for thinking that human rights have anything to do with progress.
In some small defense, I did take time out last year to explore George W. Bush’s military career. It could have been a very short book, but Deserter, George Bush’s War on Military Families, Veterans and his Own Past, went beyond his all too short military record into how that ignominious episode was obscured and inflated until far too many Americans think that the drunken wimp is a bemedalled war hero.
It also touches on Bush’s substance abuse problems, although I have no direct evidence to suggest that rum was ever snorted by the guy. In any case, it is clear that he neither drank nor abstained in moderation.
Incidentally, Castro himself has inveighed against the perils of rum drinking in Cuba of all places, so maybe he and W should start a joint world-wide temperance movement.
So back to rum. It reveals a lot about the history of the modern world, how this country was established and the global economy became global.
And more to the point, it is fun stuff to drink. In the strange cultural wars of modern America, the right rallies social conservatives against the hedonistic hippy liberals who it confuses with the far left. Why can’t we rally the six-pack brigade against the dry evangelists and the guy in the White House who goes white-knuckled at the sight of a Bud Lite? It’s a massive constituency.
So as a snifter, here is my latest piece from the Nation site. You will note my ambivalence. Havana Club is better rum than Bacardi – but Fidel doesn’t drink!
http://www.thenation.com/doc/20051205/secret_history_of_rum
Politics, books, history, foreign affairs, Caribbean, Middle East, Palestine, Israel, Iraq, China, Britain, United Nations, Oil For Food, Bush the Deserter, sex and rum and 1776 and tequilla and lots of fun things from someone who has more columns than the Parthenon.
Wednesday, November 30, 2005
Tuesday, November 29, 2005
Deadline Pundit returns from the Cybercrypt
As the Iraq War loomed, Danny Schechter and Rory O'Connor of Globalvision commissioned me to produce a daily column of news analysis - and then the money ran out. It was called the Deadline Pundit, since my mandate was to produce a thousand words of deathless but pith prose by eight thirty every morning. I continued for some months even after the money ran out. Reading them recently I was pleased how well they had stood the test of time, on issues like the New York Times strange indulgence for Judith Miller's consequentially imaginative reporting in comparison with its intolerance for Jayson Blair's inconsequentially creative contributions.
With no crystal ball, we also forsaw the impending disaster of the Iraqi occupation at the hands of the malicious incompetents in the administration.
Over the years, many colleagues have asked me, why not do a Blog? Well, I was very busy, and somewhat prejudiced. I thought, writers who can write, publish, those who can't, blog. That prejudice was reinforced by the anonymous attacks of conservative bloggers, like the eight grade school teacher from California whose grey and undistinguished career is spent tracking my writings and attacking the journalistic integrity of myself and colleagues from behind the shelter of anonymity. It made me think that those who can, do,and those who can't, teach, while those who are too arrogant for the honorable teaching profession, blog anonymously.
But then I saw distinguished colleagues who do indeed put their names on their Blogs, and I decided to take the plunge.
So since I am busy writing for money today, and I am an ecological journalist, I will begin by recycling a piece from the Washington Spectator.
For the benefit of obsessive conservative would-be media critics, I should point at that once upon a time when I worked for British Rail (RIP) I was the conductor on the Royal Train, but the experience did not influence my writing on this subject at all.
With no crystal ball, we also forsaw the impending disaster of the Iraqi occupation at the hands of the malicious incompetents in the administration.
Over the years, many colleagues have asked me, why not do a Blog? Well, I was very busy, and somewhat prejudiced. I thought, writers who can write, publish, those who can't, blog. That prejudice was reinforced by the anonymous attacks of conservative bloggers, like the eight grade school teacher from California whose grey and undistinguished career is spent tracking my writings and attacking the journalistic integrity of myself and colleagues from behind the shelter of anonymity. It made me think that those who can, do,and those who can't, teach, while those who are too arrogant for the honorable teaching profession, blog anonymously.
But then I saw distinguished colleagues who do indeed put their names on their Blogs, and I decided to take the plunge.
So since I am busy writing for money today, and I am an ecological journalist, I will begin by recycling a piece from the Washington Spectator.
For the benefit of obsessive conservative would-be media critics, I should point at that once upon a time when I worked for British Rail (RIP) I was the conductor on the Royal Train, but the experience did not influence my writing on this subject at all.
Monday, November 28, 2005
A TALE OF TWO POLITIES - Why George W. Bush Is Really Our King
http://www.washingtonspectator.com/articles/20051115kinggeorge_1.cfm
Washington Spectator November 15, 2005
A TALE OF TWO POLITIES
Why George W. Bush Is Really Our King
No one could blame President Bush for wanting to get out of town after the end of October. He'd just experienced what non-partisan political observer Charles Cook dubbed "the worst week of the worst month of the worst year of the Bush presidency." The president's approval ratings sagged to an all-time low of less than 40 percent; he suffered the humiliation of having his Supreme Court nominee torpedoed by opposition from his own party; the number of American soldiers killed in Iraq passed 2,000; he was lambasted for yet another slow response to a hurricane disaster, in Florida; and influential White House aide, I. Lewis "Scooter" Libby, resigned after being indicted for perjury and obstruction of justice.
Unfortunately for Bush, his early-November travel plans took him to Argentina for a two-day hemispheric trade summit of 34 nations, where despite his coaxing, no agreement was reached on resuming stalled negotiations on establishing the Free Trade Area of the Americas. The lesson from that weekend seemed to be that, as bad as things are at home, Bush is even less popular in Latin America. Strikes and mass demonstrations by anti-Bush protesters exploded outside the fortified gates of the hotel where the summit was held.
These days, Bush is bashed from Argentina to Australia, but it is rare to encounter a reasoned critique of the man and his administration presented as a means of enlightening the American public. Our colleague Ian Williams has the talent to do just that. Williams is a busy freelance writer, born in Liverpool but since 1989 based in New York. He serves as The Nation's U.N. correspondent and has been a regular contributor to many of Britain's major newspapers. His latest book, Rum: A Social and Sociable History of the Real Spirit of 1776, was published this summer.
Not long ago we criticized the Electoral College method of electing a president as an anti-democratic anachronism. In this issue Williams makes a more far-reaching argument: He sees the presidency itself—embodying the roles of both chief executive and head of state—as an unfortunate relic that the Founding Fathers would have done better to reconsider. Of course, such a critique has no hope of resulting in transformation of any sort. But taking the opportunity occasionally to see things through the eyes of a brilliant foreign correspondent can give us a fresh perspective on the state of our democracy—where we are today and how we got here.
The stately arrival of Prince Charles and his most recent spouse at the White House in early November, shortly after the unstately departure of Vice President Cheney's aide Lewis Libby from the same place and, one hopes, shortly before presidential adviser Karl Rove gets the bum's rush as well, was a thought-provoking event. Americans tend to assume that they have the finest democracy in the world—just as they assume that they have the best health care. It often takes an outside perspective to show up the eminently falsifiable nature of these suppositions, but it is always an uphill struggle.
To celebrate the royal visit, I was invited onto the Fox News channel to tut-tut on TV about the anachronistic nature of England's Windsor line. But alas, since Fox thinks that irony is what they used to make in Pittsburgh, my tongue-in-cheek defense of constitutional monarchy fell somewhat flat. I had forgotten that the untitled Rupert Murdoch, who owns Fox, is a republican as well as a Republican. But I notice that he did not exactly exclude his male heirs from the management of News Corp.
When the people at Fox asked me if the monarchy represented privilege, of course I said I could agree in principle, but I pointed out that in the constitutional monarchies of Scandinavia, the Low Countries and Britain, poor people have far more access to health care and education than in the current Georgian America. In fact, in every measurable way these societies are more egalitarian than the United States.
For all his eccentricities, Charles is a convinced environmentalist, who supports the Kyoto Protocol, while George thinks global warming, like evolution (and indeed probably gravity as well) is just a theory, despite the hurricanes that batter hardest at the states that gave him the presidency.
With that in mind, I told Fox that the hereditary principle is indeed a dubious way to fill jobs, but that even if the prince were eccentric or barking mad, the world would be safe when he becomes Charles III, even if he only makes it because he's his mother's son. However, I cautioned, it made one hell of a difference to the world that George W., with more than a few psychological question marks of his own, had become George II just because he was the fruit of his father's loins. After all, no rational person would believe that the spoiled legacy brat who deserted from the Air National Guard and sank business after business would ever have succeeded in politics without strong dynastic backing.
AN 18TH-CENTURY ANTIQUE—In fact, when the putative Charles III shook hands with George II of the Bush dynasty, he was meeting someone who has pretty much all the powers of Charles's ancestor, the Hanoverian George III. An equestrian statue of King George was erected in 1770 by the colonists of New York, grateful for the repeal of the Stamp Act, and was toppled in ingratitude by the same people after a public reading of the newly written Declaration of Independence, just six years later.
Essentially unchanged since then, the American political system has escaped the reforms of the British and other democracies. While the powers of the European monarchs have become more and more diluted with each passing year until the kings and queens have all the significance of a team mascot for their nations, the presidential office has retained all those quasi-monarchical powers of centuries past.
As a Hanoverian monarch subject to election every four years, the American president appoints civil servants, ambassadors, and the whole Cabinet, on the same basis as the patronage system of eighteenth-century England. The Cabinet members he chooses need not have any independent political standing whatsoever. Indeed, as we saw with the heads of the Homeland Security and FEMA, not much in the way of professional standing is required either.
Having such an intensely political personage as the head of state confuses issues. The American media and even the political classes show far more deference to the president of the United States than their British counterparts do to the queen of England and her numerous offspring. In fact, most people in the UK tend to ignore the monarchy except as a continuing royal reality show. I have heard Americans say, "I must support my president," but never heard anyone in Britain say, "I must support my prime minister."
When the U.S. separated from Britain, the institution of prime minister was in its infancy, and so it was not too surprising that the rebellious colonists overlooked the office in their Constitution, not least since they saw the prime minister of their day, Lord North, as a tool of the king.
Indeed the title of prime minister itself was not formally adopted until 1905, even in Britain. However, as the office of prime minister has developed in Britain and other places, it has become clear that it is no bad thing for the chief executive to come from the ranks of legislators—and to be accountable to them. The roles of head of state and chief executive are separate. But with its political system frozen in 1789, the United States missed out on this idea.
It is not only a question of much needed political experience. We have to ask, how far would George W. Bush's political career have advanced if he had to stand up for a Capitol Hill version of "Prime Minister's Question Time" and actually explain and defend his policies on the hoof against unscripted questions? On the other hand, looking at the docility of so many of the U.S. legislators one may wonder whether they could come up with any killer questions on the spur of the moment without a team of aides whispering in their ears.
IMPORTANCE OF OPPOSITION—The offenses for which Libby was indicted suggest that in one major respect, the American political system is not only not reforming, but is actually devolving. To score petty domestic political points against an individual who had crossed them, high-ranking officials in the White House were quite prepared to compromise secret agents and national security, putting possibly scores of lives at risk. For the Bush team, opposition is always disloyal, and the law is no protection for that opposition.
If a democracy is to function and survive, the major protagonists within it must, in the end, believe in the concept of a "loyal opposition." It does not take too much examination of the world's politics to see that in many countries this is a complete oxymoron, and of course, there were times in American history, from the Federalist period onwards, when it did not operate too smoothly as a concept. The current White House has clearly abandoned the quaint idea entirely.
This is only the latest manifestation of the idea. Many conservatives, for example, never accepted that Clinton was really president. The mere accident of election did not persuade them that someone with his views could legitimately hold the office. Similarly, when it came to George W. Bush's assumption of office, the technical detail that he may not have actually won the election was for them no conceptual barrier at all to his taking the oath.
In their own idiosyncratic way, many Democratic legislators have also shown signs of abandoning the concept of a loyal opposition. They have emphasized the loyalty at the expense of the opposition. Being excluded from power does not make you an opposition: opposing the incumbents does. Though Harry Reid's marshaling of a serious look at the road to the Iraq War was a heartening sign, and the resistance to John Bolton's nomination as U.N. Ambassador was as well, these examples stand out because of their rarity.
THE PRIMARY PROBLEM—Their lack of feistiness is not the only problem. Democratic legislators must contend with one of the few innovations in the American political system since 1789: the electoral primaries. The original idea behind primaries was to take politics out of the smoke-filled rooms of the party bosses, where as Tammany Hall's Boss Tweed once said, "I don't care who does the electin', so long as I do the nominatin'." Apart from anti-smoking laws, all that has happened since is that check writers have taken over for ward heelers.
The primaries are now responsible for much of the evil in modern American politics, from apathy and lackluster political platforms to the power of money. We now take it for granted, almost as constitutional, in fact, that the race is much more likely to go to the richest than the worthiest. To gain access to party funding, a candidate has to first win a primary, and to do so needs to raise money as an individual. As we can see, this not only gives a head start to the Mike Bloombergs of this world, it also means that candidates begin their political life in hock to business interests.
Europeans are never sure whether to be amused or horrified at the role of campaign contributions, in the U.S., in buying legislation. In most other countries this would be considered criminal corruption and outright bribery, but the American convention is to assume that as long as the bribes are spent on political expenses rather than going into the candidates' pockets, all is well.
Primaries are flawed in principle as well as in effect, but Americans are so used to them that even the most radical tend to overlook just how bizarre and essentially undemocratic they are. In few other democracies are a party's candidates chosen by non-party members. In a sense, it makes a mockery of the secret ballot for voters to declare their party allegiances on the electoral registers, and in many countries it would be regarded as a shocking intrusion to have citizens' political opinions recorded publicly in this way.
While they are anomalous enough in the states where voters at least have to declare which party they support in order to participate, primaries reach the level of outright insanity in states with "open primaries," where supporters of one party can actually choose another's candidates. We saw the results of that recently when Cynthia McKinney was defeated in an open primary in Georgia by a combination of cross voting from Republicans and out-of-state money. When she was able to present herself in a later, general election, she won handsomely, demonstrating presumably how ineffective the primaries are at representing the intentions of the electorate as a whole.
In other democratic countries, the candidates are picked by party members who have paid dues and declared support for the party's principles. Of course, the association of party and principle seems a contradiction in terms to many disgruntled Americans, but maybe the primaries have had something to do with that as well.
Another direct consequence of this is that as far as the public is concerned, the Democrats will be leaderless until the primaries. There is no leader of the opposition, loyal or otherwise, in the American political system. In more developed parliamentary systems, the scores are settled right after an election. The losing party decides whether the leader of that party is worth another try, or whether to pick someone else quickly to lead the opposition back to power.
But in the U.S., the Democrats will be rudderless for most of the presidential term until at the end, for a long and tedious year the contending candidates will exhaust their wealth and the patience of potential supporters in trashing each other, so that the one with the most money and least mire sticking to him emerges as the winning candidate, to be adopted at the content-free circus that passes for a party convention. If half the energy that went into opposing each other in the primaries went into the task of opposing the incumbent over his term of office, it would be a big step forward.
FACING THE FACTS—Americans often take some convincing that there is much wrong with their system, apart from the wrong people being elected. While the European monarchies were evolving, the American Republic became fossilized in its eighteenth-century form. The United States could benefit from a constitutional monarchy that no one cares very much about, and an established church that no one believes in; but sadly the Bush dynasty, beginning pre-Katrina, has shown many signs of developing into an unconstitutional de facto monarchy, with the White House controlling the legislators and the judges and the military every bit as firmly as George III ever did. And the U.S., for all the talk of separation of church and state is increasingly intolerant in its religion. However, while you could live with an attenuated monarchy inherited and adapted, no rational person save Karl Rove would try to implement one from a standing start.
So, is there an easy way to bring the American political system into the twenty-first century? Sadly, probably not. Even the primaries, enshrined as they are in so many state legislatures, would take a long time to disentangle. However, the Plamegate affair does offer an unrivaled opportunity for the Democrats to stake out a position for the loyal opposition, and to establish the question of to what, or whom, loyalty is due. All too often, the Democrats have acted as if in their hearts they secretly believed that the Republicans were indeed the natural governing party of the United States in some metaphysical way.
Loyalty to the nation and its people now demands an exposure of the disloyalty of the governing party. Its preparedness to lie and invent facts in order to procure a war that it has yet to explain adequately; its willingness to compromise national security to protect its lies; its confusion of loyalty to the Bush family and to its cronies with loyalty to the country, all capped with a willingness to retaliate at once against any liberals who speak out.
In fact, it demands the application of European standards of political conduct, which, even if they are more often honored in the breach than the observance, would pay dividends for a revived American democracy that currently shows signs of ignoring decent standards altogether.
Washington Spectator November 15, 2005
A TALE OF TWO POLITIES
Why George W. Bush Is Really Our King
No one could blame President Bush for wanting to get out of town after the end of October. He'd just experienced what non-partisan political observer Charles Cook dubbed "the worst week of the worst month of the worst year of the Bush presidency." The president's approval ratings sagged to an all-time low of less than 40 percent; he suffered the humiliation of having his Supreme Court nominee torpedoed by opposition from his own party; the number of American soldiers killed in Iraq passed 2,000; he was lambasted for yet another slow response to a hurricane disaster, in Florida; and influential White House aide, I. Lewis "Scooter" Libby, resigned after being indicted for perjury and obstruction of justice.
Unfortunately for Bush, his early-November travel plans took him to Argentina for a two-day hemispheric trade summit of 34 nations, where despite his coaxing, no agreement was reached on resuming stalled negotiations on establishing the Free Trade Area of the Americas. The lesson from that weekend seemed to be that, as bad as things are at home, Bush is even less popular in Latin America. Strikes and mass demonstrations by anti-Bush protesters exploded outside the fortified gates of the hotel where the summit was held.
These days, Bush is bashed from Argentina to Australia, but it is rare to encounter a reasoned critique of the man and his administration presented as a means of enlightening the American public. Our colleague Ian Williams has the talent to do just that. Williams is a busy freelance writer, born in Liverpool but since 1989 based in New York. He serves as The Nation's U.N. correspondent and has been a regular contributor to many of Britain's major newspapers. His latest book, Rum: A Social and Sociable History of the Real Spirit of 1776, was published this summer.
Not long ago we criticized the Electoral College method of electing a president as an anti-democratic anachronism. In this issue Williams makes a more far-reaching argument: He sees the presidency itself—embodying the roles of both chief executive and head of state—as an unfortunate relic that the Founding Fathers would have done better to reconsider. Of course, such a critique has no hope of resulting in transformation of any sort. But taking the opportunity occasionally to see things through the eyes of a brilliant foreign correspondent can give us a fresh perspective on the state of our democracy—where we are today and how we got here.
The stately arrival of Prince Charles and his most recent spouse at the White House in early November, shortly after the unstately departure of Vice President Cheney's aide Lewis Libby from the same place and, one hopes, shortly before presidential adviser Karl Rove gets the bum's rush as well, was a thought-provoking event. Americans tend to assume that they have the finest democracy in the world—just as they assume that they have the best health care. It often takes an outside perspective to show up the eminently falsifiable nature of these suppositions, but it is always an uphill struggle.
To celebrate the royal visit, I was invited onto the Fox News channel to tut-tut on TV about the anachronistic nature of England's Windsor line. But alas, since Fox thinks that irony is what they used to make in Pittsburgh, my tongue-in-cheek defense of constitutional monarchy fell somewhat flat. I had forgotten that the untitled Rupert Murdoch, who owns Fox, is a republican as well as a Republican. But I notice that he did not exactly exclude his male heirs from the management of News Corp.
When the people at Fox asked me if the monarchy represented privilege, of course I said I could agree in principle, but I pointed out that in the constitutional monarchies of Scandinavia, the Low Countries and Britain, poor people have far more access to health care and education than in the current Georgian America. In fact, in every measurable way these societies are more egalitarian than the United States.
For all his eccentricities, Charles is a convinced environmentalist, who supports the Kyoto Protocol, while George thinks global warming, like evolution (and indeed probably gravity as well) is just a theory, despite the hurricanes that batter hardest at the states that gave him the presidency.
With that in mind, I told Fox that the hereditary principle is indeed a dubious way to fill jobs, but that even if the prince were eccentric or barking mad, the world would be safe when he becomes Charles III, even if he only makes it because he's his mother's son. However, I cautioned, it made one hell of a difference to the world that George W., with more than a few psychological question marks of his own, had become George II just because he was the fruit of his father's loins. After all, no rational person would believe that the spoiled legacy brat who deserted from the Air National Guard and sank business after business would ever have succeeded in politics without strong dynastic backing.
AN 18TH-CENTURY ANTIQUE—In fact, when the putative Charles III shook hands with George II of the Bush dynasty, he was meeting someone who has pretty much all the powers of Charles's ancestor, the Hanoverian George III. An equestrian statue of King George was erected in 1770 by the colonists of New York, grateful for the repeal of the Stamp Act, and was toppled in ingratitude by the same people after a public reading of the newly written Declaration of Independence, just six years later.
Essentially unchanged since then, the American political system has escaped the reforms of the British and other democracies. While the powers of the European monarchs have become more and more diluted with each passing year until the kings and queens have all the significance of a team mascot for their nations, the presidential office has retained all those quasi-monarchical powers of centuries past.
As a Hanoverian monarch subject to election every four years, the American president appoints civil servants, ambassadors, and the whole Cabinet, on the same basis as the patronage system of eighteenth-century England. The Cabinet members he chooses need not have any independent political standing whatsoever. Indeed, as we saw with the heads of the Homeland Security and FEMA, not much in the way of professional standing is required either.
Having such an intensely political personage as the head of state confuses issues. The American media and even the political classes show far more deference to the president of the United States than their British counterparts do to the queen of England and her numerous offspring. In fact, most people in the UK tend to ignore the monarchy except as a continuing royal reality show. I have heard Americans say, "I must support my president," but never heard anyone in Britain say, "I must support my prime minister."
When the U.S. separated from Britain, the institution of prime minister was in its infancy, and so it was not too surprising that the rebellious colonists overlooked the office in their Constitution, not least since they saw the prime minister of their day, Lord North, as a tool of the king.
Indeed the title of prime minister itself was not formally adopted until 1905, even in Britain. However, as the office of prime minister has developed in Britain and other places, it has become clear that it is no bad thing for the chief executive to come from the ranks of legislators—and to be accountable to them. The roles of head of state and chief executive are separate. But with its political system frozen in 1789, the United States missed out on this idea.
It is not only a question of much needed political experience. We have to ask, how far would George W. Bush's political career have advanced if he had to stand up for a Capitol Hill version of "Prime Minister's Question Time" and actually explain and defend his policies on the hoof against unscripted questions? On the other hand, looking at the docility of so many of the U.S. legislators one may wonder whether they could come up with any killer questions on the spur of the moment without a team of aides whispering in their ears.
IMPORTANCE OF OPPOSITION—The offenses for which Libby was indicted suggest that in one major respect, the American political system is not only not reforming, but is actually devolving. To score petty domestic political points against an individual who had crossed them, high-ranking officials in the White House were quite prepared to compromise secret agents and national security, putting possibly scores of lives at risk. For the Bush team, opposition is always disloyal, and the law is no protection for that opposition.
If a democracy is to function and survive, the major protagonists within it must, in the end, believe in the concept of a "loyal opposition." It does not take too much examination of the world's politics to see that in many countries this is a complete oxymoron, and of course, there were times in American history, from the Federalist period onwards, when it did not operate too smoothly as a concept. The current White House has clearly abandoned the quaint idea entirely.
This is only the latest manifestation of the idea. Many conservatives, for example, never accepted that Clinton was really president. The mere accident of election did not persuade them that someone with his views could legitimately hold the office. Similarly, when it came to George W. Bush's assumption of office, the technical detail that he may not have actually won the election was for them no conceptual barrier at all to his taking the oath.
In their own idiosyncratic way, many Democratic legislators have also shown signs of abandoning the concept of a loyal opposition. They have emphasized the loyalty at the expense of the opposition. Being excluded from power does not make you an opposition: opposing the incumbents does. Though Harry Reid's marshaling of a serious look at the road to the Iraq War was a heartening sign, and the resistance to John Bolton's nomination as U.N. Ambassador was as well, these examples stand out because of their rarity.
THE PRIMARY PROBLEM—Their lack of feistiness is not the only problem. Democratic legislators must contend with one of the few innovations in the American political system since 1789: the electoral primaries. The original idea behind primaries was to take politics out of the smoke-filled rooms of the party bosses, where as Tammany Hall's Boss Tweed once said, "I don't care who does the electin', so long as I do the nominatin'." Apart from anti-smoking laws, all that has happened since is that check writers have taken over for ward heelers.
The primaries are now responsible for much of the evil in modern American politics, from apathy and lackluster political platforms to the power of money. We now take it for granted, almost as constitutional, in fact, that the race is much more likely to go to the richest than the worthiest. To gain access to party funding, a candidate has to first win a primary, and to do so needs to raise money as an individual. As we can see, this not only gives a head start to the Mike Bloombergs of this world, it also means that candidates begin their political life in hock to business interests.
Europeans are never sure whether to be amused or horrified at the role of campaign contributions, in the U.S., in buying legislation. In most other countries this would be considered criminal corruption and outright bribery, but the American convention is to assume that as long as the bribes are spent on political expenses rather than going into the candidates' pockets, all is well.
Primaries are flawed in principle as well as in effect, but Americans are so used to them that even the most radical tend to overlook just how bizarre and essentially undemocratic they are. In few other democracies are a party's candidates chosen by non-party members. In a sense, it makes a mockery of the secret ballot for voters to declare their party allegiances on the electoral registers, and in many countries it would be regarded as a shocking intrusion to have citizens' political opinions recorded publicly in this way.
While they are anomalous enough in the states where voters at least have to declare which party they support in order to participate, primaries reach the level of outright insanity in states with "open primaries," where supporters of one party can actually choose another's candidates. We saw the results of that recently when Cynthia McKinney was defeated in an open primary in Georgia by a combination of cross voting from Republicans and out-of-state money. When she was able to present herself in a later, general election, she won handsomely, demonstrating presumably how ineffective the primaries are at representing the intentions of the electorate as a whole.
In other democratic countries, the candidates are picked by party members who have paid dues and declared support for the party's principles. Of course, the association of party and principle seems a contradiction in terms to many disgruntled Americans, but maybe the primaries have had something to do with that as well.
Another direct consequence of this is that as far as the public is concerned, the Democrats will be leaderless until the primaries. There is no leader of the opposition, loyal or otherwise, in the American political system. In more developed parliamentary systems, the scores are settled right after an election. The losing party decides whether the leader of that party is worth another try, or whether to pick someone else quickly to lead the opposition back to power.
But in the U.S., the Democrats will be rudderless for most of the presidential term until at the end, for a long and tedious year the contending candidates will exhaust their wealth and the patience of potential supporters in trashing each other, so that the one with the most money and least mire sticking to him emerges as the winning candidate, to be adopted at the content-free circus that passes for a party convention. If half the energy that went into opposing each other in the primaries went into the task of opposing the incumbent over his term of office, it would be a big step forward.
FACING THE FACTS—Americans often take some convincing that there is much wrong with their system, apart from the wrong people being elected. While the European monarchies were evolving, the American Republic became fossilized in its eighteenth-century form. The United States could benefit from a constitutional monarchy that no one cares very much about, and an established church that no one believes in; but sadly the Bush dynasty, beginning pre-Katrina, has shown many signs of developing into an unconstitutional de facto monarchy, with the White House controlling the legislators and the judges and the military every bit as firmly as George III ever did. And the U.S., for all the talk of separation of church and state is increasingly intolerant in its religion. However, while you could live with an attenuated monarchy inherited and adapted, no rational person save Karl Rove would try to implement one from a standing start.
So, is there an easy way to bring the American political system into the twenty-first century? Sadly, probably not. Even the primaries, enshrined as they are in so many state legislatures, would take a long time to disentangle. However, the Plamegate affair does offer an unrivaled opportunity for the Democrats to stake out a position for the loyal opposition, and to establish the question of to what, or whom, loyalty is due. All too often, the Democrats have acted as if in their hearts they secretly believed that the Republicans were indeed the natural governing party of the United States in some metaphysical way.
Loyalty to the nation and its people now demands an exposure of the disloyalty of the governing party. Its preparedness to lie and invent facts in order to procure a war that it has yet to explain adequately; its willingness to compromise national security to protect its lies; its confusion of loyalty to the Bush family and to its cronies with loyalty to the country, all capped with a willingness to retaliate at once against any liberals who speak out.
In fact, it demands the application of European standards of political conduct, which, even if they are more often honored in the breach than the observance, would pay dividends for a revived American democracy that currently shows signs of ignoring decent standards altogether.
Tuesday, November 22, 2005
The Secret History of Rum
http://www.thenation.com/doc/20051205/secret_history_of_rum
Rum has always tended to favor and flavor rebellion, from the pirates and buccaneers of the seventeenth century to the American Revolution onward. In addition, sugar and rum pretty much introduced globalization to a waiting world, tying together Europe, the Americas, Africa and the Caribbean in a complex alcoholic web of trade and credit. Not until oil was any single commodity so important for world trade. So it is not surprising that the Bacardi Corporation has become one of the world's first transnationals.
Even before Fidel Castro took power, the Bacardi family moved its headquarters from its Cuban home to the Bahamas, allowing it to get British imperial trade preferences, while opening a large distillery in Puerto Rico to allow penetration of the American market. Now its management is mostly living in exile in Florida, monopolizing the local markets across the Caribbean and the world with its bland, branded spirit. Fifty years of marketing have made Bacardi almost synonymous with rum in much of North America, and as Thierry Gardère, maker of the acclaimed Haitian rum Barbancourt, pointed out with a pained expression to me once, "They always advertise it as mixed with something else."
In Prohibition-era America, lots of thirsty Americans went to Cuba, and what they drank there, in keeping with the ambience, was rum, usually in cocktails and often in bars favored by Fidel's onetime fishing partner, Ernest Hemingway. He made a clear distinction: "My mojito in La Bodeguita, my daiquiri in El Floridita."
Cuba made great rums and had some of the world's most renowned bars. Bacardi had really risen to prominence after the American occupation, or "liberation" (sounds familiar?), of Cuba, at the turn of the twentieth century, when the island became the playground for its northern neighbor. Barcardi built its market position during Prohibition, edging out the old New England rum. When the Eighteenth Amendment took force, Bacardi USA sold 60,000 shares, closed down the company and distributed its assets, coincidentally 60,000 cases of Bacardi rum, to the stockholders.
During the dry years the company's order books would suggest that there were unquenchable thirsts in Shanghai, Bahamas and tiny islands like the French enclave of St. Pierre and Miquelon, off Newfoundland. But of course, shiploads of Bacardi went to rendezvous with the rum-runners just outside American territorial waters. As soon as repeal was in sight, Bacardi litigated all the way up to the Supreme Court to open its business in Puerto Rico, where it was eager to get Caribbean costs combined with American nationality. Its rivals in Puerto Rico used the same style of targeted retrospective legislation that Bacardi later did against Castro's Cuba in an attempt to keep Bacardi out. In the first year after Prohibition, Bacardi sold almost a million bottles to the United States. But soon it was not selling it from Cuba. Despite the family's overt and noisy Cuban patriotism, the company pioneered outsourcing and supplied the United States from Puerto Rico. Cuba's share of American rum imports dropped from 52 percent in 1935 to 7.3 percent in 1940.
In 1955 Bacardi moved its trademark to the Bahamas, perhaps in gratitude for the islands' help in keeping the product moving during Prohibition, and also because that made it eligible for British Commonwealth preferences. Its offshoring from Cuba proved very prescient when Castro nationalized the Cuban operations in 1960, which was as much a shock to Bacardi. The Bacardi building had greeted the arrival of Fidel, Che and the compañeros with a banner saying simply "Gracias, Fidel!" In common with some other rum producers, they had supported the rebels financially. In 1959, Castro's trade delegation to the United States had included Juan Pépin Bosch and Daniel Bacardi, two of the family's heads. Neither side dwells on these happy days any more. The company is still held by 600 descendants of the founder, so it does not have to file financial statements or submit to valuations as if it were listed on stock exchanges, and in any case, with sales in 200 countries adding up to 200 million bottles, no one could be sure which stock exchange it would list on.
As its record shows, Bacardi is the original multinational. Its trademark is now held in Liechtenstein, one of the most secret and secure banking centers in the world, which contrives to be "offshore" in the middle of the Alps. However, while attending to business, the Bacardi family has never missed a chance to get its own back on Castro. Bacardi clan chief Juan Pépin Bosch brought a touch of the old connection between buccaneering and rum back to life in 1961 by buying a surplus US Air Force B-26 Marauder medium bomber in order to bomb a Cuban oil refinery. Later he was the money behind a plot to assassinate Castro. For many years Bosch was a major financier for the Cuban American Lobby and a major litigator who brought the United States to the verge of trade wars with the rest of the world. The technique has been to lobby legislators to exercise their anti-Cuban prejudices, regardless of general principles of international or indeed domestic law, and then to pay lawyers to implement the resulting legislation.
Bacardi was spurred into action when Castro's government went into partnership with the French liquor giant, Pernod Ricard, to market the renowned Havana Club internationally. Even though excluded from the US market by the embargo, Pernod was able to sell 38 million bottles of Havana Club in the first few years. In anticipation of an end to the Cuban embargo, it was gearing up for big sales in the United States. This was a challenge both political and commercial to Bacardi, which set to firing retaliatory legal broadsides and to the rediscovery of its Cuban roots.
Bacardi, wherever it is made, had for some decades tried to bury its Cuban origins, but in the 1990s it went into reverse. Its labels began to mention prominently that the company was founded in Santiago de Cuba in 1862 while eliding mention of where the rum was actually made currently. In 1998, "rum and Coke" or "Bacardi and Coke" suddenly became known as a Cuba Libre again. To match the myths, various stories were circulated to celebrate Cuba Libre, claiming that it had been invented by an American in 1898 to celebrate the American victory over the Spanish in Cuba.
The original makers of Havana Club, the Arechabala family, had fled the country after the Revolution, leaving the distillery and the brand behind. The family did not renew its trademark, which lapsed in 1973, and in 1976, the Cuban state export company registered the century-old brand with the US Patent and Trademark Office. Twenty years later, Bacardi sought out the Arechabala family members and bought out whatever suing rights they may have had. Reportedly, Bacardi paid them $1.25 million after the family had spurned offers from Pernod Ricard, which was attempting to cover its back. Bacardi, happy to tweak Fidel's beard, began selling a rum with the Havana Club label (made in the Bahamas) in the United States in 1995, and Pernod sued. The case was going in Pernod's favor, as the Manhattan judge initially made her rulings based on existing law. Then the Bacardi family cut the Gordian knot. Using political clout in Florida, it got the law changed by persuading lawmakers to smuggle a clause into a large spending bill specifically to exempt trademarks nationalized by the Cubans from the usual international protections unless the original owner had agreed to hand them over. And of course, the Arechabalas had not.
In the end, the judge broke new legal ground by accepting this retrospective and clearly privileged legislation as binding, since Pernod wanted an injunction against future use of its trademark. Judge Shira Scheindlin decided: "At this point, because plaintiffs can sell no product in this country and may not be so able for a significant length of time, they suffer no impairment of their ability to compete as a result of defendants' actions. Any competitive injury plaintiffs will suffer based upon their intent to enter the U.S. market once the embargo is lifted is simply too remote and uncertain to provide them with standing."
It was yet another case of the United States flouting treaties and international law, and the judgment is not recognized anywhere else in the world--a point emphasized by the World Trade Organization shortly afterward.
Even so, the US patent office threw out Bacardi's attempt to register other names containing Havana, because the company was claiming a spurious connection to Havana, which could have confused drinkers who thought they were buying rum from Cuba.
When Pernod pushed the European Union into filing a dispute with the WTO, Bacardi complained, in a manner that almost defines the term "disingenuous" from a family that had just secured private legislation: "Pernod Ricard has pressured the EU into filing a claim with the WTO in an attempt to politicize a purely civil dispute. Bacardi views this as a private civil matter and one that is not connected in any way to world trade laws or the WTO." Others begged to differ, not least when Castro announced that Cuba could abrogate US trademarks, such as Coca-Cola, in retaliation. The WTO itself found in 2001 that the American law violated free-trade agreements, and the US trademark office has refused to revoke Pernod's registration despite even more litigation and lobbying by Bacardi, helped by alleged illegal campaign contributions to Congressman Tom DeLay, yet another politician who might be laid low by the demon rum.
Perhaps the ultimate weapon was used when Castro threatened in 2001 to start producing a rum in Cuba called Bacardi. The US State Department, not good at seeing itself as others see it, promptly declared this to be a provocation. In the meantime, the European Union has effectively been bullied into taking no action to enforce the case it has won at the WTO. Castro himself has an occasional talent for expediency. One of the first winds of change that he got from the Soviet Union was when Mikhail Gorbachev cut back imports of Cuban rum as part of his anti-booze campaign. In 1999 the Cuban leader, who had already given up the trademark cigars that regularly put him on the cover of Cigar Aficionado magazine, went one step further; he urged Cubans to give up rum as well and warned that anyone who wanted rum over the New Year "will pay dearly for it." He asked an assembly of medical students, "How much damage has rum caused in any society?" He even lamented that there were "supporters of the revolution who like to toss down a few once in a while." Cynics assumed that the supplies for the growing export market for Cuban rum were threatened by domestic demand.
While Fidelistas may berate Bacardi for its feud with Havana Club, rum aficionados almost universally deplore the company for the effect it has had on rum. Gresham's law observes that bad money drives out good; Bacardi has achieved this with rum. Its bland ubiquity has been driving the distinctive rums of the world from the mass consumer market. It is the equivalent of American cheddar driving out the 300 cheeses of France. Its monopoly power has been used to keep much better, genuinely local Caribbean brands from reaching takeoff. The islands cannot compete with subsidized and tariff protected high fructose corn syrup and Floridian sugar grown by former Cuban barons, so their one chance to market a value-added branded commodity is frustrated by the transglobal black bat.
Republicans used to inveigh against the Democrats as the party of "Rum, Romanism and Rebellion," but now Bacardi has the GOP in its pocket, it symbolizes the complete turnaround of political positions.
Subscribe to:
Posts (Atom)