Tuesday, May 31, 2005


I doubt that any cliché has been used more frequently to undermine ethical reflection than this one: "Who’s to say what’s right or wrong?" No discussion of morality can proceed very far before someone empties his philosophical quiver and unleashes this sophomoric rhetorical challenge.

The unstated implication behind the question is that ethical principles are nothing more than personal tastes. Given this assumption, the declaration that one type of behavior is better than another becomes equivalent to announcing that steak is good and broccoli bad. All ethical judgments are thus reduced to arbitrary acts of imposition. A thing is labeled ‘bad’ only because some high-handed dude has declared that it is.

It is worth noting that no one employs this shopworn challenge in the arena of science-- because within that discipline it is assumed that some answers are closer to the truth than others. Even in history and sociology, most folks would dismiss "Who’s to say" challenges with a sneer. The obvious response is that "evidence" and "coherence" are the standards for deciding whether a proposition is true or false--not the declaration of some disciplinary czar vested with absolute authority to impose his will on less-influential practitioners.

When it comes to ethics, however, all these considerations are ignored --as if everyone knows that no such thing as ethical evidence exists. Such an idea would come as a shock to Plato, Aristotle, and Aquinas--as well as to more recent thinkers like Kant or John Stuart Mill. Heck, even Karl Marx spent decades cooking his economic books in order to convince readers that Communism was both morally superior and historically inevitable.

In addition to ignoring the possibility that ethical evidence exists, those who fling the "Who’s to say" ratchet into the cogs of ethical reflection also ignore something else--namely, that their rhetorical challenge presupposes the existence of a moral standard that it simultaneously denies.

The "Who’s to say" argument is only effective because it makes any possible answer to the question morally indefensible--since nothing is true simply because so and so says it is. But if nothing can be declared immoral, then there is no basis for rejecting the arbitrary imposition of one person’s values on another. Put succinctly, the "Who’s to say" argument presupposes two contradictory ideas: 1) All moral statements are arbitrary impositions. 2) Arbitrary impositions are immoral.

The clichéd interrogative requires the very morality it denies. And then it employs that sliver of moral indignation to undercut serious moral reflection. The fact that "immoralists" are reduced to blatant self-contradiction in their most popular logical offensive is a bit of evidence worth pondering. The fact that most people don’t recognize this logical con-game is a tribute to the intellectual and moral sloth of a culture that would rather let sleeping consciences lie.

Sunday, May 15, 2005


"European morality" is a phrase that should rival "must-see TV" and "government budget" in the Oxymoron Hall of Fame. Yet among American elites there seems to be a consensus that Continental mores are significantly more advanced than those practiced in the United States. This novel assumption raises several questions:

Have circumstances changed so drastically from the time when America’s Founding Fathers looked askance at European decadence and saw in their new nation a city that would serve as a beacon to the rest of the world? More to the point, has Europe changed that much in the last five decades? After all, the continent held in such high esteem by many of today’s politicians and mainstream media types didn’t exactly have a century to write home about.

First there was World War I--a colossal stupidity whose causes are shrouded in national aspiration, colonial competition, and a prolonged history of diplomatic gamesmanship. Only U.S. intervention succeeded in dragging the Euros out of their self-destructive trenches.

Then there was the fascist era--with Mussolini in Italy, Franco in Spain, and Adolf in Deutschland. It must have been a completely different breed of European that cheered il Duce’s invasion of Ethiopia (visions of imperial Rome dancing in their heads) or who dutifully carried out the Fuhrer’s solution to the continent’s "Jewish Problem."

During that period the other side of the political fence was largely defined by two words: "Munich" and "Czechoslovakia." Again, it was American intervention, along with British tenacity, that combined to save the continent from domination by Hitler or his double- crossed Soviet pal, Joe Stalin.

Today’s European apples must have fallen far from that ancestral tree to merit such admiration as they are accorded in the minds of America’s literati. But the closer I look, the more I see the same emptiness in the states of "Old Europe."

Germany, for example, legalized prostitution a couple of years ago--a sign of decadence reminiscent of the Weimar Republic. Elsewhere on the forensic front, a German court recently sentenced a man who killed and ate a voluntary victim to eight-and-a-half years behind bars. (With good behavior this Hannibal Lecterisch connoisseur could be hawking a rare book of Rhineland recipes by the end of the decade.)

Economically, Deutschland’s unemployment rate has grown to 12% amid a population decline projected to reach 10% by 2050. Both these trends spell disaster for generously- funded welfare programs and suggest less-than-vigorous health, culturally speaking.

Next door in the Netherlands, legislators are falling over themselves to draft more inclusive standards for killing sick, old, and depressed folk--with or without the patient’s permission. The protocols of the University Hospital in Groningen represent the cutting edge of progressive thought on this topic--ideas so reminiscent of Nazi justifications for euthanasia, they are verboten in Germany.

Meanwhile, in Sweden (bastion of neutrality in the fight against Hitler) illegitimacy has become the norm. Presently, over half of that country’s babies have parents who find the institution of marriage anachronistic.

Given these considerations, it seems likely that what appeals to Europhiles isn’t the region’s outstanding ethical record but rather its long-cultivated lack of moral conviction. It’s not that the leopard has changed its spots, but rather that American elites now identify with what most Continental intellectuals stood for throughout the last century-- accommodation, acquiescence, and moral lassitude.

Those on the other side of this "sophistication" argument can point to refined aesthetic sensibilities and copious cultural treasures that grace the land mass north of the Mediterranean. They can even tout a collective monetary unit that has appreciated in value over the last few years. The central question, however, is whether these facts represent more than a cluster of attractive specimens amid a moribund species.

My belief is that no European ethical epiphany took place in the last half of the twentieth century--an assessment whose value should become clearer as religiously-motivated immigrants increasingly replace a dwindling population of secularists whose utilitarian ideals inspire neither courage nor admiration.

Monday, May 09, 2005


"You can’t give respect unless you get it first from others." That’s the tough-guy philosophy that was articulated by a young man struggling to square some standard of decency with the rap lyrics that he and his peers invest with such authority.

I asked the adolescent standing in the school hallway to envision a room populated with individuals who embraced the sentiments he had expressed--and then to describe the verbal exchanges occurring there. When no answer was forthcoming, I supplied him with my own scenario: "You would have a room full of people all demanding that the other guy respect him first--‘You respect me.’ ‘No, you respect me!’"

The slogan seemed plausible enough at first blush, but when push came to shove, the dictum turned out to be little more than an unqualified demand for submission--a means for distinguishing top-dogs from their home-boy underlings. The latter give respect without getting it first. The former get respect by demanding it as a non-negotiable precondition. The reciprocity that forms the core of polite social intercourse is here illusory. In its place stands a hierarchy of intimidation.

This gangsta philosophy turns the traditional understanding of respect on its head. According to that discarded rule, one doesn’t merit respect without first extending it. Respect for oneself is earned based on admirable behavior. Respect for others, by contrast, is considered the default position for interpersonal relations--not an option contingent on prior recognition. By following these rules one creates a room populated by folks whose actions are the polar opposite of those presumed in the prior experiment--a room long on courtesy and short on attitude.

The new rapper rules for respect are popular because they provide a ready-made excuse for in-your-face behavior. If old fogies don’t express appreciation for anti-social slogans, low-rider jeans, and grotesque brow-piercings, "play’rs" are automatically given grounds for dissing these disapproving adults. Only those who kowtow to countercultural inclinations merit placement in the "respectable" category. Those consigned to the opposite bin have only themselves to blame.

"Me-firsters" have no obligation to defer to society--and presumably nothing to learn from it. Instead, society has an obligation to do obeisance to their dubious insights--or suffer the consequences! This "attitude" that transfoms the general obligation to act courteously into a jealously guarded prerogative isn't a formula designed to restrain youngsters from swinging bats at the heads of immature taunters.

Thanks to an industry whose corruption seems bottomless, the hard and desperate language of the street now springs glibly from the lips of impressionable youngsters--and passes for wisdom. It is a mindset promoted by entertainers and athletes whose bravado is inversely proportionate to the moral standing they have earned--a philosophy rooted in, and designed to perpetuate, failure.

Monday, May 02, 2005


There’s been no honeymoon for a pope correctly perceived as too Catholic for the taste of secular critics. Instead, a wave of preemptive vilification has greeted the "archconservative" pontiff.

"God’s Rottweiler" is a phrase that expresses the antipathy felt toward Cardinal Ratzinger by those who gleefully point to his stint in the Hitler Youth. Such journalists fail to note that membership in the Nazi organization was mandatory and that, at the ripe age of eighteen, the future "Papa-Ratzi" deserted from the German army into which he had been drafted.

In addition to these youthful "offenses" Ratzinger’s stint as prefect of the Church’s office on Doctrine earned him the moniker "God’s Inquisitor." (Ironically, this stigma comes from a class supremely skilled at destroying political opponents. Were Benedict XVI as bad as his press, one might think a degree of professional courtesy would be extended by these practitioners of high-tech lynching.)

What makes Benedict even more odious to critics is the address he gave to his peers before ascending to the papacy. In that talk the German cardinal took aim at a dogma to which his secular opponents pledge fervent allegiance--the "dictatorship of relativism."

This oxymoronic label aims at relativism’s hidden duplicity--a duplicity suggested by the insults directed at a man who, by all first-hand accounts I have read, is a model of patience and self-effacement. Though tolerance appears to be relativism’s logical corollary, the linkage is illusory.

Behind relativism’s easy-going facade stands a rabid rejection of moral absolutes. Truth, they say, is relative--but relative to what? The honest response in most cases, "relative to me," explains the hostility directed by elites toward influential expressions of orthodoxy. It also explains why the statement "there is no wrong answer" has become so popular in recent years--especially in discourse about abortion, child-rearing, or "alternative lifestyles".

Thomas Hobbes gave serious thought to the shape of a world where egos rule and concluded that a political Leviathan was required to bring order to an otherwise brutish and short existence. Relativists aren’t far behind the English philosopher--except when it comes to candor.

A world where egos rule is, more accurately, a world where some egos rule over others. And the guiding principle of this governance must be the "gut instincts" to which those power brokers give deference. In a world devoid of absolutes, these instincts increasingly coincide with narrow self-interest. (Relativists, it should be recalled, need not and do not defer to majority opinion when the latter fails to echo their own "nuanced" visions of "truth".)

St. Augustine said that in his youth he yearned to become a law unto himself--to reject any authority outside his own will. This "I-did-it-my-way" cultural complement of relativism smacks more of the "will to power" than of tolerance. The same egocentric impulse generates venom toward Pope Benedict--a man capable of defending the notion that moral truth is more than putty placed in the hands of gurus who pontificate on God-knows-what from their mansions in Malibu.

A Pope who used the spiritual force of his office to bring down political totalitarianism in Eastern Europe has been succeeded by another who targets a different form of despotism. The "Rottweilers of Relativism" despise Benedict XVI in much the same way KGB operatives feared John Paul II. Dictators, whether political or cultural, hate answering to anyone except themselves.