When did "borrr-ring" become the universal term of disdain for all that young people find not easily attainable? The answer is when these same young people began to be fed a steady diet of "entertainment" so that their relationship to the world was defined as that of a critic to a theatrical performance. For recent generations deprived of economic hardship, the world has become a stage upon which a series of actors strut their stuff in order to obtain the audience's approval.
I can't ever remember employing the word "boring" as a youngster, and I don't think my recollection is the product of nostalgic fantasy. Sure, certain tasks were unpleasant and tedious, but the idea that tedium was avoidable, or even particularly bad, was not part of the world within which I was raised. "It has to be done" was a common statement concerning such tasks--sometimes even, "It's good for you."
With the prodigious expansion of an adolescent market in the sixties and seventies, a huge cohort of commercial marketeers began pandering to adolescent and pre-pubescent desires. The unreasonableness of these desires did not matter to this new generation of merchants who embraced the profitable idea that "ethics are up to the individual"--an idea that conveniently absolved them of moral responsibility. What mattered was how their target audience felt and that these immature and fanciful adolescent feelings could be manipulated for monetary ends. Advertisers thus said whatever adolescents wanted them to say in order to please these fledgling consumers and to sell products.
Consequently, just as the target audience wished, advertised solutions to problems became "quick and easy." Responsibilities and desires conveniently melted together while discipline and sacrifice were relegated to the status of poor market motivators. "Let me entertain you" became the constant refrain of record companies, tv shows, and innumerable products that did not march under the other commercially feasible banner: "You need this to get what you want."
Increasingly kids (and even adults) who were bombarded with the emotional nonsense emanating from their own spleens, began to take these messages to heart--among them the equation of importance and fun. What didn't excite was, so said advertisers, presumptively unworthy. The fault was not in oneself--perish such an unmarketable thought--but in the activity or its manner of presentation. After all, the consumer and his feelings were always right. As a consequence of such messages, it became necessary to avoid or to misrepresent projects that failed to get emotional juices stirring quickly: reading Tolstoy, practicing the piano, or visiting Aunt Harriet in the hospital.
The irony of this concession to fantasy was that it effectively alientated consumers from life's most rewarding activities--activites that almost invariably arrive in packages labelled "difficult" and "boring." The double irony was that it succeeded in creating a huge crop of couch-potatoes caught in the banality of consumption and enslaved to the winds of fashion--persons whose identity was reduced to the biological process of assimilation. What could be more boring!
Thus, thanks to the continuous pandering of mass marketeers, an unavoidable biological function took on the status of an honored class. Laws and policies began to be judged largely in terms of how they impacted this elite but all-inclusive group. Any proposal which harmed "consumers" was deemed oppressive. Moreover, in order to protect the rights of this special set of citizens, television stations began to employ "consumer advocates" who possessed, by virtue of their title, the moral status of Robin Hood. Throughout the nation publications sprang up to guide consumers into the paths of mercantile happiness. Meanwhile, a "Consumer Price Index" kept members of this prestigious group abreast of the consumptive activity they could reasonably expect to indulge in.
Nowadays, "Born to Shop" has become a common declaration shamelessly attached to license plates. Similarly, "You, the consumer" has become the favorite salutation of media-priests who breathlessly anticipate the size of that annual consumptive orgy celebrating the birth of the mendicant from Galilee--an orgy upon which so much of our nation's economic welfare rests. Not citizens, neighbors, or homo sapiens are we--but "consumers."
The country's singular focus on the production and consumption of goods has seldom been captured more poignantly than in the following passage penned, ironically, by Harvard economist John Kenneth Galbraith:
"In the autumn of 1954, during the Congressional elections of that year, the Republicans replied to Democratic attacks on their stewardship by arguing that this was the second best year in history. It was not, in all respects, a happy defense. Many promptly said that second best was not good enough--certainly not for Americans. But no person in either party showed the slightest disposition to challenge the standard by which it was decided that one year was better than another. Nor was it felt that any explanation was required. No one would be so eccentric as to suppose that second best meant second best in the progress of the arts and the sciences. No one would assume that it referred to health, education, or the battle against juvenile delinquency. There was no suggestion that a better or poorer year was one in which the chances for survival amidst the radioactive furniture of the world had increased or diminished. . . . [N]o one was moved to suppose that the year in question was the second best as measured by the number of people who had found enduring spiritual solace. Second best could mean only one thing--that the production of goods and services was the second highest in history."
Our language, our bumper stickers, and our most cherished indices all indicate that consumption is now considered the golden road to happiness. The more elaborate and stylish our cars, homes, clothes, meals, and amusements--the happier we believe we will be. This obsession reflects the apotheosis of the alimentary --making a god of our gullets.
Titles show us what we think of ourselves, and the term "consumer" conjures up no image so clearly as an obese epicure stuffing his mouth with food--a human disposal system that devours everything within reach of its black esophageal hole. But are we, in truth, homo consumerus? Will we fulfill our natures and achieve bliss by sating our appetites for material goods and amusements? I don't think so.
Consumers are conspicuous by their absence from the list of those we have enshrined in our collective memory for future generations to honor and emulate. Our hearts and history testify, even if advertisers proclaim the opposite, that a life devoted to consumption is base--that the pursuit of things befits humans about as well as a cage is suitable for a golden eagle. The disgust we feel after weeks of self-indulgence (or two consecutive hours of televised drivel) is a feeble, bourgeois echo of the profound ennui that enervated the decadent nobility of Imperial Rome.
Birds are creatures of flight, dolphins ply the seas, and humans are such by virtue of their capacity to think, to act, to love, and to express in symbols the nature of their experience. Humans are human because they can choose to act in accord with the deepest intuitions of what is good, beautiful, and true or they can choose to remain in a yawning, consumptive darkness.
Culture Criticism with a Philosophical and Literary Flair. Diagnosing Moral Malpractice since 1989.
Sunday, December 04, 2005
Thursday, December 01, 2005
SPADER v MADISON: A TALE OF TWO JAMESES
"If individuals be not influenced by moral principles; it is in vain to look for public virtue." --James Madison
James Madison never met James Spader. More precisely, he never met Spader’s metrosexual character, Alan Shore—the silver-tongued lawyer from ABC’s “Boston Legal” who almost makes Phil Donahue look macho. Shore is the political mouthpiece of writer-producer David Kelley. As such, he performs the dramatic task of bashing the Bush Administration and corporate villains via eccentric and riveting courtroom soliloquies. One week the target is farm-spawned salmon, the next week the war in Iraq, and regularly the apocalyptic effects of global warming.
The utter rectitude of Shore’s public causes are never in doubt. Any fool can see where the truth lies and can see that opposing counsel represents the forces of greed and stupidity. This clarity, of course, arises from the fact that a single advocate creates all the arguments.
Accordingly, Shore comes off as Mr. Smith, Atticus Finch, and Oliver Stone’s Jim Garrison all rolled into one. Indeed, in Kelley’s managed courtroom, if Shore ever got in a legal skirmish over Mother Teresa’s legacy, the deceased nun would come off looking like a money-hustling tart whose atavistic faith was the bane of Calcutta. (Perhaps religiophobe Christopher Hitchens could be a script consultant for that episode.)
But what of Alan Shore the individual? What kind of person is he?
The answer is that Spader’s character is a skirt-chasing boor whose hyperactive legal briefs are constantly getting him in Dutch with associates—a man whose disdain for the proprieties of everyday life matches his inability to establish enduring relationships. This emotional transience was illustrated in one episode by Shore’s taking up “permanent” residence in a hotel. Put bluntly, Alan Shore is Bill Clinton without the political veneer.
This utter disjuncture between the public and private spheres of a single life would puzzle Madison, who saw public virtue as an outgrowth of personal moral rectitude—the former being impossible without the latter. Yet the separation of individual character and public policy stands at the heart of David Kelley’s dramatic agitprop.
Far from being ignored, personal morality is ridiculed in Kelley’s legal fantasyland where ministers come off as hypocritical lechers and their congregations consist of busybodies, homicidal prudes, or deacons who have sexual relations with cows. This is a convenient strategy since it makes the foibles of Kelley’s legal hero seem trivial by comparison. Alan Shore’s decadence is at least sincere and above board.
It should come as no surprise that this schizophrenic portrait mirrors the way Tinseltown itself touts political dogma over personal rectitude. As is the case with their ideological stand ins at the law firm of Crane, Poole, and Schmidt, political correctness provides a convenient form of absolution for self-indulgence. Instead of engaging in the rigors of penance and parenthood, all it takes to boost one’s moral standing is the perpetual public recitation of a PC rosary.
It’s a pleasant script, but it doesn’t comport with the reality Madison described. Salvation doesn’t come from above—from Washington. Nor will enlightened social policies emerge from debauched individuals who ridicule the very notion of personal virtue. A staged world where public welfare is divorced from private rectitude bears as much likeness to the actual world as Boston Legal does to the true practice of law.
James Madison never met James Spader. More precisely, he never met Spader’s metrosexual character, Alan Shore—the silver-tongued lawyer from ABC’s “Boston Legal” who almost makes Phil Donahue look macho. Shore is the political mouthpiece of writer-producer David Kelley. As such, he performs the dramatic task of bashing the Bush Administration and corporate villains via eccentric and riveting courtroom soliloquies. One week the target is farm-spawned salmon, the next week the war in Iraq, and regularly the apocalyptic effects of global warming.
The utter rectitude of Shore’s public causes are never in doubt. Any fool can see where the truth lies and can see that opposing counsel represents the forces of greed and stupidity. This clarity, of course, arises from the fact that a single advocate creates all the arguments.
Accordingly, Shore comes off as Mr. Smith, Atticus Finch, and Oliver Stone’s Jim Garrison all rolled into one. Indeed, in Kelley’s managed courtroom, if Shore ever got in a legal skirmish over Mother Teresa’s legacy, the deceased nun would come off looking like a money-hustling tart whose atavistic faith was the bane of Calcutta. (Perhaps religiophobe Christopher Hitchens could be a script consultant for that episode.)
But what of Alan Shore the individual? What kind of person is he?
The answer is that Spader’s character is a skirt-chasing boor whose hyperactive legal briefs are constantly getting him in Dutch with associates—a man whose disdain for the proprieties of everyday life matches his inability to establish enduring relationships. This emotional transience was illustrated in one episode by Shore’s taking up “permanent” residence in a hotel. Put bluntly, Alan Shore is Bill Clinton without the political veneer.
This utter disjuncture between the public and private spheres of a single life would puzzle Madison, who saw public virtue as an outgrowth of personal moral rectitude—the former being impossible without the latter. Yet the separation of individual character and public policy stands at the heart of David Kelley’s dramatic agitprop.
Far from being ignored, personal morality is ridiculed in Kelley’s legal fantasyland where ministers come off as hypocritical lechers and their congregations consist of busybodies, homicidal prudes, or deacons who have sexual relations with cows. This is a convenient strategy since it makes the foibles of Kelley’s legal hero seem trivial by comparison. Alan Shore’s decadence is at least sincere and above board.
It should come as no surprise that this schizophrenic portrait mirrors the way Tinseltown itself touts political dogma over personal rectitude. As is the case with their ideological stand ins at the law firm of Crane, Poole, and Schmidt, political correctness provides a convenient form of absolution for self-indulgence. Instead of engaging in the rigors of penance and parenthood, all it takes to boost one’s moral standing is the perpetual public recitation of a PC rosary.
It’s a pleasant script, but it doesn’t comport with the reality Madison described. Salvation doesn’t come from above—from Washington. Nor will enlightened social policies emerge from debauched individuals who ridicule the very notion of personal virtue. A staged world where public welfare is divorced from private rectitude bears as much likeness to the actual world as Boston Legal does to the true practice of law.
Sunday, November 13, 2005
THE VAST LEFT WING CONSPIRACY by Byron York
One wonders if it would be possible for a journalist on the left wing of the political spectrum to write a book describing the role of prominent conservatives in the 2004 Presidential election using the same tone employed by National Review’s White House correspondent, Byron York. Judging from the rhetoric calmly catalogued in The Vast Left Wing Conspiracy, many a leftist author would have difficulty avoiding lurid references to Adolf Hitler or the Spanish Inquisition when portraying prominent players on the Right. There’s no telling what epithets might fly if that same correspondent were dealing with a Republican financier who had just spent 27 million dollars to buy a Presidential election--an unprecedented figure that, even in inflation-adjusted terms, dwarfs W. Clement Stone's 2 million dollar contribution to Richard Nixon in 1972.
Fortunately for our imaginary scrivener, George Soros was on his side--a partisan fact that seems to erase concerns about big-money’s corrupting political influence. Soros is, indeed, the name that pervades York’s level-headed account of movers and shakers who established a clutch of organizations intent on defeating George W. Bush in 2004. But the Hungarian-born billionaire wasn’t alone in funding this loosely-coordinated infrastructure that answered Slate magazine’s prayer for a left-wing version of Hillary’s hypothetical cabal from the right. As York notes, money from only four sources totaled more than the 75 million dollars in federal funds that was given to each of the two major campaigns. Besides Soros, this quartet included Soros friend and Progressive Insurance chairman, Peter Lewis ($24,000,000), Hollywood magnate Stephen Bing ($14,000,000), plus Mr. and Mrs. Herbert Sandler, the founders of Golden West Financial Corporation who chipped in an additional $13,000,000. Not surprisingly, Herb and Marion were also friends of George S.
According to York, a total of 230 million dollars was donated to supposedly non-partisan 527 organizations on the Left--almost two and a half times the sum raised by Republican counterparts. The most lavishly funded of these 527’s, America Coming Together, took in $200 million in 2003 and 2004--one-tenth of it supplied by Soros. In analyzing this cash cow, York notes the absurdity of ACT’s election-year declaration that 98% of its funds were being spent on activities other than partisan Presidential politics--an accounting fantasy rooted in ACT’s dual role as a Political Action Committee to which strict donation limits apply. Similar duplicity characterized John Podesta’s think tank, The Center for American Progress--an institution whose “scholarly” endeavors amounted to a series of political hit pieces directed at the President.
Besides focusing on dollar figures and campaign finance laws, York offers insights into the political and intellectual milieu of emergent power-brokers--starting with Joan Blades and Wes Boyd, the couple who founded MoveOn.Org when their anger over Bill Clinton’s possible impeachment was echoed by fellow-diners at a Chinese restaurant near Berkeley. The tendency to confuse the views of like-minded associates with mainstream America frequently plagued these aspiring operatives. This myopia doubtless contributed to Blades’s June, 2004, assertion that Kerry would not only win the upcoming election--but win in a landslide.
More intriguing is York’s tantalizing portrait of George Soros--a man whose wealth and influence far exceed his capacity to articulate a theory that links market bubbles to the rise and fall of empires. No one, it seems, finds Soros’s paradigm comprehensible--much less persuasive. As for the multi-billionaire’s view of 9/11, like Blades and Boyd, the key to Soros’s response is greater sensitivity: “...the Bush administration must realize we have to be concerned about the reactions of others.” In sum, York’s sketch suggests a man who, being the object of countless solicitations, vastly overestimates his philosophical assets.
York’s analysis of another “conspirator,” Michael Moore, is largely devoted to the filmmaker’s bogus claim of Red-state success. According to York, the following facts about Fahrenheit 9/11 went unreported in 2004: The movie underperformed significantly in Red-state markets; only eight cities accounted for 44% of box-office receipts; of those eight cities, only one lay outside of Blue-state America--Toronto. Indeed, York notes that the movie was popular throughout Canada--a region which accounted for considerably more of the film’s take than it did of John Kerry’s electoral vote total. Amid discussions of Moore’s audience, York employs a few pages to debunk misrepresentations in the “documentary” itself.
Al Franken is the best known of lesser media lights that York discusses in a chapter devoted to Air America. This radio arm of the elect-Kerry cabal initially aimed at a centrist audience--but like its hypersensitive star, didn’t follow through on the blueprint. A final chapter inspects the fever-swamp inhabited by NYU Professor Mark Crispin Miller. While Miller’s theocratic fantasy, A Patriot Act, was, at best, a fringe cultural event, its over-the-top terminology has become common in post-election Democrat rhetoric. Even Tim Russert now feels free to use the T-word in his Meet the Press interrogations.
York’s conclusions about The Vast Left Wing Conspiracy are a mixed bag. Obviously, the (tongue-partly-in-cheek) conspiracy failed in its primary mission, to defeat the President--but it failed by only 3 million votes. Moreover, most of these new organizations were founded shortly after the disastrous (for the Left) 2002 Congressional elections. Thus, future efforts are likely to become more efficient and less inclined to mistake choir-preaching for effective outreach. Of course a huge amount (pun intended) depends on the political whims of George Soros and company--without whose funds many of these organizations would never have seen the light of day.In brief, The Vast Left Wing Conspiracy is a hugely informative work that puts in perspective the meetings, the minds, and, above all, the money involved in a massive effort to defeat George W. Bush in 2004. The enduring institutional legacies of that effort betoken future campaigns whose dynamics and results might not be described with such admirable equipoise by Mr. York.
Fortunately for our imaginary scrivener, George Soros was on his side--a partisan fact that seems to erase concerns about big-money’s corrupting political influence. Soros is, indeed, the name that pervades York’s level-headed account of movers and shakers who established a clutch of organizations intent on defeating George W. Bush in 2004. But the Hungarian-born billionaire wasn’t alone in funding this loosely-coordinated infrastructure that answered Slate magazine’s prayer for a left-wing version of Hillary’s hypothetical cabal from the right. As York notes, money from only four sources totaled more than the 75 million dollars in federal funds that was given to each of the two major campaigns. Besides Soros, this quartet included Soros friend and Progressive Insurance chairman, Peter Lewis ($24,000,000), Hollywood magnate Stephen Bing ($14,000,000), plus Mr. and Mrs. Herbert Sandler, the founders of Golden West Financial Corporation who chipped in an additional $13,000,000. Not surprisingly, Herb and Marion were also friends of George S.
According to York, a total of 230 million dollars was donated to supposedly non-partisan 527 organizations on the Left--almost two and a half times the sum raised by Republican counterparts. The most lavishly funded of these 527’s, America Coming Together, took in $200 million in 2003 and 2004--one-tenth of it supplied by Soros. In analyzing this cash cow, York notes the absurdity of ACT’s election-year declaration that 98% of its funds were being spent on activities other than partisan Presidential politics--an accounting fantasy rooted in ACT’s dual role as a Political Action Committee to which strict donation limits apply. Similar duplicity characterized John Podesta’s think tank, The Center for American Progress--an institution whose “scholarly” endeavors amounted to a series of political hit pieces directed at the President.
Besides focusing on dollar figures and campaign finance laws, York offers insights into the political and intellectual milieu of emergent power-brokers--starting with Joan Blades and Wes Boyd, the couple who founded MoveOn.Org when their anger over Bill Clinton’s possible impeachment was echoed by fellow-diners at a Chinese restaurant near Berkeley. The tendency to confuse the views of like-minded associates with mainstream America frequently plagued these aspiring operatives. This myopia doubtless contributed to Blades’s June, 2004, assertion that Kerry would not only win the upcoming election--but win in a landslide.
More intriguing is York’s tantalizing portrait of George Soros--a man whose wealth and influence far exceed his capacity to articulate a theory that links market bubbles to the rise and fall of empires. No one, it seems, finds Soros’s paradigm comprehensible--much less persuasive. As for the multi-billionaire’s view of 9/11, like Blades and Boyd, the key to Soros’s response is greater sensitivity: “...the Bush administration must realize we have to be concerned about the reactions of others.” In sum, York’s sketch suggests a man who, being the object of countless solicitations, vastly overestimates his philosophical assets.
York’s analysis of another “conspirator,” Michael Moore, is largely devoted to the filmmaker’s bogus claim of Red-state success. According to York, the following facts about Fahrenheit 9/11 went unreported in 2004: The movie underperformed significantly in Red-state markets; only eight cities accounted for 44% of box-office receipts; of those eight cities, only one lay outside of Blue-state America--Toronto. Indeed, York notes that the movie was popular throughout Canada--a region which accounted for considerably more of the film’s take than it did of John Kerry’s electoral vote total. Amid discussions of Moore’s audience, York employs a few pages to debunk misrepresentations in the “documentary” itself.
Al Franken is the best known of lesser media lights that York discusses in a chapter devoted to Air America. This radio arm of the elect-Kerry cabal initially aimed at a centrist audience--but like its hypersensitive star, didn’t follow through on the blueprint. A final chapter inspects the fever-swamp inhabited by NYU Professor Mark Crispin Miller. While Miller’s theocratic fantasy, A Patriot Act, was, at best, a fringe cultural event, its over-the-top terminology has become common in post-election Democrat rhetoric. Even Tim Russert now feels free to use the T-word in his Meet the Press interrogations.
York’s conclusions about The Vast Left Wing Conspiracy are a mixed bag. Obviously, the (tongue-partly-in-cheek) conspiracy failed in its primary mission, to defeat the President--but it failed by only 3 million votes. Moreover, most of these new organizations were founded shortly after the disastrous (for the Left) 2002 Congressional elections. Thus, future efforts are likely to become more efficient and less inclined to mistake choir-preaching for effective outreach. Of course a huge amount (pun intended) depends on the political whims of George Soros and company--without whose funds many of these organizations would never have seen the light of day.In brief, The Vast Left Wing Conspiracy is a hugely informative work that puts in perspective the meetings, the minds, and, above all, the money involved in a massive effort to defeat George W. Bush in 2004. The enduring institutional legacies of that effort betoken future campaigns whose dynamics and results might not be described with such admirable equipoise by Mr. York.
Monday, November 07, 2005
OUTLINE FOR A GOOD SOCIETY
What kind of society would you like to see, positively? It’s a thought-provoking question posed by someone who was commenting on my blog and saw that criticism outweighed constructive suggestions. So taking up this challenge, allow me to outline my vision of that shining city set on a hill.
First and foremost, I yearn for a country where the mavens of mass communication take their social responsibility seriously—where music makers, television big wigs, and movie moguls treat their product as they would if they lived among the families that consume their offerings. The depravity that characterizes productions distributed to anonymous consumers could not withstand the shame that would accompany daily encounters with mothers, fathers, and kids who live across the street.
I dream of a nation where integrity is honored more than celebrity—a land where commencement addresses are given overwhelmingly by individuals of exceptional character, not by entertainers who lack significant moral or intellectual credentials.
I wish for a society where scandal and notoriety are shameful burdens, not precursors to a profitable book deal. I seek a society where kindness and duty are habitually emphasized —where respect is an attitude earned based on exemplary behavior, not a cocky demand for deference rooted in physical intimidation.
I long for a community that exhibits profound gratitude for its blessings and that realizes its vast bounty rests on an inherited foundation of discipline and religious conviction—a country that knows its greatest challenges arise from frenetic acquisitiveness and a lust for power, not the much-proclaimed scarcity of resources.
I wish for an America where thankfulness is an attitude more deeply engrained than a sense of entitlement—a land where “thank you” replaces “where’s mine” as the base line of popular sentiment. And I wish for governments whose size and priorities mirror this attitude of self-reliance and gratitude.
I long for a country that cherishes visual and aural beauty—that embraces silence and meditation as food for the spirit. I long for an environment that nourishes the better angels of our nature rather than constantly feeding the monsters of rebellion and instant gratification.
I seek a society where sex isn’t a recreational sport—where the union of two people means the union of spirits under a canopy of sacredness. I yearn for the day when children aren’t reduced to inconvenient burdens—a day when “the best interests of the child” is the parental norm, not a vacuous legal phrase.
I wish for a nation in which taking responsibility is the presumptive attitude—a nation where blaming parents and society for failure is a proposition greeted with suspicion. And I wish for leaders who embrace the idea that diligence will be rewarded and who reject the enervating idea that malevolent political forces make personal virtue pointless.
I long for a country where the mantra “Graduate, work, marry” replaces the clichéd and self-defeating chants of victim groups.
I wish for news professionals who treat their task as a public trust—who prize perspective over sensationalism. I seek journalists who present information with dignity—not like disaster-hawking carnival barkers.
I look for schools where teachers can enforce meaningful standards of dress and deportment without legal hectoring—standards that mirror those promulgated at home.
I wish for a public square that honors the religious parts of our heritage alongside ideas that originate from other sources.
I long for a country where childhood innocence is protected and vile language is viewed as a sign of degeneracy. I seek a nation where gangsta rap has become a risible musical genre.
I want a video lineup where Jack Paar is the late-night norm, not Leno or Letterman.
I want a land where parents and media moguls are largely on the same page—both emphasizing the importance of courage, moral integrity, and temperance.
I long for a society where “We stand on the shoulders of giants” is again a common aphorism—a society where character counts for more than fashion and where wisdom is honored more than pushing the envelope.
I want to live in a community where free expression doesn’t mean being vulgar with impunity and where free speech actually contributes to the state of the union.
I wish to live in a country where benevolence trumps cynicism, where family devotion minimizes alienation, and where inner peace outweighs the restless pursuit of fame and fortune.
Note that my terms are not absolutist. I do not look for a utopia—a world where poverty, isolation, pain, and obscenity are abolished. I seek, instead, a society that does more good than harm—a community that, on the whole, lifts us up instead of dragging us down.
Unfortunately, I do not see the will or the insight among the people or their leaders that would permit much progress toward these goals. Inertia from the mindless pursuit of power and sensual pleasure and wealth continues to push us toward the precipice. If I had to bet, I’d put my chips on the latter scenario.
First and foremost, I yearn for a country where the mavens of mass communication take their social responsibility seriously—where music makers, television big wigs, and movie moguls treat their product as they would if they lived among the families that consume their offerings. The depravity that characterizes productions distributed to anonymous consumers could not withstand the shame that would accompany daily encounters with mothers, fathers, and kids who live across the street.
I dream of a nation where integrity is honored more than celebrity—a land where commencement addresses are given overwhelmingly by individuals of exceptional character, not by entertainers who lack significant moral or intellectual credentials.
I wish for a society where scandal and notoriety are shameful burdens, not precursors to a profitable book deal. I seek a society where kindness and duty are habitually emphasized —where respect is an attitude earned based on exemplary behavior, not a cocky demand for deference rooted in physical intimidation.
I long for a community that exhibits profound gratitude for its blessings and that realizes its vast bounty rests on an inherited foundation of discipline and religious conviction—a country that knows its greatest challenges arise from frenetic acquisitiveness and a lust for power, not the much-proclaimed scarcity of resources.
I wish for an America where thankfulness is an attitude more deeply engrained than a sense of entitlement—a land where “thank you” replaces “where’s mine” as the base line of popular sentiment. And I wish for governments whose size and priorities mirror this attitude of self-reliance and gratitude.
I long for a country that cherishes visual and aural beauty—that embraces silence and meditation as food for the spirit. I long for an environment that nourishes the better angels of our nature rather than constantly feeding the monsters of rebellion and instant gratification.
I seek a society where sex isn’t a recreational sport—where the union of two people means the union of spirits under a canopy of sacredness. I yearn for the day when children aren’t reduced to inconvenient burdens—a day when “the best interests of the child” is the parental norm, not a vacuous legal phrase.
I wish for a nation in which taking responsibility is the presumptive attitude—a nation where blaming parents and society for failure is a proposition greeted with suspicion. And I wish for leaders who embrace the idea that diligence will be rewarded and who reject the enervating idea that malevolent political forces make personal virtue pointless.
I long for a country where the mantra “Graduate, work, marry” replaces the clichéd and self-defeating chants of victim groups.
I wish for news professionals who treat their task as a public trust—who prize perspective over sensationalism. I seek journalists who present information with dignity—not like disaster-hawking carnival barkers.
I look for schools where teachers can enforce meaningful standards of dress and deportment without legal hectoring—standards that mirror those promulgated at home.
I wish for a public square that honors the religious parts of our heritage alongside ideas that originate from other sources.
I long for a country where childhood innocence is protected and vile language is viewed as a sign of degeneracy. I seek a nation where gangsta rap has become a risible musical genre.
I want a video lineup where Jack Paar is the late-night norm, not Leno or Letterman.
I want a land where parents and media moguls are largely on the same page—both emphasizing the importance of courage, moral integrity, and temperance.
I long for a society where “We stand on the shoulders of giants” is again a common aphorism—a society where character counts for more than fashion and where wisdom is honored more than pushing the envelope.
I want to live in a community where free expression doesn’t mean being vulgar with impunity and where free speech actually contributes to the state of the union.
I wish to live in a country where benevolence trumps cynicism, where family devotion minimizes alienation, and where inner peace outweighs the restless pursuit of fame and fortune.
Note that my terms are not absolutist. I do not look for a utopia—a world where poverty, isolation, pain, and obscenity are abolished. I seek, instead, a society that does more good than harm—a community that, on the whole, lifts us up instead of dragging us down.
Unfortunately, I do not see the will or the insight among the people or their leaders that would permit much progress toward these goals. Inertia from the mindless pursuit of power and sensual pleasure and wealth continues to push us toward the precipice. If I had to bet, I’d put my chips on the latter scenario.
Monday, October 31, 2005
"THE GREAT RAID" AND HEROISM
“The Great Raid” portrays an America that no longer exists. More precisely, the cinematic recreation of the operation that rescued over 500 POWs from their Japanese captors in the Philippines presents a nation where military service and heroism were sincerely honored--a country where flag-waving patriotism was a sentiment that didn’t require a litany of lawyerly qualifications to fend off accusations of ethnocentricity. That attitude now characterizes only a subset of the population.
Triumphs of courage and military planning such as are celebrated in “The Great Raid” have doubtless occurred in Iraq and Afghanistan over the last three years. But they are largely ignored by the mainstream media--or buried beneath an avalanche of criticism focused on strategic and diplomatic blunders. Faux “tributes to the fallen” replace reports that pay homage to heroism within a specific mission. Such commemorative moments rip acts of sacrifice from their military context in order to utilize that blood for political ends that most of the fallen would despise.
The unstated subtext of these honor-segments is that no military mission is really worth dying for. Why else would these “tributes” omit significant reference to the objectives for which these soldiers gave the last full measure of devotion? Why else would gripping stories of individual and unit heroism be shunned? Why else would like-minded groups protest government offers to engrave on cemetery markers the name of the fallen warrior’s military operation? The implication is clear: Promising lives were wasted in battles where, in Matthew Arnold’s words, “ignorant armies clash by night.”
Well-coifed journalists, actors, and scribblers are, of course, passionately devoted to worthy causes--but their causes are devoid of serious risk. Indeed, almost all translate into career enhancements. Only actors who praise martial achievements are likely to suffer reprisals within an industry whose grandees regularly belittle acts of heroism as moronic or delusional. Friends of the Earth, by contrast, need not fear incoming fire from petrol interests who favor drilling in ANWR. Nor are anti-tobacco militants endangered by foes who produce Philip Morris’s nicotine delivery systems. And animal rights activists stand in far less danger of physical harm than their adversaries who trade in furs or employ critters in medical tests.
By contrast, enemies in “The Great Raid” are real, powerful, and brutal. Engaging in combat with these foes required more than a savvy PR agent or a simplistic slogan. It required courage, training, and a willingness to risk everything for the sake of comrades and country. Moreover, the country in which these dedicated soldiers lived acknowledged these facts and didn’t transform monstrous enemy acts into occasions for sympathetic psychoanalysis.
Today’s armchair generals are unwilling to come to grips with this basic fact: Heroism is necessary for the survival of a democratic and just society. Consequently, they also fail to recognize that our foes are frequently implacable and powerful. They cherish, instead, the illusion that withdrawal, subsidies, apologies, and diplomacy can make dangerous people go away--that political correctness can substitute for courage. They have no understanding of what the commander of the Cabanatuan raid said to his troops before setting out on their mission--words about deeds that would define their understanding of themselves for the rest of their lives.
Perhaps these critics sense within themselves the absence of the stuff it takes to face death while accomplishing a great task. That deficiency is no crime. The crime is failing to honor what distinguishes the courageous few from the rest of us and pretending that heroic sacrifice is a needless waste--pretending, in fact, that we who live soft lives devoid of danger are the true guardians of freedom. The crime is in not acknowledging greatness of spirit when it stares us in the face.
Triumphs of courage and military planning such as are celebrated in “The Great Raid” have doubtless occurred in Iraq and Afghanistan over the last three years. But they are largely ignored by the mainstream media--or buried beneath an avalanche of criticism focused on strategic and diplomatic blunders. Faux “tributes to the fallen” replace reports that pay homage to heroism within a specific mission. Such commemorative moments rip acts of sacrifice from their military context in order to utilize that blood for political ends that most of the fallen would despise.
The unstated subtext of these honor-segments is that no military mission is really worth dying for. Why else would these “tributes” omit significant reference to the objectives for which these soldiers gave the last full measure of devotion? Why else would gripping stories of individual and unit heroism be shunned? Why else would like-minded groups protest government offers to engrave on cemetery markers the name of the fallen warrior’s military operation? The implication is clear: Promising lives were wasted in battles where, in Matthew Arnold’s words, “ignorant armies clash by night.”
Well-coifed journalists, actors, and scribblers are, of course, passionately devoted to worthy causes--but their causes are devoid of serious risk. Indeed, almost all translate into career enhancements. Only actors who praise martial achievements are likely to suffer reprisals within an industry whose grandees regularly belittle acts of heroism as moronic or delusional. Friends of the Earth, by contrast, need not fear incoming fire from petrol interests who favor drilling in ANWR. Nor are anti-tobacco militants endangered by foes who produce Philip Morris’s nicotine delivery systems. And animal rights activists stand in far less danger of physical harm than their adversaries who trade in furs or employ critters in medical tests.
By contrast, enemies in “The Great Raid” are real, powerful, and brutal. Engaging in combat with these foes required more than a savvy PR agent or a simplistic slogan. It required courage, training, and a willingness to risk everything for the sake of comrades and country. Moreover, the country in which these dedicated soldiers lived acknowledged these facts and didn’t transform monstrous enemy acts into occasions for sympathetic psychoanalysis.
Today’s armchair generals are unwilling to come to grips with this basic fact: Heroism is necessary for the survival of a democratic and just society. Consequently, they also fail to recognize that our foes are frequently implacable and powerful. They cherish, instead, the illusion that withdrawal, subsidies, apologies, and diplomacy can make dangerous people go away--that political correctness can substitute for courage. They have no understanding of what the commander of the Cabanatuan raid said to his troops before setting out on their mission--words about deeds that would define their understanding of themselves for the rest of their lives.
Perhaps these critics sense within themselves the absence of the stuff it takes to face death while accomplishing a great task. That deficiency is no crime. The crime is failing to honor what distinguishes the courageous few from the rest of us and pretending that heroic sacrifice is a needless waste--pretending, in fact, that we who live soft lives devoid of danger are the true guardians of freedom. The crime is in not acknowledging greatness of spirit when it stares us in the face.
Thursday, October 13, 2005
FREAKONOMICS--HIPSTERS DO ECON
John Kenneth Galbraith once said that economics had never quite been able to shed its “dismal science” label because it was never quite undeserved. Despite a devil-may-care approach, Freakonomics does little to improve the discipline’s tarnished image. Indeed, the book’s macabre, cost-benefit abortion ratios vie with Thomas Malthus’s starvation-population equation for primacy in the economics Hall of Shame.
Authored by University of Chicago pop-economist Steven Levitt and writer Stephen Dubner, Freakonomics revels in eccentricity and seldom misses a chance to surprise readers with brash comparisons that, on reflection, are less impressive than advertised. Its first chapter juxtaposes sumo wrestlers and school teachers but produces the hardly stunning conclusion that both groups contain persons willing to cheat when stakes are high. Elsewhere, Levitt and Dubner link the Ku Klux Klan with real estate agents by noting that both have access to inside information. The operative principle in these “Who’d a thunk it” pairings is that any two things are remarkably alike--provided the analyst disregards the ways they aren’t alike.
The most explosive correlation in this scatter-shot compendium is the one that links recent reductions in the crime-rate to the nationwide legalization of abortion in 1973. By itself, this proposition isn’t stunning news since abortions occur disproportionately among demographic groups that produce more than their share of criminals. Thus, when those groups have fewer offspring, fewer criminals are the result. Put succinctly: more abortions, less crime. Freakonomics provides a cursory summary of statistical evidence to buttress the idea that some unspecified part of the recent drop in crime is due to the million-plus abortions performed annually in the U.S. since 1975. Booming incarceration rates and increased police numbers are two other factors the authors credit when explaining the recent crime bust.
Freakonomics stops short of putting a seal of approval on abortion as an effective crime-fighting technique--but just barely. The book repeatedly portrays women’s motives for receiving abortions in glowing terms. In an effusive eulogy to Roe v. Wade, Levitt and Dubner assert that “The Supreme Court gave voice to what mothers in Romania and Scandinavia ... had long known: when a woman does not want to have a child, she usually has good reason.” Later the authors expand this broad endorsement of motives and imply that most abortions have beneficial social consequences.
Not coincidentally, Nicolae Ceausescu’s Romania is presented as the prime counter- example to America’s post-Roe, abortion-on-demand regime. The dictator’s abortion ban, it appears, produced a bumper crop of children more likely to engage in crime than their predecessors. Leaving aside the fact that, as usual, no details of this study are provided, one can’t help but wonder whether Communist Romania is the place where “other things being equal” comparisons are most prudently invoked.
Statements about abortion demographics provide a glaring example of scholarly lassitude. “One study,” the authors inform us, shows that these potential children would have been “50% more likely than average to live in poverty.” Another study says they would have been 60% more likely than average to grow up with one parent. Since these two factors, taken together, comprise the strongest predictors that children will have a criminal future, the authors offer the following conclusion to these widely-acknowledged premises: “...the very factors that drove millions of American women to have an abortion also seemed to predict that their children, had they been born, would have led unhappy and possibly criminal lives.”
Confusion arises, first of all, because the preceding statement blurs the distinction between all women who receive abortions and the much smaller subgroup of poor, single women who undergo the procedure. More blatant is the distortion created by substituting a sentence of universal despair (“their children...would have led unhappy and possibly criminal lives.”) for the possible criminality of some unstated fraction of these fetuses--certainly less than 20% of the high-risk cohort. Only blindered ideology, or a craven desire to mirror elite opinion, can explain this oversight that is repeated a second time (at least in spirit) at chapter’s end: “When the government gives a woman the opportunity to make a decision about abortion, she generally does a good job of figuring out if she is in a good position to raise the baby well.”
Even if one concedes that the crime-abortion link constitutes prima facie evidence of good decision-making, the authors present no evidence (and certainly no moral argument) to suggest that the overwhelming majority of women whose children would not become criminals did a “good job.” Perhaps this is the reason the elastic word “unhappy” was inserted into a statement whose precedents only concerned criminality. In any case, both summaries transform philosophically-grounded assessments of decisions made by a small fraction of women into empirically-buttressed validations of abortions had by all women.
The only thing positive about these acts of linguistic-statistical legerdemain is that they spare readers from being subjected to another utilitarian calculation that pits the potential happiness of 80 to 90% of aborted fetuses against the potential havoc wrought by the other 10 to 20%. This thought experiment would mirror one actually provided by the authors where a 100:1 fetus to human being ratio is employed to show abortion’s inutility as a serious murder-reduction strategy. (In this case a popular methodological observation is turned on its head. Insane means are used to attain a morally reasonable end.)
Amazingly, this allusive mixture of statistical data and careless rhetoric about the link between abortion and reduced crime rates takes place in scarcely more space than is devoted to the topic in this review. Curious readers are left to themselves to sift through a list of reference works that appear only at the back of the book and without extended annotation.
Ironically, Freakonomics says a good deal more about parenting than it does about abortion and crime. Not surprisingly, what it says is neither consistent nor compelling. At one point we are told that good parenting (what we do as opposed to what we are) doesn’t matter much (p.175). Elsewhere the authors assert that bad parenting “Clearly...matters a great deal” (p.153). Single-parent homes are said to be irrelevant when it comes to school performance (p.174), yet the same condition is dubbed a prime indicator of criminal behavior (p.138). On a related topic, data drawn from Chicago’s public schools are used to show that school choice, in itself, matters little (p.158). But a few pages later we are informed that the poor performance of blacks and their white classmates is caused by the abysmal quality of the schools they attend (p.165).
This lack of intellectual rigor is previewed in an introductory chapter where Levitt and Dubner assert that morality concerns the way people “would like the world to work”--as if wishes and moral obligations were the same thing. The authors then claim that economics, by contrast, deals with the way things “actually do work”--an assertion rooted either in disciplinary amnesia or, more likely, philosophical naiveté.
A dearth of moral gravitas is also communicated by observations such as the one that compares foot-soldiers in the crack cocaine “business” with “a McDonald’s burger flipper or a Wal-Mart shelfstocker.” Here and elsewhere an elitist disposition is evident--an attitude more interested in projecting an aura of trendy insouciance than in acknowledging the gulf that separates honest work from an occupation where the four-year chance of being killed is 25%. Of a piece with this cavalier attitude are the sleazy rumors that are casually dropped about CIA drug trafficking and More Guns, Less Crime author John Lott.
A final chapter devoted to children’s names provides more evidence that this book doesn’t take itself seriously. After concluding that names, in themselves, don’t impact a child’s future, a dozen more pages follow that contain comments about names that have been popular with upper and lower class parents--not, of course, that it really matters.
What Freakonomics highlights, more than anything, is an adolescent mentality that enjoys iconoclasm for its own sake. Investigations conducted in this frame of mind are relished more for their shock value than for any insight they provide into the human condition. Such an attitude breeds the careless rhetoric and cursory treatment that the authors exhibit when discussing the most sensitive issues. What Levitt and Dubner obviously don't appreciate is the price society pays for adopting a cost-benefit perspective that happily views forty million aborted fetuses as a successful crime-fighting effort.
Authored by University of Chicago pop-economist Steven Levitt and writer Stephen Dubner, Freakonomics revels in eccentricity and seldom misses a chance to surprise readers with brash comparisons that, on reflection, are less impressive than advertised. Its first chapter juxtaposes sumo wrestlers and school teachers but produces the hardly stunning conclusion that both groups contain persons willing to cheat when stakes are high. Elsewhere, Levitt and Dubner link the Ku Klux Klan with real estate agents by noting that both have access to inside information. The operative principle in these “Who’d a thunk it” pairings is that any two things are remarkably alike--provided the analyst disregards the ways they aren’t alike.
The most explosive correlation in this scatter-shot compendium is the one that links recent reductions in the crime-rate to the nationwide legalization of abortion in 1973. By itself, this proposition isn’t stunning news since abortions occur disproportionately among demographic groups that produce more than their share of criminals. Thus, when those groups have fewer offspring, fewer criminals are the result. Put succinctly: more abortions, less crime. Freakonomics provides a cursory summary of statistical evidence to buttress the idea that some unspecified part of the recent drop in crime is due to the million-plus abortions performed annually in the U.S. since 1975. Booming incarceration rates and increased police numbers are two other factors the authors credit when explaining the recent crime bust.
Freakonomics stops short of putting a seal of approval on abortion as an effective crime-fighting technique--but just barely. The book repeatedly portrays women’s motives for receiving abortions in glowing terms. In an effusive eulogy to Roe v. Wade, Levitt and Dubner assert that “The Supreme Court gave voice to what mothers in Romania and Scandinavia ... had long known: when a woman does not want to have a child, she usually has good reason.” Later the authors expand this broad endorsement of motives and imply that most abortions have beneficial social consequences.
Not coincidentally, Nicolae Ceausescu’s Romania is presented as the prime counter- example to America’s post-Roe, abortion-on-demand regime. The dictator’s abortion ban, it appears, produced a bumper crop of children more likely to engage in crime than their predecessors. Leaving aside the fact that, as usual, no details of this study are provided, one can’t help but wonder whether Communist Romania is the place where “other things being equal” comparisons are most prudently invoked.
Statements about abortion demographics provide a glaring example of scholarly lassitude. “One study,” the authors inform us, shows that these potential children would have been “50% more likely than average to live in poverty.” Another study says they would have been 60% more likely than average to grow up with one parent. Since these two factors, taken together, comprise the strongest predictors that children will have a criminal future, the authors offer the following conclusion to these widely-acknowledged premises: “...the very factors that drove millions of American women to have an abortion also seemed to predict that their children, had they been born, would have led unhappy and possibly criminal lives.”
Confusion arises, first of all, because the preceding statement blurs the distinction between all women who receive abortions and the much smaller subgroup of poor, single women who undergo the procedure. More blatant is the distortion created by substituting a sentence of universal despair (“their children...would have led unhappy and possibly criminal lives.”) for the possible criminality of some unstated fraction of these fetuses--certainly less than 20% of the high-risk cohort. Only blindered ideology, or a craven desire to mirror elite opinion, can explain this oversight that is repeated a second time (at least in spirit) at chapter’s end: “When the government gives a woman the opportunity to make a decision about abortion, she generally does a good job of figuring out if she is in a good position to raise the baby well.”
Even if one concedes that the crime-abortion link constitutes prima facie evidence of good decision-making, the authors present no evidence (and certainly no moral argument) to suggest that the overwhelming majority of women whose children would not become criminals did a “good job.” Perhaps this is the reason the elastic word “unhappy” was inserted into a statement whose precedents only concerned criminality. In any case, both summaries transform philosophically-grounded assessments of decisions made by a small fraction of women into empirically-buttressed validations of abortions had by all women.
The only thing positive about these acts of linguistic-statistical legerdemain is that they spare readers from being subjected to another utilitarian calculation that pits the potential happiness of 80 to 90% of aborted fetuses against the potential havoc wrought by the other 10 to 20%. This thought experiment would mirror one actually provided by the authors where a 100:1 fetus to human being ratio is employed to show abortion’s inutility as a serious murder-reduction strategy. (In this case a popular methodological observation is turned on its head. Insane means are used to attain a morally reasonable end.)
Amazingly, this allusive mixture of statistical data and careless rhetoric about the link between abortion and reduced crime rates takes place in scarcely more space than is devoted to the topic in this review. Curious readers are left to themselves to sift through a list of reference works that appear only at the back of the book and without extended annotation.
Ironically, Freakonomics says a good deal more about parenting than it does about abortion and crime. Not surprisingly, what it says is neither consistent nor compelling. At one point we are told that good parenting (what we do as opposed to what we are) doesn’t matter much (p.175). Elsewhere the authors assert that bad parenting “Clearly...matters a great deal” (p.153). Single-parent homes are said to be irrelevant when it comes to school performance (p.174), yet the same condition is dubbed a prime indicator of criminal behavior (p.138). On a related topic, data drawn from Chicago’s public schools are used to show that school choice, in itself, matters little (p.158). But a few pages later we are informed that the poor performance of blacks and their white classmates is caused by the abysmal quality of the schools they attend (p.165).
This lack of intellectual rigor is previewed in an introductory chapter where Levitt and Dubner assert that morality concerns the way people “would like the world to work”--as if wishes and moral obligations were the same thing. The authors then claim that economics, by contrast, deals with the way things “actually do work”--an assertion rooted either in disciplinary amnesia or, more likely, philosophical naiveté.
A dearth of moral gravitas is also communicated by observations such as the one that compares foot-soldiers in the crack cocaine “business” with “a McDonald’s burger flipper or a Wal-Mart shelfstocker.” Here and elsewhere an elitist disposition is evident--an attitude more interested in projecting an aura of trendy insouciance than in acknowledging the gulf that separates honest work from an occupation where the four-year chance of being killed is 25%. Of a piece with this cavalier attitude are the sleazy rumors that are casually dropped about CIA drug trafficking and More Guns, Less Crime author John Lott.
A final chapter devoted to children’s names provides more evidence that this book doesn’t take itself seriously. After concluding that names, in themselves, don’t impact a child’s future, a dozen more pages follow that contain comments about names that have been popular with upper and lower class parents--not, of course, that it really matters.
What Freakonomics highlights, more than anything, is an adolescent mentality that enjoys iconoclasm for its own sake. Investigations conducted in this frame of mind are relished more for their shock value than for any insight they provide into the human condition. Such an attitude breeds the careless rhetoric and cursory treatment that the authors exhibit when discussing the most sensitive issues. What Levitt and Dubner obviously don't appreciate is the price society pays for adopting a cost-benefit perspective that happily views forty million aborted fetuses as a successful crime-fighting effort.
Thursday, October 06, 2005
100 PEOPLE WHO ARE SCREWING UP AMERICA
Bernard Goldberg’s 100 PEOPLE WHO ARE SCREWING UP AMERICA provides a wealth of anecdotal evidence to bolster the belief that the country is going to hell in a hand-basket. From rappers to reporters, from shock jocks to political blowhards, the former CBS journalist offers a series of sketches about persons whose words and actions deserve special recognition in a cultural Hall of Shame.
Goldberg divides the landscape of contemporary destructiveness into a few large subdivisions--politicians, journalists, entertainers, businessmen, lawyers, and academics. Under the journalistic category, names of Goldberg’s former colleagues pop up frequently. Bill Moyers (34) and Dan Rather (12) are skewered for ideological bias, whereas Barbara Walters (46) and Diane Sawyer (56) are called on the carpet for their role in erasing the line between serious journalism and entertainment.
On the executive side, ABC’s David Westin (55) and NBC’s Neal Shapiro (54) are faulted for presiding over this merging of news and entertainment. Meanwhile, CBS news President Andrew Hayward (13) is excoriated for failing to take personal responsibility for Mary Mapes’s (14) “60 Minutes” hit-piece based on fraudulent National Guard documents that were obtained from an unstable Bush-hater. The most prominently vilified executive is publisher Arthur Sulzberger (2), whose legacy is that he turned the New York Times into a paper that now prints all the news that fits his bent.
Lesser journalistic lights like Ted Rall (15) and Jeff Danzinger (35) also make Goldberg’s list of cultural malefactors--the former for his despicable post-mortem editorial cartoon lampooning NFL player turned soldier, Pat Tillman, and the latter for a racist caricature of Condoleezza Rice.
On the political front Goldberg focuses most of his fire on leftist ideologues--but a smattering of right-wingers like David Duke (66) are also included in the mix. By my count, eight of the bottom twenty on Goldberg’s list are either Democrat pols like Howard Dean (20) and Al Gore (18) or operatives like People for the American Way lobbyist, Ralph Neas (10). That count, by the way, excludes the party’s chief financial backer in 2004, George Soros (19)--a man whose prominence arises not from deep policy insights but rather from very deep pockets. Another political sub-group includes “Racial Enforcers” like Al Sharpton (17) and Jesse Jackson (4).
When it comes to debased entertainment, Goldberg rounds up the usual suspects: Howard Stern (62), Jerry Springer (32), Maury Povich (31), and Eminem (58). Besides providing revoltingly specific examples of this bottomless vulgarity, the author also reveals the identity of a little-known figure who has provided much of the financial backing for the rap industry, Interscope’s Ted Field (57). Far from being a product of the ghetto, Mr. Field is a very white child of the sixties and heir to his father’s retail fortune. The younger Field has now made his own mark as the premier corrupter of American youngsters.
In addition to sleazemeisters, Goldberg notes the negative contributions that Hollywood types are making to political discourse. Because so many Tinseltown egos deserve recognition, the author lumps a number of “stars” like Janeane Garofalo and Alec Baldwin into three catch-all categories: The Dumb Celebrity (85), The Vicious Celebrity (84), and The Dumb and Vicious Celebrity (83). Of course some of the beautiful people spout inanities so frequently that they merit numbers of their own--like Barbra Streisand (91) and comedian turned talk-jock Al Franken (37).
Lawyers and corporate executives form other subgroups described as “American Jackels” and “White-Collar Thugs.” The legal grifter-in-chief is former tort lawyer, Senator, and Vice-presidential candidate John Edwards (16). Edwards’ pathetic courtroom channeling of a dead child is persuasive evidence of the way lawyers are fleecing the public rather than pursuing justice. The handsome North Carolinian’s pitch to a gullible jury secured a huge settlement apparently based on bogus notions that linked the mother’s C-section with her child’s cerebral palsy. On the other side of the bench, Massachusetts Supreme Court Chief Justice Margaret Marshall (7), is given recognition for her redefinition of marriage by judicial fiat.
Enron President Ken Lay (45) and Tyco CEO Dennis Kozlowski (44) are prominent business executives on Goldberg’s top-100 list. The entry describing Koslowski’s birthday bash in Sardinia is one that economist Thorsten Veblen would have loved to employ as an example of conspicuous consumption. Other individuals singled out for their contributions to American decadence include Paul Eibler (43), President of the software corporation that gave us “Grand Theft Auto,” and Todd Goldman (97), an entrepreneur who has made good money selling shirts that insult boys.
Academia is another rich source of cultural pollutants. Indeed, ivory-tower dwellers constitute ten percent of Goldberg’s list. Ward Churchill (72), the faux Native American who used the term “little Eichmanns” to describe persons murdered in the Twin Towers, is the poster child for today’s America-bashers--folks whose most venerable icon is linguist-turned-naysayer, Noam Chomsky (11). Former University of Pennsylvania President, Sheldon Hackney (87), is among the spineless, politically correct administrators that Goldberg singles out for opprobrium. And in a more theoretical vein Princeton ethics professor Peter Singer (39) is cited for his avant-garde advocacy of infanticide.
Many of Goldberg’s names don’t fall clearly into any of the aforementioned categories but do serve to illustrate the rot that is consuming American culture. My personal “favorite” is Amy Richards (63), a thirty-four year old woman with academic and literary connections. She had been taking birth control pills but went off them because they made her "moody." In 2003 she became pregnant by her boyfriend of three years. Having a child out of wedlock wasn't a problem for Richards. The problem was that there wasn't just one baby-to-be in her womb. There were three--and three was just too many. The fearful scenario she envisioned for herself involved moving from Manhattan to Staten Island, shopping at Costco, and buying large jars of mayonnaise. Moreover, the multiple pregnancy would force here to forego her spring lecture income. Because she found this domestic script distasteful, she asked her obstetrician if she could "get rid of" one or two of the three fetuses in her womb.
It turns out that such a procedure is possible, and like all such acts it comes with a wonderfully clinical name--“selective reduction.” So selective reduction was the “choice” Amy Richards made. Since two of her three babies-to-be were twins and three days younger than the free-standing fetus, those two were the ones “selected” to receive shots of potassium chloride to the heart. The “successful” operation meant that Ms. Richards could raise her child by herself, stay in Manhattan, and lecture during the profitable spring months.
As a bonus, after the birth of her son, Ms. Richards put her writing talents to good use by composing an essay about her recent experience. The piece was called, "When One Is Enough," and was published in the Sunday New York Times Magazine on July 18, 2004. The article will doubtless be "interesting" reading for Richards' "non-selected" son. One can imagine the questions his mother's moral reasoning will one day prompt: "Why them and not me? What's all this talk about mayonnaise and Costco? Why tell the world about it?" To which questions the single answer is "Amy Richards."
Such is the country in which, by Goldberg’s calculus, Michael Moore ranks number one on the list of those who are leading us into a cultural abyss.
Goldberg divides the landscape of contemporary destructiveness into a few large subdivisions--politicians, journalists, entertainers, businessmen, lawyers, and academics. Under the journalistic category, names of Goldberg’s former colleagues pop up frequently. Bill Moyers (34) and Dan Rather (12) are skewered for ideological bias, whereas Barbara Walters (46) and Diane Sawyer (56) are called on the carpet for their role in erasing the line between serious journalism and entertainment.
On the executive side, ABC’s David Westin (55) and NBC’s Neal Shapiro (54) are faulted for presiding over this merging of news and entertainment. Meanwhile, CBS news President Andrew Hayward (13) is excoriated for failing to take personal responsibility for Mary Mapes’s (14) “60 Minutes” hit-piece based on fraudulent National Guard documents that were obtained from an unstable Bush-hater. The most prominently vilified executive is publisher Arthur Sulzberger (2), whose legacy is that he turned the New York Times into a paper that now prints all the news that fits his bent.
Lesser journalistic lights like Ted Rall (15) and Jeff Danzinger (35) also make Goldberg’s list of cultural malefactors--the former for his despicable post-mortem editorial cartoon lampooning NFL player turned soldier, Pat Tillman, and the latter for a racist caricature of Condoleezza Rice.
On the political front Goldberg focuses most of his fire on leftist ideologues--but a smattering of right-wingers like David Duke (66) are also included in the mix. By my count, eight of the bottom twenty on Goldberg’s list are either Democrat pols like Howard Dean (20) and Al Gore (18) or operatives like People for the American Way lobbyist, Ralph Neas (10). That count, by the way, excludes the party’s chief financial backer in 2004, George Soros (19)--a man whose prominence arises not from deep policy insights but rather from very deep pockets. Another political sub-group includes “Racial Enforcers” like Al Sharpton (17) and Jesse Jackson (4).
When it comes to debased entertainment, Goldberg rounds up the usual suspects: Howard Stern (62), Jerry Springer (32), Maury Povich (31), and Eminem (58). Besides providing revoltingly specific examples of this bottomless vulgarity, the author also reveals the identity of a little-known figure who has provided much of the financial backing for the rap industry, Interscope’s Ted Field (57). Far from being a product of the ghetto, Mr. Field is a very white child of the sixties and heir to his father’s retail fortune. The younger Field has now made his own mark as the premier corrupter of American youngsters.
In addition to sleazemeisters, Goldberg notes the negative contributions that Hollywood types are making to political discourse. Because so many Tinseltown egos deserve recognition, the author lumps a number of “stars” like Janeane Garofalo and Alec Baldwin into three catch-all categories: The Dumb Celebrity (85), The Vicious Celebrity (84), and The Dumb and Vicious Celebrity (83). Of course some of the beautiful people spout inanities so frequently that they merit numbers of their own--like Barbra Streisand (91) and comedian turned talk-jock Al Franken (37).
Lawyers and corporate executives form other subgroups described as “American Jackels” and “White-Collar Thugs.” The legal grifter-in-chief is former tort lawyer, Senator, and Vice-presidential candidate John Edwards (16). Edwards’ pathetic courtroom channeling of a dead child is persuasive evidence of the way lawyers are fleecing the public rather than pursuing justice. The handsome North Carolinian’s pitch to a gullible jury secured a huge settlement apparently based on bogus notions that linked the mother’s C-section with her child’s cerebral palsy. On the other side of the bench, Massachusetts Supreme Court Chief Justice Margaret Marshall (7), is given recognition for her redefinition of marriage by judicial fiat.
Enron President Ken Lay (45) and Tyco CEO Dennis Kozlowski (44) are prominent business executives on Goldberg’s top-100 list. The entry describing Koslowski’s birthday bash in Sardinia is one that economist Thorsten Veblen would have loved to employ as an example of conspicuous consumption. Other individuals singled out for their contributions to American decadence include Paul Eibler (43), President of the software corporation that gave us “Grand Theft Auto,” and Todd Goldman (97), an entrepreneur who has made good money selling shirts that insult boys.
Academia is another rich source of cultural pollutants. Indeed, ivory-tower dwellers constitute ten percent of Goldberg’s list. Ward Churchill (72), the faux Native American who used the term “little Eichmanns” to describe persons murdered in the Twin Towers, is the poster child for today’s America-bashers--folks whose most venerable icon is linguist-turned-naysayer, Noam Chomsky (11). Former University of Pennsylvania President, Sheldon Hackney (87), is among the spineless, politically correct administrators that Goldberg singles out for opprobrium. And in a more theoretical vein Princeton ethics professor Peter Singer (39) is cited for his avant-garde advocacy of infanticide.
Many of Goldberg’s names don’t fall clearly into any of the aforementioned categories but do serve to illustrate the rot that is consuming American culture. My personal “favorite” is Amy Richards (63), a thirty-four year old woman with academic and literary connections. She had been taking birth control pills but went off them because they made her "moody." In 2003 she became pregnant by her boyfriend of three years. Having a child out of wedlock wasn't a problem for Richards. The problem was that there wasn't just one baby-to-be in her womb. There were three--and three was just too many. The fearful scenario she envisioned for herself involved moving from Manhattan to Staten Island, shopping at Costco, and buying large jars of mayonnaise. Moreover, the multiple pregnancy would force here to forego her spring lecture income. Because she found this domestic script distasteful, she asked her obstetrician if she could "get rid of" one or two of the three fetuses in her womb.
It turns out that such a procedure is possible, and like all such acts it comes with a wonderfully clinical name--“selective reduction.” So selective reduction was the “choice” Amy Richards made. Since two of her three babies-to-be were twins and three days younger than the free-standing fetus, those two were the ones “selected” to receive shots of potassium chloride to the heart. The “successful” operation meant that Ms. Richards could raise her child by herself, stay in Manhattan, and lecture during the profitable spring months.
As a bonus, after the birth of her son, Ms. Richards put her writing talents to good use by composing an essay about her recent experience. The piece was called, "When One Is Enough," and was published in the Sunday New York Times Magazine on July 18, 2004. The article will doubtless be "interesting" reading for Richards' "non-selected" son. One can imagine the questions his mother's moral reasoning will one day prompt: "Why them and not me? What's all this talk about mayonnaise and Costco? Why tell the world about it?" To which questions the single answer is "Amy Richards."
Such is the country in which, by Goldberg’s calculus, Michael Moore ranks number one on the list of those who are leading us into a cultural abyss.
Monday, September 19, 2005
Thomas Sowell on "The Real History of Slavery"
It’s not like Alex Haley pictured it. At best the "Roots" portrait is incomplete. At worst the book and TV mini-series is mendacious.
I’m talking about the not-so-peculiar institution of slavery. The rest of the story (and the more accurate story) can be found in the extended essay, "The Real History of Slavery," in Thomas Sowell’s provocatively-titled book, BLACK REDNECKS
AND WHITE LIBERALS.
That fuller perspective may not come as a total surprise. Most of us have memories of ancient Greek and Roman society in the recesses of our minds--civilizations suffused with human bondage. Others may recall the Barbary pirates. What is surprising, however, is how most of our front-burner thoughts on the topic are at odds with this larger perspective.
Ideas that seldom inform top-of-the-head ruminations include the following: Slavery existed worldwide and throughout the history of civilization. Slavery wasn’t founded on racial differences but on disparities of power. Throughout most of history whites enslaved whites, blacks enslaved blacks, and other groups enslaved vulnerable neighbors. (The very word "slave" derives from "Slav"--an ethnic group subject to raids from various directions.) The British, more than any other national group, were responsible for outlawing slavery around the world in the 19th and early 20th centuries. Finally, those English imperial efforts were met with great resistance in the Ottoman Empire and in parts of Africa.
These ideas are not the ones that typically spring to mind when the word "slavery" is mentioned. Instead, the Roots picture of white slavers going ashore in Africa and rounding up vulnerable natives is typical. This image is itself implausible. In
most cases slavers acquired their cargo from African tribes that, like peoples around the world, captured and sold neighbors who were not powerful enough to fend off adversaries. Indeed, Sowell notes that slavery flourished in the West African area from which Kunta Kinte was presumably abducted.
The "problem" with such facts is that they don’t conform to the "black and white" Hollywood paradigm that is required if one wants to use history as a political weapon. Having Africans enslave other Africans--only some of whom were sold to merchants docked at the shore--isn’t a stirring myth. Race-based indignation is further undermined by accounts that depict black slaveowners who supported the Confederacy or managed plantations in the Caribbean.
Also unhelpful for contemporary purposes is the fact that the British expended decades of effort and approximately 5% of their GNP in an effort to eradicate a practice that most nations accepted without question. (Even Thomas More’s Utopia included slavery in its vision of perfection.) Most disconcerting of all, at least for some contemporary groups, is the fact that Quakers and evangelical Anglicans like William Wilberforce provided the spiritual impetus that made his island nation willing to pay a steep price in lives and treasure for a cause whose reward was largely intangible.
Sowell’s discussion of the black family delivers another blow to conventional wisdom. The author points out that between 1890 and 1940 African-Americans actually had slightly higher marriage rates than whites--a distinction that, since 1960, has more than disappeared. These facts suggest the implausibility of using the "legacy of slavery" to explain social pathologies that only emerged a century after the fact.
History is what it is--warts and all. If major kudos go to the British for eliminating slavery, so be it--no matter how grating that fact might be for those who equate Western civilization with racism. More importantly, as Sowell notes, the shame and guilt that often poison race relations are actually mitigated by looking the facts about slavery square in the face. The truth, he believes, will set us free.
I’m talking about the not-so-peculiar institution of slavery. The rest of the story (and the more accurate story) can be found in the extended essay, "The Real History of Slavery," in Thomas Sowell’s provocatively-titled book, BLACK REDNECKS
AND WHITE LIBERALS.
That fuller perspective may not come as a total surprise. Most of us have memories of ancient Greek and Roman society in the recesses of our minds--civilizations suffused with human bondage. Others may recall the Barbary pirates. What is surprising, however, is how most of our front-burner thoughts on the topic are at odds with this larger perspective.
Ideas that seldom inform top-of-the-head ruminations include the following: Slavery existed worldwide and throughout the history of civilization. Slavery wasn’t founded on racial differences but on disparities of power. Throughout most of history whites enslaved whites, blacks enslaved blacks, and other groups enslaved vulnerable neighbors. (The very word "slave" derives from "Slav"--an ethnic group subject to raids from various directions.) The British, more than any other national group, were responsible for outlawing slavery around the world in the 19th and early 20th centuries. Finally, those English imperial efforts were met with great resistance in the Ottoman Empire and in parts of Africa.
These ideas are not the ones that typically spring to mind when the word "slavery" is mentioned. Instead, the Roots picture of white slavers going ashore in Africa and rounding up vulnerable natives is typical. This image is itself implausible. In
most cases slavers acquired their cargo from African tribes that, like peoples around the world, captured and sold neighbors who were not powerful enough to fend off adversaries. Indeed, Sowell notes that slavery flourished in the West African area from which Kunta Kinte was presumably abducted.
The "problem" with such facts is that they don’t conform to the "black and white" Hollywood paradigm that is required if one wants to use history as a political weapon. Having Africans enslave other Africans--only some of whom were sold to merchants docked at the shore--isn’t a stirring myth. Race-based indignation is further undermined by accounts that depict black slaveowners who supported the Confederacy or managed plantations in the Caribbean.
Also unhelpful for contemporary purposes is the fact that the British expended decades of effort and approximately 5% of their GNP in an effort to eradicate a practice that most nations accepted without question. (Even Thomas More’s Utopia included slavery in its vision of perfection.) Most disconcerting of all, at least for some contemporary groups, is the fact that Quakers and evangelical Anglicans like William Wilberforce provided the spiritual impetus that made his island nation willing to pay a steep price in lives and treasure for a cause whose reward was largely intangible.
Sowell’s discussion of the black family delivers another blow to conventional wisdom. The author points out that between 1890 and 1940 African-Americans actually had slightly higher marriage rates than whites--a distinction that, since 1960, has more than disappeared. These facts suggest the implausibility of using the "legacy of slavery" to explain social pathologies that only emerged a century after the fact.
History is what it is--warts and all. If major kudos go to the British for eliminating slavery, so be it--no matter how grating that fact might be for those who equate Western civilization with racism. More importantly, as Sowell notes, the shame and guilt that often poison race relations are actually mitigated by looking the facts about slavery square in the face. The truth, he believes, will set us free.
Tuesday, September 06, 2005
BLACK REDNECKS AND WHITE LIBERALS
A Cal State sociology professor recently convinced the education pooh-bahs in San Bernardino of the need to incorporate "Ebonics" into their curriculum--at least on a limited basis. By giving official linguistic recognition and respect to what most people consider street-talk, these officials hope to peak the interest of marginal students and to discourage them from dropping out of school.
Unfortunately, none of these lofty objectives were achieved by Oakland’s experiment in substandard English. And now Professor Thomas Sowell has written an extended essay--contained in the book, Black Rednecks and White Liberals--that helps explain why.
Ebonics, Sowell argues, is not a distinctive black language. Instead, it is a dialect with an undistinguished pedigree that includes Southerners of both races and extends back to the borderlands between Britain and Scotland. Terms like "yawl," "dis," and "dat" as well as verbal constructs like "I be" and "she ain’t" were common in the more primitive regions of those countries--areas which provided a substantial number of emigrants to the American South. Thus, what some academics have labeled "Ebonics" is really an Anglo-Saxon export associated with a pre-Enlightenment "cracker" culture.
Characteristics of that culture in America have been described by observers ranging from Alexis de Tocqueville to W.E.B. DuBois: less interest in education, anemic entrepreneurial activity, undisciplined work habits, emphasis on immediate gratification, intoxication, ostentatious display, an exaggerated sense of personal pride, frequent duels, and lax sexual mores. These accounts focus, by and large, on white "redneck" culture. But the same traits naturally defined most of the non-white inhabitants of the South who had no contact with the African cultures from which they were s eparated by thousands of miles and several generations.
Significantly, not all black Americans (and not all white Southerners) were part of or remained tied to this culture that originated in foreign territories where lawlessness made an emphasis on long-range planning futile. Sowell points to blacks in the North whose manners and speech patterns were reflective of those found in the population among which they lived--at least prior to the unwelcome northern migration of Southern blacks in the first decades of the twentieth century.
Sowell also notes the impact of schools established in the South after the Civil War by religiously-motivated New Englanders. These "New England enclaves" not only produced a disproportionate share of black leaders, they also inculcated in their students an ethic that replaced the dysfunctional ethos in which they had been immersed from birth. DuBois himself praised these efforts as "the finest thing in American history."
West Indian migrants constitute a third group whose stunning accomplishments have been inversely related to their participation in "redneck" culture. These individuals, also exposed to the "legacy of slavery" in the Caribbean, became entrepreneurs in Harlem, comprised a huge percentage of black graduates at Ivy League schools, and in the 1930’s were actually less likely than whites to spend time in New York’s Sing Sing Prison.
The upshot of Sowell's analysis is this: Ebonics is more closely associated with a dysfunctional redneck culture than with any distinctive black identity. As the Swedish scholar Gunnar Myrdal observed, "the so-called 'Negro dialect' is simply a variation on the ordinary Southern accent."
The utter irony of the Ebonics fad is that while most whites and many blacks have escaped the clutches of this once-prevalent culture, some intellectuals seem hell-bent on perpetuating and even expanding its scope. By identifying the language of redneck culture with racial pride, academics succeed in linking racial identity with a world of violence, ignorance, and hair-trigger resentment--with values nowadays commercially disseminated via gangsta rap. No wonder Harvard economist Roland Fryer Jr. recently lamented that black youths commonly view educational accomplishment as selling out to "da man."
The KKK would be ecstatic with these results--not having the wit to see that they exist as cousins in the same cultural backwater. Misery loves company.
Unfortunately, none of these lofty objectives were achieved by Oakland’s experiment in substandard English. And now Professor Thomas Sowell has written an extended essay--contained in the book, Black Rednecks and White Liberals--that helps explain why.
Ebonics, Sowell argues, is not a distinctive black language. Instead, it is a dialect with an undistinguished pedigree that includes Southerners of both races and extends back to the borderlands between Britain and Scotland. Terms like "yawl," "dis," and "dat" as well as verbal constructs like "I be" and "she ain’t" were common in the more primitive regions of those countries--areas which provided a substantial number of emigrants to the American South. Thus, what some academics have labeled "Ebonics" is really an Anglo-Saxon export associated with a pre-Enlightenment "cracker" culture.
Characteristics of that culture in America have been described by observers ranging from Alexis de Tocqueville to W.E.B. DuBois: less interest in education, anemic entrepreneurial activity, undisciplined work habits, emphasis on immediate gratification, intoxication, ostentatious display, an exaggerated sense of personal pride, frequent duels, and lax sexual mores. These accounts focus, by and large, on white "redneck" culture. But the same traits naturally defined most of the non-white inhabitants of the South who had no contact with the African cultures from which they were s eparated by thousands of miles and several generations.
Significantly, not all black Americans (and not all white Southerners) were part of or remained tied to this culture that originated in foreign territories where lawlessness made an emphasis on long-range planning futile. Sowell points to blacks in the North whose manners and speech patterns were reflective of those found in the population among which they lived--at least prior to the unwelcome northern migration of Southern blacks in the first decades of the twentieth century.
Sowell also notes the impact of schools established in the South after the Civil War by religiously-motivated New Englanders. These "New England enclaves" not only produced a disproportionate share of black leaders, they also inculcated in their students an ethic that replaced the dysfunctional ethos in which they had been immersed from birth. DuBois himself praised these efforts as "the finest thing in American history."
West Indian migrants constitute a third group whose stunning accomplishments have been inversely related to their participation in "redneck" culture. These individuals, also exposed to the "legacy of slavery" in the Caribbean, became entrepreneurs in Harlem, comprised a huge percentage of black graduates at Ivy League schools, and in the 1930’s were actually less likely than whites to spend time in New York’s Sing Sing Prison.
The upshot of Sowell's analysis is this: Ebonics is more closely associated with a dysfunctional redneck culture than with any distinctive black identity. As the Swedish scholar Gunnar Myrdal observed, "the so-called 'Negro dialect' is simply a variation on the ordinary Southern accent."
The utter irony of the Ebonics fad is that while most whites and many blacks have escaped the clutches of this once-prevalent culture, some intellectuals seem hell-bent on perpetuating and even expanding its scope. By identifying the language of redneck culture with racial pride, academics succeed in linking racial identity with a world of violence, ignorance, and hair-trigger resentment--with values nowadays commercially disseminated via gangsta rap. No wonder Harvard economist Roland Fryer Jr. recently lamented that black youths commonly view educational accomplishment as selling out to "da man."
The KKK would be ecstatic with these results--not having the wit to see that they exist as cousins in the same cultural backwater. Misery loves company.
Thursday, August 25, 2005
THE "DEEPLY RELIGIOUS" LITMUS TEST
Are a judicial nominee’s "deeply held religious beliefs" a legitimate area of exploration for legislators charged with the advice and consent function? During recent confirmation hearings New York Senator Charles Schumer began pursuing this novel line of inquiry --apparently based on the belief that the first amendment not only creates a wall of separation between church and state (words not found in the Constitution) but also establishes special hurdles for individuals who take their faith seriously.
Naive observers might think that the primary qualification for judicial nominees would be their ability and willingness to interpret the law fairly. But as courts increasingly take upon themselves legislative prerogatives, a candidate’s philosophical views loom ever larger to politicians charged with the responsibility of deciding who is and who isn’t "qualified" to sit on a federal bench.
Under Schumer’s view of jurisprudence, constitutional temperaments seem to be compatible with deeply held views of all types--just not with deeply held religious views. Thus, even though more than 90% of Americans claim to believe in God, as far as judicial reflection is concerned, only god-free logic is clearly acceptable.
Imagine, for example, a judge with deeply held positions that are at odds with the vast majority of Americans when it comes to the age of sexual consent. Imagine that the judge’s name is Ginsburg. Would the philosophical roots of her views make any difference to Schumer and associates? Apparently not. All that would matter is that she came to her "progressive" views without relying on any religious convictions.
Now consider a judge with deeply held religious views that are roughly congruent with ideas embraced by at least half the country. Imagine the judge’s name is Roberts. Does the religious basis of his moral convictions become a major issue? Apparently so--since his religious beliefs trigger concerns arising from an expansive rendering of the First Amendment establishment clause.
In sum, secular nominees are OK, theists are questionable, and true believers are out of court. Ironically, this rule makes the following premise judicially illegitimate: "We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness."
According to Jefferson, governments are founded to secure the aforementioned rights. But according to Schumeristas, judges in that government fall under a cloud of suspicion if they employ this traditional concept of natural rights when pondering the resolution of legal questions. (Over a decade ago Justice Thomas, in his confirmation hearing, was given grief by Senator Biden for subscribing to these same "self-evident" truths that undergird the Declaration of Independence.)
Judges are entitled, it seems, to reason based on penumbral emanations plucked from thin air. They are also entitled to patch together decisions under the philosophical banner "the greatest good for the greatest number"--a benign utilitarian phrase that Princeton’s Peter Singer employs to justify infanticide. But they are not entitled, by Schumer’s lights, to reason based on the axiom that life derives from God--at least not if the conclusion to that train of thought violates some "deeply held" abortion plank in his party’s platform.
This new approach to jurisprudence depicts people of faith as irrational, eccentric, and narrowly sectarian--persons whose aberrant ideas must be quarantined to prevent public contamination. Meanwhile, non-religious minds (or intellects that have been scrubbed clean of religious pollutants) are portrayed as legitimate jurists.
This view would come as a surprise to George Washington, whose Farewell Address contains these anti-Schumeristic observations: "Of all the dispositions and habits that lead to political prosperity, Religion and Morality are indispensable supports.... And let us with caution indulge the supposition that morality can be maintained without religion."
Removing religious symbols and rituals from the public square was a giant first step on this slippery suppositional slope. That policy made it possible to stigmatize as theocratic all public officials with strong religious views--especially judges. If both these strategies prove successful, the only persons deemed qualified to rule in our still religious nation will be those who are indifferent or hostile to the beliefs held dear by a substantial majority of citizens. The secular sieve that cleansed the public square of vital religious expression will have accomplished the same feat in the halls of government.
Naive observers might think that the primary qualification for judicial nominees would be their ability and willingness to interpret the law fairly. But as courts increasingly take upon themselves legislative prerogatives, a candidate’s philosophical views loom ever larger to politicians charged with the responsibility of deciding who is and who isn’t "qualified" to sit on a federal bench.
Under Schumer’s view of jurisprudence, constitutional temperaments seem to be compatible with deeply held views of all types--just not with deeply held religious views. Thus, even though more than 90% of Americans claim to believe in God, as far as judicial reflection is concerned, only god-free logic is clearly acceptable.
Imagine, for example, a judge with deeply held positions that are at odds with the vast majority of Americans when it comes to the age of sexual consent. Imagine that the judge’s name is Ginsburg. Would the philosophical roots of her views make any difference to Schumer and associates? Apparently not. All that would matter is that she came to her "progressive" views without relying on any religious convictions.
Now consider a judge with deeply held religious views that are roughly congruent with ideas embraced by at least half the country. Imagine the judge’s name is Roberts. Does the religious basis of his moral convictions become a major issue? Apparently so--since his religious beliefs trigger concerns arising from an expansive rendering of the First Amendment establishment clause.
In sum, secular nominees are OK, theists are questionable, and true believers are out of court. Ironically, this rule makes the following premise judicially illegitimate: "We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness."
According to Jefferson, governments are founded to secure the aforementioned rights. But according to Schumeristas, judges in that government fall under a cloud of suspicion if they employ this traditional concept of natural rights when pondering the resolution of legal questions. (Over a decade ago Justice Thomas, in his confirmation hearing, was given grief by Senator Biden for subscribing to these same "self-evident" truths that undergird the Declaration of Independence.)
Judges are entitled, it seems, to reason based on penumbral emanations plucked from thin air. They are also entitled to patch together decisions under the philosophical banner "the greatest good for the greatest number"--a benign utilitarian phrase that Princeton’s Peter Singer employs to justify infanticide. But they are not entitled, by Schumer’s lights, to reason based on the axiom that life derives from God--at least not if the conclusion to that train of thought violates some "deeply held" abortion plank in his party’s platform.
This new approach to jurisprudence depicts people of faith as irrational, eccentric, and narrowly sectarian--persons whose aberrant ideas must be quarantined to prevent public contamination. Meanwhile, non-religious minds (or intellects that have been scrubbed clean of religious pollutants) are portrayed as legitimate jurists.
This view would come as a surprise to George Washington, whose Farewell Address contains these anti-Schumeristic observations: "Of all the dispositions and habits that lead to political prosperity, Religion and Morality are indispensable supports.... And let us with caution indulge the supposition that morality can be maintained without religion."
Removing religious symbols and rituals from the public square was a giant first step on this slippery suppositional slope. That policy made it possible to stigmatize as theocratic all public officials with strong religious views--especially judges. If both these strategies prove successful, the only persons deemed qualified to rule in our still religious nation will be those who are indifferent or hostile to the beliefs held dear by a substantial majority of citizens. The secular sieve that cleansed the public square of vital religious expression will have accomplished the same feat in the halls of government.
Tuesday, August 09, 2005
EBONICS: THE EMPIRE STRIKES BACK
Perhaps in response to Bill Cosby’s humorous critique of "Who you are? What you be?" linguistic deformations, a Cal State sociologist has convinced the education pooh-bahs in San Bernardino of the need to incorporate substandard English into the curriculum of that school district--at least on a limited basis.
Dr. Cosby, you may recall, had the temerity, the effrontery, da u-da-man moxie to declare that kids who speak like untutored street urchins are as unlikely to impress potential employers as baggy-pant adolescents who show up for interviews flashing four inches of underwear and sporting a backward-facing baseball cap. For this shameless violation of political correctness Cosby was excoriated by sensitivity monitors in the press and tarred with convenient accusations of moral impropriety--concerns that are never raised by MSM talking heads when Jesse Jackson pontificates about matters of race and culture.
Now the empire is striking back. The "we no Tomz" academic establishment --whose record of negative accomplishment rivals that of Democrat consultant and John Kerry campaign advisor Bob Shrum--is pushing not only the risible notion that Ebonics is a distinct language but also that the issue is a foregone scholarly conclusion.
Consequently, ideas that worked poorly with an honest-to-goodness language (Spanish) are now being shoehorned into a curriculum on behalf of speech patterns that reign supreme ("be top dog") in the world of rap. The rationale for this educational absurdity is that speaking in Ebonics will keep kids who are currently dropping out in large numbers more interested in school.
More likely what will happen is that marginal students will think they’re OK just as they are--since the desperate language of pop-culture will be accorded special recognition. Indeed, if Ebonics is a real language, why should members of this distinctive linguistic community learn to talk like "whitey"? As Harvard economist Roland Fryer Jr. has observed, the idea that educational accomplishment means selling out to "da man" already permeates black culture. Ebonics-based education only strengthens that notion by linking identify politics to poor speech. In this way anger, sloth, and racial pride are combined in a witches brew almost guaranteed to produce failure.
Needless to say, none of the sociologists or school board members got to where they are by using the language they seek to legitimize. Nor would these persons think twice about exiting a doctor’s office where the fellow with the stethoscope spoke in the cockneyesque dialect of the street (’aws ye bum, guv’nr!). Of course, no one has to make that decision because professional accomplishment requires serious dedication--a quality not exhibited by those of whatever race who cling to sloppy grammar and poor diction like a security blanket.
In many ways Ebonics advocates resemble jar-of-urine-as-work-of-art exhibitionists. Both fancy themselves part of a special group that is more creative, sensitive, and thoughtful than ordinary folks. Unfortunately, kids victimized by this avant garde educational project will lose out in a much bigger way than suckers who cede their aesthetic sensibilities to a few con-artists.
As Thomas Sowell’s book, Black Rednecks and White Liberals, observes, these black pupils will be encouraged to identify more completely with a violent "cracker" culture that has nothing to do with racial identity and that undercuts their own success. They’ll certainly be worse off than Spanish-speaking students whose native tongue wasn’t invented by an academic in search of a tenured post.
Put succinctly, these children will be kept on the plantation--a plantation that benefits various shades of elites who constantly churn out bad ideas for which they themselves never pay a price: whole language, self-esteem curricula, culturally sensitive math, and now Ebonics.
Dr. Cosby, you may recall, had the temerity, the effrontery, da u-da-man moxie to declare that kids who speak like untutored street urchins are as unlikely to impress potential employers as baggy-pant adolescents who show up for interviews flashing four inches of underwear and sporting a backward-facing baseball cap. For this shameless violation of political correctness Cosby was excoriated by sensitivity monitors in the press and tarred with convenient accusations of moral impropriety--concerns that are never raised by MSM talking heads when Jesse Jackson pontificates about matters of race and culture.
Now the empire is striking back. The "we no Tomz" academic establishment --whose record of negative accomplishment rivals that of Democrat consultant and John Kerry campaign advisor Bob Shrum--is pushing not only the risible notion that Ebonics is a distinct language but also that the issue is a foregone scholarly conclusion.
Consequently, ideas that worked poorly with an honest-to-goodness language (Spanish) are now being shoehorned into a curriculum on behalf of speech patterns that reign supreme ("be top dog") in the world of rap. The rationale for this educational absurdity is that speaking in Ebonics will keep kids who are currently dropping out in large numbers more interested in school.
More likely what will happen is that marginal students will think they’re OK just as they are--since the desperate language of pop-culture will be accorded special recognition. Indeed, if Ebonics is a real language, why should members of this distinctive linguistic community learn to talk like "whitey"? As Harvard economist Roland Fryer Jr. has observed, the idea that educational accomplishment means selling out to "da man" already permeates black culture. Ebonics-based education only strengthens that notion by linking identify politics to poor speech. In this way anger, sloth, and racial pride are combined in a witches brew almost guaranteed to produce failure.
Needless to say, none of the sociologists or school board members got to where they are by using the language they seek to legitimize. Nor would these persons think twice about exiting a doctor’s office where the fellow with the stethoscope spoke in the cockneyesque dialect of the street (’aws ye bum, guv’nr!). Of course, no one has to make that decision because professional accomplishment requires serious dedication--a quality not exhibited by those of whatever race who cling to sloppy grammar and poor diction like a security blanket.
In many ways Ebonics advocates resemble jar-of-urine-as-work-of-art exhibitionists. Both fancy themselves part of a special group that is more creative, sensitive, and thoughtful than ordinary folks. Unfortunately, kids victimized by this avant garde educational project will lose out in a much bigger way than suckers who cede their aesthetic sensibilities to a few con-artists.
As Thomas Sowell’s book, Black Rednecks and White Liberals, observes, these black pupils will be encouraged to identify more completely with a violent "cracker" culture that has nothing to do with racial identity and that undercuts their own success. They’ll certainly be worse off than Spanish-speaking students whose native tongue wasn’t invented by an academic in search of a tenured post.
Put succinctly, these children will be kept on the plantation--a plantation that benefits various shades of elites who constantly churn out bad ideas for which they themselves never pay a price: whole language, self-esteem curricula, culturally sensitive math, and now Ebonics.
Sunday, July 31, 2005
LIVE 8: POSTURING VERSUS PRACTICE
"One swallow does not a summer make." The epigram reflects the heart of Aristotle’s ethical philosophy. Character isn’t the product of random acts of kindness or of good intentions as evanescent as the head on a mug of cold beer. Instead, practice and habit are keys to transforming good ideas into virtuous reality.
Live 8, the international rock concert christened by Coldplay’s Chris Martin "the greatest thing that’s ever been organized probably in the history of the world" flies under a different aphoristic banner--this one offered by the event’s producer, Bob Geldof: "Something must be done, even if it doesn't work."
Only the term "probably" reflects well on Mr. Martin’s sense of historical proportion. Not much more can be said of Geldof’s urgent plea that endorses opening windows in a house on fire under the theory that it’s better than doing nothing.
It isn’t surprising that Aristotle’s party-pooping point-of-view doesn’t generate much enthusiasm among the Pepsi X-Generation. After all, what could be more attractive than an eleemosynary approach that distracts from a group’s own moral failings, makes a show of concern for individuals living far away, and features puerile music that fans would pay to hear in any case--a pathological trifecta.
Such one-shot exhibitions of emotional solidarity amount to little more than sops to consciences sated with excess. They resemble Andie MacDowell’s character in the opening scene of Sex, Lies, and Videotape where she ponders for her psychiatrist the fate of a trash-filled barge cruising the high seas in search of a friendly port-of-call. That mental voyage steered her thoughts away from a troubled marriage.
Aristotle, by contrast, asserts that there is no quick fix--not for individuals and not for societies. It takes more than "doing something" to produce an admirable result. What it takes is doing the right thing, in the right way, at the right time, over and over again. It’s not a formula that sits well with narcissistic entertainers or undisciplined fans.
As far as Africa is concerned, there are, indeed, "simple" answers--but they involve long-term practices unfamiliar to revelers who mindlessly parrot the musical mantra that self-indulgence is the highest form of integrity. Asian nations like South Korea and Taiwan, for example, have moved from poverty to prosperity because of policies that provide a stable legal framework, foster private enterprise, and encourage education. Absent such basic reforms, money sent to corrupt regimes will only subsidize failure.
In the age of advertising, silver-bullet solutions sell well. People are drawn to Hollywood endings where problems vanish after a single dramatic intervention. For these folks social liposuction is always the answer.
One concert, however, will not a continent remake. Indeed, based on the ideas articulated by Live 8 promoters, it’s likely to make things worse.
Live 8, the international rock concert christened by Coldplay’s Chris Martin "the greatest thing that’s ever been organized probably in the history of the world" flies under a different aphoristic banner--this one offered by the event’s producer, Bob Geldof: "Something must be done, even if it doesn't work."
Only the term "probably" reflects well on Mr. Martin’s sense of historical proportion. Not much more can be said of Geldof’s urgent plea that endorses opening windows in a house on fire under the theory that it’s better than doing nothing.
It isn’t surprising that Aristotle’s party-pooping point-of-view doesn’t generate much enthusiasm among the Pepsi X-Generation. After all, what could be more attractive than an eleemosynary approach that distracts from a group’s own moral failings, makes a show of concern for individuals living far away, and features puerile music that fans would pay to hear in any case--a pathological trifecta.
Such one-shot exhibitions of emotional solidarity amount to little more than sops to consciences sated with excess. They resemble Andie MacDowell’s character in the opening scene of Sex, Lies, and Videotape where she ponders for her psychiatrist the fate of a trash-filled barge cruising the high seas in search of a friendly port-of-call. That mental voyage steered her thoughts away from a troubled marriage.
Aristotle, by contrast, asserts that there is no quick fix--not for individuals and not for societies. It takes more than "doing something" to produce an admirable result. What it takes is doing the right thing, in the right way, at the right time, over and over again. It’s not a formula that sits well with narcissistic entertainers or undisciplined fans.
As far as Africa is concerned, there are, indeed, "simple" answers--but they involve long-term practices unfamiliar to revelers who mindlessly parrot the musical mantra that self-indulgence is the highest form of integrity. Asian nations like South Korea and Taiwan, for example, have moved from poverty to prosperity because of policies that provide a stable legal framework, foster private enterprise, and encourage education. Absent such basic reforms, money sent to corrupt regimes will only subsidize failure.
In the age of advertising, silver-bullet solutions sell well. People are drawn to Hollywood endings where problems vanish after a single dramatic intervention. For these folks social liposuction is always the answer.
One concert, however, will not a continent remake. Indeed, based on the ideas articulated by Live 8 promoters, it’s likely to make things worse.
Thursday, July 21, 2005
THE CLASSLESS SOCIETY
Bepeckled, gaunt, he stood erect,
then cocked his head to interject
and sprayed his syllables like bullets at the crowd.
"I am a Leftist," he avowed
--his dull and tainted teeth, projections
gravely cast against a bearded disposition.
"Power to the people," he implored
--praising Marx with jargon weighty,
lauding Castro, Che, and Warren Beatty.
"Revolution!" cried the man whose tenured post
was funded most
by porcine forces he deplored.
"Redistribution!" shouted harder,
he who sought with eyeshade ardor
to reduce to naught assessments owed.
"Comrades, comrades," droned
the frothy intellect
whose peers he held beneath contempt.
"Workers of the world, unite!
Discard your gilded chains!
What’s his is yours by right!"
Declared the well-fed orator
who never worker one employed
but lived, sans sweat, off others’ gains.
Venom-spitting lover of mankind,
Egalitaire--elite in his own mind,
Parasitic prole du jour,
This cogitating amateur,
Content to fill his acolytes with bilious rage
(and pants with cash)
Till hist’ry be as classless as its shabby sage.
then cocked his head to interject
and sprayed his syllables like bullets at the crowd.
"I am a Leftist," he avowed
--his dull and tainted teeth, projections
gravely cast against a bearded disposition.
"Power to the people," he implored
--praising Marx with jargon weighty,
lauding Castro, Che, and Warren Beatty.
"Revolution!" cried the man whose tenured post
was funded most
by porcine forces he deplored.
"Redistribution!" shouted harder,
he who sought with eyeshade ardor
to reduce to naught assessments owed.
"Comrades, comrades," droned
the frothy intellect
whose peers he held beneath contempt.
"Workers of the world, unite!
Discard your gilded chains!
What’s his is yours by right!"
Declared the well-fed orator
who never worker one employed
but lived, sans sweat, off others’ gains.
Venom-spitting lover of mankind,
Egalitaire--elite in his own mind,
Parasitic prole du jour,
This cogitating amateur,
Content to fill his acolytes with bilious rage
(and pants with cash)
Till hist’ry be as classless as its shabby sage.
Monday, June 27, 2005
NUANCE: THE WHITE LEFT'S BURDEN
If there is one mantra that leftists love to repeat, it is that their ideas are more "nuanced" than those of conservatives. Consequently, so goes the refrain, they face more daunting communication challenges than folks whose thoughts are so simplistic they can be placed on a bumper sticker--with room to spare.
This communication gap--call it the White Left’s Burden--explains why smarter-than-thous find it so hard to garner a majority of votes in this unsophisticated country. Most folks, it appears, lack the gray matter required to grasp their finely tuned arguments. This gap also explains why the national Democrat Party can only enact its legislative agenda via the least democratic branch of the government--the federal judiciary.
Take as an example that famous declaration, "I actually did vote for the 87 billion before I voted against it." It takes a New England preppy who graduated in the bottom third of his college class to fathom the complexity of that assertion. Dimwitted NASCAR-types would simply chalk up an about-face of this sort to the common political desire to have your cake and eat it too.
It also takes more acumen than middle America can muster to understand Howard Dean’s recent comments about "evil" Republicans--"white Christian(s)" who don’t care if children go to bed hungry and who’ve "never worked an honest day in their lives." A professional deconstructionist might be needed to explain to the hoi polloi whether the two senators from Massachusetts fall inside or outside the last of those artfully articulated demographic categories. This tutor might also explain how it is really the Republicans who are poisoning political discourse. (I suspect the words "dialectical," "post-modern," and "grinch" would be employed frequently during this pedagogical session.)
More recently the party of nuance has taken to employing terms like "theocracy" to describe the efforts of religiously-motivated Americans to resist aggressive secularization of the public square. Typical of such "crusades" (or are they "jihads"?) are unsuccessful campaigns to allow public high schools to engage in traditions of minimal piety that have been in place for a century or more.
One can sympathize with cognitive hair-splitters who are now burdened with the job of explaining to simpletons why a minuscule cross on the seal of Los Angeles County constitutes an "establishment of religion"--a task further complicated by the "living document" judicial philosophy that makes all appeals to the Constitution’s text either superfluous or mendacious. Fortunately for these brainiacs, most Americans already believe that the phrase "separation of church and state" is actually in the Constitution-- probably in the section where, simultaneously, senate filibusters are made inviolate and the Constitution infinitely elastic.
The piece de resistance of nuanced discourse has to be Senator Dick Durbin’s recent use of the terms "Nazis," "Soviet gulags," and "Pol Pot" in conjunction with his condemnation of abuses at the Guantanamo detention facility. Only a mind working at warp speed can appreciate the refinement reflected in this three-pronged allusion. An untrained ear might think Senator Durbin was comparing the penal practices of Hitler, Stalin, and the Cambodian tyrant to those employed at GTMO.
Au contraire! The senior senator from Illinois was in no way linking the actions of American military personnel at Guantanamo with the aforementioned regimes. Unfortunately, I’m unequipped to explain this faux comparison that juxtaposes genocidal regimes with the alleged discomforts endured by terrorists at Guantanamo. (Curiously, the word "alleged" seems to be missing from Senator Durbin’s vocabulary.) All I can say is that hinterland Neanderthals tend to hear his statement as an imprudent assertion that will give aid and comfort to linguistically-challenged enemies.
Maybe Ted Turner can explain Durbin’s comparison--or that guy whose definition of the word "is" varies to suit the exigencies of his legal situation. In the meantime we simple- minded, uncaring, luxury-sated, theocratic fascists will have to do the best we can, thesaurus in hand, to master the complexities of sophisticated political discourse.
This communication gap--call it the White Left’s Burden--explains why smarter-than-thous find it so hard to garner a majority of votes in this unsophisticated country. Most folks, it appears, lack the gray matter required to grasp their finely tuned arguments. This gap also explains why the national Democrat Party can only enact its legislative agenda via the least democratic branch of the government--the federal judiciary.
Take as an example that famous declaration, "I actually did vote for the 87 billion before I voted against it." It takes a New England preppy who graduated in the bottom third of his college class to fathom the complexity of that assertion. Dimwitted NASCAR-types would simply chalk up an about-face of this sort to the common political desire to have your cake and eat it too.
It also takes more acumen than middle America can muster to understand Howard Dean’s recent comments about "evil" Republicans--"white Christian(s)" who don’t care if children go to bed hungry and who’ve "never worked an honest day in their lives." A professional deconstructionist might be needed to explain to the hoi polloi whether the two senators from Massachusetts fall inside or outside the last of those artfully articulated demographic categories. This tutor might also explain how it is really the Republicans who are poisoning political discourse. (I suspect the words "dialectical," "post-modern," and "grinch" would be employed frequently during this pedagogical session.)
More recently the party of nuance has taken to employing terms like "theocracy" to describe the efforts of religiously-motivated Americans to resist aggressive secularization of the public square. Typical of such "crusades" (or are they "jihads"?) are unsuccessful campaigns to allow public high schools to engage in traditions of minimal piety that have been in place for a century or more.
One can sympathize with cognitive hair-splitters who are now burdened with the job of explaining to simpletons why a minuscule cross on the seal of Los Angeles County constitutes an "establishment of religion"--a task further complicated by the "living document" judicial philosophy that makes all appeals to the Constitution’s text either superfluous or mendacious. Fortunately for these brainiacs, most Americans already believe that the phrase "separation of church and state" is actually in the Constitution-- probably in the section where, simultaneously, senate filibusters are made inviolate and the Constitution infinitely elastic.
The piece de resistance of nuanced discourse has to be Senator Dick Durbin’s recent use of the terms "Nazis," "Soviet gulags," and "Pol Pot" in conjunction with his condemnation of abuses at the Guantanamo detention facility. Only a mind working at warp speed can appreciate the refinement reflected in this three-pronged allusion. An untrained ear might think Senator Durbin was comparing the penal practices of Hitler, Stalin, and the Cambodian tyrant to those employed at GTMO.
Au contraire! The senior senator from Illinois was in no way linking the actions of American military personnel at Guantanamo with the aforementioned regimes. Unfortunately, I’m unequipped to explain this faux comparison that juxtaposes genocidal regimes with the alleged discomforts endured by terrorists at Guantanamo. (Curiously, the word "alleged" seems to be missing from Senator Durbin’s vocabulary.) All I can say is that hinterland Neanderthals tend to hear his statement as an imprudent assertion that will give aid and comfort to linguistically-challenged enemies.
Maybe Ted Turner can explain Durbin’s comparison--or that guy whose definition of the word "is" varies to suit the exigencies of his legal situation. In the meantime we simple- minded, uncaring, luxury-sated, theocratic fascists will have to do the best we can, thesaurus in hand, to master the complexities of sophisticated political discourse.
Friday, June 17, 2005
CALIFORNIA LEGISLATIVE FOLLIES: SHORTEN THAT TEXTBOOK
When is less, more? Answer: When it’s a California textbook. Just ask Assembly member Jackie Goldberg--sponsor of AB 756, the bill to ban the state from purchasing texts longer than 200 pages.
Honestly, folks, this is no joke. And not only does the former Compton teacher want new textbooks for California schools to be limited to 200 pages, her assembly colleagues actually agree with her.
By a vote of 42-26 the lower house in Sacramento recently went on record supporting the novel idea that fewer pages equals greater achievement. Never mind that no study establishing a "small is beautiful" correlation can be produced by supporters of the legislation or that an arbitrary numerical quota seems plain stupid. The important thing is that Goldberg and company have an idea about which they feel warm and fuzzy. And that, apparently, is all it takes for whacky educational policy to pass muster in the Assembly.
One can hardly imagine the amount of media scorn that would befall a Republican legislator who proposed a minimalist limit to textbook pagination. "Dumb and Dumber" jokes would flow from late night comics like vintage wine:
*"Did you hear that Assemblyman Yahoo wants all California texts to be shorter than 200 pages? --And to require no more than four crayons per page."
*"When asked what part of U.S. History should be deleted to keep within the 200-page limit, Rep. Yahoo replied, ‘How about the Constitution?’"
*"Yahoo’s other ideas for education include miniaturized desks, shorter teachers, 20 x 20 classrooms, and lower IQs."
Batta-boom-batta-bing!
Goldberg’s defense of her proposal is almost as funny. The world, she observes, has changed significantly in the last few decades, yet we’re still using big, bulky books to teach our kids. She also says that today’s workplace demands more than the ability to read page 435 on some manual and that a more "dynamic...learning process" is required.
Can anyone in class, I wonder, define the term "non sequitur"? Or was that Latinism on page 201? More specifically, one might reply to Goldberg’s "weighty" arguments as follows: 1) Some people continue to wear pants that cover their rumps despite the fact that the world has changed. 2) The ability to read page 435 of a manual is not a sufficient skill for success, but it is necessary.
Excessive jokes aside, the philosophical basis for Goldberg’s legislation is the ubiquitous educational emphasis on "process." According to this school of thought, information itself isn’t nearly as important as learning how to learn. Thus, for Goldberg, essential data can be crammed into 200 pages--no matter what the subject. Internet references in limit-exempt appendices will then provide the dynamism that compensates for this textual downsizing.
In practice, process-education has tended to produce students too uninformed to know what puzzles they should be unraveling on the information superhighway. And it has had this effect because "learning skills" or the "love of learning" are virtually impossible to nurture in students who don’t know much about anything in particular. Superficiality begets couch-potatoes, not dedicated researchers.
A more constructive approach to California’s educational woes would involve loosening the death-grip of interest groups like the CTA on education policy, providing more flexibility to charter schools, and even revisiting the hated concept of vouchers. In a competitive educational environment, pedagogical idiocies like the above would not be met with silence by state employees whose very jobs rest on their effectiveness.
Honestly, folks, this is no joke. And not only does the former Compton teacher want new textbooks for California schools to be limited to 200 pages, her assembly colleagues actually agree with her.
By a vote of 42-26 the lower house in Sacramento recently went on record supporting the novel idea that fewer pages equals greater achievement. Never mind that no study establishing a "small is beautiful" correlation can be produced by supporters of the legislation or that an arbitrary numerical quota seems plain stupid. The important thing is that Goldberg and company have an idea about which they feel warm and fuzzy. And that, apparently, is all it takes for whacky educational policy to pass muster in the Assembly.
One can hardly imagine the amount of media scorn that would befall a Republican legislator who proposed a minimalist limit to textbook pagination. "Dumb and Dumber" jokes would flow from late night comics like vintage wine:
*"Did you hear that Assemblyman Yahoo wants all California texts to be shorter than 200 pages? --And to require no more than four crayons per page."
*"When asked what part of U.S. History should be deleted to keep within the 200-page limit, Rep. Yahoo replied, ‘How about the Constitution?’"
*"Yahoo’s other ideas for education include miniaturized desks, shorter teachers, 20 x 20 classrooms, and lower IQs."
Batta-boom-batta-bing!
Goldberg’s defense of her proposal is almost as funny. The world, she observes, has changed significantly in the last few decades, yet we’re still using big, bulky books to teach our kids. She also says that today’s workplace demands more than the ability to read page 435 on some manual and that a more "dynamic...learning process" is required.
Can anyone in class, I wonder, define the term "non sequitur"? Or was that Latinism on page 201? More specifically, one might reply to Goldberg’s "weighty" arguments as follows: 1) Some people continue to wear pants that cover their rumps despite the fact that the world has changed. 2) The ability to read page 435 of a manual is not a sufficient skill for success, but it is necessary.
Excessive jokes aside, the philosophical basis for Goldberg’s legislation is the ubiquitous educational emphasis on "process." According to this school of thought, information itself isn’t nearly as important as learning how to learn. Thus, for Goldberg, essential data can be crammed into 200 pages--no matter what the subject. Internet references in limit-exempt appendices will then provide the dynamism that compensates for this textual downsizing.
In practice, process-education has tended to produce students too uninformed to know what puzzles they should be unraveling on the information superhighway. And it has had this effect because "learning skills" or the "love of learning" are virtually impossible to nurture in students who don’t know much about anything in particular. Superficiality begets couch-potatoes, not dedicated researchers.
A more constructive approach to California’s educational woes would involve loosening the death-grip of interest groups like the CTA on education policy, providing more flexibility to charter schools, and even revisiting the hated concept of vouchers. In a competitive educational environment, pedagogical idiocies like the above would not be met with silence by state employees whose very jobs rest on their effectiveness.
Saturday, June 04, 2005
NOISE--AND THE DEATH OF DECORUM
Big brother has nothing on the cultural demigods that permeate American life. The fact that many businesses now afflict patrons with loud, grating, hippity-rap music is a testament to the power wielded by media mavens. At popular restaurants these piped-in projections are often so intrusive, conversation becomes impossible. Indeed, the din feeds on itself as hearing-impaired diners who passively accept this aural assault shout across the table like hikers communicating over a vast chasm.
Almost everywhere--airports, auto repair centers, high school basketball games, barber shops--electronic devices broadcast sound waves crafted by virtual conductors who, for the last four decades, have decimated civility and murdered vocal restraint. (Does anyone whisper anymore? And if the technique were employed, could listeners decipher the words?)
Not even the nationwide bookseller near me is exempt from this pervasive tribute to the deities of distraction. Occasionally, refined selections--like sheltered eddies by a raging river--interrupt the persistent percussive pounding. But even this "background" music is too loud. And the respite is brief. The dissonance commences again--punctuated by extended cell-phone conversations involving parties whose anvils and stirrups have doubtless been damaged by prior abuse.
If one has the misfortune of sitting within fifty feet of this store’s juvenile section, the high-pitched screams of unrestrained brats will be added to the mix. A menagerie of wild beasts don’t create the cacophony generated by these homo non-sapiens romping through an establishment once devoted to intellectual reflection. When even bookstores become purveyors of mind-numbing clamor, you know you’re in trouble.
What once would have raised howls of protest, is now meekly endured by milquetoasts too cowed to glare in unison at clueless parents or to inform the manager that his "music" is irritating and offensive. Instead, customers dutifully genuflect toward the Huns who shroud our sensibilities with emotional smog. These SNL, MTV, Leno-Letterman barbarians no longer pound at the gate. Rather, they own the portal key--the means of communication.
Incessant noise, crude humor, and shallow sensuality fills our public space--flooding into the street from boxes that blare similar messages in private homes. It is a fitting backdrop for lives devoid of depth--for nose-pierced pop-cultural clones, mall-rat rebels, and 9-to-5 commuters engrossed in the permutations of celebrity justice.
Amid this raucous decadence, tolerance is a one-way street. Principled individuals are asked to defer to the sensibilities of those who find the word "Christmas" offensive. On the other hand, when crude, insulting, and noxious material is broadcast in public, these same individuals are expected to "be open" to the expression of "alternate perspectives". In such a society, being a good sport is synonymous with moral cowardice. After all, only those with a conscience (or those with hearing intact) are obliged to leave their "hangups" at home.
Mel Brooks, as the 2000-year-old man, once made this reply to Carl Reiner’s question about the secret to planetary peace: "If everyone in the world ... would play ... a violin, we would be bigger and better than Mantovani." The effect of filling our ears (and souls) with virulent electronic emissions has had the opposite result--producing a culture where reflection, tranquility, and considerateness are increasingly rare.
This debilitating trend will only be reversed when Americans summon the courage to demand decorum in public. Each intervention creates momentum toward a decent society and makes it more likely that frustrated fence-sitters will also take up the challenge of calling brutishness exactly what it is.
The project starts with you.
Almost everywhere--airports, auto repair centers, high school basketball games, barber shops--electronic devices broadcast sound waves crafted by virtual conductors who, for the last four decades, have decimated civility and murdered vocal restraint. (Does anyone whisper anymore? And if the technique were employed, could listeners decipher the words?)
Not even the nationwide bookseller near me is exempt from this pervasive tribute to the deities of distraction. Occasionally, refined selections--like sheltered eddies by a raging river--interrupt the persistent percussive pounding. But even this "background" music is too loud. And the respite is brief. The dissonance commences again--punctuated by extended cell-phone conversations involving parties whose anvils and stirrups have doubtless been damaged by prior abuse.
If one has the misfortune of sitting within fifty feet of this store’s juvenile section, the high-pitched screams of unrestrained brats will be added to the mix. A menagerie of wild beasts don’t create the cacophony generated by these homo non-sapiens romping through an establishment once devoted to intellectual reflection. When even bookstores become purveyors of mind-numbing clamor, you know you’re in trouble.
What once would have raised howls of protest, is now meekly endured by milquetoasts too cowed to glare in unison at clueless parents or to inform the manager that his "music" is irritating and offensive. Instead, customers dutifully genuflect toward the Huns who shroud our sensibilities with emotional smog. These SNL, MTV, Leno-Letterman barbarians no longer pound at the gate. Rather, they own the portal key--the means of communication.
Incessant noise, crude humor, and shallow sensuality fills our public space--flooding into the street from boxes that blare similar messages in private homes. It is a fitting backdrop for lives devoid of depth--for nose-pierced pop-cultural clones, mall-rat rebels, and 9-to-5 commuters engrossed in the permutations of celebrity justice.
Amid this raucous decadence, tolerance is a one-way street. Principled individuals are asked to defer to the sensibilities of those who find the word "Christmas" offensive. On the other hand, when crude, insulting, and noxious material is broadcast in public, these same individuals are expected to "be open" to the expression of "alternate perspectives". In such a society, being a good sport is synonymous with moral cowardice. After all, only those with a conscience (or those with hearing intact) are obliged to leave their "hangups" at home.
Mel Brooks, as the 2000-year-old man, once made this reply to Carl Reiner’s question about the secret to planetary peace: "If everyone in the world ... would play ... a violin, we would be bigger and better than Mantovani." The effect of filling our ears (and souls) with virulent electronic emissions has had the opposite result--producing a culture where reflection, tranquility, and considerateness are increasingly rare.
This debilitating trend will only be reversed when Americans summon the courage to demand decorum in public. Each intervention creates momentum toward a decent society and makes it more likely that frustrated fence-sitters will also take up the challenge of calling brutishness exactly what it is.
The project starts with you.
Tuesday, May 31, 2005
THE "WHO'S TO SAY" ETHICS CON
I doubt that any cliché has been used more frequently to undermine ethical reflection than this one: "Who’s to say what’s right or wrong?" No discussion of morality can proceed very far before someone empties his philosophical quiver and unleashes this sophomoric rhetorical challenge.
The unstated implication behind the question is that ethical principles are nothing more than personal tastes. Given this assumption, the declaration that one type of behavior is better than another becomes equivalent to announcing that steak is good and broccoli bad. All ethical judgments are thus reduced to arbitrary acts of imposition. A thing is labeled ‘bad’ only because some high-handed dude has declared that it is.
It is worth noting that no one employs this shopworn challenge in the arena of science-- because within that discipline it is assumed that some answers are closer to the truth than others. Even in history and sociology, most folks would dismiss "Who’s to say" challenges with a sneer. The obvious response is that "evidence" and "coherence" are the standards for deciding whether a proposition is true or false--not the declaration of some disciplinary czar vested with absolute authority to impose his will on less-influential practitioners.
When it comes to ethics, however, all these considerations are ignored --as if everyone knows that no such thing as ethical evidence exists. Such an idea would come as a shock to Plato, Aristotle, and Aquinas--as well as to more recent thinkers like Kant or John Stuart Mill. Heck, even Karl Marx spent decades cooking his economic books in order to convince readers that Communism was both morally superior and historically inevitable.
In addition to ignoring the possibility that ethical evidence exists, those who fling the "Who’s to say" ratchet into the cogs of ethical reflection also ignore something else--namely, that their rhetorical challenge presupposes the existence of a moral standard that it simultaneously denies.
The "Who’s to say" argument is only effective because it makes any possible answer to the question morally indefensible--since nothing is true simply because so and so says it is. But if nothing can be declared immoral, then there is no basis for rejecting the arbitrary imposition of one person’s values on another. Put succinctly, the "Who’s to say" argument presupposes two contradictory ideas: 1) All moral statements are arbitrary impositions. 2) Arbitrary impositions are immoral.
The clichéd interrogative requires the very morality it denies. And then it employs that sliver of moral indignation to undercut serious moral reflection. The fact that "immoralists" are reduced to blatant self-contradiction in their most popular logical offensive is a bit of evidence worth pondering. The fact that most people don’t recognize this logical con-game is a tribute to the intellectual and moral sloth of a culture that would rather let sleeping consciences lie.
The unstated implication behind the question is that ethical principles are nothing more than personal tastes. Given this assumption, the declaration that one type of behavior is better than another becomes equivalent to announcing that steak is good and broccoli bad. All ethical judgments are thus reduced to arbitrary acts of imposition. A thing is labeled ‘bad’ only because some high-handed dude has declared that it is.
It is worth noting that no one employs this shopworn challenge in the arena of science-- because within that discipline it is assumed that some answers are closer to the truth than others. Even in history and sociology, most folks would dismiss "Who’s to say" challenges with a sneer. The obvious response is that "evidence" and "coherence" are the standards for deciding whether a proposition is true or false--not the declaration of some disciplinary czar vested with absolute authority to impose his will on less-influential practitioners.
When it comes to ethics, however, all these considerations are ignored --as if everyone knows that no such thing as ethical evidence exists. Such an idea would come as a shock to Plato, Aristotle, and Aquinas--as well as to more recent thinkers like Kant or John Stuart Mill. Heck, even Karl Marx spent decades cooking his economic books in order to convince readers that Communism was both morally superior and historically inevitable.
In addition to ignoring the possibility that ethical evidence exists, those who fling the "Who’s to say" ratchet into the cogs of ethical reflection also ignore something else--namely, that their rhetorical challenge presupposes the existence of a moral standard that it simultaneously denies.
The "Who’s to say" argument is only effective because it makes any possible answer to the question morally indefensible--since nothing is true simply because so and so says it is. But if nothing can be declared immoral, then there is no basis for rejecting the arbitrary imposition of one person’s values on another. Put succinctly, the "Who’s to say" argument presupposes two contradictory ideas: 1) All moral statements are arbitrary impositions. 2) Arbitrary impositions are immoral.
The clichéd interrogative requires the very morality it denies. And then it employs that sliver of moral indignation to undercut serious moral reflection. The fact that "immoralists" are reduced to blatant self-contradiction in their most popular logical offensive is a bit of evidence worth pondering. The fact that most people don’t recognize this logical con-game is a tribute to the intellectual and moral sloth of a culture that would rather let sleeping consciences lie.
Sunday, May 15, 2005
ASSESSING EUROPE'S MORAL SUPERIORITY
"European morality" is a phrase that should rival "must-see TV" and "government budget" in the Oxymoron Hall of Fame. Yet among American elites there seems to be a consensus that Continental mores are significantly more advanced than those practiced in the United States. This novel assumption raises several questions:
Have circumstances changed so drastically from the time when America’s Founding Fathers looked askance at European decadence and saw in their new nation a city that would serve as a beacon to the rest of the world? More to the point, has Europe changed that much in the last five decades? After all, the continent held in such high esteem by many of today’s politicians and mainstream media types didn’t exactly have a century to write home about.
First there was World War I--a colossal stupidity whose causes are shrouded in national aspiration, colonial competition, and a prolonged history of diplomatic gamesmanship. Only U.S. intervention succeeded in dragging the Euros out of their self-destructive trenches.
Then there was the fascist era--with Mussolini in Italy, Franco in Spain, and Adolf in Deutschland. It must have been a completely different breed of European that cheered il Duce’s invasion of Ethiopia (visions of imperial Rome dancing in their heads) or who dutifully carried out the Fuhrer’s solution to the continent’s "Jewish Problem."
During that period the other side of the political fence was largely defined by two words: "Munich" and "Czechoslovakia." Again, it was American intervention, along with British tenacity, that combined to save the continent from domination by Hitler or his double- crossed Soviet pal, Joe Stalin.
Today’s European apples must have fallen far from that ancestral tree to merit such admiration as they are accorded in the minds of America’s literati. But the closer I look, the more I see the same emptiness in the states of "Old Europe."
Germany, for example, legalized prostitution a couple of years ago--a sign of decadence reminiscent of the Weimar Republic. Elsewhere on the forensic front, a German court recently sentenced a man who killed and ate a voluntary victim to eight-and-a-half years behind bars. (With good behavior this Hannibal Lecterisch connoisseur could be hawking a rare book of Rhineland recipes by the end of the decade.)
Economically, Deutschland’s unemployment rate has grown to 12% amid a population decline projected to reach 10% by 2050. Both these trends spell disaster for generously- funded welfare programs and suggest less-than-vigorous health, culturally speaking.
Next door in the Netherlands, legislators are falling over themselves to draft more inclusive standards for killing sick, old, and depressed folk--with or without the patient’s permission. The protocols of the University Hospital in Groningen represent the cutting edge of progressive thought on this topic--ideas so reminiscent of Nazi justifications for euthanasia, they are verboten in Germany.
Meanwhile, in Sweden (bastion of neutrality in the fight against Hitler) illegitimacy has become the norm. Presently, over half of that country’s babies have parents who find the institution of marriage anachronistic.
Given these considerations, it seems likely that what appeals to Europhiles isn’t the region’s outstanding ethical record but rather its long-cultivated lack of moral conviction. It’s not that the leopard has changed its spots, but rather that American elites now identify with what most Continental intellectuals stood for throughout the last century-- accommodation, acquiescence, and moral lassitude.
Those on the other side of this "sophistication" argument can point to refined aesthetic sensibilities and copious cultural treasures that grace the land mass north of the Mediterranean. They can even tout a collective monetary unit that has appreciated in value over the last few years. The central question, however, is whether these facts represent more than a cluster of attractive specimens amid a moribund species.
My belief is that no European ethical epiphany took place in the last half of the twentieth century--an assessment whose value should become clearer as religiously-motivated immigrants increasingly replace a dwindling population of secularists whose utilitarian ideals inspire neither courage nor admiration.
Have circumstances changed so drastically from the time when America’s Founding Fathers looked askance at European decadence and saw in their new nation a city that would serve as a beacon to the rest of the world? More to the point, has Europe changed that much in the last five decades? After all, the continent held in such high esteem by many of today’s politicians and mainstream media types didn’t exactly have a century to write home about.
First there was World War I--a colossal stupidity whose causes are shrouded in national aspiration, colonial competition, and a prolonged history of diplomatic gamesmanship. Only U.S. intervention succeeded in dragging the Euros out of their self-destructive trenches.
Then there was the fascist era--with Mussolini in Italy, Franco in Spain, and Adolf in Deutschland. It must have been a completely different breed of European that cheered il Duce’s invasion of Ethiopia (visions of imperial Rome dancing in their heads) or who dutifully carried out the Fuhrer’s solution to the continent’s "Jewish Problem."
During that period the other side of the political fence was largely defined by two words: "Munich" and "Czechoslovakia." Again, it was American intervention, along with British tenacity, that combined to save the continent from domination by Hitler or his double- crossed Soviet pal, Joe Stalin.
Today’s European apples must have fallen far from that ancestral tree to merit such admiration as they are accorded in the minds of America’s literati. But the closer I look, the more I see the same emptiness in the states of "Old Europe."
Germany, for example, legalized prostitution a couple of years ago--a sign of decadence reminiscent of the Weimar Republic. Elsewhere on the forensic front, a German court recently sentenced a man who killed and ate a voluntary victim to eight-and-a-half years behind bars. (With good behavior this Hannibal Lecterisch connoisseur could be hawking a rare book of Rhineland recipes by the end of the decade.)
Economically, Deutschland’s unemployment rate has grown to 12% amid a population decline projected to reach 10% by 2050. Both these trends spell disaster for generously- funded welfare programs and suggest less-than-vigorous health, culturally speaking.
Next door in the Netherlands, legislators are falling over themselves to draft more inclusive standards for killing sick, old, and depressed folk--with or without the patient’s permission. The protocols of the University Hospital in Groningen represent the cutting edge of progressive thought on this topic--ideas so reminiscent of Nazi justifications for euthanasia, they are verboten in Germany.
Meanwhile, in Sweden (bastion of neutrality in the fight against Hitler) illegitimacy has become the norm. Presently, over half of that country’s babies have parents who find the institution of marriage anachronistic.
Given these considerations, it seems likely that what appeals to Europhiles isn’t the region’s outstanding ethical record but rather its long-cultivated lack of moral conviction. It’s not that the leopard has changed its spots, but rather that American elites now identify with what most Continental intellectuals stood for throughout the last century-- accommodation, acquiescence, and moral lassitude.
Those on the other side of this "sophistication" argument can point to refined aesthetic sensibilities and copious cultural treasures that grace the land mass north of the Mediterranean. They can even tout a collective monetary unit that has appreciated in value over the last few years. The central question, however, is whether these facts represent more than a cluster of attractive specimens amid a moribund species.
My belief is that no European ethical epiphany took place in the last half of the twentieth century--an assessment whose value should become clearer as religiously-motivated immigrants increasingly replace a dwindling population of secularists whose utilitarian ideals inspire neither courage nor admiration.
Monday, May 09, 2005
RESPECT ME--OR ELSE!
"You can’t give respect unless you get it first from others." That’s the tough-guy philosophy that was articulated by a young man struggling to square some standard of decency with the rap lyrics that he and his peers invest with such authority.
I asked the adolescent standing in the school hallway to envision a room populated with individuals who embraced the sentiments he had expressed--and then to describe the verbal exchanges occurring there. When no answer was forthcoming, I supplied him with my own scenario: "You would have a room full of people all demanding that the other guy respect him first--‘You respect me.’ ‘No, you respect me!’"
The slogan seemed plausible enough at first blush, but when push came to shove, the dictum turned out to be little more than an unqualified demand for submission--a means for distinguishing top-dogs from their home-boy underlings. The latter give respect without getting it first. The former get respect by demanding it as a non-negotiable precondition. The reciprocity that forms the core of polite social intercourse is here illusory. In its place stands a hierarchy of intimidation.
This gangsta philosophy turns the traditional understanding of respect on its head. According to that discarded rule, one doesn’t merit respect without first extending it. Respect for oneself is earned based on admirable behavior. Respect for others, by contrast, is considered the default position for interpersonal relations--not an option contingent on prior recognition. By following these rules one creates a room populated by folks whose actions are the polar opposite of those presumed in the prior experiment--a room long on courtesy and short on attitude.
The new rapper rules for respect are popular because they provide a ready-made excuse for in-your-face behavior. If old fogies don’t express appreciation for anti-social slogans, low-rider jeans, and grotesque brow-piercings, "play’rs" are automatically given grounds for dissing these disapproving adults. Only those who kowtow to countercultural inclinations merit placement in the "respectable" category. Those consigned to the opposite bin have only themselves to blame.
"Me-firsters" have no obligation to defer to society--and presumably nothing to learn from it. Instead, society has an obligation to do obeisance to their dubious insights--or suffer the consequences! This "attitude" that transfoms the general obligation to act courteously into a jealously guarded prerogative isn't a formula designed to restrain youngsters from swinging bats at the heads of immature taunters.
Thanks to an industry whose corruption seems bottomless, the hard and desperate language of the street now springs glibly from the lips of impressionable youngsters--and passes for wisdom. It is a mindset promoted by entertainers and athletes whose bravado is inversely proportionate to the moral standing they have earned--a philosophy rooted in, and designed to perpetuate, failure.
I asked the adolescent standing in the school hallway to envision a room populated with individuals who embraced the sentiments he had expressed--and then to describe the verbal exchanges occurring there. When no answer was forthcoming, I supplied him with my own scenario: "You would have a room full of people all demanding that the other guy respect him first--‘You respect me.’ ‘No, you respect me!’"
The slogan seemed plausible enough at first blush, but when push came to shove, the dictum turned out to be little more than an unqualified demand for submission--a means for distinguishing top-dogs from their home-boy underlings. The latter give respect without getting it first. The former get respect by demanding it as a non-negotiable precondition. The reciprocity that forms the core of polite social intercourse is here illusory. In its place stands a hierarchy of intimidation.
This gangsta philosophy turns the traditional understanding of respect on its head. According to that discarded rule, one doesn’t merit respect without first extending it. Respect for oneself is earned based on admirable behavior. Respect for others, by contrast, is considered the default position for interpersonal relations--not an option contingent on prior recognition. By following these rules one creates a room populated by folks whose actions are the polar opposite of those presumed in the prior experiment--a room long on courtesy and short on attitude.
The new rapper rules for respect are popular because they provide a ready-made excuse for in-your-face behavior. If old fogies don’t express appreciation for anti-social slogans, low-rider jeans, and grotesque brow-piercings, "play’rs" are automatically given grounds for dissing these disapproving adults. Only those who kowtow to countercultural inclinations merit placement in the "respectable" category. Those consigned to the opposite bin have only themselves to blame.
"Me-firsters" have no obligation to defer to society--and presumably nothing to learn from it. Instead, society has an obligation to do obeisance to their dubious insights--or suffer the consequences! This "attitude" that transfoms the general obligation to act courteously into a jealously guarded prerogative isn't a formula designed to restrain youngsters from swinging bats at the heads of immature taunters.
Thanks to an industry whose corruption seems bottomless, the hard and desperate language of the street now springs glibly from the lips of impressionable youngsters--and passes for wisdom. It is a mindset promoted by entertainers and athletes whose bravado is inversely proportionate to the moral standing they have earned--a philosophy rooted in, and designed to perpetuate, failure.
Monday, May 02, 2005
THE ROTTWEILERS OF RELATIVISM
There’s been no honeymoon for a pope correctly perceived as too Catholic for the taste of secular critics. Instead, a wave of preemptive vilification has greeted the "archconservative" pontiff.
"God’s Rottweiler" is a phrase that expresses the antipathy felt toward Cardinal Ratzinger by those who gleefully point to his stint in the Hitler Youth. Such journalists fail to note that membership in the Nazi organization was mandatory and that, at the ripe age of eighteen, the future "Papa-Ratzi" deserted from the German army into which he had been drafted.
In addition to these youthful "offenses" Ratzinger’s stint as prefect of the Church’s office on Doctrine earned him the moniker "God’s Inquisitor." (Ironically, this stigma comes from a class supremely skilled at destroying political opponents. Were Benedict XVI as bad as his press, one might think a degree of professional courtesy would be extended by these practitioners of high-tech lynching.)
What makes Benedict even more odious to critics is the address he gave to his peers before ascending to the papacy. In that talk the German cardinal took aim at a dogma to which his secular opponents pledge fervent allegiance--the "dictatorship of relativism."
This oxymoronic label aims at relativism’s hidden duplicity--a duplicity suggested by the insults directed at a man who, by all first-hand accounts I have read, is a model of patience and self-effacement. Though tolerance appears to be relativism’s logical corollary, the linkage is illusory.
Behind relativism’s easy-going facade stands a rabid rejection of moral absolutes. Truth, they say, is relative--but relative to what? The honest response in most cases, "relative to me," explains the hostility directed by elites toward influential expressions of orthodoxy. It also explains why the statement "there is no wrong answer" has become so popular in recent years--especially in discourse about abortion, child-rearing, or "alternative lifestyles".
Thomas Hobbes gave serious thought to the shape of a world where egos rule and concluded that a political Leviathan was required to bring order to an otherwise brutish and short existence. Relativists aren’t far behind the English philosopher--except when it comes to candor.
A world where egos rule is, more accurately, a world where some egos rule over others. And the guiding principle of this governance must be the "gut instincts" to which those power brokers give deference. In a world devoid of absolutes, these instincts increasingly coincide with narrow self-interest. (Relativists, it should be recalled, need not and do not defer to majority opinion when the latter fails to echo their own "nuanced" visions of "truth".)
St. Augustine said that in his youth he yearned to become a law unto himself--to reject any authority outside his own will. This "I-did-it-my-way" cultural complement of relativism smacks more of the "will to power" than of tolerance. The same egocentric impulse generates venom toward Pope Benedict--a man capable of defending the notion that moral truth is more than putty placed in the hands of gurus who pontificate on God-knows-what from their mansions in Malibu.
A Pope who used the spiritual force of his office to bring down political totalitarianism in Eastern Europe has been succeeded by another who targets a different form of despotism. The "Rottweilers of Relativism" despise Benedict XVI in much the same way KGB operatives feared John Paul II. Dictators, whether political or cultural, hate answering to anyone except themselves.
"God’s Rottweiler" is a phrase that expresses the antipathy felt toward Cardinal Ratzinger by those who gleefully point to his stint in the Hitler Youth. Such journalists fail to note that membership in the Nazi organization was mandatory and that, at the ripe age of eighteen, the future "Papa-Ratzi" deserted from the German army into which he had been drafted.
In addition to these youthful "offenses" Ratzinger’s stint as prefect of the Church’s office on Doctrine earned him the moniker "God’s Inquisitor." (Ironically, this stigma comes from a class supremely skilled at destroying political opponents. Were Benedict XVI as bad as his press, one might think a degree of professional courtesy would be extended by these practitioners of high-tech lynching.)
What makes Benedict even more odious to critics is the address he gave to his peers before ascending to the papacy. In that talk the German cardinal took aim at a dogma to which his secular opponents pledge fervent allegiance--the "dictatorship of relativism."
This oxymoronic label aims at relativism’s hidden duplicity--a duplicity suggested by the insults directed at a man who, by all first-hand accounts I have read, is a model of patience and self-effacement. Though tolerance appears to be relativism’s logical corollary, the linkage is illusory.
Behind relativism’s easy-going facade stands a rabid rejection of moral absolutes. Truth, they say, is relative--but relative to what? The honest response in most cases, "relative to me," explains the hostility directed by elites toward influential expressions of orthodoxy. It also explains why the statement "there is no wrong answer" has become so popular in recent years--especially in discourse about abortion, child-rearing, or "alternative lifestyles".
Thomas Hobbes gave serious thought to the shape of a world where egos rule and concluded that a political Leviathan was required to bring order to an otherwise brutish and short existence. Relativists aren’t far behind the English philosopher--except when it comes to candor.
A world where egos rule is, more accurately, a world where some egos rule over others. And the guiding principle of this governance must be the "gut instincts" to which those power brokers give deference. In a world devoid of absolutes, these instincts increasingly coincide with narrow self-interest. (Relativists, it should be recalled, need not and do not defer to majority opinion when the latter fails to echo their own "nuanced" visions of "truth".)
St. Augustine said that in his youth he yearned to become a law unto himself--to reject any authority outside his own will. This "I-did-it-my-way" cultural complement of relativism smacks more of the "will to power" than of tolerance. The same egocentric impulse generates venom toward Pope Benedict--a man capable of defending the notion that moral truth is more than putty placed in the hands of gurus who pontificate on God-knows-what from their mansions in Malibu.
A Pope who used the spiritual force of his office to bring down political totalitarianism in Eastern Europe has been succeeded by another who targets a different form of despotism. The "Rottweilers of Relativism" despise Benedict XVI in much the same way KGB operatives feared John Paul II. Dictators, whether political or cultural, hate answering to anyone except themselves.
Thursday, April 28, 2005
PUSHING TV'S ENVELOPE TO THE MORGUE
"I see dead people." It’s not just a line proffered by a young boy with an exceptional range of vision, it’s now the case for all those working stiffs in search of emotional consolation who plump themselves down in front of the boob tube during prime time.
What the latter viewers are likely to see on network television has changed considerably over the years. "I Love Lucy" comedies and "Bonanza" Westerns morphed into "Laugh In" irreverence and Norman Lear commentary wrapped in domestic dysfunction. More recently viewers were treated to "Married With Children" cynicism--sit-com sleaze with a smirk. The question that then arose for TV moguls was this: Where do we go from here?
Having sated audiences with visions of firm flesh--and having convinced them that casual copulation after puberty is infinitely less important than inhaling second-hand cigarette smoke--the folks who regularly substitute shock-value for dramatic depth were perplexed. It wasn’t just a matter of where to push the envelope. It was a question of what envelope to push. As his "Crime Scene Investigation" programs indicate, producer Jerry Bruckheimer topped the long-standing industry obsession with sex with a clutch of shows featuring vivid images of death.
"Corpses R Us" might be an alternate title for these forensic dramas that exchange a fleshless Freddy Krueger for grotesque shots of bodies lying in various states of disrepair--either at the crime scene or atop an autopsy table. Close-ups of no-longer-vital organs being probed for admissible evidence prompts visceral reactions from viewers no longer aroused by gratuitous shots of animate mammary glands. Highlighted tissue isn’t distinguished by its beauty, but rather by its vulnerability to decay and displacement. Over against this vision of decomposition, the lyrical assertion: "All we are is dust in the wind," seems positively romantic.
On top of these haunting visual displays, CSI Las Vegas patrons are often treated to philosophical disquisitions that equate human life with the remains accessible during a post-mortem exam. "That’s all we really are," observed the bearded protagonist whose eyes gleam when discussing roller-coaster thrills but (like almost all the dramatic cast) has no room in his life for unnecessary chemical baggage--i.e. a wife and kids.
Donald Bellisario’s NCIS (Naval Criminal Investigative Service), joins Bruckheimer’s small-screen trio in exploring the commercial possibilities of pathology. Besides mortuarial moments that match anything displayed by CSI., NCIS adds to the scientific mix a brilliant, dark-haired woman who "just happens to be" a fan of BDSM. Her spiked necklace and Gothish appearance don’t make it clear whether Abby inclines more to the BD or the SM side of the erotic ledger. But her expertise and perkiness make one thing perfectly obvious--what once was considered perverse, is now just another day at the office.
It isn’t surprising that folks in the entertainment business--bereft of intellectual depth or moral insight--should grasp at any straw they can lay their hands on to get an emotional rise out of audiences. In lieu of dramatic intensity, they settle for envelope-pushing. For creativity they substitute titillation. Instead of works that give to virtue a local habitation and a name, they peddle spiritual pornography. Anything for an audience.
As snuff-films and "bug-chasing" erotic parties grimly testify, the last rung on the cultural ladder that leads to oblivion is a preoccupation with death--a desperate fascination born of the belief that only a random biological fluke distinguishes a living soul from a corpse.
What the latter viewers are likely to see on network television has changed considerably over the years. "I Love Lucy" comedies and "Bonanza" Westerns morphed into "Laugh In" irreverence and Norman Lear commentary wrapped in domestic dysfunction. More recently viewers were treated to "Married With Children" cynicism--sit-com sleaze with a smirk. The question that then arose for TV moguls was this: Where do we go from here?
Having sated audiences with visions of firm flesh--and having convinced them that casual copulation after puberty is infinitely less important than inhaling second-hand cigarette smoke--the folks who regularly substitute shock-value for dramatic depth were perplexed. It wasn’t just a matter of where to push the envelope. It was a question of what envelope to push. As his "Crime Scene Investigation" programs indicate, producer Jerry Bruckheimer topped the long-standing industry obsession with sex with a clutch of shows featuring vivid images of death.
"Corpses R Us" might be an alternate title for these forensic dramas that exchange a fleshless Freddy Krueger for grotesque shots of bodies lying in various states of disrepair--either at the crime scene or atop an autopsy table. Close-ups of no-longer-vital organs being probed for admissible evidence prompts visceral reactions from viewers no longer aroused by gratuitous shots of animate mammary glands. Highlighted tissue isn’t distinguished by its beauty, but rather by its vulnerability to decay and displacement. Over against this vision of decomposition, the lyrical assertion: "All we are is dust in the wind," seems positively romantic.
On top of these haunting visual displays, CSI Las Vegas patrons are often treated to philosophical disquisitions that equate human life with the remains accessible during a post-mortem exam. "That’s all we really are," observed the bearded protagonist whose eyes gleam when discussing roller-coaster thrills but (like almost all the dramatic cast) has no room in his life for unnecessary chemical baggage--i.e. a wife and kids.
Donald Bellisario’s NCIS (Naval Criminal Investigative Service), joins Bruckheimer’s small-screen trio in exploring the commercial possibilities of pathology. Besides mortuarial moments that match anything displayed by CSI., NCIS adds to the scientific mix a brilliant, dark-haired woman who "just happens to be" a fan of BDSM. Her spiked necklace and Gothish appearance don’t make it clear whether Abby inclines more to the BD or the SM side of the erotic ledger. But her expertise and perkiness make one thing perfectly obvious--what once was considered perverse, is now just another day at the office.
It isn’t surprising that folks in the entertainment business--bereft of intellectual depth or moral insight--should grasp at any straw they can lay their hands on to get an emotional rise out of audiences. In lieu of dramatic intensity, they settle for envelope-pushing. For creativity they substitute titillation. Instead of works that give to virtue a local habitation and a name, they peddle spiritual pornography. Anything for an audience.
As snuff-films and "bug-chasing" erotic parties grimly testify, the last rung on the cultural ladder that leads to oblivion is a preoccupation with death--a desperate fascination born of the belief that only a random biological fluke distinguishes a living soul from a corpse.
Monday, April 18, 2005
A LESSON IN HOW TO DIE
The Terri Schiavo case has led some observers to conclude that people of faith have a morbid interest in prolonging life at any cost. If true, this position would be highly ironic--given the beliefs that most pious individuals hold about an afterlife. The passing of Pope John Paul II provides, by contrast, a classic example of death in the Christian tradition.
Overwhelmingly, what is crucial to believers isn’t "life at any price" but rather the much-reviled notion that life isn’t ours to do with as we please. Instead, it is seen as a gift whose proper uses are divinely circumscribed. John Locke articulated this view as follows: "Since all men are the creation of one omnipotent and infinitely wise Maker... they are his property, made to live for his, not one another’s, pleasure."
This grant of life doesn’t require a person to employ any means necessary to prolong it. Indeed, in one’s final days decisions are often made not to engage in heroic measures to postpone what seems inevitable. Comfort is offered, last rites may be administered, but no surgery or resuscitation efforts are contemplated. In the language of ethics this scenario is called "allowing to die"--and it is perfectly compatible with all the religious traditions with which I am familiar. "The Lord giveth and the Lord taketh away. Blessed be the name of the Lord," is the passage from the book of Job often cited on such occasions.
A more complex end-of-life scenario involves the removal of life-sustaining mechanisms from individuals already utilizing these devices. When patients themeselves request that "extraordinary" care be withdrawn, there is, by and large, no religious problem. These persons are simply choosing to suspend treatment that, within the "allowing to die" paradigm, is never begun. When, however, others take this decision upon themselves, questions about substitutive judgment and "playing God" arise--especially if doubt exists about the patient’s medical condition or the trustworthiness of a surrogate’s judgment.
Significantly, when life-support systems are discontinued, patients don’t inevitably die. Karen Ann Quinlan, for example, surprised doctors by continuing to live almost ten years after her breathing machine was removed. Such exceptions never occur, of course, when persons are denied not extraordinary care, but basic nutrition. Withholding food and water amounts to a death sentence whether one is sick or not. And when this act of deprivation is done without a patient’s consent, the result, morally speaking, becomes indistinguishable from homicide.
A final scenario occurs when someone actively brings about another’s demise by means such as lethal injection. In "mercy killing" the "angel of charity" becomes the immediate and active purveyor of death. Clint Eastwood’s character in "Million Dollar Baby" engaged in this illegal act for which, in real life, Jack Kevorkian now sits in a Michigan prison.
The religious objection to this option (as also to withholding nutrition) is that it clearly places ultimate responsibility for life in the hands of someone other than the One who created it. Both suicide and killing in the name of kindness are seen as acts of hubris by which individuals cross a bright line--a trespass that is even more flagrant when the coup de grace is unsolicited.
In a culture enamored with the notion of autonomy, religious concepts that highlight human boundaries seem outmoded. NBC’s Bryan Williams, commenting on the Pope’s passing, observed that the pontiff faced death "on his own terms." It was a singularly inept description--a self-inflating compliment employed to describe the final act of a supremely selfless life.
"My body is my own. I’ll do with it as I please," was a slogan common in the 60’s and 70’s. As those same baby boomers reach the age when bodies and intentions increasingly march to different drummers, the shortsightedness of that boast becomes more apparent. John Paul’s last years were an object lesson in bearing with grace the cross of physical infirmity.
For persons of faith, the crucial issue isn’t "quality of life" but rather one’s willingness to acknowledge, up to the point of death, that we are in God’s hands. Practically speaking, this acknowledgment means maintaining an attitude of humility vis a vis the ultimate outcome. "Thy will be done."
As it was when we entered life, so it is when we leave. The life we "control" is vanishingly small. Our spirit springs from a source beyond ourselves and flourishes in a matrix of interdependence. For those whose beliefs continue to derive from the religious wellsprings of Western culture, death confronts us with truths that modern culture yearns to deny: Life is a gift, not just a right--and the measure of that life is not mere self-assertion, but gratitude and service.
Overwhelmingly, what is crucial to believers isn’t "life at any price" but rather the much-reviled notion that life isn’t ours to do with as we please. Instead, it is seen as a gift whose proper uses are divinely circumscribed. John Locke articulated this view as follows: "Since all men are the creation of one omnipotent and infinitely wise Maker... they are his property, made to live for his, not one another’s, pleasure."
This grant of life doesn’t require a person to employ any means necessary to prolong it. Indeed, in one’s final days decisions are often made not to engage in heroic measures to postpone what seems inevitable. Comfort is offered, last rites may be administered, but no surgery or resuscitation efforts are contemplated. In the language of ethics this scenario is called "allowing to die"--and it is perfectly compatible with all the religious traditions with which I am familiar. "The Lord giveth and the Lord taketh away. Blessed be the name of the Lord," is the passage from the book of Job often cited on such occasions.
A more complex end-of-life scenario involves the removal of life-sustaining mechanisms from individuals already utilizing these devices. When patients themeselves request that "extraordinary" care be withdrawn, there is, by and large, no religious problem. These persons are simply choosing to suspend treatment that, within the "allowing to die" paradigm, is never begun. When, however, others take this decision upon themselves, questions about substitutive judgment and "playing God" arise--especially if doubt exists about the patient’s medical condition or the trustworthiness of a surrogate’s judgment.
Significantly, when life-support systems are discontinued, patients don’t inevitably die. Karen Ann Quinlan, for example, surprised doctors by continuing to live almost ten years after her breathing machine was removed. Such exceptions never occur, of course, when persons are denied not extraordinary care, but basic nutrition. Withholding food and water amounts to a death sentence whether one is sick or not. And when this act of deprivation is done without a patient’s consent, the result, morally speaking, becomes indistinguishable from homicide.
A final scenario occurs when someone actively brings about another’s demise by means such as lethal injection. In "mercy killing" the "angel of charity" becomes the immediate and active purveyor of death. Clint Eastwood’s character in "Million Dollar Baby" engaged in this illegal act for which, in real life, Jack Kevorkian now sits in a Michigan prison.
The religious objection to this option (as also to withholding nutrition) is that it clearly places ultimate responsibility for life in the hands of someone other than the One who created it. Both suicide and killing in the name of kindness are seen as acts of hubris by which individuals cross a bright line--a trespass that is even more flagrant when the coup de grace is unsolicited.
In a culture enamored with the notion of autonomy, religious concepts that highlight human boundaries seem outmoded. NBC’s Bryan Williams, commenting on the Pope’s passing, observed that the pontiff faced death "on his own terms." It was a singularly inept description--a self-inflating compliment employed to describe the final act of a supremely selfless life.
"My body is my own. I’ll do with it as I please," was a slogan common in the 60’s and 70’s. As those same baby boomers reach the age when bodies and intentions increasingly march to different drummers, the shortsightedness of that boast becomes more apparent. John Paul’s last years were an object lesson in bearing with grace the cross of physical infirmity.
For persons of faith, the crucial issue isn’t "quality of life" but rather one’s willingness to acknowledge, up to the point of death, that we are in God’s hands. Practically speaking, this acknowledgment means maintaining an attitude of humility vis a vis the ultimate outcome. "Thy will be done."
As it was when we entered life, so it is when we leave. The life we "control" is vanishingly small. Our spirit springs from a source beyond ourselves and flourishes in a matrix of interdependence. For those whose beliefs continue to derive from the religious wellsprings of Western culture, death confronts us with truths that modern culture yearns to deny: Life is a gift, not just a right--and the measure of that life is not mere self-assertion, but gratitude and service.
Subscribe to:
Posts (Atom)