“Everyone’s selfish.” That’s the commercial wisdom that’s been pounded into morally flaccid skulls over the last few decades. The recently departed Libertarian presidential candidate, Harry Browne, even placed this very popular idea into an essay that graces a popular ethics anthology. The article was named, “The Unselfishness Trap.”
According to this revisionist logic, people who do what they want to do are selfish. And since everyone—Pope Benedict, Donald Trump, and Richard “the shoebomber” Reid—does what he wants, everyone is selfish.
It’s a clever rhetorical sleight of hand that manages to overturn, without further ado, two thousand years of moral reasoning that portrayed selfishness as a grave vice. Now, armed with this cheaply attained insight, conspicuous consumers and media narcissists are able to dismiss the trait as an empty label that moralists attach to actions they disapprove. At least that’s the conclusion Harry and company came to after five minutes of not-so-strenuous philosophical lifting.
Getting folks to think seriously about morality is, indeed, a daunting task—especially in a culture where individuals are inundated with messages designed to reinforce their worst impulses. “No rules, just right.” “Obey your thirst.” “Just do it.” “You’ve got to make yourself happy first.”
Anyone who can’t spot the flaw in the aforementioned selfishness argument would do well to begin a rigorous moral exercise program—say, ten pages of Aristotle, Augustine, or Aquinas a day. Heck, you can throw in Confucius or the Tao-Te Ching for diversity’s sake.
The mistake that modern rationalizers make is to define selfishness as “doing what you want to do.” This is not, and never has been, the definition of selfishness. (Indeed, for Aristotle the very definition of a virtuous individual is a person who both does and wants to do what is right.) Instead of “doing what you want,” selfishness is properly defined as the quality exhibited by those who focus inordinately on their own interests—and act accordingly. Selfish people, in other words, regularly fail to look at things through the eyes of others.
By introducing a bogus definition of the word “selfish,” revisionists succeed in undermining the concept itself. For if every single action is selfish, then the phrase “acting selfishly” is transformed into a useless redundancy.
An embarrassing question that arises from this short analysis is this: Which persons and groups are most anxious to do away with the concept of selfishness? It doesn’t take a media genius to figure that one out.
A final question takes this form: Why do contemporary Americans so easily confuse rational self-interest, willing acts of sacrificial love, and craven egotism? The answer, I think, is that people who are brazenly superficial and stunningly narcissistic prefer to employ moral mirrors that are thoroughly muddled.
Culture Criticism with a Philosophical and Literary Flair. Diagnosing Moral Malpractice since 1989.
Thursday, May 25, 2006
Friday, May 19, 2006
ORLANDO PATTERSON AND THE "COOL-POSE" BLACK CULTURE
Orlando Patterson recently dropped an ideological stink bomb on his chums at the New York Times. In an article published March 26th, the Harvard sociologist notes not only the “disconnect” that exists between “millions of black youths” and “the American mainstream,” but also “the failure of social scientists to adequately explain the problem.”
This failure, according to Patterson, has been caused by “a deep-seated dogma that has prevailed in social science and policy circles since the mid-1960’s.” That dogma rejected “any explanation that invokes a group’s cultural attributes” and focused instead on “structural factors like low incomes, joblessness, poor schools and bad housing.”
Translating socio-speak into plain English, Patterson is saying that his professional colleagues have been unwilling to admit that a corrupt subculture has had a devastating impact on millions of young black males. Instead, academics have put all their analytical eggs in economic baskets that exclude the domain of morality.
To the chagrin of ivory tower Marxists, Patterson observes that “countless studies… have found that poor schools, per se, do not explain why after 10 years of education a young man remains illiterate.” Nor do they explain why young black females do so much better than their male counterparts.
What does explain this phenomenon is the group approval given to young males who assert their manhood by shunning literacy, assuming a “cool-pose” persona, and engaging in acts of two-bit bravado. In Patterson’s words, “For these young men, it was almost like a drug, hanging out on the street after school, shopping and dressing sharply, sexual conquests, party drugs, hip-hop music and culture...”
Despite these insights, Patterson isn’t yet prepared to jump professional ship and devotes much of his time taking back with the left hand what the right hand had given. Thomas Sowell, a scholar not subject to the blacklisting pressure that permeates most campuses, fingers more forthrightly the “redneck” culture that is ruining the lives of many young blacks—a culture whose roots run back to the very non-black borderland between England and Scotland.
If Patterson and his cohorts had the courage of a Bill Cosby, their analyses wouldn’t tiptoe around an issue that is obvious to anyone with a modicum of sense and the courage to report what should be obvious to anyone: Subcultures that glorify violence, indolence, and promiscuity affect youngsters every bit as much as vicious racist stereotypes or economic hardship.
The problem with most sociologists isn’t a paucity of data but a lack of backbone. They are loath to criticize the pimps in the music and entertainment industry lest they offend powerful political and cultural play’rs whose power grows in tandem with black degradation.
Senator Daniel Moynihan once remarked, regretfully, that there’s good money to be made in bad schools. The same sociological insight applies to a corrupt subculture.
This failure, according to Patterson, has been caused by “a deep-seated dogma that has prevailed in social science and policy circles since the mid-1960’s.” That dogma rejected “any explanation that invokes a group’s cultural attributes” and focused instead on “structural factors like low incomes, joblessness, poor schools and bad housing.”
Translating socio-speak into plain English, Patterson is saying that his professional colleagues have been unwilling to admit that a corrupt subculture has had a devastating impact on millions of young black males. Instead, academics have put all their analytical eggs in economic baskets that exclude the domain of morality.
To the chagrin of ivory tower Marxists, Patterson observes that “countless studies… have found that poor schools, per se, do not explain why after 10 years of education a young man remains illiterate.” Nor do they explain why young black females do so much better than their male counterparts.
What does explain this phenomenon is the group approval given to young males who assert their manhood by shunning literacy, assuming a “cool-pose” persona, and engaging in acts of two-bit bravado. In Patterson’s words, “For these young men, it was almost like a drug, hanging out on the street after school, shopping and dressing sharply, sexual conquests, party drugs, hip-hop music and culture...”
Despite these insights, Patterson isn’t yet prepared to jump professional ship and devotes much of his time taking back with the left hand what the right hand had given. Thomas Sowell, a scholar not subject to the blacklisting pressure that permeates most campuses, fingers more forthrightly the “redneck” culture that is ruining the lives of many young blacks—a culture whose roots run back to the very non-black borderland between England and Scotland.
If Patterson and his cohorts had the courage of a Bill Cosby, their analyses wouldn’t tiptoe around an issue that is obvious to anyone with a modicum of sense and the courage to report what should be obvious to anyone: Subcultures that glorify violence, indolence, and promiscuity affect youngsters every bit as much as vicious racist stereotypes or economic hardship.
The problem with most sociologists isn’t a paucity of data but a lack of backbone. They are loath to criticize the pimps in the music and entertainment industry lest they offend powerful political and cultural play’rs whose power grows in tandem with black degradation.
Senator Daniel Moynihan once remarked, regretfully, that there’s good money to be made in bad schools. The same sociological insight applies to a corrupt subculture.
Tuesday, May 09, 2006
UNITED 93: STANDING UP FOR WHAT YOU BELIEVE IN?
According to one voice on a radio promo for United 93, the film is about “standing up for what you believe in.” I can hardly imagine a more inane—even if thoroughly predictable—statement.
I know that moral illiteracy pervades this country far more than the substantial illiteracy of letters. Still, I would have thought that some measure of care would have been given to a statement that describes the import of what the passengers did on United 93. Instead, a vacuous cliché is employed to tout acts of courage performed in the service of life and decency.
Does it not occur to people who employ this ubiquitous phrase as a badge of moral distinction that the terrorists responsible for killing almost three thousand innocent people on 9/11 were also “standing up for what they believed?” Or is thirty seconds of ethical reflection too much to ask of Americans eager to equate passion and admirable behavior?
As long as we are handing out kudos for simply acting on one’s beliefs, why not give lifetime achievement awards to Joseph Stalin, Mao, Adolf Hitler, and Pol Pot. In the American division credits could go to George Wallace for standing in the doorway of the University of Alabama in definance of a federal court order giving blacks the right to attend the institution. Bull Conner might also get a “standing O” for his more forceful interventions on behalf of Jim Crow laws.
Depending on the meaning of the elastic phrase “standing up for,” one might place Timothy McVeigh in the distinguished company of persons who persist in their beliefs—no matter what.
In other words, it seems to occur to no one in the editorial business, or elsewhere, that “standing up for what you believe in” is a phrase perfectly consistent with pig-headed racism, ideological intransigence, and even mass murder.
Given these semantic shortcomings, one might ask why this phrase has become so popular with Americans. The answer, I think, is precisely because the phrase possesses no moral content whatsoever. Consequently, it can be employed to sanction whatever course of action anyone happens to pursue—provided the person “embraces” that act.
Thus, immoral scumbags like Howard Stern are now praised for “honesty” while the once-loathsome word, “shameless,” is consigned to the dustbin of cultural history. Gone is the incomprehensible aphorism: “Hypocrisy is the tribute that vice gives to virtue.”
Vice and virtue—there’s the rub. What many Americans want is a way of talking that dispenses with those inconvenient, judgmental terms. And making terrorists the moral equals of those who oppose them is a small price to pay for the “freedom” that is bestowed on individuals who enthusiastically embrace such language.
I know that moral illiteracy pervades this country far more than the substantial illiteracy of letters. Still, I would have thought that some measure of care would have been given to a statement that describes the import of what the passengers did on United 93. Instead, a vacuous cliché is employed to tout acts of courage performed in the service of life and decency.
Does it not occur to people who employ this ubiquitous phrase as a badge of moral distinction that the terrorists responsible for killing almost three thousand innocent people on 9/11 were also “standing up for what they believed?” Or is thirty seconds of ethical reflection too much to ask of Americans eager to equate passion and admirable behavior?
As long as we are handing out kudos for simply acting on one’s beliefs, why not give lifetime achievement awards to Joseph Stalin, Mao, Adolf Hitler, and Pol Pot. In the American division credits could go to George Wallace for standing in the doorway of the University of Alabama in definance of a federal court order giving blacks the right to attend the institution. Bull Conner might also get a “standing O” for his more forceful interventions on behalf of Jim Crow laws.
Depending on the meaning of the elastic phrase “standing up for,” one might place Timothy McVeigh in the distinguished company of persons who persist in their beliefs—no matter what.
In other words, it seems to occur to no one in the editorial business, or elsewhere, that “standing up for what you believe in” is a phrase perfectly consistent with pig-headed racism, ideological intransigence, and even mass murder.
Given these semantic shortcomings, one might ask why this phrase has become so popular with Americans. The answer, I think, is precisely because the phrase possesses no moral content whatsoever. Consequently, it can be employed to sanction whatever course of action anyone happens to pursue—provided the person “embraces” that act.
Thus, immoral scumbags like Howard Stern are now praised for “honesty” while the once-loathsome word, “shameless,” is consigned to the dustbin of cultural history. Gone is the incomprehensible aphorism: “Hypocrisy is the tribute that vice gives to virtue.”
Vice and virtue—there’s the rub. What many Americans want is a way of talking that dispenses with those inconvenient, judgmental terms. And making terrorists the moral equals of those who oppose them is a small price to pay for the “freedom” that is bestowed on individuals who enthusiastically embrace such language.
Tuesday, May 02, 2006
SCHOOL SHOOTINGS ARE HO-HUM
“She would have been a good woman,” the Misfit said, “if it had been somebody there to shoot her every minute of her life.” Those were the words an escaped convict spoke over the lifeless body of a self-absorbed grandmother.
The statement in Flannery O’Connor’s short story, “A Good Man Is Hard To Find,” suggests that goodness only comes to the fore in certain people when death is staring them in the face. Were O’Connor alive today, she might have to revise that thought.
April 20th was the seventh anniversary of the day in 1999 when Dylan Klebold and Eric Harris killed thirteen innocent people at Columbine High School in Littleton, Colorado. This year five teenagers in rural Kansas picked that date to carry out a similar massacre. A few days later a handful of seventh-graders—yes, seventh graders—in a small Alaska community were rounded up for hatching a similar plan.
Almost as depressing as these foiled attempts at homicide were the comments that some adults made about them. One school official gave thanks that “a student felt they could talk to an adult.” The PC grammar of this benediction blends seamlessly with its presumption about kids who are reluctant to betray their peers—even those contemplating mass murder. What could be worse, after all, than being known as a snitch?
Even more discouraging was this remark made by an Alaskan police chief: “People are people. Something like this can happen anywhere…”
Unfortunately, events like this have happened throughout the United States. Last November in Campbell County, Tennessee, a fifteen-year-old shot and killed an assistant principal and wounded two others. And in March of 2005, Minnesota’s Red Lake Indian Reservation became the scene of a slaughter that left five students and two adults dead—all killed by a 16-year-old.
Other post- and pre-Columbine locations include Santee, California, Springfield, Oregon, Pearl, Mississippi, and Jonesboro, Arkansas. That partial list doesn’t include, of course, “unsuccessful” plots like those in Kansas and Alaska. .
The truly breathtaking part of the officer’s statement was his glib observation that “people are people”—as if kids randomly shooting their classmates is part of the human condition.
Are adults really so mindless as to believe this nonsense? Children in the United States weren’t turning schools into mortuaries two generations ago. Nor can one rattle off a string of cites in Japan or Italy where barely pubescent males have planned or carried out mass killings at school.
Yet faced with horrors that occur with astounding regularity, many Americans seem to have accepted barbarism as a fact of life. “People are people!” Such folks could be shot every day of their lives and still remain oblivious to the depths of the cultural depravity around them. Even mass juvenile murder is preferable to the truth.
The statement in Flannery O’Connor’s short story, “A Good Man Is Hard To Find,” suggests that goodness only comes to the fore in certain people when death is staring them in the face. Were O’Connor alive today, she might have to revise that thought.
April 20th was the seventh anniversary of the day in 1999 when Dylan Klebold and Eric Harris killed thirteen innocent people at Columbine High School in Littleton, Colorado. This year five teenagers in rural Kansas picked that date to carry out a similar massacre. A few days later a handful of seventh-graders—yes, seventh graders—in a small Alaska community were rounded up for hatching a similar plan.
Almost as depressing as these foiled attempts at homicide were the comments that some adults made about them. One school official gave thanks that “a student felt they could talk to an adult.” The PC grammar of this benediction blends seamlessly with its presumption about kids who are reluctant to betray their peers—even those contemplating mass murder. What could be worse, after all, than being known as a snitch?
Even more discouraging was this remark made by an Alaskan police chief: “People are people. Something like this can happen anywhere…”
Unfortunately, events like this have happened throughout the United States. Last November in Campbell County, Tennessee, a fifteen-year-old shot and killed an assistant principal and wounded two others. And in March of 2005, Minnesota’s Red Lake Indian Reservation became the scene of a slaughter that left five students and two adults dead—all killed by a 16-year-old.
Other post- and pre-Columbine locations include Santee, California, Springfield, Oregon, Pearl, Mississippi, and Jonesboro, Arkansas. That partial list doesn’t include, of course, “unsuccessful” plots like those in Kansas and Alaska. .
The truly breathtaking part of the officer’s statement was his glib observation that “people are people”—as if kids randomly shooting their classmates is part of the human condition.
Are adults really so mindless as to believe this nonsense? Children in the United States weren’t turning schools into mortuaries two generations ago. Nor can one rattle off a string of cites in Japan or Italy where barely pubescent males have planned or carried out mass killings at school.
Yet faced with horrors that occur with astounding regularity, many Americans seem to have accepted barbarism as a fact of life. “People are people!” Such folks could be shot every day of their lives and still remain oblivious to the depths of the cultural depravity around them. Even mass juvenile murder is preferable to the truth.
Subscribe to:
Posts (Atom)