Monday, October 27, 2025

An Enemy Within: America’s Pop Culture

What’s contemporary Top 40 music like?  It isn’t a question I would have been inclined to ask were I not forced by my cable company to go initially to that Sirius channel before tuning to soft rock, country, or classical offerings.  What caught my ear, however, during those brief exposure episodes was the sameness of the bland sounds emitted before I switched to more familiar styles.  Eventually I decided to endure several consecutive Top 40 Sirius songs to see if my random encounters were representative, and lo and behold, they were!

Typically the artist’s voice is only slightly more prominent than background synthesizer music.   To employ a visual analogy, that background sound would be captured by a palette of undifferentiated yellowish browns.  Rarely does one hear distinctive musical instruments that were integral components of most musical offerings just a few decades ago.  Occasionally uninspiring piano and drum accompaniments are discernible, but there’s nothing like Mick Fleetwood’s drums or the brassy excellence of Chicago.     

The musical scores in Top 40 are also hugely repetitive and generally possess all the range and complexity of bad elevator music—characteristics shared by the vocalists who thankfully rush through their lyrics a la Kendrick Lamar’s Super Bowl halftime fiasco.  This defect is occasionally remedied for teenage fans and maturity-stunted adults by videos that supply the often inaudible wording.  Failing that musical patch, lyrics are always accessible on the net.  

Even the songs of Taylor Swift are, at best, incremental improvements on the above descriptions.  A columnist for the UK Times, Rod Liddle, had the courage to label Swift a “caterwauling blonde moppet” who produces “banal and life-sapping sub-Kardashian electropop drivel,” and for his literary trouble received a restraining order from the popstress’s lawyers.  Readers can judge for themselves the accuracy of Liddle’s review based on this hit romantically titled “Slut.”    

The moral tenor of contemporary Top 40 offerings, not surprisingly, coincides with the songs’ lack of acoustical virtuosity.  A piece that accurately reflects that congruence is innocently titled “Diet Pepsi” by a singer named Addison Rue.  The video consists of a series of bra and panties erotic poses in the front and back seat of a ’65 Mustang driven by a James Dean type.  A somewhat rappish singer named Lola Young who climbed to number one on several Euro-charts takes care of the narcissistic, f-bomb, parent-trashing spectrum with “Messy.”

The moral vacuity, of course, isn’t limited to hetero-normatives as this headline explains: “Lesbian viral sensation Gigi Perez has admitted that her queer anthem ‘Sailor Song’ changed her life.”  The full lyrics are linked here and include these lines addressed to her same-sex object of desire:  “I don’t believe in God, but I believe that you’re my savior” plus “And when we’re getting dirty, I forget all that is wrong.”     

I’m well aware that pop and rap songs largely aimed at kids from 9 to 19 aren’t a space to consult for uplifting moral sentiment, but nowadays the degrading, self-centered undertow has become a lyrical and visual squalor that’s even reflected in a poverty of musical excellence.  Whereas post-WWI Dadaism was a self-conscious mocking of established standards, today’s pop music represents, I think, a largely unconscious assumption of the id-generated desires promulgated preeminently in Hollywood.  Although these themes have been present in pop music for decades (most revoltingly in Madonna’s sacrilegious video “Like a Prayer”) today’s pop music, with MTV-spawned visuals, is saturated with the basest lyrics that seldom reach beyond the singer’s primal desires--no more tying ribbons around old oak trees or occasionally teaching the world to sing in perfect harmony.   

Does all this attention to Top 40 music really matter?  I’m afraid it does.  Recently Douglas Murray at an Alliance for Responsible Citizenship meeting mentioned a politician who proudly noted how the extraordinary events of October 7 brought forth extraordinary young people ready to meet that crisis. Murray observed that her statement needed an important adjustment, namely that it isn’t just extraordinary events that throw up extraordinary youths, instead “it’s extraordinary events when they come up against people who have been extraordinarily well cultured” (minute 7).

Is it possible that a society that countenances, praises, or ignores Top 40 garbage like the aforementioned can produce a sufficient number of extraordinary youths to meet future crises?  Combine those toxic sounds and images with their cinematic cousins and Americans face a Herculean stable clean-up task.  It is encouraging that military recruitment spiked significantly following President Trump’s election, but the power of the “resistance” baked deep in our culture is akin to a cancer slowly spreading its tentacles into a body’s vital organs.  It’s a disease that won’t be eliminated by four years of political therapy.  A more radical intervention will be required, the prerequisite to which is acknowledging the mortal danger itself.    

 Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

 

Charlie Kirk's Legacy

In The Mission Robert Bolt’s Jesuit priest (Jeremy Irons) is shot dead while carrying a cross in a pacific procession while protecting his 18th century Indian community from Portuguese troops whose masters seek to enslave them.  Two decades earlier Bolt dramatized a pious Sir Thomas More in A Man For All Seasons.  More’s papal-grounded opposition to Henry VIII’s divorce from Catherine of Aragon and marriage to Anne Boleyn is rewarded with imprisonment and, after the perjured testimony of a one-time admirer, beheading.  Not long ago Clint Eastwood in Gran Torino dies to save his young foreign friend by exposing those who threatened him.  Eastwood’s irascible character, Walt, lies stretched out in cruciform after having been pummeled with bullets by a neighborhood gang in plain sight of onlookers. 

Those examples are cinematic creations that pay tribute in various ways to their spiritual predecessor.  Charlie Kirk, however, was a Christ figure in real life.  Far from a “spreader of hate” as asserted by politicians and commentators who specialize in that activity, Charlie practiced respectful dialogue and viewed even rude interlocutors as fellow humans created in the image of God.  Anyone taking the time to sample the many hours of dialogue available will never see any name-calling or personal insults proceed from his mouth.  Instead one witnesses an impressive willingness to listen to opposing comments.  Those who disagree are encouraged to “come to the front of the line,” and their often ill-tempered challenges are met with thoughtful, sometimes forceful, responses.  Frequently Charlie seeks points of agreement from which further dialogue becomes possible. 

In encounters with exponents of the narcissistic relativism that permeates American campuses, Charlie sought to expose the vacuity of arguments that reject God or Nature’s God as the fundamental ground for morality.  “My heart” and “my truth” provide no rational counter-argument to “Mao’s truth” or “Stalin’s truth.”  Absent a more fundamental foundation even “the greatest good for the greatest number” melts into a hodge-podge of undefined terms.  What is meant by “greatest good” or by “greatest number,” and how do we understand the word “good”?  And why must everyone follow this formula if nothing undergirds it?  Such are the infuriating questions posed in Socratic fashion to students who have been told (not  taught) that religion is foolish superstition, that the Western tradition represents the prejudices of old white men, and that morality is nothing more than a person’s own “values.”  

It was galling to detractors that this vibrant, intellectually intimidating young man gained his insights by consulting the Western literary tradition and contemporary scholars like Thomas Sowell—especially taking to heart the wisdom contained in the Old and New Testaments. “Can anything good come out of Nazareth?” was a skeptical observation from two millennia past.  A contemporary echo of that comment would be “Charlie didn’t go to college and quotes the Bible”--an observation that condemns most of higher education rather than the intellectual stature of the autodidact who rejected schooling’s institutional corruption.  

As Dennis Prager regularly observed, “You have to go to college to believe some things.”  Men can become women and vice-versa.  A man who has undergone hormonal treatment to appear more feminine can compete on an equal playing field with women.  You are the sex (‘gender”) you want to be.  A man who wants to be a woman is a woman and thus entitled to undress in the women’s locker room and to invade a women’s spa.  A mountain of evidence and common sense was available for Charlie to counter those absurdities--millennia of tradition plus obvious physiological, psychological , and chromosomal differences.  Similar arguments were presented to abortion advocates who equate “my body” with the nascent life having its own genetic structure within that body.

To religious or “spiritual” challenges that stress “Jesus gets us” tolerance, Charlie presented the Jesus of  Scripture (John 8:1-11) who not only saved the woman accused of adultery from being stoned but also approached her and said, “Go, and sin no more.”  Yes, mercy is fundamental to Christianity but also the truth about sin.  The latter value, Kirk noted, has been forgotten or even condemned as “judgmental” not only by popular culture but also by a vast number of Christians.

Perhaps most of all Charlie was hated as the image of a personal goodness that exposes, like sunlight, the moral and spiritual decay of our culture and of individuals consumed by that decadence.  He was a faithful husband with two beautiful children, to whose welfare he was passionately devoted.  He engaged in a disciplined, spiritually informed health regimen—no drugs, exercise, healthy food and drink.  He was remarkably judicious in the use of his time, a trait inimical to a culture awash in slothful self-indulgence.  Such things arouse self-loathing when one looks in the mirror and sees the polar opposite.  Now his assassination has exposed the evil that brims with revulsion at an utterly decent  husband and father who was skilled at respectful and forceful dialogue—a Christian devoted to God, family, and country.

 “Truly, truly, I say to you, unless a grain of wheat falls into the earth and dies, it remains alone; but if it dies, it bears much fruit” (John 12:24).  So may it be with the death of Charlie Kirk.  

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

 

                       

Thursday, October 23, 2025

On Democracies and Death Cults -- October 7 and the Gaza War

On Democracies and Death Cults by Douglas Murray

If any book could open the eyes of clueless pro-Hamas student protesters, Douglas Murray’s On Democracies and Death Cults would likely be the one.  Packed with eyewitness accounts of the horrific October 7 massacre and Israel’s subsequent response, only ideological intransigents would instinctively ignore the massive moral gulf separating Hamas from its Jewish enemies.

Far from a philosophical analysis, Murray addresses the Democracy-Death Cult clash by relating what he saw, what he gathered from interviews, and, incredibly, by what was available via phone messages, social media posts, and filming of the atrocities—much done by the terrorists themselves.  “Using GoPro cameras and mobile phones the terrorists broadcast their acts of violence with pride.  By late in the day on October 7, it was already clear that these acts included burning people alive, shooting innocent people, cutting off people’s heads, and raping men and women.  Sometimes before killing them.  Sometimes after.”  

The account Murray provides of the attack is vivid and personal.  Parents get messages of their children’s last desperate minutes while Hamas fighters, unlike the Nazis, publicize the grisly torture they inflict on Jews.  Relevant detours into the history of Israel’s struggle for survival come as a relief, as do paragraphs devoted to the burgeoning population of Gaza and non-Jewish Israel—figures that conclusively rebut the popular “genocide” accusations against Israel.  “Apartheid state” calumnies are likewise countered with facts about Arab participation in Israel’s government even at the highest levels.

The story of one Hamas terrorist, Yahya Sinwar, serves as a singular representative of the Death Cult throughout Murray’s work.  Sinwar was a leader of the October 7 attack but had been recruited into Hamas years earlier.  In 1988, Sinwar was imprisoned for the murder of four Palestinians he suspected were informers—crimes to which he proudly admitted.  One of the few Israelis who had regular contact with Sinwar in prison was a dentist, Dr. Yuval Bitton.  In 2004 Bitton noticed something was wrong with Sinwar and arranged for him to be sent to a medical center where he was operated on for a brain tumor.  Bitton visited Sinwar in the hospital and the latter thanked him for saving his life.

The story doesn’t end there.  In 2011 Sinwar was the highest level prisoner released in a 1,027 to 1 swap for a young Israeli soldier who’d been imprisoned in Gaza for five years.  (Israel withdrew from Gaza in 2006.) Upon release Sinwar immediately resumed his position in Hamas and advocated taking more Israeli hostages to free other Palestinians in Israeli jails--a tactic expanded on October 7 to include even dead Israelis.  On that same horrendous day a farmer, Tamir Adar, and his family were apparently killed in the Hamas attack.  None were ever heard from again.  Tamir was the nephew of Dr. Yuval Bitton.

Near the end of his book Murray recounts the killing of Sinwar a year after the initial massacre.  “Sinwar had been killed in Rafah, in the south of Gaza, in the place where Vice President Kamala Harris and many other international observers had insisted the IDF should not go.”  The comment about Harris and “other international observers” reiterates a point often made in the book, namely, the hand-tying, “proportional response” demands regularly imposed on Israel by world leaders who dismiss the devastating impact of Hamas and Hezbollah missiles on community life. “Why was the whole country so littered with bomb shelters that on the 7th people ran into them across the south and were promptly massacred inside them by Hamas?  How was this a way to live?  And who else would live like this?”  Even so, Murray notes that civilian casualties in Gaza have been exceptionally low by historical standards—a fact that didn’t prevent the International Criminal Court from designating Prime Minister Benjamin Netanyahu a war criminal.

Chapters three and four are largely devoted to Western responses to the October 7 Hamas atrocities. That very evening “a great crowd of anti-Israel protesters had gathered outside the Israeli embassy in London, among other places, to celebrate the massacres of the day.  They waved flags and lit flares while shouting the same war cry and victory cry as the terrorists, ‘Allahu Akbar!’”  A Times Square protest against Israel occurred the next day “while Hamas terrorists were still murdering their way through the south of Israel.” The general frivolity of student protests on American campuses, where chants of “intifada” went out alongside demands for more accommodating toilet facilities and “alternative milk,” blatantly contrasted with Hamas atrocities and the courageous response to those acts that was occurring in Gaza and Israel, sometimes by females the same age as the privileged protesters. 

Throughout the Western world these anti-Jewish protests proliferated, egged on by Professors whose words would have gotten them fired if directed against gays or blacks.  One example of many: “Cornell University history professor Russell Rickford was filmed at an anti-Israel rally praising Hamas’s massacre and telling the crowd, ‘It was exhilarating , it was energizing.’”  What does it mean, Murray asks, that “on the streets of every major Western city, people who must have known what had been done on the 7th publicly took the side of the aggressors?”

A psychological explanation was previously given by Soviet novelist Vasily Grossman:  “Anti-Semitism . . . is a mirror for the failings of individuals, social structures and State systems.  Tell me what you accuse the Jews of—I’ll tell you what you’re guilty of.”  Murray expands this dictum to apply to the student protestors whose view of Western culture has been warped by radical leftists:  “Tell me what you accuse the Jews of—I’ll tell you what you believe you are guilty of.”  For Gazans and persons throughout the Arab world an historical explanation largely suffices, starting with the still celebrated pact between Hitler and the Mufti of Jerusalem—a collaboration which continues to make Mein Kampf a best seller.   

These “explanations” comprise only a small fraction of Murray’s book which is devoted overwhelmingly to describing what happened on October 7, how individual Jews responded, and how the Western world responded.  It is those journalistic details that make On Democracies and Death Cults a work that might even turn the head of students more interested in performative protest than in the truth about good and evil, life and death.

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

In Praise of Hypocrisy

 “Hypocrisy is the tribute that vice gives to virtue.”  That’s an aphorism that has fallen into desuetude, along with the word “desuetude.”  The saying was still in use in the mid-twentieth century but became virtually meaningless in popular culture after the sixties.  That was when the moral imperatives now popular came into fashion:  “Stand up for what you believe in” and “Be true to your values.”  Instead of being encouraged to be virtuous, the general public was told to affirm and exhibit for the world whatever their beliefs happened to be--hedonistic, nihilistic, Marxist, Christian, et cetera.

Simultaneous with this dubious moral revisionism, hypocrisy was promoted to number one on the scale of bad things, standing as it does in direct opposition to the aforementioned imperatives.  A hypocrite doesn’t outwardly embrace what he really believes in or values.  The question thus arises, how is hypocrisy in any sense a tribute to virtue?   

To answer that question one must explore the dramatic origin of the word “hypocrisy,” literally “an actor under a mask.”  As thus understood, the idea of “pretense” is a necessary component of the term—an element now often ignored.  And what the moral “actor” pretends to be is virtuous, or at least more virtuous than he really is. Given this meaning of the word, simply failing to be true to one’s values would not make one a hypocrite since neither pretense nor virtue need be part of that behavior.  The ubiquitous “I’m only human” excuse would suffice to provide secular absolution for any disconnect between values and performance.

It’s only when an individual pretends to be virtuous when he isn’t as virtuous as he pretends to be that hypocrisy in the aphoristic sense comes into play.  The reason for pretending to be virtuous is that virtue is, or at least was, generally recognized as superior to vice.  This recognition of virtue’s superiority (even if only pretended for public consumption) is the “tribute” vice gives to its opposite number.

We can thank the famous French philosopher of the 1960s, Jean-Paul Sartre, for the aforementioned moral revisionism that replaced objective moral standards with self-defined mores and substituted “authenticity” for virtue.  Being “authentic” involved embracing one’s own actions and standards of conduct.  Consequently, “hypocrisy” was transformed into the vilification of “inauthentic” persons who failed to embrace their own actions or standards of conduct.  Nowadays “hypocrite” has become the only judgmental epithet many persons are willing, and eager, to employ.

The most pernicious use of this redefined term is to vilify persons who don’t live up to the high standards they espouse, thus making it equivalent to the word “sinner” or, in more pedestrian terms, “imperfect.”  It’s true that a hypocrite in the traditional sense “pretends” to be something he is not, but it is not the case that someone who fails to live up to exalted moral standards is a hypocrite.  A person who fails to clear a traditional moral bar set at seven feet isn’t a hypocrite unless he pretends otherwise.  

Yet thanks to today’s linguistic legerdemain all morally serious persons, people whose ideal of virtue exceeds their grasp, have become hypocrites.  Moral zeroes, by contrast, are deemed “honest” or “true to themselves” if they set their moral bars flat on the ground and step triumphantly over them.  No one accused Howard Stern (at least not back in the day) of hypocrisy.  Instead his shamelessness, formerly at or near the bottom on the scale of vices, was embraced by the cultural avant-garde.  Stern openly and profitably disparaged traditional standards of virtue.  

Thus, in this topsy-turvy world of setting one’s own moral standards, the ethical playing field is hopelessly slanted in favor of shamelessness.  The rules of the game encourage everyone to place the moral bar as low as possible and to prize being non-judgmental above all else.  Anyone who dares raise the bar of virtue high will be pummeled with charges of hypocrisy for failing to be perfect (cf. William Bennett, The Book of Virtues).     

Being a hypocrite in the traditional sense isn’t a good thing, but it’s better than shamelessness.  The latter doesn’t pay tribute to virtue at all, whereas the former exists in a world where virtue is an objective good honestly pursued by imperfect people and sometimes indirectly honored even by those corrupted by vice.  

 

Taking Religion Seriously -- Review of Charles Murray's Book

 

Taking Religion Seriously isn’t a book you would expect from a political scientist most well known for Losing Ground and The Bell Curve.  It’s not, however, surprising that a man eighty-two years of age should ponder the topics addressed in this brief work that can be read in a few hours.  Broadly speaking those topics are God, morality, and Christianity.

Though Murray claims no special expertise on those matters, it’s obvious he’s devoted considerable time to exploring the subject matter--a largely intellectual journey that began three decades earlier with his wife’s pursuit of a religious community congruent with her profound experience of motherly love.  The latter search found a suitable destination with the Quakers.  Charles’ more intellectual investigations are summarized in this book which offers tentative conclusions plus a plethora of books suggested for further investigation. 

Part One of the book, “Taking God Seriously,” begins with the aforementioned spiritual awakening experienced by his wife, Catherine, whose “love for her [newborn] daughter surpassed anything she had ever known.”  It was, in her words, “far more than evolution required.”  Murray then focuses on his own spiritual limitations, discussing youthful Peace Corps experiences in Thailand and his largely unsuccessful attempts at meditation.  Those experiences, however, led him to see that people have “perceptual deficits” as well as talents that facilitate the ability to appreciate music, art, and spirituality.  This lack of spiritual perceptiveness, he notes, is facilitated by “Western modernity” which shelters most of us from the tragic aspects of life like the death of children that until recently plagued all people.    

Murray’s “secular catechism” in chapter three provides a succinct summary of the beliefs one is likely to inherit via cultural osmosis or higher education.  Those materialistic assumptions dismiss religion and reduce humans to highly evolved animals living on a “nondescript planet on the edge of a nondescript galaxy in a universe with a billion galaxies.”  Murray points out how unreflective that creed is, ignoring   fundamental mysteries like the amazing relationship of mathematics to the physical world and even failing to seriously ask why the universe itself exists.   

Those observations lead to thoughts about the Big Bang and its relevance for the idea of God, observations whose detailed mathematical elements can be skimmed over by non-physicists and reduced to one conclusion: the odds of there being a universe at all are vanishingly slim. This analysis is essentially the cosmological argument for the existence of God employing unimaginable exponents like 10 to the 10th power raised to the 123rd power—a number that has more zeroes “than there are elementary particles in the entire universe.”  The jury is out on the precision of that number, as it is on an alternative theory Murray acknowledges but can’t embrace--the existence of “multiverses” that account for such long odds.   

Part One ends with unexpected data offered to challenge the prevailing materialistic assumption that the mind and consciousness are essentially related to the brain. That evidence includes near death and paranormal experiences.  Murray is well aware of the unreliability of many of these reports but points out that scientific analysis of some of these incidents is beginning to occur. He’s also unwilling to dismiss out of hand evidence not amenable to rigorous scientific methods.     

In Part Two Murray turns his attention to Christianity and begins by discussing its essential contribution to the cultural efflorescence of Europe from the 15th to the 19th centuries.  A major component of that development rested on Western science’s implicit faith in the universe’s rationality. That faith, as the mathematician and philosopher Alfred North Whitehead observed, was itself rooted in the medieval “insistence on the rationality of God.”  Murray further notes that the modern decline of Western art and literature coincides with the decline of Christianity in the culture.  Quoting from his own book, Human Accomplishment, “Is it not implausible that those individuals who accomplished things so beyond the rest of us just happened to be uniformly stupid about the great questions.”  Bach, Murray argues, “made a prima facie case that his way of looking at the universe needs to be taken seriously.”  The same argument goes for other artists who held truth, beauty, and the good as their primary motivation rather than the expression of their own feelings.

C. S. Lewis’ Mere Christianity and morality are the focus of subsequent chapters.  Murray proposes, with Lewis, that at least certain moral concepts appear in all cultures and that this fact suggests a transcendent source beyond evolutionary explanations.  He then discusses what for me was the most interesting material in the book, namely observations about the dating and historical reliability of the gospels.  (That’s saying a lot, since I graduated in 1975 from Emory University’s Candler School of Theology.)  Murray does a good job of questioning the generally accepted revisionist views on those questions and provides cogent reasons for believing earlier dates are plausible and for thinking the gospel accounts of Jesus are largely accurate. Those reasons include, among others, the tradition of oral transmission in Judaism as well as detailed analyses of Jewish names and geographical references. 

 Murray’s foray into Christian belief ends with surprising observations about the resurrection, ideas that are seldom put forward as cogently by anyone with his intellectual pedigree. The final topic of that analysis concerns somewhat tangential but intriguing facts about the Shroud of Turin. Currently those facts seem to debunk a prior medieval dating and to support an unexplained origin of the image in an area and time consistent with the crucifixion of Jesus.  Obviously that preliminary evidence doesn’t prove anything about a resurrection, but it does show my casual dismissal of “just another medieval relic hoax” was premature.

In sum, Taking Religion Seriously provides significant food for thought as well as numerous sources to further investigate the truly fundamental issues it raises:  God, morality, and Christianity.   

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

 

 

MCMA: Making California Mexico Again

 Why do Californians continue to vote for destructive Democrats like Karen Bass, Gavin Newsom, and their political clones despite their incompetence at basic duties like fighting fires and facilitating post-disaster reconstruction?  As Victor Davis Hanson notes in devastating commentaries, the state has a third of the nation’s welfare recipients, half its homeless, crumbling infrastructure, staggering home and rent prices, massive social service expenditures for illegals, shocking electricity rates, aging power lines that spark wildfires, and (until a voter initiative last November reversed it) legalized looting.  That’s a short list.   

 Recently the overwhelmingly Democrat state legislature passed a bill that enabled a sixty-five cent increase in energy prices based on the climate hysteria that infects the state’s coastal elites and journalists.  That tax will doubtless fall primarily on gasoline and other utilities.  Predictably, a group called California Environmental Voters Education Fund began running a commercial for the state’s voters who sport an illiteracy rate near 25% among persons 15 and over.  It blamed California’s high gas and electricity prices on “dirty energy greed.” 

 It’s a slick 30-second spot that begins with a printed warning: “Dirty energy companies are costing Californians” then shows a lady sweating an electricity bill of $277. Next comes a gas pump reading that spikes from $83 to $101. A final scene features oil company reps clapping happily at Wall Street’s closing bell as superimposed profit figures rise from 166- to over 200-billion dollars.  A final plea reads, “Don’t make Californians pay for dirty energy’s greed,” followed by the website: “FightTheGreed.org.”

 It’s a message designed to appeal to semi-literates, stoners, Hollywood groupies, and coastal elites who may know better but also know which party butters their bread by providing a massive influx of illegal and H-1B labor.  The spot isn’t meant for anyone sharp enough to discuss California’s highest-in-the-nation 61 cent-per-gallon gas tax or to ask why oil companies aren’t so greedy in other states, much less for anyone who ponders the price to be paid for progressively eliminating California’s oil production and refinery capacity. 

 You can be sure the Silicon Valley gang will fund similar propaganda pieces for any legislation that threatens their stranglehold on the 60/19 (House), 30/10 (Senate) Democrat majorities in Sacramento.  But you can also count on local news programs to function as lapdogs for Democrat politicians.  A Karen Bass press secretary could hardly formulate more sympathetic and uncritical coverage for the mayor who partied in Africa while her city burned, fired her DEI fire chief when the latter criticized department budget cuts, and pretends that cleaning up rubble after seven months constitutes lightning progress. 

 The same local news almost never identifies criminal “migrants” as illegal, and needless to say, employs a consistent anti-ICE narrative.  ACLU lawyers are featured who denounce the racist “targeting” and “kidnapping” of anyone who’s “brown.”  And despite unavoidable coverage of the real or attempted gunshot during ICE’s marijuana field raid last month, the news slant was clear: “protestors” are defenders of “the community” and ICE is getting what it deserves.  Little or no mention is made of the fact that the farm used hundreds of illegal aliens as laborers including, according to government sources,  fourteen minors.

When I moved to the San Diego area in 1984, Ronald Reagan was about to win the state’s presidential vote overwhelmingly (57%) and for the second time.  Those POTUS elections followed two terms served by The Gipper as governor (’67-’74).  For sixteen straight years, from 1983 to 1998, California had GOP Governors (Deukmejian and Wilson).  Today no Republican has occupied any statewide office since 2011 and, as noted above, Democrats possess supermajorities in both state houses.  So what accounts for this radical shift?  Primarily, demographics.

In 2012 Ann Coulter noted that California’s non-Hispanic white population over the prior half-century fell from 80% to half that number.  By 2022 the percentage was pegged at 33.7%. Meanwhile the Latino population grew from less than 10% to over 40%, a number that includes 2.6 million “undocumented immigrants” according to a 2022 estimate by Mayorkas’s Department of Homeland Security—a  figure that doubtless grew over the final two years of Autopen’s open border.

So it wasn’t necessary to change the minds of Californians.  Just change the population residing in the state and keep a large percentage of them poor and uneducated—much like the pyramidal  population of Mexico.  The Democrat plan was to do the same with Texas.  So far that plan has backfired as traditional Latinos along the Rio Grande rebelled against the invasion that upended their lives via crime, drugs, and the sheer number of strangers who ignored their property rights just like they ignored the American border.  Barring a catastrophe that makes the L.A. fires look like a weenie roast or some dramatic and unexpected political about-face by the state’s Latinos, it’s likely the poor, ill-informed, 27% foreign-born population of California will continue voting to Make California Mexico Again.   

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

 

LEFTISTS SEXUALIZING CHILDREN

 

According to sociologist Peter Berger, his discipline is “an intrinsically debunking discipline that should be congenial to nihilists, cynics, and other fit subjects for police surveillance” and the popular suspicion of sociology is grounded in “a sound instinct for survival.” That quotation is found in historian Page Smith’s fine book, Killing the Spirit (1990). An example supporting Berger’s warning is a recent paper published in the American Sociological Association’s journal Sex and Sexualities titled “Childhood Sexualities: On Pleasure and Meaning from the Margins.”  

Not surprisingly the paper’s primary author, Deevia Bhana, holds the South African Chair in Gender and Childhood Sexuality at the University of KwaZulu-Natal.  The relatively short article (c. 3000 words) is written in the opaque intellectual jargon that has dominated academic discourse for the last half-century.  Here’s a typical paragraph: 

Re-centering pleasure at the margins therefore confronts both colonial and heteropatriarchal logics, insisting that children’s own accounts of what feels good, exciting, or frightening are legitimate sources of knowledge. Thus, letting children “do” sexual pleasure in their own way is vital for their sense of their own agency. Yet, only by tracing the circuits in which race, class, gender, and age secure or foreclose pleasure can we theorize children’s sexual worlds.

After wading through Bhana’s “scholarly” verbiage, the Manhattan Institute’s Dr. Colin Wright observed that the peer-reviewed paper views “childhood innocence” as a “colonial fiction” and that it urges us to see children as sexual beings.  Wright concludes, “It is hard to read this [paper] as anything other than laying the intellectual groundwork for dismantling age-of-consent protections.”  Beyond Bhana’s tributes to age, race, class, gender, and other power-based distinctions, it’s clear she minimizes the danger of sexualizing children and embraces “diverse” expressions of childhood pleasure.  

Unfortunately, the sexualization of children has been going on in plain sight prior to the abstruse intellectualizing which puts a sociological seal of approval on that corruption.  Bhana only seeks to add “intersectional” categories to the “educational” process of confusing youngsters about who they are by destroying “colonial” and “heteronormative” notions about childhood sexuality.  For her, drag queen story hour would merely be part one of an indoctrination program touting victimhood and pre-teen sexual “agency”--“agency” being a code-word for making your own rules and doing your own thing.

The latter task is something at which Hollywood and the film community have exceeded for some time, the case of Roman Polanski being exhibit one.  Polanski pled guilty to having intercourse with a 13-year-old girl in 1977 but still received a slew of honors subsequently, including an Academy Award for Best Director in 2003. Doubtless the overtly sexual content of music and music videos beginning in the MTV era moved the sexual needle from just teenagers toward the 9-12 “tweener” category.  More recently the “trans” movement with its companion “Drag Queen Story Hour” has exposed even younger kids to sexual expressions formerly confined to adults at Bourbon Street night clubs. 

Not wishing to be branded as “judgmental” or “conservative,” ABC and Disney’s Good Morning America in 2018 featured an 11-year-old “Drag kid” (stage name: “Desmond is Amazing”) over whom host Michael Strahan and the studio audience cheered enthusiastically during the sexually suggestive performance.  Subsequently, as Matt Walsh notes, “Desmond” graduated to dancing at gay nightclubs where patrons threw money at him.  Walsh also highlights several other “Drag kid” and “Drag Queen” atrocities that a few decades past would have warranted arrests for indecent exposure and parental child abuse.  Just recently Elon Musk  called on customers to cancel their Netflix subscriptions based on transgender themes in, among other offerings, the company’s animated show “Dead End: Paranormal Park.”  

 On the legislative front California, as usual, leads the nation by passing a law signed by Governor Newsom in 2020 that gives judges discretion about listing someone as a sex offender for having “voluntary” oral or anal sex with a minor.  The bill was promoted as bringing fairness to LGBTQ defendants.  In addition, a bill passed in 2022 (SB 107) made California a sanctuary state for minors seeking sex change treatments and surgeries proscribed in other states.  In a similar vein, in 2024 California legally prohibited schools from requiring parental notification when students change their gender designation at school.   

Elsewhere in the world, the United Kingdom’s government became brazenly deferential toward non-Western cultural standards vis-à-vis sex with and assaults against minors.  Recent convictions of gang members engaged in the grooming and rape of girls as young as ten may signal, however, that Britons are finally ready to return to more traditional or “colonial” (cf. Bhana) anti-rape mores. 

Given the academic and media-fueled frenzy in favor of deconstructing centuries of legal and cultural prohibitions against sexualizing minors (a deconstruction personified in the life and writing of its Bhana-cited intellectual avatar, Michel Foucault) one could view Jeffrey Epstein’s notorious crimes as just another degenerate by-product of our era, perhaps slightly ahead of his time.   

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

Monday, April 28, 2025

"Thank You, Dr. Fauci" Film Should Doom the Doc, But Likely Won’t

If a large number of the right people view Jenner Furst’s documentary, Thank You, Dr. Fauci, it should at the very least invert Anthony Fauci’s public reputation from the courageous voice of science to that of a mendacious bureaucrat whose reckless obsession with genetically altered vaccines paved the way for a pandemic that cost at least twenty million lives.  The likelihood of that reversal of fortune is, at the moment, slim, since the same tech, pharma, and political powers that successfully squashed the truth about the Covid lab leak and the Covid vaccines continue to exert outsized control of the means of communication throughout the U.S. government and the media world.      

That last sentence summarizes Furst’s depressing conclusion based on a mountain of evidence from documents and expert testimony.  The most prominent of those experts is Dr. Robert Redfield, the Director of the CDC at the time of the pandemic’s outbreak in 2019 and 2020. Redfield occupied the highest public health position in the Trump Administration but was curiously “iced out” of the small behind-the-scene meetings conducted by the head of the National Institute of Allergy and Infectious Diseases (NIAID), Dr. Tony Fauci.  Redfield explains this odd omission by noting that he possessed a trait not shared by Fauci and his collegial entourage, “I’m honest.”  Combine honesty with Redfield’s belief (echoed by his Chinese CDC counterpart, George Gao) that the 2019 coronavirus outbreak in Wuhan was the result of a lab leak and Redfield became in Fauci’s eyes as dangerous as the genetically altered virus itself.

As Furst shows, convincing evidence exists that Fauci was funding Wuhan’s dangerous gain-of-function research, research so hazardous it was officially banned in the U.S. from October, 2014 through 2017.  But this technique was an integral part of Fauci’s lifelong medical goal -- namely, the creation of a vaccine through genetic manipulation to treat potentially devastating viruses (like HIV and Ebola) that might emerge from nature.  To achieve this Nobel-winning breakthrough one must first create in the lab the dangerous virus itself.  It might seem absurd to create a virulent virus that can be easily transmitted between humans based on the fear that a similar virus might emerge from nature.  But according to Dr. Fauci and his colleagues, Drs. Peter Daszak and Jeremy Farrar, that’s a risk worth taking -- an assertion made even after one such virus (Ebola) likely escaped in 2014 from a Sierra Leone lab in West Africa and after the H1N1 virus in 1977 admittedly escaped from a lab (Dr. Michael Osterholm).    

It’s also amazing that Redfield’s point of view has been as effectively iced out by the media as it was from the Fauci-constituted group that early in 2020 began dumping articles around the science world that confidently asserted the natural origin, via bats, of the Wuhan virus.  Concerning one influential publication that appeared in The Lancet and denounced other views as “conspiracies,” Redfield observed, “. . . it’s not even a scientific paper.  I call it a wedding announcement.”  And of the March 17 “Proximal Origin” paper that went viral Redfield judged it “very sloppy science.”  Coincidentally, two members of Fauci’s inner circle who were associated with the 2014 Sierra Leone lab (Kristian Andersen and Robert Garry) had a grant proposal worth $8.9 million on Fauci’s desk as they began pushing the natural-origin coronavirus explanation.  Furst also notes that “bat” explanations were put forward by Peter Daszak et al. to explain the Ebola outbreak in 2014

Beyond Redfield’s comments, Furst’s film provides a clear explanation of the gene sequence within the Wuhan virus -- a biological fingerprint that provides clear evidence of genetic manipulation.  Not only does the virus possess a so-called “furin cleavage site” that promotes transmission to humans (a characteristic one prominent scientist said “stands out like a sore thumb”) it also contains, incredibly, genetic elements related to the HIV virus.  A scientific paper announcing these findings (Pradhan, et al.) appeared early in 2020 but was quickly withdrawn from publication “under intense pressure” according to Furst.  That study was also deleted from the internet and soon buried under the aforementioned Fauci-instigated “propaganda” blizzard.   

Among other points of interest exposed in Furst’s film is that the Wuhan virus began its spread in August or September of 2019 and that the city’s October military athletic games (with participants from Italy, France, Iran, and the U.S.) served as the original “super-spreader” event ---an opinion again shared by the effectively muted Dr. Redfield.  Importantly, though more indirectly, the documentary  probes links between Dick Cheney, U.S. Intelligence, bio-terrorism, the anthrax “hoax,” and Tony Fauci.  The bottom line of that inquiry implicates Fauci as a powerful insider with “billions of dollars to fund the riskiest research in the world with no oversight.”  This bio-weapon “defense” research was an enterprise in which Peter Daszak, a Fauci associate with major Wuhan (and likely intelligence) connections, was also involved.  Indeed, Furst’s documentary notes that Daszak’s DEFUSE proposal for the DOD (which was officially rejected) sought to create a virus just like Covid’s a year before the pandemic’s outbreak.  Put succinctly, and inverting a Vietnam oxymoron, Fauci, Daszak et al. hoped to create deadly bio-weapons in order to destroy them.

Furst’s documentary also reveals that Fauci’s net worth of seven million dollars at the beginning of the pandemic ballooned to twelve million a few years later, a Pelosi-level windfall not entirely explained by the career bureaucrat’s $434,000 annual salary.  Beyond whatever advance Fauci received for his self-congratulatory memoir, On Call, it’s possible that the 690-million royalty dollars pharmaceutical companies like Moderna sent to Fauci’s NIAID Institute provided some financial uptick. Coincidentally, the founder of Openthebooks.com, Adam Andrzejewski, an otherwise healthy 55-year-old man and marathon runner who uncovered information about those huge payments died in his sleep of a “sudden illness” soon afterward.   

If Dr. Redfield’s observation that the COVID virus was “an act of scientific arrogance” seems harsh, it pales in comparison with the comments of other prominent individuals.  Dr. Richard Ebright: “We had willful, deliberate misfeasance that likely caused a pandemic, killed 20 million, and cost 25 trillion dollars. Against that context it is unsurprising that individuals would prefer a lie.“  Prof. Jeffrey Sachs:  “Fauci knew and lied repeatedly.”  Dr. Marty Makary (President Trump’s nominee to head the Food and Drug Administration): “Dr. Fauci claims that they’re creating more dangerous viruses in the lab to predict a pandemic or to prevent one.  It has never happened.  It’s a fantasy.”  Dr. Andrew Huff (A former associate of Peter Daszak):  “I think the only way forward is a military tribunal to hold the long list of bad actors accountable.”

I should note in closing that Furst is a lifelong leftist, thus his film’s short references to Republicans (outside of Senator Rand Paul) are uniformly negative (Reagan vis-à-vis AIDS, Trump on Covid, and perhaps more deservedly Dick Cheney).  He gives no credit to conservative media that seriously questioned the mainstream media’s Fauci idolatry.  However, Furst’s research for the film opened his eyes so that he now says, in an interview with Tucker Carlson, that he’s become non-political.  I would liken his ideological awakening to that of Christopher Hitchens who compared his feeling about losing his devotion to Soviet-styled leftism to that of losing a limb.  But anyone who, like Furst, can seriously assert that both Rachel Maddow and Joy Reid are intelligent and well meaning is clearly still under sedation for removal of that limb.