Friday, March 13, 2026

Talarico, With His Left Hand on the Bible

Calling Texas’s Democrat senatorial candidate James Talarico, even derisively, a “bible-banger” is a disservice to bible-bangers.  At least the latter take the book seriously, whereas Talarico uses it as a prop to further his far-left Democrat agenda.  His Wikipedia page even lists “opposition to Christian Nationalism” first among his political positions -- before the legalization of cannabis, expansion of the Supreme Court, and a two-state Palestinian solution that vilifies Israel for “atrocities” and “war crimes” in Gaza.

These priorities were foreshadowed by his education, which started as a University of Texas government major followed by a Masters of Education at Harvard.  Later, while serving in the Texas House of Representatives, he earned a Masters of Divinity degree from Austin Presbyterian Theological Seminary, an institution associated with the Presbyterian Church USA.  This “mainline” Protestant group has seen its membership drop from over four million in 1965 to likely under one million this year.  It has achieved this “hospice” status, as the author of a recent First Things article notes, “by doing exactly what Talarico now proposes:  subordinating the claims of Scripture to the moral intuitions of secular progressivism and calling the result ‘the gospel.’”  Following Gavin Newsom’s lead vis-à-vis California, Talarico proposes to do for Texas and America what deconstructive theology has done for the PCUSA.

Talarico’s forays into theology are prime examples of Nietzsche’s atheistic critique, “The text has disappeared under the interpretation.”  It takes breathtaking arrogance or ignorance to cite the annunciation to Mary found in Luke 1:26-38 as a scriptural basis for abortion.  According to Reverend Talarico the fact that the angel Gabriel asks Mary’s “consent” prior to her conception somehow constitutes tacit approval of abortion since “creation has to be done with consent.”  Talarico prefaces this fatuous interpretation with the observation that Mary was “an oppressed teenage peasant girl living in poverty under an oppressive empire,” an unsupported assertion that shows where his ideological druthers lie -- much closer to Marx than Matthew or Mark. 

In truth (a word to which deconstructionists are allergic) even if Talarico’s notion of “consent” were accurate, the annunciation scene would only proscribe rape, not Democrat worship at the altar of abortion.  But in fact, prior to Mary’s “Behold, I am the handmaid of the Lord; let it be to me according to your word,” the angel had already announced that “you will conceive in your womb and bear a son, and you shall call his name Jesus.”  Yet miraculously, with his left hand on the most recent printout of Democrat talking points, Talarico manages to turn a “handmaid of the Lord” into an abortion advocate worthy of membership in Planned Parenthood.  Also ignored by the Lone Star divine is the history of early Christians saving exposed and unwanted babies in the Roman Empire and the all but universal Christian condemnation of abortion through the ages.  But apparently for Talarico, Henry Ford, and deconstructionists generally, history is, more or less, bunk.

As for Talarico’s God being “non-binary” based on the fact that the Genesis one masculine word for God in Hebrew (Elohim) is followed by a generally feminine word for “His” breath, wind, or spirit (ruach), this not-so-odd pairing stems ironically from binary-based linguistic principles in languages like Hebrew, Greek, and Spanish where nouns are usually designated either male or female.  More importantly, while languages may have non-sexual male-female gender designations, the Hebrew and Christian scriptures make it perfectly clear that God transcends human sexual categories.  And those human categories Genesis limits to two: “male and female He created them.”  Unlike the philandering Greek gods, the biblical God has no consort, creates by the word, and must not be imaged.  In short, God transcends any of the six human sexes Talarico absurdly asserts “science” has discovered.  One can only imagine what article of leftist faith the recent seminary graduate might deduce from the fact that Elohim has a plural masculine ending (im) regularly used with singular pronouns and verbs.

Talarico clearly criticizes Christians (smeared as Christian Nationalists) more than any other group.  The state rep even vilified this largely conservative cohort for using the bible (plus, I would add, common sense and reliable science) to oppose sex changes for minors.  Those who did so in the state capital, he thundered, not only harmed children, they also dishonored scripture for the sake of a “hateful amendment.”  Stated without leftist distortion, Talarico believes Christians harm children if they oppose their mutilation.  Instead, he embraces the faux science of propagandists who inundated our culture with the bizarre notion that even pre-teens are capable of evaluating the lifelong medical and psychological consequences of transition decisions.  In short, Talarico inverts biblical teaching and slanders Christians who seek to protect “little ones” from the ravages of a mass delusion.

On the other hand, Talarico expresses nothing but appreciation for other “beautiful faith traditions” that are “circling the same truth” as Christianity.  Even conceding that the major religions Talarico explicitly mentions are attempts to express in symbolism and ethical demands the nature of the cosmic “mystery,” it takes a sophomoric novice (or political shyster) to focus only on this grand generalization and to ignore the particulars of each religious tradition, including their historical impacts -- both of which are far from “the same.”  But Slick Jimmy would surely find a way to rationalize the Hindu caste system or the enduring violence and subordination of women that has characterized Islam from its inception -- likely by pointing to a verse in the New Testament that admonishes wives to obey their husbands.  No need to complete the sermon, “Husbands, love your wives, even as Christ loved the church and gave himself for it” (Ephesians 5:25).

In short, Talarico comes off as an Elmer Gantry on political steroids, using scripture for the sake of a leftist creed that also includes DEI, abortions for transgenders, a “welcome mat” on the border, white skin racism, and green hysteria.  Indeed, he felt so confident about his Marxist-inspired version of reality that he used a middle school teaching position to indoctrinate young students on the dangers of climate change.  In his own words he “freaked them out” with the same existential fears long stoked by AOC.  So not only does Talarico support surgeons who mutilate young bodies, he’s also eager to manipulate their minds and emotions.  Would that Talarico were as respectful of their humanity (and of biblical texts) as he is of the wildly differing attempts to grasp the cosmic mystery he glibly embraces for political effect.

Richard Kirk graduated from Emory University’s school of theology in 1975, taught religion classes professionally from a literary-historical perspective, and is now a freelance writer living in Southern California.  His book Moral Illiteracy: "Who's to Say?"  is also available on Kindle , as is his book Poetry with a Moral Edge.

"Trained Emotions" and the Abolition of Temperance

Allan Bloom in his devastating commentary on modern rock music made this observation:  “It may well be that a society’s greatest madness seems normal to itself.”  Rather than using this intriguing idea to critique Bad Bunny’s repulsive Super Bowl performance (an act that resonated with FOX’s King of Late Night, Greg Gutfeld) I wish to move beyond rotten trees to the more expansive infected forest in which we currently wander.

A short but penetrating book often employed in my philosophy classes was C. S. Lewis’s The Abolition of Man in which he said, “Without the aid of trained emotions the intellect is powerless against the animal organism.”  I was immediately convinced that those two words, “trained emotions,” would seem absurd to most students since that verbal combination goes counter to almost every relevant message they’ve received in their lives.   

As if to confirm the accuracy of Bloom’s madness-normalcy statement, Google highlights this incorrect “Christian” interpretation of Lewis: 

“Training emotions?” That's ridiculous, you can't train your emotions!  It's just how you feel, nothing can change that. Lewis doesn't deny that, but his point is that not all emotions are right. 

In fairness, the author of the above sentences in her further analysis does attempt, unsuccessfully, to rescue the faulty assertion that Lewis believes “you can’t train your emotions.”  In truth, Lewis means exactly what he and other philosophers have believed for millennia, namely, that emotions can be trained by the human capacity termed the spirit in Plato and “the chest” in Lewis’s treatise.  Indeed, Lewis decries a modern approach to education that has produced “men without chests” by failing to develop the human faculty that’s the source of courage, self-control, and the proper appreciation of things.  As Lewis says, echoing Plato, “The little human animal will not at first have the right responses.  It must be trained to feel pleasure, liking, disgust, and hatred at those things which really are pleasant, likeable, disgusting, and hateful.”

Instead of embracing Lewis’s and Plato’s doctrine of objective value (the idea that things merit corresponding emotional responses) we now unthinkingly embrace the idea that emotions are uncontrollable responses to stimuli and that beauty is solely in the eye (or spleen) of the beholder.  Unfortunately, it’s a short hop from this aesthetic assumption to the philosophical assertion that good and bad, morally speaking, are no more than our personal emotional responses to things.  Indeed, this belief was perhaps the most common assumption held by my students who insisted that the slaughter of innocent humans (as in the Holocaust) was only “wrong in my own opinion” rather than objectively wrong.

A virtue directly connected to the “spirit” (Lewis’s “chest”) and almost never touted nowadays is temperance.  Today that term is largely associated only with abstinence from alcohol, but historically the virtue concerned controlling and properly directing appetites and emotions.  Aristotle emphasized habit as the primary means of developing temperance -- of feeling pleasure at doing good things and displeasure at doing bad things.  But today habits are disparaged as much as temperance is ignored, while emotional impact is the premier currency employed in commercials, Hollywood productions, pop music, internet posts, and many graduation addresses.  Institutionally, only in military boot camp is Aristotle’s insight taken seriously so recruits will be able to control their emotions under fire.  

A half century ago Marshall McLuhan distinguished the dying culture of print from the new electronic culture.  The former, he noted, was formed largely by the abstract communication of printed words which facilitated unified linguistic standards, rational discourse, and the ability to act without reacting -- a quality preeminently exemplified by  jet pilots and astronauts who keep cool under pressure.  By contrast, the new electronic culture of the sixties featured emotional immediacy conveyed by music and visual images, all now exponentially increased by instantaneously composed, conveyed, and received computer messages.  McLuhan likened this new informational world to the “tribal drum.”  

If one assumes that emotions can’t be trained, then people are, as Lewis asserted, “powerless against the animal organism” -- that is, against our instincts and raw emotions which are being constantly manipulated and stimulated by the “tribal drums” of electronic images.  Consider the internet wasteland that provided the violent propaganda motivating that Canadian transgender youth who recently killed his mother, stepbrother, and several others at school in a small British Columbia town.  This young mass murderer not only put immense stock in his own emotions, those feelings were also given inordinate weight by “professionals” who couldn’t summon the wit or courage to provide a diagnosis that opposed the wrong-sex prescription repeated endlessly by media to a mentally disturbed teenager.

Far from “training” emotions, our culture regularly praises “groundbreaking” actions that destroy limits on expression and belief.  Implicitly and often explicitly, obscenity is taken as a sign of cultural sophistication as are activities that contain egregious displays of sexual perversion to which the label “mature” are absurdly attached.  In short, sensationalism and sheer novelty are often associated with progress, whereas self-control is disparaged as “self-censorship,” the assumption being that whatever one feels ought to be expressed, and likely expressed in public.  No joke is too crude for the King of Late Night, and no expression of political venom is too over the top for his competitors.   

It is possible, as Plato and Lewis assert, to train one’s emotions, but our “tribal drum” culture militates against and actively discourages it with almost every image and message that emanates from its corrupt media.  Temperance in our time need not uncritically reflect Victorian or 1950s mores, but absent development of the human capacity to train emotions, a culture drowning in moral and emotional chaos is inevitable.

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle , as is his book Poetry with a Moral Edge

Does "Jesus Gets Us" Get Jesus?

 They look in the mirror of their souls and see Jesus.  That’s the best explanation of the “Jesus Gets Us” campaign.  This biographical process isn’t a new phenomenon as even a cursory study of “life of Jesus” literature makes clear. 

The most famous review of previous “lives of Jesus” was written by the scholarly humanitarian Albert Schweitzer:  Von Reimarus zu Wrede: eine Geschichte der Leben-Jesu-Forschung, a title often translated as The Quest of the Historical Jesus.  Here’s Schweitzer’s prefatory summary:  “Thus each successive epoch of theology found its own thoughts in Jesus”—a rational Messiah, a romantic Jesus, a social gospel reformer, et cetera.  Ironically, Schweitzer’s own portrait pointed to an inscrutable but spiritually powerful figure focused on the end of times about whom nothing much could be confidently known.

To be fair, most scholarly portraits resulted from attempts to utilize what was considered reliable biblical evidence.  As far as I can tell, the “Jesus Gets Us” folk compose their caricature based on a single act performed by Jesus on his disciples and a 60s Beatles’ song, “All You Need Is Love.”  Missing is any serious consideration of the plethora  of data points that provide a more realistic portrait of the first century Jew hailed as “the Christ” (i.e. the Messiah) by his followers.  Since “Jesus Gets Us” ads evidence no concern for “historical critical” issues and takes the biblical narrative at face value, I shall do the same and see how that narrative comports with their foot-washing Jesus. 

As noted earlier, there is only one gospel account of Jesus washing feet (John 13:1-15), and that was performed on his apostles at the Last Supper.  It’s unclear how “Jesus gets us” folk would incorporate the perfumed anointing of Jesus’ own feet by a woman described as “a sinner” (Luke 7:48ff.) into their ads or, more to the point, the judgmental command to “sin no more” given to another woman caught in adultery whose stoning was nixed by Jesus’ suggestion that the first stone be cast by someone without sin (John 8:11).   

Furthermore, I can’t imagine the “Jesus Gets Us” Messiah issuing this dictum: “I say to you that everyone who divorces his wife, except on the ground of unchastity, makes her an adulteress; and whoever marries a divorced woman commits adultery” (Matthew 6:32).  It’s hard to see that same Jesus washing the feet of a woman emerging from an abortion mill like Planned Parenthood.  Then there’s the warning to, “Beware of false prophets, who come to you in sheep’s clothing but inwardly are ravenous wolves,” an admonition that could arguably be applied to obsessive foot-washers.  Indeed, it’s hard to envision those folks taking seriously half or more of the Sermon on the Mount (Matthew 5-7), especially this rather harsh command:  “Do not give dogs what is holy; and do not throw your pearls before swine“ (Matthew 7:6).  Finally, the Jesus who overturns the tables of money-changers in the Temple (Mark 11:15-17) is clearly filled with more righteous indignation than the “no questions asked” foot-washer who effectively inverts the biblical Master’s command to “be wise as serpents and harmless as doves” (Matthew 10:16). 

This incompatibility exercise could be extended ad nauseam, but it’s sufficient to show that the “Jesus Gets Us” portrait is, to be generous, incomplete.  There are, to be sure, a significant number of sayings and stories that comport well with the foot-washing image:  association with outcasts (e.g. Samaritans, women, and sinners of various stripes), the command to love one’s enemies, and a readiness to forgive transgressions.  But left out of the “Jesus Gets Us” portrait is a clear moral, spiritual voice that is diminished and distorted by a silent Messiah on his knees tacitly overlooking moral outrages that litter twenty first century America and for which, like “family planning,” Leftists have a soft spot in their hearts.

This Jesus who keeps his mouth shut and does what the Jesus of the gospels never does (i.e. wash the feet of prostitutes, political protesters, and haters of scriptural tradition) is precisely the Jesus folks desire who wish to consign religion to an irrelevant closet.  No moral demands, no condemnations, no judgments come from this Jesus-- only an action that implies passive acquiescence.  Apparently, for the “Jesus Gets Us” crew the ambiguous “Judge not” admonition in Matthew 7:1 constitutes the only verbal command they take to heart. The following verses (2-5), however, clearly imply judgments, but judgments based on self-reflection and humility.  An “absolutist” interpretation would mean that nothing will be expected of those who pass no judgments at all (cf. v. 2) and thus would contradict the plethora of judgments made by Jesus himself (cf. Matthew 23) and also expected of his followers (e.g. Mark 6:7-12).       

It is the “no-judgment, foot-washing Jesus” that seems to inhabit the souls of those who, ironically, don’t wish to arouse the kind of hatred from the powers that be that brought about the crucifixion of the real Jesus.  This “no judgment” mentality is also, not coincidentally, the default position within our largely libertine pop culture, a rule that is invariably broken to judge the “judgmental” — i.e. individuals and institutions that give voice to traditional or biblical moral standards. 

The “Jesus Gets Us” Jesus doesn’t “get” the One who spoke a lot more about exalted moral and spiritual truths than he foot-washed.  The construct does provide, however, an acceptable religious image for a permissive, rudderless culture scared to death of being judged by its rotten fruit.  For that culture a non-suffering, non-speaking, non-confrontational foot-washer works quite well.   

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

                

The "frigitity" of collectivism and real sources of warmth

Now New York City Mayor Zohran Mamdani pledged to replace “the frigidity of rugged individualism with the warmth of collectivism.” It was a bit surprising that “the Zohran” employed the term “collectivism” instead of the word effusively and loosely employed by political and ministerial do-gooders, “community.”  Perhaps he thought the latter sounded too much like “communism.” 

His political soulmate, Hillary Rodham Clinton, was savvy enough to cloak her government-centered paradise in the image of a “village” -- which in most minds conjures up visions of a largely rural setting with a few hundred homes inhabited by families gathered around the warmth of a fireplace and by individuals who voluntarily assist neighbors that mostly attend three or four different houses of worship, shop at local stores, and are acquainted with the village banker. 

Of course that’s not what HRC really had in mind, her preference being ecologically-dictated, city-centered, government-subsidized apartments with as few “nuclear families” as demographically possible.  Recall the Obama utopia provided in his (and doubtless Hillary’s) “Life of Julia” cartoon.  This “villager” was an independent yuppie who focused first on her career as a web designer, had a single child at 31 (with no husband or father mentioned), later starts her own business, and then retires at 67. Finally, supported by social security and Obamacare, she begins volunteering at a community garden, presumably seeking a little human warmth or perhaps doing penance.  

Mamdani’s tendentious alternatives, of course, mischaracterize both options.  “Rugged individualism” was a characteristic of a few pioneers who acted as Giants in the Earth, but even most of these heroes depended on families for warmth and succor.  What Mamdani disparages, in truth, are folks, largely in families, who are self-sufficient.  Even a “village” that takes care of its own affairs is unwelcome in “the Zohran’s” eyes.  Indeed, it was the prevalence of voluntary associations, not “rugged individualism” that impressed the French aristocrat Alexis de Tocqueville as he traversed America in the age of Jackson. “Americans of all ages, all conditions, all minds constantly unite....  Americans use associations to give fetes, to found seminaries, to build inns, to raise churches, to distribute books, to send missionaries to the antipodes; in this manner they create hospitals, prisons, schools.”  This communal self-sufficiency is Mamdani’s nightmare.  What he aspires to see is dependence on government and absolute power in his own manicured Red hands.

Unfortunately for America, its habit of communal self-sufficiency is no longer a defining characteristic, as the desire for Washington D.C. to pay for or provide goods and services for lone individuals and favored groups has grown exponentially -- something Tocqueville called the danger of  “democratic despotism” lurking in the nation’s future thanks to a dissolution of its religiously-based moral fiber.  Instead of exhibiting any brand of “rugged individualism,” most Americans are painfully aware that they now exist within a largely corporate techno-culture enervated by mass immigration and slouching toward globalism.  Such a social order, as the politics of Minnesota, California, and New York illustrate, is ripe for the deceptive “warmth” of collectivist rhetoric.  As for the frigid individualism of Mamdani’s fantasies, it now takes the form of Bowling Alone.   

Ayn Rand, who experienced the expropriating “warmth” of Lenin’s collectivist society before moving to America in 1926, observed that “[s]ocialism is the doctrine that man has no right to exist for his own sake, that his life and his work do not belong to him, but belong to society, that the only justification of his existence is his service to society, and that society may dispose of him in any way it pleases for the sake of whatever it deems to be its own tribal, collective good.”  “Frigidity” is a term that accurately attaches to that dehumanizing doctrine whose warmth is often obtained only by emaciated bodies huddled together in a Siberian concentration camp.  Less dramatically, one might picture the “frigidity” of an impersonal crowd seated uncomfortably in a large, drab one-room facility waiting for their assigned numbers to be broadcast by a score of bureaucrats assigned duties behind individual windows at the Department of Motor Vehicles.  Without a return to the warmth of personal moral responsibility within the confines of families and local communities, the continued expansion of that DMV experience to our broader “collective” society will likely continue apace while only privileged comrades in Gracie Mansions around the country enjoy the non-collective perk of bidets in their washrooms.  

AN INCONVENIENT STUDY

 

Even persons familiar with RFK Jr.’s damning statistics about American health might be surprised by the information contained in Del Bigtree’s new documentary An Inconvenient Study, a film that also depicts with gut-wrenching impact the human misery represented by those numbers thanks to the cowardice, greed, arrogance, and power of the medical and pharmaceutical industries.                                                                                                                    

The film’s Emmy Award-winning producer has been a medical journalist for two decades, first at CBS and later as host of “The High Wire” Internet show.  Bigtree’s prior documentary, Vaxxed (2015), brought him to national attention when the film, which had been touted by Robert De Niro, was pulled from the actor’s own Tribeca Film Festival for being anti-vaccine and containing junk science.  This linked news segment about the incident provides an example of how major media serves as a megaphone for powerful corporate interests--a role effectively noted in An Inconvenient Study

The aforementioned RFK Jr. statistics can be summarized using this single data point:  54 percent of American kids now have a chronic disease of some sort, whereas in the 1980s that number was 12.8 percent.  Coincident with that precipitous rise has been a dramatic increase in the number of vaccines administered to kids from birth to age 18--from around 20 in the 1980s to 72 or more now. The question raised by this correlation is obvious:  Is there a causal link between vaccinations and chronic disease and especially between vaccinations and the huge increase in autism?    

The specific study that Bigtree’s film revolves around was done by the Henry Ford Health System and was directed by Ford’s Head of Infectious Diseases, Dr. Marcus Zervos.  It was designed as retrospective and compared health outcomes of vaccinated with unvaccinated patients in Ford’s system.  Such a study is the only option available since medical authorities consider double-blind placebo trials of vaccines unethical.  A study of that sort, it’s argued, would be using kids as guinea pigs.  Consequently, vaccines are often tested against similar vaccines, trials that show if one vaccine is more dangerous than its cousin but not if either vaccine has unintended consequences.   Additionally, some tests to secure the “safe and effective” label reiterated ad nauseam by politicians, media, and the pharmaceutical industry, are absurdly small and of short duration, as was true of the Hepatitis B recombinant vaccine often administered to infants on the first day of life.  Its safe and effective label was secured by testing 147 infants for 5 days.

Three prior retrospective studies are presented in the film.  Each showed significantly negative results for the vaccinated.  Dr. Peter Abby focused only on the DTP vaccine he administered to half the children of the African country of Guinea-Bissau.  Thirty years later he found that while the vaccine was effective at preventing diphtheria, tetanus, and whooping cough, those infants who took the vaccine died from other diseases at five times the rate of the unvaccinated.  A more limited cohort of patients was broadly analyzed by a Portland pediatrician who discovered negative outcomes for the vaccinated in terms of allergies, autoimmune diseases, ADHD, and infections of all kinds.  For his trouble the doc had his published paper retracted and his medical license revoked.  Another physician was “dragged through the coals” for a study analyzing 600 kids.  It showed significantly negative results for vaccinated patients in terms of allergies, neurological disorders, and especially autism.   

As with the movie Vaxxed, these studies were panned for being too small or otherwise flawed in ways that voided their unwanted findings.  Consequently Bigtree pushed hard for Dr. Zervos, a respected pro-vaccine doctor at a major medical center, to use Ford’s extensive data base to perform a comprehensive and rigorous vaccinated vs. unvaccinated analysis.  In 2020 Zervos agreed to oversee such a project and to publish it no matter the results.  In pressing his case, Bigtree noted that such a study could finally shut the mouths of vaccine critics like himself.  Two years later the analysis had been completed but the Ford Health group wouldn’t publish it.  Its results, however, were made available to Bigtree.  

The climax of the documentary is Bigtree’s hidden camera discussion with Zervos (June 5, 2022) during which the producer makes a final plea to Zervos to make good on his prior pledge to publish the study no matter the results.  Zervos acknowledges that the study with its largely negative findings for the modern vaccine regimen was, in his opinion, a good study that should be published.  He insists, however, that in the current political and media environment its publication will only result in his firing and a trashing of the study’s quality.  In Zervos’s own words, “Publishing something like that, I might as well retire.  I’d be finished.”  Zervos knew whereof he spoke since previously he had been torn apart for publishing an heretical article on the effectiveness of hydroxychloroquine for treating COVID.  He wasn’t willing, he told Bigtree, to go through that again. “I can’t handle it.”

The Ford Study was based on a population of 18,468 subjects, 1,957 of whom were fully unvaccinated.  The results were, in Bigtree’s words, “A bombshell.”  Among the devastating findings were the following:  the vaccinated group had a six times higher risk for autoimmune diseases, a four times higher risk for asthma diagnoses, a 5.6 times higher risk for neurodevelopmental disorders, and a 4.47 times higher risk for speech disorders.  Several comparisons were statistically impossible to calculate because no cases of the disorder were found among the unvaccinated, conditions that included brain dysfunction, diabetes, behavioral problems, learning disabilities, and ADHD (the latter of which had 262 diagnoses among the vaccinated).  A similar mathematical issue arose with autism where the single case among the unvaccinated could not be usefully compared to 23 within the vaccinated cohort.

A summary statistic found that after ten years there was an 83 percent likelihood that the unvaccinated would avoid any chronic disease whereas in the vaccinated group there was only a 43 percent chance of being free of a chronic disease.  This result tracks perfectly with the aforementioned fact that over half of American kids now suffer from some chronic disease.  The devastating human toll for medical, pharmaceutical, political, and media complicity in silencing studies like those mentioned in An Inconvenient Study  is presented during the first twenty minutes of the film.  It can be viewed for free on the Internet via aninconvenientstudy.com  as long as Bigtree resists Ford Health’s cease and desist order. 

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

Monday, October 27, 2025

An Enemy Within: America’s Pop Culture

What’s contemporary Top 40 music like?  It isn’t a question I would have been inclined to ask were I not forced by my cable company to go initially to that Sirius channel before tuning to soft rock, country, or classical offerings.  What caught my ear, however, during those brief exposure episodes was the sameness of the bland sounds emitted before I switched to more familiar styles.  Eventually I decided to endure several consecutive Top 40 Sirius songs to see if my random encounters were representative, and lo and behold, they were!

Typically the artist’s voice is only slightly more prominent than background synthesizer music.   To employ a visual analogy, that background sound would be captured by a palette of undifferentiated yellowish browns.  Rarely does one hear distinctive musical instruments that were integral components of most musical offerings just a few decades ago.  Occasionally uninspiring piano and drum accompaniments are discernible, but there’s nothing like Mick Fleetwood’s drums or the brassy excellence of Chicago.     

The musical scores in Top 40 are also hugely repetitive and generally possess all the range and complexity of bad elevator music—characteristics shared by the vocalists who thankfully rush through their lyrics a la Kendrick Lamar’s Super Bowl halftime fiasco.  This defect is occasionally remedied for teenage fans and maturity-stunted adults by videos that supply the often inaudible wording.  Failing that musical patch, lyrics are always accessible on the net.  

Even the songs of Taylor Swift are, at best, incremental improvements on the above descriptions.  A columnist for the UK Times, Rod Liddle, had the courage to label Swift a “caterwauling blonde moppet” who produces “banal and life-sapping sub-Kardashian electropop drivel,” and for his literary trouble received a restraining order from the popstress’s lawyers.  Readers can judge for themselves the accuracy of Liddle’s review based on this hit romantically titled “Slut.”    

The moral tenor of contemporary Top 40 offerings, not surprisingly, coincides with the songs’ lack of acoustical virtuosity.  A piece that accurately reflects that congruence is innocently titled “Diet Pepsi” by a singer named Addison Rue.  The video consists of a series of bra and panties erotic poses in the front and back seat of a ’65 Mustang driven by a James Dean type.  A somewhat rappish singer named Lola Young who climbed to number one on several Euro-charts takes care of the narcissistic, f-bomb, parent-trashing spectrum with “Messy.”

The moral vacuity, of course, isn’t limited to hetero-normatives as this headline explains: “Lesbian viral sensation Gigi Perez has admitted that her queer anthem ‘Sailor Song’ changed her life.”  The full lyrics are linked here and include these lines addressed to her same-sex object of desire:  “I don’t believe in God, but I believe that you’re my savior” plus “And when we’re getting dirty, I forget all that is wrong.”     

I’m well aware that pop and rap songs largely aimed at kids from 9 to 19 aren’t a space to consult for uplifting moral sentiment, but nowadays the degrading, self-centered undertow has become a lyrical and visual squalor that’s even reflected in a poverty of musical excellence.  Whereas post-WWI Dadaism was a self-conscious mocking of established standards, today’s pop music represents, I think, a largely unconscious assumption of the id-generated desires promulgated preeminently in Hollywood.  Although these themes have been present in pop music for decades (most revoltingly in Madonna’s sacrilegious video “Like a Prayer”) today’s pop music, with MTV-spawned visuals, is saturated with the basest lyrics that seldom reach beyond the singer’s primal desires--no more tying ribbons around old oak trees or occasionally teaching the world to sing in perfect harmony.   

Does all this attention to Top 40 music really matter?  I’m afraid it does.  Recently Douglas Murray at an Alliance for Responsible Citizenship meeting mentioned a politician who proudly noted how the extraordinary events of October 7 brought forth extraordinary young people ready to meet that crisis. Murray observed that her statement needed an important adjustment, namely that it isn’t just extraordinary events that throw up extraordinary youths, instead “it’s extraordinary events when they come up against people who have been extraordinarily well cultured” (minute 7).

Is it possible that a society that countenances, praises, or ignores Top 40 garbage like the aforementioned can produce a sufficient number of extraordinary youths to meet future crises?  Combine those toxic sounds and images with their cinematic cousins and Americans face a Herculean stable clean-up task.  It is encouraging that military recruitment spiked significantly following President Trump’s election, but the power of the “resistance” baked deep in our culture is akin to a cancer slowly spreading its tentacles into a body’s vital organs.  It’s a disease that won’t be eliminated by four years of political therapy.  A more radical intervention will be required, the prerequisite to which is acknowledging the mortal danger itself.    

 Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

 

Charlie Kirk's Legacy

In The Mission Robert Bolt’s Jesuit priest (Jeremy Irons) is shot dead while carrying a cross in a pacific procession while protecting his 18th century Indian community from Portuguese troops whose masters seek to enslave them.  Two decades earlier Bolt dramatized a pious Sir Thomas More in A Man For All Seasons.  More’s papal-grounded opposition to Henry VIII’s divorce from Catherine of Aragon and marriage to Anne Boleyn is rewarded with imprisonment and, after the perjured testimony of a one-time admirer, beheading.  Not long ago Clint Eastwood in Gran Torino dies to save his young foreign friend by exposing those who threatened him.  Eastwood’s irascible character, Walt, lies stretched out in cruciform after having been pummeled with bullets by a neighborhood gang in plain sight of onlookers. 

Those examples are cinematic creations that pay tribute in various ways to their spiritual predecessor.  Charlie Kirk, however, was a Christ figure in real life.  Far from a “spreader of hate” as asserted by politicians and commentators who specialize in that activity, Charlie practiced respectful dialogue and viewed even rude interlocutors as fellow humans created in the image of God.  Anyone taking the time to sample the many hours of dialogue available will never see any name-calling or personal insults proceed from his mouth.  Instead one witnesses an impressive willingness to listen to opposing comments.  Those who disagree are encouraged to “come to the front of the line,” and their often ill-tempered challenges are met with thoughtful, sometimes forceful, responses.  Frequently Charlie seeks points of agreement from which further dialogue becomes possible. 

In encounters with exponents of the narcissistic relativism that permeates American campuses, Charlie sought to expose the vacuity of arguments that reject God or Nature’s God as the fundamental ground for morality.  “My heart” and “my truth” provide no rational counter-argument to “Mao’s truth” or “Stalin’s truth.”  Absent a more fundamental foundation even “the greatest good for the greatest number” melts into a hodge-podge of undefined terms.  What is meant by “greatest good” or by “greatest number,” and how do we understand the word “good”?  And why must everyone follow this formula if nothing undergirds it?  Such are the infuriating questions posed in Socratic fashion to students who have been told (not  taught) that religion is foolish superstition, that the Western tradition represents the prejudices of old white men, and that morality is nothing more than a person’s own “values.”  

It was galling to detractors that this vibrant, intellectually intimidating young man gained his insights by consulting the Western literary tradition and contemporary scholars like Thomas Sowell—especially taking to heart the wisdom contained in the Old and New Testaments. “Can anything good come out of Nazareth?” was a skeptical observation from two millennia past.  A contemporary echo of that comment would be “Charlie didn’t go to college and quotes the Bible”--an observation that condemns most of higher education rather than the intellectual stature of the autodidact who rejected schooling’s institutional corruption.  

As Dennis Prager regularly observed, “You have to go to college to believe some things.”  Men can become women and vice-versa.  A man who has undergone hormonal treatment to appear more feminine can compete on an equal playing field with women.  You are the sex (‘gender”) you want to be.  A man who wants to be a woman is a woman and thus entitled to undress in the women’s locker room and to invade a women’s spa.  A mountain of evidence and common sense was available for Charlie to counter those absurdities--millennia of tradition plus obvious physiological, psychological , and chromosomal differences.  Similar arguments were presented to abortion advocates who equate “my body” with the nascent life having its own genetic structure within that body.

To religious or “spiritual” challenges that stress “Jesus gets us” tolerance, Charlie presented the Jesus of  Scripture (John 8:1-11) who not only saved the woman accused of adultery from being stoned but also approached her and said, “Go, and sin no more.”  Yes, mercy is fundamental to Christianity but also the truth about sin.  The latter value, Kirk noted, has been forgotten or even condemned as “judgmental” not only by popular culture but also by a vast number of Christians.

Perhaps most of all Charlie was hated as the image of a personal goodness that exposes, like sunlight, the moral and spiritual decay of our culture and of individuals consumed by that decadence.  He was a faithful husband with two beautiful children, to whose welfare he was passionately devoted.  He engaged in a disciplined, spiritually informed health regimen—no drugs, exercise, healthy food and drink.  He was remarkably judicious in the use of his time, a trait inimical to a culture awash in slothful self-indulgence.  Such things arouse self-loathing when one looks in the mirror and sees the polar opposite.  Now his assassination has exposed the evil that brims with revulsion at an utterly decent  husband and father who was skilled at respectful and forceful dialogue—a Christian devoted to God, family, and country.

 “Truly, truly, I say to you, unless a grain of wheat falls into the earth and dies, it remains alone; but if it dies, it bears much fruit” (John 12:24).  So may it be with the death of Charlie Kirk.  

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

 

                       

Thursday, October 23, 2025

On Democracies and Death Cults -- October 7 and the Gaza War

On Democracies and Death Cults by Douglas Murray

If any book could open the eyes of clueless pro-Hamas student protesters, Douglas Murray’s On Democracies and Death Cults would likely be the one.  Packed with eyewitness accounts of the horrific October 7 massacre and Israel’s subsequent response, only ideological intransigents would instinctively ignore the massive moral gulf separating Hamas from its Jewish enemies.

Far from a philosophical analysis, Murray addresses the Democracy-Death Cult clash by relating what he saw, what he gathered from interviews, and, incredibly, by what was available via phone messages, social media posts, and filming of the atrocities—much done by the terrorists themselves.  “Using GoPro cameras and mobile phones the terrorists broadcast their acts of violence with pride.  By late in the day on October 7, it was already clear that these acts included burning people alive, shooting innocent people, cutting off people’s heads, and raping men and women.  Sometimes before killing them.  Sometimes after.”  

The account Murray provides of the attack is vivid and personal.  Parents get messages of their children’s last desperate minutes while Hamas fighters, unlike the Nazis, publicize the grisly torture they inflict on Jews.  Relevant detours into the history of Israel’s struggle for survival come as a relief, as do paragraphs devoted to the burgeoning population of Gaza and non-Jewish Israel—figures that conclusively rebut the popular “genocide” accusations against Israel.  “Apartheid state” calumnies are likewise countered with facts about Arab participation in Israel’s government even at the highest levels.

The story of one Hamas terrorist, Yahya Sinwar, serves as a singular representative of the Death Cult throughout Murray’s work.  Sinwar was a leader of the October 7 attack but had been recruited into Hamas years earlier.  In 1988, Sinwar was imprisoned for the murder of four Palestinians he suspected were informers—crimes to which he proudly admitted.  One of the few Israelis who had regular contact with Sinwar in prison was a dentist, Dr. Yuval Bitton.  In 2004 Bitton noticed something was wrong with Sinwar and arranged for him to be sent to a medical center where he was operated on for a brain tumor.  Bitton visited Sinwar in the hospital and the latter thanked him for saving his life.

The story doesn’t end there.  In 2011 Sinwar was the highest level prisoner released in a 1,027 to 1 swap for a young Israeli soldier who’d been imprisoned in Gaza for five years.  (Israel withdrew from Gaza in 2006.) Upon release Sinwar immediately resumed his position in Hamas and advocated taking more Israeli hostages to free other Palestinians in Israeli jails--a tactic expanded on October 7 to include even dead Israelis.  On that same horrendous day a farmer, Tamir Adar, and his family were apparently killed in the Hamas attack.  None were ever heard from again.  Tamir was the nephew of Dr. Yuval Bitton.

Near the end of his book Murray recounts the killing of Sinwar a year after the initial massacre.  “Sinwar had been killed in Rafah, in the south of Gaza, in the place where Vice President Kamala Harris and many other international observers had insisted the IDF should not go.”  The comment about Harris and “other international observers” reiterates a point often made in the book, namely, the hand-tying, “proportional response” demands regularly imposed on Israel by world leaders who dismiss the devastating impact of Hamas and Hezbollah missiles on community life. “Why was the whole country so littered with bomb shelters that on the 7th people ran into them across the south and were promptly massacred inside them by Hamas?  How was this a way to live?  And who else would live like this?”  Even so, Murray notes that civilian casualties in Gaza have been exceptionally low by historical standards—a fact that didn’t prevent the International Criminal Court from designating Prime Minister Benjamin Netanyahu a war criminal.

Chapters three and four are largely devoted to Western responses to the October 7 Hamas atrocities. That very evening “a great crowd of anti-Israel protesters had gathered outside the Israeli embassy in London, among other places, to celebrate the massacres of the day.  They waved flags and lit flares while shouting the same war cry and victory cry as the terrorists, ‘Allahu Akbar!’”  A Times Square protest against Israel occurred the next day “while Hamas terrorists were still murdering their way through the south of Israel.” The general frivolity of student protests on American campuses, where chants of “intifada” went out alongside demands for more accommodating toilet facilities and “alternative milk,” blatantly contrasted with Hamas atrocities and the courageous response to those acts that was occurring in Gaza and Israel, sometimes by females the same age as the privileged protesters. 

Throughout the Western world these anti-Jewish protests proliferated, egged on by Professors whose words would have gotten them fired if directed against gays or blacks.  One example of many: “Cornell University history professor Russell Rickford was filmed at an anti-Israel rally praising Hamas’s massacre and telling the crowd, ‘It was exhilarating , it was energizing.’”  What does it mean, Murray asks, that “on the streets of every major Western city, people who must have known what had been done on the 7th publicly took the side of the aggressors?”

A psychological explanation was previously given by Soviet novelist Vasily Grossman:  “Anti-Semitism . . . is a mirror for the failings of individuals, social structures and State systems.  Tell me what you accuse the Jews of—I’ll tell you what you’re guilty of.”  Murray expands this dictum to apply to the student protestors whose view of Western culture has been warped by radical leftists:  “Tell me what you accuse the Jews of—I’ll tell you what you believe you are guilty of.”  For Gazans and persons throughout the Arab world an historical explanation largely suffices, starting with the still celebrated pact between Hitler and the Mufti of Jerusalem—a collaboration which continues to make Mein Kampf a best seller.   

These “explanations” comprise only a small fraction of Murray’s book which is devoted overwhelmingly to describing what happened on October 7, how individual Jews responded, and how the Western world responded.  It is those journalistic details that make On Democracies and Death Cults a work that might even turn the head of students more interested in performative protest than in the truth about good and evil, life and death.

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle    

In Praise of Hypocrisy

 “Hypocrisy is the tribute that vice gives to virtue.”  That’s an aphorism that has fallen into desuetude, along with the word “desuetude.”  The saying was still in use in the mid-twentieth century but became virtually meaningless in popular culture after the sixties.  That was when the moral imperatives now popular came into fashion:  “Stand up for what you believe in” and “Be true to your values.”  Instead of being encouraged to be virtuous, the general public was told to affirm and exhibit for the world whatever their beliefs happened to be--hedonistic, nihilistic, Marxist, Christian, et cetera.

Simultaneous with this dubious moral revisionism, hypocrisy was promoted to number one on the scale of bad things, standing as it does in direct opposition to the aforementioned imperatives.  A hypocrite doesn’t outwardly embrace what he really believes in or values.  The question thus arises, how is hypocrisy in any sense a tribute to virtue?   

To answer that question one must explore the dramatic origin of the word “hypocrisy,” literally “an actor under a mask.”  As thus understood, the idea of “pretense” is a necessary component of the term—an element now often ignored.  And what the moral “actor” pretends to be is virtuous, or at least more virtuous than he really is. Given this meaning of the word, simply failing to be true to one’s values would not make one a hypocrite since neither pretense nor virtue need be part of that behavior.  The ubiquitous “I’m only human” excuse would suffice to provide secular absolution for any disconnect between values and performance.

It’s only when an individual pretends to be virtuous when he isn’t as virtuous as he pretends to be that hypocrisy in the aphoristic sense comes into play.  The reason for pretending to be virtuous is that virtue is, or at least was, generally recognized as superior to vice.  This recognition of virtue’s superiority (even if only pretended for public consumption) is the “tribute” vice gives to its opposite number.

We can thank the famous French philosopher of the 1960s, Jean-Paul Sartre, for the aforementioned moral revisionism that replaced objective moral standards with self-defined mores and substituted “authenticity” for virtue.  Being “authentic” involved embracing one’s own actions and standards of conduct.  Consequently, “hypocrisy” was transformed into the vilification of “inauthentic” persons who failed to embrace their own actions or standards of conduct.  Nowadays “hypocrite” has become the only judgmental epithet many persons are willing, and eager, to employ.

The most pernicious use of this redefined term is to vilify persons who don’t live up to the high standards they espouse, thus making it equivalent to the word “sinner” or, in more pedestrian terms, “imperfect.”  It’s true that a hypocrite in the traditional sense “pretends” to be something he is not, but it is not the case that someone who fails to live up to exalted moral standards is a hypocrite.  A person who fails to clear a traditional moral bar set at seven feet isn’t a hypocrite unless he pretends otherwise.  

Yet thanks to today’s linguistic legerdemain all morally serious persons, people whose ideal of virtue exceeds their grasp, have become hypocrites.  Moral zeroes, by contrast, are deemed “honest” or “true to themselves” if they set their moral bars flat on the ground and step triumphantly over them.  No one accused Howard Stern (at least not back in the day) of hypocrisy.  Instead his shamelessness, formerly at or near the bottom on the scale of vices, was embraced by the cultural avant-garde.  Stern openly and profitably disparaged traditional standards of virtue.  

Thus, in this topsy-turvy world of setting one’s own moral standards, the ethical playing field is hopelessly slanted in favor of shamelessness.  The rules of the game encourage everyone to place the moral bar as low as possible and to prize being non-judgmental above all else.  Anyone who dares raise the bar of virtue high will be pummeled with charges of hypocrisy for failing to be perfect (cf. William Bennett, The Book of Virtues).     

Being a hypocrite in the traditional sense isn’t a good thing, but it’s better than shamelessness.  The latter doesn’t pay tribute to virtue at all, whereas the former exists in a world where virtue is an objective good honestly pursued by imperfect people and sometimes indirectly honored even by those corrupted by vice.  

 

Taking Religion Seriously -- Review of Charles Murray's Book

 

Taking Religion Seriously isn’t a book you would expect from a political scientist most well known for Losing Ground and The Bell Curve.  It’s not, however, surprising that a man eighty-two years of age should ponder the topics addressed in this brief work that can be read in a few hours.  Broadly speaking those topics are God, morality, and Christianity.

Though Murray claims no special expertise on those matters, it’s obvious he’s devoted considerable time to exploring the subject matter--a largely intellectual journey that began three decades earlier with his wife’s pursuit of a religious community congruent with her profound experience of motherly love.  The latter search found a suitable destination with the Quakers.  Charles’ more intellectual investigations are summarized in this book which offers tentative conclusions plus a plethora of books suggested for further investigation. 

Part One of the book, “Taking God Seriously,” begins with the aforementioned spiritual awakening experienced by his wife, Catherine, whose “love for her [newborn] daughter surpassed anything she had ever known.”  It was, in her words, “far more than evolution required.”  Murray then focuses on his own spiritual limitations, discussing youthful Peace Corps experiences in Thailand and his largely unsuccessful attempts at meditation.  Those experiences, however, led him to see that people have “perceptual deficits” as well as talents that facilitate the ability to appreciate music, art, and spirituality.  This lack of spiritual perceptiveness, he notes, is facilitated by “Western modernity” which shelters most of us from the tragic aspects of life like the death of children that until recently plagued all people.    

Murray’s “secular catechism” in chapter three provides a succinct summary of the beliefs one is likely to inherit via cultural osmosis or higher education.  Those materialistic assumptions dismiss religion and reduce humans to highly evolved animals living on a “nondescript planet on the edge of a nondescript galaxy in a universe with a billion galaxies.”  Murray points out how unreflective that creed is, ignoring   fundamental mysteries like the amazing relationship of mathematics to the physical world and even failing to seriously ask why the universe itself exists.   

Those observations lead to thoughts about the Big Bang and its relevance for the idea of God, observations whose detailed mathematical elements can be skimmed over by non-physicists and reduced to one conclusion: the odds of there being a universe at all are vanishingly slim. This analysis is essentially the cosmological argument for the existence of God employing unimaginable exponents like 10 to the 10th power raised to the 123rd power—a number that has more zeroes “than there are elementary particles in the entire universe.”  The jury is out on the precision of that number, as it is on an alternative theory Murray acknowledges but can’t embrace--the existence of “multiverses” that account for such long odds.   

Part One ends with unexpected data offered to challenge the prevailing materialistic assumption that the mind and consciousness are essentially related to the brain. That evidence includes near death and paranormal experiences.  Murray is well aware of the unreliability of many of these reports but points out that scientific analysis of some of these incidents is beginning to occur. He’s also unwilling to dismiss out of hand evidence not amenable to rigorous scientific methods.     

In Part Two Murray turns his attention to Christianity and begins by discussing its essential contribution to the cultural efflorescence of Europe from the 15th to the 19th centuries.  A major component of that development rested on Western science’s implicit faith in the universe’s rationality. That faith, as the mathematician and philosopher Alfred North Whitehead observed, was itself rooted in the medieval “insistence on the rationality of God.”  Murray further notes that the modern decline of Western art and literature coincides with the decline of Christianity in the culture.  Quoting from his own book, Human Accomplishment, “Is it not implausible that those individuals who accomplished things so beyond the rest of us just happened to be uniformly stupid about the great questions.”  Bach, Murray argues, “made a prima facie case that his way of looking at the universe needs to be taken seriously.”  The same argument goes for other artists who held truth, beauty, and the good as their primary motivation rather than the expression of their own feelings.

C. S. Lewis’ Mere Christianity and morality are the focus of subsequent chapters.  Murray proposes, with Lewis, that at least certain moral concepts appear in all cultures and that this fact suggests a transcendent source beyond evolutionary explanations.  He then discusses what for me was the most interesting material in the book, namely observations about the dating and historical reliability of the gospels.  (That’s saying a lot, since I graduated in 1975 from Emory University’s Candler School of Theology.)  Murray does a good job of questioning the generally accepted revisionist views on those questions and provides cogent reasons for believing earlier dates are plausible and for thinking the gospel accounts of Jesus are largely accurate. Those reasons include, among others, the tradition of oral transmission in Judaism as well as detailed analyses of Jewish names and geographical references. 

 Murray’s foray into Christian belief ends with surprising observations about the resurrection, ideas that are seldom put forward as cogently by anyone with his intellectual pedigree. The final topic of that analysis concerns somewhat tangential but intriguing facts about the Shroud of Turin. Currently those facts seem to debunk a prior medieval dating and to support an unexplained origin of the image in an area and time consistent with the crucifixion of Jesus.  Obviously that preliminary evidence doesn’t prove anything about a resurrection, but it does show my casual dismissal of “just another medieval relic hoax” was premature.

In sum, Taking Religion Seriously provides significant food for thought as well as numerous sources to further investigate the truly fundamental issues it raises:  God, morality, and Christianity.   

Richard Kirk is a freelance writer living in Southern California whose book Moral Illiteracy: "Who's to Say?"  is also available on Kindle