376 BAD THERAPY  

Asking what’s gone wrong with our kids is an ancient endeavor, but these days we must deal with what seems to be an unusually troubled younger generation awash in a “youth mental health crisis.”  Abigail Shrier, in Bad Therapy:  Why Kids Aren’t Growing Up (New York:  Penguin Publishing Company, c. 2024; Kindle Edition) offers a thoughtful analysis that merits attention.   While acknowledging some youngsters need serious psychological treatment she’s concerned about “the worriers; the fearful; the lonely, lost, and sad.  College coeds who can’t apply for a job without three or ten calls to Mom.”  They’re not mentally ill but they’re doing poorly and look for “diagnoses to explain the way they feel.”  Rarely does this help, but:  “We shower these kids with meds, therapy, mental health and ‘wellness’ resources, even prophylactically.  We rush to remedy a misdiagnosed condition with the wrong sort of cure” (p. xii).  

Shrier remembers how she was reared.  Parents spanked when necessary and rarely worried about their kids’ feelings.  They were told where to go, how to dress and behave.  Probing their kids’ psyches for some “repressed identity” never occurred to them.  “But as millions of women and men my age entered adulthood,” she says, “we commenced therapy.  We explored our childhoods and learned to see our parents as emotionally stunted.  Emotionally stunted parents expected too much, listened too little, and failed to discover their kids’ hidden pain.  Emotionally stunted parents inflicted emotional injury” (p. xv).  Resolving to do better, her generation determined to rear “happy” kids.  “We resolved to listen better, inquire more, monitor our kids’ moods, accommodate their opinions when making a family decision, and, whenever possible, anticipate our kids’ distress.  We would cherish our relationship with our kids. Tear down the barrier of authority past generations had erected between parent and child and instead see our children as teammates, mentees, buddies” (p. xvi).   And to do so they trusted a bevy of “wellness experts.”   

       Such experts were anxious and willing to help!  Therapy would solve all problems.  To provide more help than professional therapists could give school administrators jumped into the “crisis” and urged teachers to counsel and coddle their students, becoming “partners” with their parents in providing emotional comfort.  Shrier and millions more “bought in, believing they would cultivate the happiest, most well-adjusted kids.  Instead, with unprecedented help from mental health experts, we have raised the loneliest, most anxious, depressed, pessimistic, helpless, and fearful generation on record.  Why?  How did the first generation to raise kids without spanking produce the first generation to declare they never wanted kids of their own?  How did kids raised so gently come to believe that they had experienced debilitating childhood trauma?  How did kids who received far more psychotherapy than any previous generation plunge into a bottomless well of despair?” (p. xvii).  How indeed?  

The answer was therapy and more therapy.  Tragically, Shrier thinks, this resulted in an epidemic of iatrogenesis, illustrating how “healers” often make things worse.  “Well-meaning therapists often act as though talking through your problems with a professional is good for everyone.  That isn’t so” (p. 8).  Talking doesn’t always help, as careful studies dealing with policemen, burn victims, breast cancer patients, and grief-ridden mourners show.  Folks who say they don’’t want to talk about their problems are often much wiser than those who insist they do!  Unfortunately we’ve been persuaded that lots of us are sick.  Most Gen Zers think they have mental health issues and almost “40 percent of the rising generation has received treatment from a mental health professional—compared with 26 percent of Gen Xers” (p. 17).  To meet their needs “wellness centers” have sprouted on most college campuses and professors are routinely advised to make allowances for all sorts of mental health problems.  Yet the problems proliferate with no indication that treatments succeed.  

   This suggests, to Shrier, that we’ve been overwhelmed by “bad therapy.”  She thinks this, in part, because of scholars like Camilo Ortiz, a “tenured professor and leading child and adolescent psychologist.”  His research shows that “individual therapy has almost no proven benefit for kids.”  If anyone needs help it’s the parents,  who too often “unwittingly transmit their own anxiety to their kids.”  However,  “numberless psychotherapists not only offer individual therapy to young kids, they practice techniques like ‘play therapy’ that have shown scant evidence of benefiting kids. In fact, there’s very little evidence that individual (one-on-one) psychotherapy helps young kids at all” (p. 40).  

Consulting Ortiz and other scholars, Shrier lists 10 Bad Therapy steps:  (1)  Teach Kids to Pay Close Attention to their Feelings—in fact they should learn to distrust their emotions and often repress them;  (2)  Induce Rumination—in fact rehashing often harms a person; (3)  Make “Happiness” a Goal but Reward Emotional Suffering—in fact happiness comes as a result of doing things well; (4) Affirm and Accommodate Kids’ Worries—in fact they need to confront and deal with them; (5) Monitor, Monitor, Monitor—in fact they need to be supervised less and left alone much more than they are; (6)  Dispense Diagnoses Liberally—in fact we need to stop labeling kids’ disorders (e.g. ADHD) and treat them as ordinary aspects of growing up; (7) Drug ’Em—in fact drugs such as Ritalin should be administered only as a last resort; (8) Encourage Kids to Share Their “Trauma”—in fact, the less they share the better they fare; (9) Encourage Young Adults to Break Contact with “Toxic” Family—in fact few families are truly “toxic” and severing oneself from those who love them best rarely helps anyone; (10) Create Treatment Dependency—in fact interminable therapy sessions enrich counselors while harming kids.  In sum:  “Bad therapy encourages hyperfocus on one’s emotional states, which in turn makes symptoms worse” (p. 64).

Compounding the bad therapy kids may get in counselors’ offices, the nation’s schools are redesigning themselves as therapeutic care centers, flying the flag of  SEL (social-emotional learning).  Add to SEL the “restorative justice” President Barack Obama urged in his 2014 “Dear Colleague Letter” threatening schools with loss of funding if they continued to suspend and expel a disproportionate number of minority kids.  This presented schools with a quandary:  How do you maintain order without punishment?  The Dear Colleague Letter spelled out the solution:  “‘restorative practices, counseling, and structured systems of positive interventions.’  Violent kids were rebranded as kids in pain.  Schools stopped suspending or expelling them.  And a newly invigorated era of mental health in public schools was born. ‘Restorative justice’ is the official name for schools’ therapeutic approach that reimagines all bad behavior as a cry for help” (p. 94). 

Consequently, says Shrier:  “For more than a decade, teachers, counselors, and school psychologists have all been playing shrink, introducing the iatrogenic risks of therapy to schoolkids, a vast and captive population” (p. 71).  Teachers—even in math classes— may very well begin the day by asking their students how they’re feeling and even engaging in forms of group therapy urging them to confess their deepest anxieties.  “Sometimes described by enthusiasts as ‘a way of life,’ social-emotional learning is the curricular juggernaut that devours billions in education spending each year and upward of 8 percent of teacher time” (p. 77).  In SEL sessions kids’ parents are often blamed for a variety of problems and often referred to as “caregivers” or “service providers” who fail to treat their clients well.  Some “experts” even dismiss parents as “morons” incapable to rightly parenting.  Meanwhile, under their” guidance kids behave worse and schools appear increasingly anarchical.  

And their parents, determined to be “gentle” with their children cooperate with the teachers and therapists who declare they know how kids should be reared.  Earlier generations, however, “had a more masculine style of parenting.”  Dads usually did the disciplining but moms certainly did their fair share.  “This is the style I’ve called ‘knock it off, shake it off’ parenting.  The sort that met kids’ interpersonal conflict with ‘Work it out yourselves,’ and greeted kids’ mishaps with ‘You’ll live.’  A loving but stolid insistence that young children get back on the horse and carry on” (p. 168).  The “Battered Mommy Syndrome”—kids punching and kicking their parents—was unheard of.  Three-year olds weren’t asked for advice, nor did children dictate the dinner menu.  Youngsters were rarely considered particularly “sensitive” or “brittle” since they usually proved quite resilient in even difficult situations.  

Indeed they can be resilient if only they’re left alone to grow up as kids have done for centuries.  They don’t need drugs or counselors or constant monitoring.  Shrier has determined to relax and let her kids take risks, fall down and get up without dramatics, make friends on their own, etc.  She’s persuaded that many alleged childhood “disorders” can be corrected by assigning chores and demanding respect for elders.  Above all she urges readers to discern and flee bad therapy.  One a broader scale, perhaps all of us should free ourselves from the bad therapeutic culture advancing upon us in virtually all our institutions.

                                                          * * * * * * * * * * * * * * * * * * * * *

Much that Abigail Shrier describes in Bad Therapy was discerned by one of the finest thinkers of the past century, Philip Rieff, in his The Triumph of the Therapeutic: Uses of Faith After Freud (Chicago: University of Chicago Press, c. 1966), wherein he  warned educational and religious leaders to beware of psychological nostrums.  We’ve witnessed his prophetic insights as what Christian Smith called “moralistic, therapeutic deism” has secured a dominant position in the cultural landscape.  In our world, Rieff said,  “hospital and theater,” fitness centers and films, are replacing family and nation.  “Religious man was born to be saved,” he noted, but “psychological man is born to be pleased.  The difference was established long ago, when ‘I believe,’ the cry of the ascetic, lost precedence to ‘one feels,’ the caveat of the therapeutic” (p. 25).  Within the Christian tradition, this trend surfaced as early as 1857, when “Archbishop Temple put into clear English what had been muddled in German ever since the time of Schleiermacher:  ‘Our theology has been cast in a scholastic mould, all based on logic. We are in need of, and are actually being forced into, a theology based on psychology” (p. 42).  

       Ancient and Medieval civilization developed through the subjugation of our sensual desires, choosing to follow moral standards rather than pleasures.  The highest kind of knowledge is attained through faith—knowing and obeying God.  The good life in conforming to creation and the Creator.  This tradition of self-discipline and responsibility—labeled by Rieff a “dialectic of perfection, based on the deprivational mode” celebrating martyrs and saints, “is being succeeded by a dialectic of fulfillment, based on the appetitive mode” (p. 50).  Rather than restraining himself, psychological man seeks to “be kind” to himself.  An egoistic ethic of self-esteem and tolerance replaces the ethic of repentance and sanctity.  The “ideal man,” from Plato to Tocqueville, was understood to be a “good citizen,” sacrificing his own interests for the welfare of others.   In the emergent therapeutic culture, however, the “ideal man”—as is evident everywhere from the Oval Office to the box office—knows how to amuse himself.     Beginning with Francis Bacon, however, a new approach, progresively shaped by psychoanalytic theory, has exerted control.  According to this theory, we must learn how to change what is, to “create our own” reality, to craft whatever suits our desires.  Marx wielded philosophy as a hammer and sickle for social change. Freud proffered clients insights whereby they could choose whatever seemed desirable. Jung and Adler and hosts of lesser folks followed suit, and our world is largely ruled by folks who want to rule!                                            This led Rieff into extensive discussions of Jung, Reich, and D.H. Lawrence–thorough, penetrating, illuminating analyses. He showed, persuasively, how the “sexual revolution” has its roots in the likes of such intellectuals, and he makes clear how effectively they have subverted Western civilization.  As a result of adopting this therapeutic approach, many folks in our society lack the arresting sense of sin which typified the classical culture.  Indeed, it is “incomprehensible to him inasmuch as the moral demand system no longer generates powerful inclinations toward obedience or faith, nor feelings of guilt when those inclination are over-ridden by others for which sin is the ancient name” (p. 245).  No longer haunted by sin, modern man feels no need for salvation, no desire for a Savior.  So churches emphasize “religious” experiences and advertise “spiritual” therapies designed to help vaguely distressed people feel better.  Many, indeed, have “become, avowedly, therapists, administrating a therapeutic institution–under the justificatory mandate that Jesus himself was the first therapeutic” (p. 251).  Such, Rieff insists, is quite wrong-headed and needs to be rejected, but it’s what’s happened under the reign of modernity.                                                                                                                                                                                                                               * * * * * * * * * * * * * * * * * * * * * * In My Life among the Deathworks: Illustrations of the Aesthetics of Authority (Charlottesville: University of Virginia Press, c. 2006), the first volume of a trilogy entitled Sacred Order/Social Order, Philip Rieff explored the fact that “cultures give readings of sacred order and ourselves somewhere in it.”  Throughout human history, James Davison Hunter explains, all cultures have been “constituted by a system of moral demands that are underwritten by an authority that is vertical in its structure.  . . . .  These are not merely rules or norms or values, but rather doxa: truths acknowledged and experienced as commanding in character” (p. xix).  First World (pagan) and Second World (Judeo-Christian) Cultures—to use Rieff’s categories—humbly aligned themselves with a higher, invisible Reality:  the Sacred.                                                     The modem (what Rieff labels “Third World”) culture shapers, working out the position espoused by Nietzsche’s Gay Science declaring that “God is dead,” have negated that ancient sacred order. Turning away from, indeed assailing, any transcendent realm, they have rigidly restricted themselves to things horizontal—material phenomena and human perspectives.  Rather than reading Reality, they actively encourage illiteracy regarding it—e.g. idiosyncratic “reader responses” to “texts,” the venting of personal opinions, and the construction of virtual realities.  Their relentless attacks upon the sacred are what Rieff calls “deathworks” that are both surreptitious and ubiquitous, shaping the arts and education, dominating movies and TV, journalism and fiction, law schools and courtrooms.  As he says: “There are now armies of third world teachers, artists, therapists, etc., teaching the higher illiteracy” (p. 92).                                                                     

Throughout his treatise, Rieff weighed the import of the raging culture war.  This Kulturkampf “is between those who assert that there are no truths, only readings, that is, fictions (which assume the very ephemeral status of truth for negational purposes) and what is left of the second culture elites in the priesthood, rabbinate, and other teaching/directive elites dedicated to the proposition that the truths have been revealed and require constant rereading and application in the light of the particular historical circumstance in which we live.  And that those commanding truths in their range are authoritative and remains so” (p. 17).  He especially emphasizes that: “The guiding elites of our third world are virtuosi of de-creation, of fictions where once commanding truths were” (p. 4). By denying all religious and moral truths, they have established an effectually godless “anti-culture.”                  Rieff’s analyses of influential artistic works (many of them reproduced in the text) are particularly insightful and persuasive. What was evident a century ago in only a few artists (James Joyce and Pablo Picasso), and psychoanalysts (Sigmund Freud and Karl Jung), now dominates the mass media and university classrooms, where postmodern gurus Michel Foucault and Jacques Derrida are routinely invoked.  One thing these elites, will not acknowledge:  any transcendent, ”divine creator and his promised redemptive acts before whom and beside which there is nothing that means anything” (p. 58).  Nietzsche folly understood this, propounding “a rationalism so radical that it empties itself, as God the Father was once thought to have emptied himself to become very man in the Son.”  (p. 70).                         Rieffs grandfather, a survivor of the Nazi death camps, “was appalled to discover not only in the remnant of his family in Chicago but in the Jewish community of the family’s Conservative synagogue . . .  that the Jewish sense of commanding truth was all but destroyed.  Those old traditions were treated as obsolete, replaced by the phrase that horrified my grandfather most:  everyone is entitled to their own opinion” (p. 82).  The nihilism of the Nazis flourished in Chicago!  To Rieff, Auschwitz signifies “the first full and brutally clear arrival of our third world” (p. 83). But the death camps, both Nazi and Bolshevik, were simply the logical culmination of Hamlet’s ancient view that “there is nothing good or bad in any world except thinking makes it so’” (p. 83).   What was manifest in Auschwitz, Rieff says, is equally evident in the world’s abortion mills!  In one of Freud’s letters, we read a “death sentence, casually uttered, upon sacred self:  ‘Similarly birth, miscarriage, and menstruation are all connected with the lavatory via the word Abort (Abortus).’  How many things,” Rieff muses, “turn before my eyes into images of our flush-away third world” (p. 104).  Rejecting “pro-choice” rhetoric, he insists: “The abortionist movement does bear comparison the Shoah [the Jewish Holocaust].  In these historic cases both Jews and ‘fetuses’ are what they represent, symbols of our second world God.  It is as godterms that they are being sacrificed” (p. 105).      My Life among the Deathworks, says Hunter, “is stunning in its originality, breathtaking in its erudition and intellectual range, and astonishing in the brilliance of its insights into our historical moment” (p. xv).  It is however “difficult, intentionally so,” because “Rieff wants the reader to work for the insight he has to offer; to read and then reread” (p. xvi). The book rather resembles Pascal’s Pensees—a collage of aphorisms and illustrations (many of them paintings) rather than a systematic development of a thesis. The book does, however, richly reward the reader’s persistence.

* * * * * * * * * * * * * * * * * * * * *

Philosopher Mark Goldblatt would consider bad therapy a result of philosophical developments  leading to the declaration: I Feel, Therefore I Am: The Triumph of Woke Subjectivism (New York:  Bombardier Books, c.  2022; Kindle Edition).  He cites G.K. Chesterton’s words from a century ago:   “We shall soon be in a world in which a man may be howled down for saying that two and two make four, in which furious party cries will be raised against anybody who says that cows have horns, in which people will persecute the heresy of calling a triangle a three-sided figure, and hang a man for maddening a mob with the news that grass is green.”  Reason is drowning in a sea of emotion wherein everyone decides what is true or false,  right or wrong, on the basis of how it feels.  This is not exactly a unique moment, however, for Pilate asked Jesus “What is truth?”  He seemed to be tossing aside the possibility of Truth’s existence in any transcendent sense.  To him truth was simply instrumental, finding out what works to one’s own advantage.  He had the power to kill Jesus and did so, washing his hands in the process.  So it follows that today’s subjectivism is amply evident and clearly rooted in the perspectivism of Friedrich Nietzsche, who called Pilate the “solitary figure worthy of honor” in the New Testament.  In Goldblatt’s view, we are now “having a Pontius Pilate moment.  What is truth?  Whatever you will.  Whatever you can.  Whatever you dare” (p. 7).  Many of the major issues confronting us are, most deeply, questions of truth.  Is truth a clear seeing and accepting of what is—a “correspondence” between what I think and Reality—or is it merely what I imagine things are or ought to be?  Is truth objective or subjective?  “Objective truth is revealed by a careful examination of evidence and the application of logic to that evidence.  Objective truth is true regardless of our subjective feelings about it because it is anchored in the object of the belief or proposition; it is a relationship between out-there and in-here, an alignment between the two” (p. 15).  Philosophically it’s a form of realism.  Subjectivism, however, is a branch of idealism that “foregrounds not reality but perception.”  Following George Berkeley, “To be is to be perceived. Esse est percipi.”   As Goldblatt shows, many modern movements—Black Lives Matter, Transgenderism, et al—share this subjectivism.  MY truth is THE truth!  And feelings are triumphant!

375 An Apostolic Agenda

Taking seriously Pope Francis’ recent words to the Roman Curia—“Brothers and sisters, Christendom no longer exists!”—some scholars at the University of Mary (a college in Bismarck, North Dakota, that was established in 1987 and now enrolls some 5000 students) worked with James P. Shea to publish From Christendom to Apostolic Mission: Pastoral Strategies for an Apostolic Age (Bismarck, N.D.:  University of Mary Press., c. 2020; Kindle Edition).   This is hardly a novel concern, for in 1974 Archbishop Fulton Sheen said:   “We are at the end of Christendom.  Not of Christianity, not of the Church, but of Christendom.  Now what is meant by Christendom?  Christendom is economic, political, social life as inspired by Christian principles.”  It created a wonderful culture—gothic cathedrals, universities, hospitals, theologians and saints—for which we should give thanks.  But, Sheen insisted:  “That is ending — we’ve seen it die.”  Nevertheless:  “These are great and wonderful days in which to be alive.  . . . .  It is not a gloomy picture — it is a picture of the Church in the midst of increasing opposition from the world. And therefore live your lives in the full consciousness of this hour of testing, and rally close to the heart of Christ.”  Shea and his friends want to do precisely that by recovering an apostolic mindset.

Our formerly Western Christian culture has been slowly but surely disintegrating.  Dealing with it brings challenges not faced by early missionaries proclaiming the Good News to a pagan world.  C.S. Lewis said it is difference between a man wooing a young maiden and a man winning a cynical divorcée back to her previous marriage.  More disquieting:  many non-Christians actually call themselves Christian!  Many things have contributed to this development, including the massive changes wrought by technology.  But the “key battles our culture faces are intellectual ones” (p. 11).  Until we learn to think in truly Christian ways we’ll never evangelize our world.  These challenges will not likely be met by academics, for our institutions of higher learning “are often so decayed in purpose (apart from technical training) that not much wisdom or light is to be hoped from them; for various reasons, they can tend to deform rather than enlighten the minds of those who come under their influence.  Rather, what is needed is the sort of intellectual life that was characteristic of the Church in her early centuries, a life possessed to some degree by every Christian.  It is not simply or primarily a matter of college degrees but of the conversion of the mind to a Christian vision of reality and of readiness to live out the ramifications of that vision.  A compelling Christian narrative is called for, one that provides a counter to the secular vision, that helps Christians understand and fend off false gospels” (p. 12).

We must deal with the “spirit of the age” by casting a fresh vision, rooted in a new way of discerning truth, that offers people something more than this world affords.  As is true of any worldview, it will need to include philosophy, art, science, religion, et al. in setting forth a narrative describing the “cosmic battle for souls between God and the devil” and declaring the way to join the winning side.  In its beginnings, the Christian Church worked in accord with “an apostolic mode, by which is meant that she was making her way against the current of the wider society and needed to articulate and maintain a distinct and contrasting vision” (p. 19).  Different strategies were employed when addressing Jews or Gentiles—early evangelists dealt with the audience at hand, and we must also open our hearts to Christ and follow His way in our world.  We need to recover an apostolic zeal with strategies shaped to reach our generation, with “new movements and religious communities being born or rediscovering their vitality; institutions being founded or reformed; a deepening life of prayer and communal witness being expressed” (p. 38).  As ever, this will come about not by orchestrating mass movements but by heeding creative minorities who deeply believe in and proclaim the “Good News:  “that God in his mercy has come among us to set us free from our sins and from slavery to the devil, and for those who turn to their true allegiance, the nightmare of life apart from God can be transformed into a dawn of eternal hope.  They need to know, from their own experience, that obedience to the Gospel is perfect freedom, that holiness leads to happiness, that a world without God is a desolate wasteland, and that new life in Christ transforms darkness into light” (p. 43).

  The Good News is good for all peoples at all times.  It’s embraced by individuals, one at a time, who find it both true and efficacious.  It’s generally more effectively proclaimed by witnesses than scholars, by missionaries than moralists.   People need to turn around, learn to think differently, to be truly converted, and:  “Every conversion is a marvel of grace, an astonishing work of God.  Saint Augustine once said that it was a greater miracle for God to save one sinner than to have created the whole world. Augustine’s comment points to the attitude appropriate to an apostolic age” (p. 45).  Embracing an Apostolic Agenda, modern missionaries “should assume that the majority of their hearers are unconverted or half-converted in mind and imagination and have embraced to some degree the dominant non-Christian vision” (p. 71).  So the Good News must be set forth “in such a way that the minds of its hearers can be given the opportunity to be transformed, converted from one way of looking at the world to a different way” (p. 70).

Importantly, getting converted means replacing a mechanistic with a sacramental worldview.  There are invisible as well as visible realities in our world, “and the invisible world is incomparably more real, more lasting, more beautiful, and larger than the visible.  Our blindness to that world represents much of our predicament.  We are caught by the illusion of the merely seen and need to have our blindness cured.  This drama involves us not only with the awful and marvelous and incomprehensible being of God, who created us with a decisive purpose in mind, but also with a cosmic struggle among creatures of spirit more powerful than we are, who influence human life for both good and evil.  We have been born into a battle, and we are given the fearful and dignifying burden of choice: we need to take a side.”  We are designed and destined for eternity.  Our lives make a difference because we’ve been created for a reason.  “Not only are we meant to know good things, happiness, strength, length of existence, but we have been created to experience the unthinkable:  to share in the very nature of God, to become — in the language so beloved by Eastern Christians — ‘divinized.’  Created from the passing stuff of the material world fused with an invisible and immortal soul, we are each of us meant to be what we would be tempted to call gods: creatures of dazzling light and strength, beauty and goodness, sharing in and reflecting the power and beauty of the Infinite God” (p. 76).  Now that’s Good News!

Each of us has an assignment if we’re to live as apostles.  We have only one life to live and need to live it well.  We must both take the world lightly and earnestly work to make it better.  If we’re honest we realize we’ll not much matter, as the world calculates things.  But Christians need to remember “that in dealing with even the smallest details of life, they are working out an eternal destiny.  They fight the darkness within themselves and embrace the life of love laid out for them by Christ, delighting in conforming their wills to his, knowing that obedience to him does not limit them or impede their self-development but rather brings them to their true selves, to freedom and fulfillment.  They live as exiles, in hope and hard fighting, waiting for the final triumph of God, full of gratitude for what they have been given, full of hope for all they have been promised, full of love originating in Christ toward others who need to hear the good news of a merciful and forgiving and gift-giving God” (p. 79).

This is a book written by Catholics for Catholics.  But it provides an analysis and agenda for all believers.  Given the collapse of Christendom, we need not fear, for God is with us.  And empowered by His Spirit we can do what the apostles did long ago:  proclaim and live out the Good News.

                              * * * * * * * * * * * * * * * * * * * * * * * * * * * *

In a sequel to From Christendom to Apostolic Mission: Pastoral Strategies for an Apostolic Age, Jonathan Reyes and someprofessors from the University of Mary have set forth The Religion of the Day (Bismarck, N.D.:  University of Mary Press, c. 2023; Kindle Edition), attempting, as the title specifies, to analyze today’s dominant religion and propose ways for Christians to counter it—searching for clues in the first century when “God in Christ came among us to wage a spiritual battle and, in every age since the time of its founding by Christ, the Church has been engaged in a kind of three-front war.  On one front, Christians fight an external battle against the unbelief of a fallen world; a second front is an internal battle against disloyalty and corruption among Church members; and most importantly, the third front is a fight against the darkness and unbelief of one particular member of the Church:  namely, ourselves.  Much of the nature of that battle is the same in every age: Jesus Christ is the same yesterday, today, and forever (Heb 13:8), and human nature, despite what many current philosophies want to suggest, is fundamentally constant” (p. 10).  

Inasmuch as man is by nature incurably religious the basic question in every era is what form religion takes.  The book’s authors think we the religion of our age is Progressivism, which is basically Neo-Gnosticism (more definitively described as a “Modern Neo-Gnostic Progressive Utopian Revolutionary Religion”—a revival of perhaps the most persistent heresy in Church history. It was St Thomas Aquinas’s  main adversary, and it has been clearly propounded by a series of thinkers since the Enlightenment.  Thus John Dewey, the American philosopher still influencing this nation’s educational and political classes (who helped draft the “Humanist Manifesto”) called for a “humanistic religion” focused on Man rather than God.  Progressives like Dewey think we can save ourselves, following a variety of self-help schemes, because all the evils in the world result from flawed material and social arrangements.  Remaking the world in accord with our needs and desires will enable us to become (as Eleanor Roosevelt famously said) “better and better” persons living with another in perfect accord.  It is, as the authors perceptively declare, essentially “an expression of human pride” (p. 21).

The memorable lyrics of John Lennon embody the progressive religion:  “Imagine there’s no heaven; It’s easy if you try. / No hell below us; Above us, only sky. / You may say I’m a dreamer; But I’m not the only one. / I hope someday you’ll join us, And the world will be as one.”  Unpacking this more prosaically, the authors provide a helpful analysis of Twelve Aspects of Modern Progressive Religion, beginning with the typical Gnostic sense of alienation from the world, a feeling that something’s deeply wrong with the world as it is.  It’s not us—there’s no original sin in the Gnostic mind—but a world that’s deeply flawed. else. “‘Not my fault!’ is the universal Progressive religious mantra” (p. 28).  Without remorse, without repentance, modern religionists must imagine or dream of  something better attainable through esoteric knowledge of some sort, or remaking what is into what ought to be, even destroying the existing creation to make way for a better one.  If there is a Creator He failed to make tings as they ought to be.  The first great Cristian critique of Gnosticism, St. Irenaeus of Lyons, “wrote that the essence of all Gnostic sects was blasphemy against the Creator,” a trait still evident in Neo-Gnosticism. 

“Voltaire’s famous cry against the Church, ‘Écrasez l’infâme!’ (‘Crush the loathsome thing!’) is the battle cry of Progressive believers against the order of the current world and against the God who is perceived as somehow standing behind it, as they insist on the utter annihilation of the structure of an oppressive system as the necessary prerequisite for the new age of freedom to come” (p. 31).  Man must master his world, technologically transforming what is into what he wants it to be.  Man’s knowledge provides the key.  Not “Jesus saves” but “we will save” sets the agenda.  Drawing upon the Hegelian/Marxist dialectic, progressives seek to ever be on “the right side of history,” and “morally up-to-date,” making the world a better place.  As was true of the French leftists in 1789, today’s progressives champion revolution:  “Revolution – the annihilation of the structures of oppression – is the privileged means by which Progressive belief will bring about the new age of freedom.  . . . .  This is clear in Karl Marx’s famous revolutionary dictum, ‘The philosophers have hitherto only interpreted the world in various ways. The point, however, is to change it’” (p. 45).  

We see the revolutionary ethos in today’s youthful protesters who cheerfully embrace violence.  To gain their goals they promote the “cancel culture” so evident in American universities.  No dissent is allowed, no gradualism will suffice.  All must be uprooted and replaced.  Oppressive systems must be destroyed.  Consequently, a “program of willful systematic amnesia begins with artifacts – statues, texts, uses of language – but if the requisite power is gained it always moves on to eliminating living humans.  The logic of tearing down statues and erasing words is the same as the impetus behind the French Revolutionary Reign of Terror, the Soviet gulag, the Chinese cultural revolution, and the Nazi death camps. The sources and expressions of evil must be hunted down and eliminated so that the pure society can properly arrive” (p. 47).  As gnostic movements have risen and fallen in the past, so too it’s modern expression will ultimately fail.  But in our day it’s powerful and virulent.  Its power stands exposed in the 2005 treatise by sociologists Christian Smith and Melina Lundquist Denton—Searching: The Religious and Spiritual Lives of American Teenagers—describing what young Americans believe.  “They famously coined the term ‘Moralistic Therapeutic Deism’ or ‘MTD’ to describe what those teenagers, including Christian ones, most commonly believed” (p. 65).  They think there’s a God out there somewhere who created the world who mainly wants us to “be good, nice, and fair to each other, as taught in the Bible and by most world religions.  The central goal of life is to be happy and to feel good about oneself.  God does not need to be particularly involved in one’s life except when God is needed to resolve a problem.  Good people go to heaven when they die” (p. 66).  

This is of course anything but orthodox, traditional Christianity!  To address it we need not fear the darkness but learn to light candles illuminating it with Christ’s Light, to work with Him in rescuing the perishing.  We must begin by stressing the astounding fact of His Incarnation.  God really did become man.  The Maker of all that is actually lived among us—Immanuel, God with us.  “As C. S. Lewis once observed with the claim of the Incarnation in mind, ‘One must keep on pointing out that Christianity is a statement which, if false, is of no importance, and if true, is of infinite importance.  The one thing it cannot be is moderately important’” (p. 72).  Of all places, we must begin the battle for truth and righteousness within the Church!  The Progressive religion has poisoned too many professing Christians “who have abandoned key doctrines of the faith and have embraced some form of the neo-Gnostic gospel of personal self-creation” (p. 99).  They imagine themselves to be “Christians” but have never “encountered the risen Christ as their Lord and Savior” (p. 100).  They think everyone is basically good rather than sinful and need not so much a redeemer as a cheerleader.  

Simultaneously we must do battle within our own souls.  “God’s kingdom is established on earth mainly by personal conversion and holiness:  the saints are the true movers and shakers of history” (p. 98).  Rather than agitating for social justice we need to focus on being justified, made right, by God.  We need less to march in the streets than stand patiently with Christ.  We fight for the Faith with spiritual weapons, resisting the devil and boldly declaring the Word of the Lord.  It’s better to be a martyr than an emperor.  Facing an increasingly non-Christian world we must nevertheless believe God providentially brought us into it.  Now is our time.   This is our time.  ‘We will neither be lost in nostalgia for a distant time in the past, nor will we fall into the trap of thinking that Christianity is now ‘outmoded’ and needing a fundamental change of belief or morality. Instead, we will seek wisdom to understand how Christ is responding to our times, as the Gospel of the One who is ‘ever ancient and ever new continues to reach out to save the lost” (p. 131).

                                      * * * * * * * * * * * * * * * * * * * * * * * * * *

Eric Metaxas, the author of the highly-acclaimed Bonhoeffer:  Pastor, Martyr, Prophet, Spy, recently published Letter to the American Church (Washington, D.C.:  Salem Publishing, c. 2022; Kindle Edition), applying insights gained from writing it.  He wrote “this book because I am convinced the American Church is at an impossibly—and almost unbearably—important inflection point.  The parallels to where the German Church was in the 1930s are unavoidable and grim” (p. ix).  We may fail to realize that we are as immersed in evil as were the Germans under Hitler’s control, but we are in fact facing anti-God ideologies such as “Critical Race Theory,” LGBTQ+ rationales,  and pro-abortion rhetoric.  Rather than identify and oppose them, too many churchmen have set aside the Gospel in order to please cultural elites championing such perversions.  Few German pastors in 1932 understood that one must act when there’s still time to do so and that small steps determine the course of one’s future.  

Too often the Church fears to appear judgmental, to condemn evil, to oppose persons and organizations promoting it.  But Metaxas wonders:  “Where did we get the idea that we shouldn’t be at the forefront in criticizing the great evil of Communist countries like China that brutally persecute religious minorities in ways that bring to mind the Nazis themselves?” (p. 5).  What we should learn from Bonhoeffer is the importance of resisting evil, discerning its presence and speaking out at its manifestations.  Pastors and theologians are especially responsible for doing so.  Unfortunately, in 1954 Senator Lyndon Johnson orchestrated legislation that forbade churches from endorsing political figures, threatening the churches’ tax exempt status!  Inasmuch as they remained silent at this move to quiet them, “they behaved rather like many of the submissive pastors in Germany two decades earlier” (p. 8).  Still fearful of the taxman, all too many American churchmen still refuse to publicly hold politicians responsible for their behavior.  They’ve lost the courage to enter the public square and fight for justice.  Though few of us know much about the “Johnson Amendment” and the government’s capacity to quash religious freedom, churches saw it vanish during the recent COVID-19 shutdowns.  Churches were actually deemed “non-essential” and ordered to close their doors.  Virtually all of them did!  Marijuana dispensaries and strip clubs stayed open but churches closed and pastors said nothing.   “When questionable medical procedures were being forced on their parishioners . . .  they meekly adopted the stance that it was the ‘Christ-like’ thing to submit and not to fight, nor even to mention such tremendously serious issues.  This was a deeply disgraceful moment for the American Church” (p. 12).

That moment came for Bonhoeffer when, on Reformation Sunday in 1932, he  preached a message in an historic Berlin church.   “Rather than stroke the egos of those German elites slumbering in the pews, Bonhoeffer’s sermon was calculated to wake them up, if they were still able to be awakened” (p. 25).  Midway through his message, the authorities shut down the broadcast.  “To put it in our own modern parlance,” Metaxas says, he “had just been ‘cancelled.’”  Thenceforth he sought ways to resist the Nazis, helping lead the “Confessing Church” in opposition to the pro-Nazi, state-subsidized “Deutsche Christen” (Christian Church).  In time, only 3,000 of Germany’s 18,000 pastors stood with Bonhoeffer.  Many of them would be arrested and killed.  The majority failed to discern what was actually happening.  “They could not believe that the Nazis were devotedly anti-Christian—and that they were essentially atheist and pagan tribalists working to eventually obliterate the Christian Church” (p. 48).  Somewhat the same is now taking place—witness the rainbow banners and BLM flags on churches.  Such acts compromise the Faith and  churchmen must be resisted.   People need courageous leaders, and “God expects those who have a voice to speak out for those who do not—who most of all tend to be the poorest among us” (p. 13).  The COVID pandemic has receded and the church doors have opened, but today we’re besieged by Critical Race Theorists who want to indoctrinate our children and by homosexual and transgender ideologists who work to undermine the Christian Way.  It’s time, Metaxas thinks, for some new Bonhoeffers!  

374 The War on Masculinity

Himself childless, C.S. Lewis still wrote:  “Children are not a distraction from more important work.  They are the most important work.”  Yet one of the more distressing developments during the past half-century is the failure of men to embrace their traditional roles as fathers of children and providers/protectors of women.  Whether or not this is the result of men simply discarding their responsibilities or of women emasculating them is highly debated, but Nancy Pearcey gives valuable views in The Toxic War on Masculinity:  How Christianity Reconciles the Sexes (Grand Rapids:  Baker Publishing Group, c. 2023; Kindle Edition).  Pearcey has written numerous highly-praised books, including Total Truth: Liberating Christianity from Its Cultural Captivity and How Now Shall We Live? (coauthored with  the late Chuck Colson).  She was praised by The Economist as “America’s pre-eminent evangelical Protestant female intellectual,” and is currently a professor and scholar in residence at Houston Christian University. 

Though reared in a Christian home Pearcey struggled with her faith—in large part because of her highly-respected but abusive father.  In high school she discarded Christianity and became a committed feminist.  Then, wandering about Europe in search of something to live for, she stumbled into Francis Schaeffer’s L’Abri.  There, “for the first time I discovered that there exists something called Christian apologetics, and I was stunned.  I had no idea that Christianity could be supported by logic and reasons and good arguments.  Eventually I found the arguments persuasive and I reconverted to Christianity” (p. 14).  This move prompted her to rethink her feminist agenda in the light of biblical truth.  “So in a sense,” she says,  “I’ve been writing this book my entire life. As a little girl, I wondered how a man could sometimes be so wonderful and at other times so cruel.  As an adult, I have had to spend literally decades thinking through how to define a healthy, biblical concept of masculinity.  What is the God-given pattern for manhood?  How did Western culture lose it?  And how can we recover it?” (p. 14)

       She begins by noting that if masculinity is considered “toxic”—as it is by many—the best solution is emasculation!  Rip the maleness out of men!  Thus in 2018 the American Psychological Association (APA) issued guidelines for counseling men and boys, denouncing “traditional masculinity ideology” as “psychologically harmful.”  Influential gender studies professors justify hating men simply because they’re men.  There’s even a hashtag, #KillAllMen and books titled I Hate Men, The End of Men, and Are Men Necessary?  From many cultural sectors comes a strong message:  masculinity, like arsenic, is toxic!  But Pearcey wants to celebrate what’s good about men and help them live up to the goodness of their creation.  She says:  “Because of testosterone, men are typically larger, stronger, and faster than women.  In general, they are also more physical, more competitive, and more risk-taking. We need to affirm these God-given traits as good when used to honor and serve others” (p. 18).  In fact:  “We should not make the mistake of equating masculinity with men’s bad behavior.  A biblical worldview tells us that men were originally created to live by the ideal of the Good Man, exercising traits such as honor, courage, fidelity, and self-control.  A healthy society is one that teaches and encourages a God-centered view of masculinity” (p. 22).

      The Good Man, Pearcey insists, generally attends church!  Contrary to the stereotypical patriarch—an angry man ruling the family with an iron hand and traumatizing  women and children—the best research shows that devout, conservative evangelicals, regularly going to church, are the least abusive, most admirable males in America.  Citing Brad Wilcox, a professor of sociology at the University of Virginia, director of the National Marriage Project, and author of Soft Patriarchs, New Men: How Christianity Shapes Fathers and Husbands, she argues that the more devout the man the better he is as husband and father.  Wilcox says:  “‘the happiest of all wives in America are religious conservatives. . . .  Fully 73 percent of wives who hold conservative gender values and attend religious services regularly with their husbands have high-quality marriages’” (p. 39).  

Though American evangelicals may never have heard of St John Chrysostom, they’re living out his admonition, given 1600 years ago:  “Let everything take second place to care for our children, our bringing them up in the discipline and instruction of the Lord.”  Rooted in New Testament teachings, Ancient Church fathers such as Chrysostom proposed a “mutuality in conjugal rights.  It was a symmetry ‘at total variance . . . with pagan culture,’ writes sociologist Rodney Stark” (p. 53).  Christian women enjoyed a much higher status in the church than in pagan society and played a significant role in its development.  As the “head” of the family the father should act as a servant seeking others’ well-being rather than a tyrant exercising his authority.  He should sacrificially enable his wife and children to find their calling and exercise their spiritual gifts.  Such men are, Wilcox says, “soft patriarchs.”  

Unlike conservative Christian men, however, American males are struggling and  there is, Pearcey thinks, a toxic side to their worldview and behavior.  To understand why she conducts an in-depth historical search for some reasons and basically finds the Industrial Revolution largely responsible.  As long as families worked together on farms or cottage industries, most men took responsibility for their families and lived rightly.  During the colonial era, in New England “the ideal for manhood was not personal ambition or self-fulfillment but the subordination of one’s private interests for the common good.  As historian Gordon Wood explains, men ‘were expected to suppress their private wants and interests and develop disinterestedness—the term the eighteenth century most often used as a synonym for civic virtue’” (p. 77).  They were urged to be  “Christian gentlemen.”  Hundreds of religious publications in the 17th century praised men for being, one historian says, “‘forgiving, magnanimous, benevolent, virtuous, moderate, self-controlled, and a worthy citizen’” (p. 101).  

As they moved from farms to factories, however, American men embraced a more competitive, acquisitive philosophy and relied less on Christian principles.  Rather than embracing moral standards they were, in the 19th century one historian says, urged to be ambitious and strong in a competitive marketplace.  “By taking husbands and fathers out of the home, industrialization created the material conditions that made it more difficult to fulfill a biblical ideal of manhood.  Men were no longer physically present enough to be fully engaged husbands and fathers.  They spent most of their time in the public realm, which was growing increasingly secular.  The Industrial Revolution thus became a catalyst for the acceptance of secular views of masculinity” (p. 101).  With their men working away from home the women, almost by default, became the teachers and exemplars of virtue.  

So, as Frances Parkes said in 1825, the “world corrupts, home should refine.” Thirty  years later Ralph Waldo Emerson would hail women as the “civilizers of mankind.”  Harriet Beecher Stowe urged wives to “mother” their husbands for in time, she said, “the true wife becomes a mother to her husband; she guides him, cares for him, teaches him, and catechizes him in the nicest way possible.”  Given these cultural upheavals, women effectively took charge of families, schools and churches.  By the end of the century they constituted nearly 90 percent of Sunday morning church goers.  They generally had minimal doctrinal concerns but enthusiastically championed various reform movements—urging women’s suffrage, the abolition of slavery, and closing “down taverns, saloons, brothels, and gambling houses.”  However well-intended, these reform endeavors easily  alienated men because they generally singled out male vices.  “As historian Mary Ryan points out, ‘Almost all the female reform associations were implicit condemnations of males; there was little doubt as to the sex of slave masters, tavern-keepers, drunkards, and seducers’” (p. 124). 

Deeply impacted by their fathers working away from home and their mothers taking charge of it were young men—fatherless sons.  More than a century ago Frances Willard, president of the Woman’s Christian Temperance Union, saw this as a serious problem needing attention, saying “God is the father, but how many families there are where the prototype of the divine is practically absent from Sunday to Sunday.”  When mothers tried to replace fathers their sons often rebelled, preferring to be a “bad” boy rather than a feminized weakling.  Consequently they were seen by some women as “Goths and Vandals”—little barbarians!  Boys read books celebrating cowboys, soldiers, and frontiersmen who found solace in wild, solitary places.  They found in the Boy Scouts an organization appealing to their “Noble Savage” urge.  The novelist Henry James spoke for many men in his novel The Bostonians (1886).  In the words of his male protagonist, Ransom:  “‘The whole generation is womanized, the masculine tone is passing out of the world; it’s a feminine, hysterical, chattering, canting age.’  Ransome announces his intention to recover ‘the masculine character, the ability to dare and endure’” (p. 146).  

Throughout the past century men have struggled to rightly recover their masculine character.  They’ve done so amidst the growing problem of fatherless boys that has now become a crisis.  Neither the government nor the schools nor the churches have figured out how to restore the family to health.  The greatest challenge we face may very well be getting men to be good fathers.   To do so will entail significant changes and sacrifices.  Work needs to be reduced to a secondary vocation, making fathering a man’s real work.  Taking time to attend church—and support her activities helping children grow up—must become a priority.  If Pearcey’s right there’s little to hope for in the secular world.  But if Christians heed the call they can make a difference and become Good Men.  

                      * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Whereas Pearcey still upholds many egalitarian aspects of her early feminism, Anthony Esolen sharply attacks it in No Apologies: Why Civilization Depends on the Strength of Men (Washingon:  Regnery Gateway, c. 2022; Kindle Edition.). The acclaimed translator of Dante’s Divine Comedy and one of the best Catholic writers in America, Esolen writes “to return to men a sense of their worth as men, and to give to boys the noble aim of manliness, an aim which is their due by right” (p. vii).  He urges us to look around and see how much men have accomplished.  “Every road you see was laid by men.  Every house, church, every school, every factory, every public building was raised by the hands of men” (p. x).  Wherever hard, necessary, physical things get done men (not women) do them.  “The whole of your civilization rests upon the shoulders of men who have done work that most people will not do—and that the physically weaker sex could not have done” (p. x).  Men have nothing to apologize for! 

“Acquit yourselves like men,” Paul said in I Corinthians 16:13.  The Greek text is quite clear, for andrizeisthe means, literally, “Be men!”  Jerome’s Vulgate is equally clear:   “viriliter agite, ”Be men!”  Many modern translations, however, sooth feminist sensibilities by simply saying “be courageous” or “be brave.”  The admonition to men is erased!  So it goes even in the world of Bible translators!  To those who label masculinity toxic, Esolen replies:  “Who is toxic?  The word suggests something hidden, secret, sly.  Imagine someone sprinkling a bit of strychnine in the soup—not enough to kill, but certainly enough to make the diner sick.  That is similar to what is being done to boys in our schools and in mass entertainment.  They are told that there is something wrong with them because they are not like girls.  They are also told that girls can do all of the physical things they can, and perhaps do them better—an absurd falsehood.  Telling boys these things is poisonous, and I daresay it is intended to be so:  those who speak this way want the boys to be weaklings, to despise their own sex, to doubt their natural and healthy inclinations” (p. xiii).  Indeed, shouts Esolen, stop poisoning our boys!  Stop the teachers trying to make our boys little girls!  Enough!   

  Begin by dealing honestly with the facts.  Men are physically stronger—much stronger—than women.  Hundreds of high school boys run faster than female world champions.  “The strongest and fastest women in the world would be pulverized by a men’s professional football team.  You would not ask the score.  You would ask whether the women could stop a single play from scrimmage.  You would ask whether the women ended up in the hospital.  In fact, the best female athletes in the world would be made into mincemeat by a half-decent high school boys’ team.  They would be in danger of serious harm, because the boys would be heavier than they are, taller, faster, stronger, and with much more of that quick-surge muscle action that packs power into the shortest impulses” (p. 3).   Proving his point, recently “the Australian women’s World Cup soccer team was trounced, seven to two, by an under-sixteen boys’ team, and a similar thing happened to the American women’s team that actually won the World Cup” (p. 3). 

As in athletics, wherever you find mechanical systems sustaining modern technology you’ll almost certainly find that men designed and continue to maintain them.  So, Esolen says:  “If you call a plumber to deal with a sewer pipe that has backed up into your basement, it is a practical certainty that it is going to be a man, because the sheer strength required to deal with the valve rusted shut or with a section of pipe that has to be cut or muscled into place is like a threshold” (p. 40).  Sadly enough, our nation’s infrastructure (roads, bridges, etc.) is fraying and needs strong men to do the hard work necessary to repair it.  Where are we going to find such men when our boys are all told to go to college and get a desk job?  Women can’t do it and we’re not rearing boys to respect and embrace hard work.  It’s as if we think the world will run by “magic,” maintaining our comforts without requiring the hard work necessary to make it work.

Men often accomplish great things because they’re team players.  Eccentric geniuses certainly operate alone, but men typically want to get together to accomplish things.  Feminists frequently complain about the “old boys clubs” that keep women from succeeding, but in fact men simply like to be with other men.  They like to plan projects.  They launch hunting expeditions, as did Sioux men hunting bison, because working together is the only way to succeed.  “Out of the individual strengths and wills of the different men, you must create a new thing, a hunting party, whose members at work are less like separate individuals than like the limbs of a body” (p. 64).  The same instinct is at work when neighborhood boys come together to play football.  Esolen notes:  “For a very long time now, there have been girls’ basketball teams, and yet you rarely see a group of girls spontaneously organizing themselves for a game on a basketball court or spontaneously organizing themselves for a pickup game of softball.  Boys will invent more games in a year than girls have adopted from boys in fifty.  It is in their nature to do so” (p. 70).  

As they organize teams men embrace hierarchies.  Some men will be in charge of others, some skills will be more important than others.  There’s no egalitarian ethos on a sandlot baseball field or the NFL draft day!  “That men form hierarchies without embarrassment, and without necessarily destroying the real and important equality among them, is one of the most astonishing things we can say about them; it is something so common and so obvious that we do not even notice it.  But I say: if you do not have hierarchy, you will not only fail at civilization, you will fail even to have a strong tribe of savages in the woods.  You will not kill the bison” (p. 72).  A quarterback orders ten other players to carry out their assignment.  Should every man in the huddle be given equal opportunity to call the play?  Should every workman erecting a cathedral be allowed to design the building?  Effective teams can never be egalitarian.  Yet, apart from the task, such men may very well be best friends, comrades committed to treating each other as equals!  A team’s quarterback and cornerback have hugely different roles to play on the football field but may be inseparable friends attending the same church where the cornerback is considered an outstanding Bible teacher giving guidance to the team’s leader.  In a criminal trial a male prosecutor and his antagonist (the defense attorney) fight for their assigned side, then go out to dinner together with no injured egos.  They illustrate “the masculine capacity to set things in proper emotional compartments, to bracket, to feel and express great passion at one moment and then to set it aside as if it were irrelevant” (p. 94).

To Esolen:  “The miracle of culture and of civilization is the miracle of the transformation and redirection of masculine energy from the willful self to the team, the work crew, the school, and the army—for the sake of the home and the women at the center of the home, and, in the end, for the sake of the city and the nation” (p. 86).  So throughout the centuries men have worked together to build great things.  All-male Renaissance art studies gave us Michelangelo, Raphael, Titian, Tintoretto, and thousands of artists all over Europe.  We should be grateful!  That women were frequently excluded from such groups doesn’t trouble Esolen:  “No apologies, then, for the masculine institutions of the past.  Instead, we should question our refusing to grant to men and boys the opportunity or even the legal permission to form groups that are natural to them and that have proved to be so marvelously productive” (p. 87).  When boys build tree houses with signs saying “No Girls Allowed” let them be!  It’s part of the process of becoming a man as well as granting the “freedom of association” guaranteed by the Constitution.  

Nowhere are strong men needed more than at home.  Yet it’s everywhere evident that families are jeopardized by the shifting sands of modernity.  To Gabriel Marcel there was an “inexpressible sadness which emanates from great cities,” something rooted in “a self-betrayal of life “bound up in the most intimate fashion with the decay of the family.” In part this results from a feminist ideology saturated with by envy.  One of the seven deadly sins, “envy is always looking cross-eyed—that is what the Latin invidia means—at something good that someone else enjoys, and wishing to ruin the enjoyment.  It is spiritual poison for weaklings.  Specifically, envy is the spiritual poison for feminists who see what healthy men and women enjoy, do not themselves enjoy it, and therefore want to ruin it for everyone else.  We can see this in academe.  Feminist scholars have discovered no neglected female Chaucer, so they must tear the actual Chaucer down and make sure that nobody else learns from him, calling him a racist and a rapist and whatnot.  They cannot of themselves produce a Shakespeare, so they must tear him down or wrench his meaning away from the Christian faith he so often portrays in dramatic action.  And on it goes. They have discovered no neglected female Titian, no neglected female Bach. There are none to discover” (p. 100).  So too they hate traditional family and want to destroy it.

We need fathers—patriarchs—who rule wisely.  When they’re absent, boys turn aggressive and girls long for what’s gone.  Both go bad.  “If women lead men,” as is often the case today, Esolen asks, “where are the happy female bosses—and the joyful men they lead?  . . . .  Why do people in an egalitarian wonderland not sing their love of the sexes?  The truth is, as C. S. Lewis says, that love does not speak the language of equality.  It speaks the language of gratitude and superiority, of awe at the unique characteristics that make the beloved different from oneself. . . .  When fathers go absent, do not expect women to take their place” (p. 103).  We “can have patriarchy or not.  If not, you will either suffer anarchy—moral, intellectual, and civic—or you will suffer tyranny in your attempt to keep the anarchy from ruining everything . . .   You can have fathers who govern, or else you can have unattached and unaccountable males who take a dismal pleasure in doing nothing or a ferocious pleasure in destroying things—or sometimes alternate between one and then the other” (p. 105).  We need patriarchs.  Nothing else works.  It’s rooted in our nature as human beings.  No apologies!

Esolen brings to his discussion a deeply-informed knowledge of the West’s best literature.  Citing Dante and Chaucer and Shakespeare and C.S. Lewis enable him to draw upon the wisdom of our civilization in building his case for men.  He also writes as a committed Christian, knowing the truth revealed in creation as well as Scripture.  He’s off-ostracized for speaking the truth as he sees it—and he doubtlessly overstates some of his view—but he’s worth reading and heeding.  No apologies!  

373 “Science at the Doorstep of God”

For more than a decade Robert Spitzer, S.J., Ph.D., has been publishing a series of thoughtful treatises touching on science, philosophy, and theology.  His recent Science at the Doorstep to God:  Science and Reason in Support of God, the Soul, and Life after Death  (San Francisco:  Ignatius Press, c.2023; Kindle Edition) digs into current evidence lending credence to the Christian tradition.  He believes the intellectual “landscape is changing” with many of the old objections to the Christian faith collapsing.  Interestingly enough, younger scientists (66 percent) are more likely to believe in God than older ones and only one-third identify as agnostic or atheist.  Among physicians, three-fourths believe in God while only one-fifth claim to be skeptics.  “It is also worth noting,” says Spitzer, “that most of the originators of modern physics were religious believers, including Galileo Galilei (the father of observational astronomy and initial laws of dynamics and gravity), Sir Isaac Newton (father of calculus, classical mechanics, and quantitative optics), James Clerk Maxwell (father of the classical theory of electromagnetic radiation), Max Planck (father of quantum theory and co-founder of modern physics), Albert Einstein (father of the theory of relativity and co-founder of modern physics), Kurt Gödel (one of the greatest modern mathematicians and logicians and originator of the incompleteness theorems), Sir Arthur Eddington (father of the nuclear fusion explanation of stellar radiation), Werner Heisenberg (father of the matrix theory of quantum mechanics and the uncertainty principle), and Freeman Dyson (originator of multiple theories in contemporary quantum electrodynamics)” (p. 16).

       Such intellectual giants were fully aware of the limitations of natural science, restricted as it is to observational data and inductive reasoning.  Scientific truths are not universal truths because they are focused on the empirical world which can never be known in toto. It’s certainly an important way of knowing—but not the only way.  Scientists (as scientists) cannot know, as do mathematicians, that numbers are quantifiable universal ideas, not empirical data.  Scientists (as scientists) cannot know, as philosophers do, that some truths are a priori, necessarily true, as in the laws of thought.  Scientists cannot (as scientists) know history as historians do,  relying upon what Aristotle said are testimonies, credible eye-witness accounts.  Virtually all logicians insist that “intrinsic contradictions (like square circles or asserting a propositional statement is simultaneously right and wrong) are impossible (and therefore false) at all times everywhere, without exception.”  We also know many things about ourselves, derived from introspection and memory, that afford us important truths.  So Spitzer endeavors to show how evidence from a variety of trustworthy sources lends credence to trans-physical realities such as God, freedom, and immortality.  He believes the evidence will show that there must be a Creator and that man has “a transphysical soul capable of surviving bodily death, which is self-conscious, conceptually intelligent, transcendentally aware, ethical/moral, empathetic/loving, aesthetically aware, and capable of freely initiated actions” (p. 30).  

The best current scientific evidence shows that the universe came into being in an instant—the “big bang.”  Since Monsignor Georges Lemaître, a colleague of Einstein’s, set forth the Big Bang theory in 1927, a hundred years of studies have led, almost inexorably, to the conclusion that the material world is not eternal.  Lemaître “showed with great mathematical precision that the expansion of the universe as a whole was the best explanation of the recessional velocities of distant galaxies, but his conclusion was so radical that Einstein and others found it difficult to accept” (p. 35).  But in 1929 Lemaître’s theory was confirmed by Edwin Hubble at the Mount Wilson Observatory.   Hubble invited Einstein and Lemaître to speak at the observatory in 1933, and “Einstein reputedly said, ‘This is the most beautiful and satisfactory explanation of creation to which I have ever listened.’  Since that time, Lemaître’s theory has been confirmed in a variety of different ways, making it one of the most comprehensive and rigorously established theories in contemporary cosmology” (p. 36).  Everything points to an instantaneous beginning point!  “If a beginning of physical reality is a point at which everything physical (including mass-energy, space-time, and physical laws and constants) came into existence, then prior to this beginning, all aspects of physical reality would not have existed—they would literally have been nothing” (p. 52).  Ex nihilo—from nothing—everything that now exists began to be.  So Christians had proclaimed, purely on the basis of Scripture, for centuries.  Now cosmologists favor that view.

       Still more:  the more we know the more it appears that the universe is “fine-tuned” with a precision that defies chance and accident.  Consider, as did Roger Penrose, the low entropy of our universe in light of the Big Bang.   He calculated the improbability of this combination as a “number is so large that if it were written out (with every zero being 10-point type), our solar system could not contain it.  It is the same odds as a monkey perfectly typing the manuscript of Shakespeare’s Macbeth with random tapping of the keys in a single try!  The odds of this happening by a one-off random occurrence is, by most physicists’ reckoning, virtually impossible.  Yet this low entropy did occur at the Big Bang, which allowed an abundance of life forms to develop within our very spacious and complex universe” (p. 65).  How could this be unless the world is more than mere matter-in-motion.  When Sir Fred Hoyle (one of the last stout, atheistic defenders of the “steady-state” theory) “discovered the need for exceedingly precise fine-tuning in the resonance levels of oxygen, carbon, helium, and beryllium needed for carbon bonding and carbon abundance, his atheism was shaken to the core.  Upon considering the options for how such precise fine-tuning might occur, he concluded as follows:  ‘Would you not say to yourself, “Some super-calculating intellect must have designed the properties of the carbon atom, otherwise the chance of my finding such an atom through the blind forces of nature would be utterly minuscule.’”  A commonsense interpretation of the facts suggests that a superintellect has monkeyed with physics, as well as with chemistry and biology, and that there are no blind forces worth speaking about in nature.  The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond question’” (p. 58).  

Noted scholars have calculated that our “nearly flat universe” was most unlikely.  For it be be as it is, only one nanosecond after the Big Bang its mass density “would have to have been very close to 1024 kilograms per cubic meter.  If the mass-energy had been only one kilogram per cubic meter more, the universe would have collapsed in on itself by now (inhibiting the formation of life), and if it had been one kilogram less per cubic meter (out of 1024 kilogram per cubic meter), the universe would have expanded so rapidly that it would have never formed stars or galaxies necessary for life” (p. 69).  How it possibly happened is hard to imagine—but it seems to have happened precisely that way.   Still more:  “All four universal forces—gravitational, strong nuclear, electromagnetic, and weak—are exceedingly fine-tuned for life” (p. 70).  In the light of so many factors, Spitzer says:  “The ultimate explanation for fine-tuning will have to be not only transphysical (immaterial), but also intelligent to conceive the mathematical systems underlying our physical laws.  This transphysical intelligence will also have to transcend all material/physical processes, structures, and realities so that it can both conceive of those realities and infuse them with mathematical determinations and structures.  The ultimate explanation of fine-tuning, therefore, seems inescapably to be a transphysical/transmaterial conscious intelligence” (p. 95).  After compiling mountains of additional scientific evidence pointing to the fine-tuning of the universe, Spitzer says Fred Hoyle’s “superintellect” is in fact God the “maker of heaven and earth.”  Such evidence points to the high probability of God’s existence, though empirical science can never definitely prove or disprove it.  “Recall that all scientific evidence must be grounded in observable data.  But since God (an unrestricted reality transcending space and time) is not only beyond our universe (the furthest extent of our observational data), but also transcends our sensorial apparatus (and therefore can remain hidden), science will never be able to disprove His existence by its proper method” (p. 102).  Indeed, as the Psalmist said:  “The heavens declare the glory of God; and the firmament sheweth his handiwork” (Ps 19:1).

Turning from empirical science to philosophical metaphysics, Spitzer updates and defends Thomas Aquinas’ famous “proofs” for the existence of God.  The Angelic Doctor “seems to have been the first philosopher to have recognized the full implications of an uncaused reality existing through itself” (p. 111).  (This chapter builds material presented in his earlier treatise, New Proofs for the Existence of God.)  He believes Aquinas had two important metaphysical insights:  a “distinction between existence and essence” and the “priority of existence over essence” (p. 132).  Rooted in these principles he argued, in various ways, that whatever exists must have a cause and said (in Spitzer’s words) that:  “Since everything in reality (except the one uncaused reality) must be a caused reality, and since all caused realities require an uncaused reality to be their first cause, then the one uncaused reality must be the first cause of everything else in existence.  This is what is meant by ‘the Creator of everything else in existence’.  Therefore, the one uncaused reality is the Creator of everything else in existence.  Conclusion:  Therefore, there must exist one and only one uncaused, unrestricted reality that is the Creator of everything else in existence. To say otherwise requires you to argue a contradiction (an impossibility) or to deny the existence of everything (including yourself):  The unique, uncaused, unrestricted Creator is referred to as God.  Therefore, God as defined, exists” (p. 113).  Inasmuch as things exist there must be an “uncaused, unrestricted Creator” sustaining them.  

Spitzer also presents evidence showing we are, by nature, more than mere mortals.  We have a non-material or transphysical soul that explains why we are able to do some very interesting and significant things.  This is especially evident in the many persuasive near death experiences that have received scholarly attention for several decades.  Millions of people have reported having such experiences, and the research shows that they “cannot be thinking, seeing, recalling past memories, or remembering new data” with their biological brain.  There’s something more than grey matter at work here!  To Spitzer:  “here is the mystery.  Even though these patients have no meaningful brain functions, they report being able to think, see, remember, and move. What’s more, they report highly unusual data that can be validated by independent researchers after resuscitation” (p. 144).  Most amazingly:  blind people actually see during their out-of-body states.  “The phenomenon of people blind from birth accurately reporting data throws all known natural explanations of near-death experiences into question because blind people have no visual images in their physical brains that could be projected into imagination, visualization, or hallucination” (p. 147).  To those reporting on their near-death experiences there is no question regarding the reality of their souls.  More than mere matter we are most deeply spiritual beings.

A single anecdote (involving persons two of my recently deceased friends, Terry and Loretta Arnholt, knew quite well) is telling.  A young boy, Colton Burpo, the son of a Wesleyan pastor in Nebraska, had a near-death experience when he was four year old.  He told his parents he had sat on Jesus’ lap, heard angels sing, and met his great grandfather. “Most interestingly, he described an encounter with his deceased sister, who ran up to him and hugged him while he was in ‘heaven’.  She told him that she died in her mother’s tummy, and that she had not been named by their parents.”  When Colton told his mom he had two sisters she was perplexed since she had never told him about her miscarriage.  But he insisted:   “‘I have two sisters.  You had a baby die in your tummy, didn’t you?” She asked, “Who told you I had a baby die in my tummy?” … “She did, Mommy.  She said she died in your tummy.’  Mother Sonja tried to be calm but “was overwhelmed.  Our baby … was—is!—a girl, she thought.  Sonja focused on Colton, and asked:  “So what did she look like?”  “She looked a lot like [his sister] Cassie,” Colton said. “She is just a little bit smaller, and she has dark hair….” Asked to name her, Colton said, “She doesn’t have a name. You guys didn’t name her….” “You’re right, Colton,” Sonja said. “We didn’t even know she was a she.” Then Colton said something that still rings in my ears: “Yeah, she said she just can’t wait for you and Daddy to get to heaven.” When Colton went to another room all his mom could say was:  “Our baby is okay,” (p. 152-154).  

That we are souls indwelling bodies is further evident in our remarkable ability to think.  In defining us as “rational animals” the ancient Greeks were right on target.  We want to know—as young journalists learn—answers to questions regarding who, what, where, when, how, and why.  We cannot not think!  It’s ingrained in us to ask questions.  We do more than perceive things, as do animals, for we take sense perceptions and develop mental concepts.  Our language reveals this.  Only “3 percent of our words signify perceptual ideas, and about 97 percent, conceptual ideas” (p. 168).  We want to know what causes things to be as they are.  Aristotle’s enduring genius was evident when he showed how we invoke material, formal, efficient, and final causes to fully explain things.  To build a house we need wood and nails (materials), a plan (the form), a builder (the efficient cause) and a reason for building it (to secure shelter, the final cause).  Few ways of thinking make more sense—yet all too many moderns consider only material and efficient causes.  We can see differences and similarities in things.  We can understand that some things occur earlier or later than other things.  We can think abstractly, as is most evident in our use of language.  “In sum, without an understanding of high-order concepts such as ‘similarity’ and ‘difference (with respect to the question of what), ‘cause’ and effect’ (with respect to the question of why), and ‘earlier’ and later’ (with respect to the question of when), we would have no understanding whatsoever—no conceptual ideas, no predicates, no syntactically significant language; we would be reduced to the level of perceptual ideas alone” (p. 173).   Consequently, we must, Spitzer says, have a “preexperiential awareness of high-conceptual ideas” revealing “a transphysical origin capable of grasping relatability without reference to what is related.  This points to the existence of a transphysical soul” (p. 169).  

As does our self-consciousness!  “Self-consciousness was recognized to be transmaterial by Saint Augustine and Saint Thomas Aquinas, both of whom noticed that this act of self-reflectivity requires that the same act of consciousness be both content of thought and thinker of thought simultaneously” (p. 197).  We not only think—we know we are thinking.  We’re aware of ourselves and continually make decisions rooted in our ability to think. We can make decisions because we’re free to do so.  Our reason and will make us  free.  Whereas hard-core evolutionary determinists deny it, Spitzer counters with persuasive evidence favoring free will.  Scholars such as Rudolph Otto studying religious experiences have documented “a fundamental, prerational experience of what he termed ‘the numen’ (a spirit or divine power) underlying these experiences.  The numen is experienced as an interior presence of a transcendent ‘wholly Other’, which is mysterious, overwhelming, fascinating, and awe-inspiring, as well as desirable, inviting, and enchanting” (p. 216).  In such moments the sacred dimensions to reality impress us and we have a “spiritual awakening” that often makes all the difference in how we thenceforth live.  To ignore or deny such experiences diminishes us, for we are most fully human when knowing what’s ultimately real.  

We’re also deeply human when acknowledging our moral consciences.  We cannot not know that some things are right and wrong.  Just ask a seven year old boy whose bike has been stolen if he thinks it was right or wrong!  When we do wrong we generally want to make it right. Our conscience generally speaks in a still small voice rather than a loud speaker, but it’s almost always speaking, and “John Henry Newman held that this guiding moral force is one of the most important spiritual dimensions of human beings.  He showed that closely examining it could reveal the presence of God within us” (p. 228).  Spitzer draws upon great literature (e.g. Dostoevsky’s Crime and Punishment)to illustrate conscience’s power, and shows how it points us toward God as “a divine, loving, Fatherly authority figure.”  John Henry “Newman puts it this way: ‘[When we are] contemplating and revolving on this feeling the mind will reasonably conclude that it is an unseen father who is the object of the feeling’” (p. 230). 

Long ago Plato identified five kinds of uniquely human, transcendental desires:  “the desire for perfect truth, perfect love, perfect goodness/justice, perfect beauty, and perfect being/home.  Saint Augustine and Saint Thomas Aquinas, as well as many contemporary philosophers such as Karl Rahner, Bernard Lonergan, Josef Pieper, and Jacques Maritain, have spoken of these same desires through the centuries.  What these philosophers recognized is that these five transcendental desires reveal that God is present to our consciousness, showing that we must be spiritual as well as physical beings” (p. 232).  To deny our transcendental desires is to reduce us to purely physical beings, which is too often done in the modern academy.  But to be truly human is to see in our desires something essential about us revealing something about the world beyond us.  However often we’re told there is no “truth” we keep coming back to affirm it by confessing our knowledge is imperfect.  Yet we would not know it’s imperfect unless we had a hunger for its perfection!  “Without at least a tacit awareness of perfect knowledge, we would not be able to grasp that our current knowledge is imperfect” (p. 234).  Similarly, our desires for love, justice, and beauty all point toward an ultimate Source Who simply IS the transcendentals. 

Though Spitzer writes for a general audience, at times his scientific expertise taxes this reader’s competence!  So when he endeavored to link quantum physics to the soul I was awed without fully grasping it all!  Nevertheless, his discussion of “Quantum Hylomorphism” is quite fascinating.  He notes that for centuries monistic materialists denied the soul and propounded theories widely embraced by scientists. “The whole of reality,” they said, “can be explained by material reality organized in more and more complex layers giving rise to higher-level activities, such as self-consciousness and thought” (p. 240).   Plato and Aristotle, Augustine and Aquinas, of course, challenged the materialists, and, interestingly enough, quantum physics may show how right they were!  Modern physicists generally talk of fields of energy rather than bits of matter in motion.  Consequently:  “If we consider material particles to be excited states of more fundamental quantum fields (as in quantum field theory) interacting in space-time (the curvature of which gives rise to gravitational effects), then we could say that the constituents of the physical world are not purely material in the way that early philosophers . . . conceived them.  Rather, physical reality has something in common with the content of a human mind—information fields that can be reduced to instantiated states capable of interacting with other physical realities and systems” (p. 243).  Then, perhaps, “a transphysical soul with conceptual ideas could act as a higher-order information field influencing all layers of lower-order information fields all the way down to quantum fields intrinsic to particles.  This would enable a free creatively intelligent self-conscious soul to interact with material reality at the lowest levels without being reduced to them” (p. 244).  To Spitzer, these recent developments in science provide clear evidence that we are essentially spiritual beings by tying together “the laws of quantum mechanics, general relativity, and classical physics while allowing for an autonomous, self-conscious, rational, and transcendent soul integrated with the material world through the layering of information fields” (p. 249). 

Science a the Doorstep to God is a challenging read!  But it’s worth the effort—and it’s certainly worthwhile to know there are fully-informed Christians working to defend the faith once delivered unto the saints!  Many surveys show that many youngsters abandon the Christian faith because they think science has disproved it.  Militant atheists such as Christopher Hitchens have persuaded them of this.  What they need to know, as Spitzer shows, is that many atheists have only a superficial knowledge of science and philosophy, while truly deep thinkers frequently acknowledge there must be a Mind behind our visible world, seeing that “the worlds were framed by the word of God” (Her 11:3).

372 Still They Hate

       Since September 11, 2001, there have been 35,000 terrorist attacks around the world.  Virtually all of them were orchestrated by Muslims.  Before 9/11 I’d rarely studied Islam and knew only elementary facts about Muslim history, so I then read and reviewed a number of books.  Considering the brutal attack of Hamas on Israel on October 7, 2023, I’ve decided to republish three of those reviews (with some slight updating).  To understand what motivates Islamic Jihadists one should read first-hand accounts such as Brigitte Gabriel’s Because They Hate:  A Survivor of Islamic Terror Warns America (New York:  St. Martin’s Press, c. 2006), that give specificity to events in the Middle East.  Born in Lebanon, when it was still a peaceful, prosperous, predominantly Christian country, she witnessed the chaos and destruction that followed the Islamic Palestinians’ invasion of her homeland 40 years ago.  Living in the United States, she wrote this book to warn Americans “that what happened to me and my country of birth could, terrifyingly, happen here in America” (p. 2).  We simply must know this:  “The main objective in the radical Islamists’ strategy to dominate the world is the destruction of the United States.  They know that if America, the keystone, falls, then the arch of Western civilization will collapse” (p. 169).  

The only child of elderly, prosperous parents in southern Lebanon, Gabriel enjoyed (for a decade) an idyllic childhood, blessed with parties, religious holidays, good schools, and friendly neighbors.  Things changed rapidly, however, as the nation’s “open door” immigration policies allowed thousands of Palestinians to enter the country.  Following the successful establishment of the nation of Israel, growing numbers of Palestinians lived in PLO refugee camps in Jordan and launched terrorist raids against Israel.  Weary of their troublesome presence, Jordan’s King Hussein expelled them in 1970.  Subsequently, “Lebanon was the only one of twenty-two Arab countries that was willing to open its borders to a third wave of Palestinian refugees” (p. 18).  These refugees quickly seized control of their host country.  Gabriel’s home and village, located near the Israeli border, were reduced to rubble as Muslims routinely shelled it.  “To a ten year old, all this—the civil war and the attack against us—was bewildering.  Just as people asked ‘Why do they hate us?’ after 9/11, one evening I asked my father, ‘Why did they do this to us?’  He took a long breath and paused, deeply concerned about what he was about to say.  ‘The Muslims bombed us because we are Christians.  They want us dead because they hate us’” (p. 33).  To Americans mystified by the terrorists’ attacks on 9/11—and by the Muslims’ rejoicing thereafter—she says:  “There is a three-word answer that is both simple and complex:  because they hate.  They hate our way of life.  They hate our freedom.  They hate our democracy.  They hate the practice of every religion but their own.  They don’t just disagree.  They hate” (p. 145).  

In 1982 Israeli troops occupied southern Lebanon and brought blessed peace to Gabriel’s region.  It was a military action bringing what Europe experienced when the Nazis were defeated in 1945.  Protected by Israeli soldiers, she and her neighbors moved about freely and rebuilt their lives.  When her mother became seriously ill, Jewish military medical personnel took her to a hospital in Israel, where she received first-class treatment.  In that hospital a lifetime of anti-Jewish prejudice drained away from Gabriel.  The Israelis were even treating Islamic terrorists!  “I realized at that moment,” she says, “that I had been sold a fabricated lie by my government and culture about the Jews and Israel that was far from reality.  I knew for a fact, as someone raised in the Arab world, that if I had been a Jew in an Arab hospital, I would have been lynched and then thrown to the ground, and joyous shouts of ‘Allahu Akbar’ would have echoed through the hospital and the surrounding streets” (p. 79).  

In that Jewish hospital, Gabriel volunteered to serve as a translator.  This led in time to a job with a Jerusalem television station, where she worked for six years.  There the contrast between Judaism and Islam was striking.  On the Jewish side, “you see order, structure, cleanliness, and beautiful flowers planted everywhere” (p. 103).  A block away, in the Muslim section, dirt and disorder prevailed.  The “clash of civilizations” shone forth every day in Jerusalem.  At work, helping prepare daily newscasts, the clash seemed overwhelmingly clear, and she “began to realize that the Arab Muslim world, because of its religion and culture, is a natural threat to civilized people of the world, particularly Western Civilization” (p. 105).  Working as a journalist, Gabriel saw the astoundingly favorable treatment Western media gave homicidal thugs like Yasser Arafat.  Ever portraying the Palestine Liberation Organization in positive ways—and Israelis as villains—American journalists greatly helped the jihadists.  “Unable to defeat Western military superiority, our enemy depends on negative themes throughout the media to create disunity, opening schisms on the home front in our communities, on our campuses, and in our government” (p. 111).  Noting that “General Bui Tin, who served on the general staff of North Vietnam’s army, was asked why America was defeated in Vietnam.  He said:  ‘America lost because of its democracy; through dissent and protest it lost the ability to mobilize a will to win’” (p. 112).  In our “fight against Islamo-fascism” these words should give us pause.  Living in Jerusalem, she watched foreign TV “journalists” who “blew in, blew around, and blew out.  They came with their preconceived ideas, toed the network editorial policy line, and perpetuated,” albeit unconsciously, the “subtle Arab and PLO propaganda, which had reached them wherever they came from.”  They loved to photograph “wailing Palestinians” and “kids throwing stones against border patrol soldiers firing tear gas and rubber bullets.  Because I could speak the language and read the Arabic press and knew the nuances behind events, I sensed that reporters were being manipulated” (p. 119).  Thus it was with a both amazement and anger that Gabriel “watched the West fall further under the spell of anti-West, anti-Israeli propaganda, just as it did during its coverage of Lebanon, which portrayed the Palestinians and Islamo-fascists as the victims instead of the aggressors” (p. 119).  

Gabriel was alarmed by this because she had carefully observed developments in the Middle East—and America’s response to them—since 1975.  When, in the 1979 hostage crisis in Iran, President Jimmy Carter “alternately groveled and bungled, Ayatollah Khomeini exultingly proclaimed, ‘America cannot do a damn thing!’  This became a slogan and a battle cry throughout the Middle East” (p. 125).  Though markedly different from Carter in many ways, President Ronald Reagan behaved similarly in Lebanon.  When Hezbollah (subsidized and controlled by Iran) “blew up the marines in Lebanon in 1983, America turned tail and ran, leaving the Christians to be slaughtered in town after town.  It sent a strong, loud, and clear message to the Muslim radicals of the world, including Osama bin Laden:  America is no longer the power it used to be” (p. 125).  That being so, Sudanese Muslims, in 1983, launched a genocidal “jihad to impose Islam on black African Christians and animists in the southern part of the country” (p. 125).  Some two million innocent people were killed within a decade.  

She further provided brief accounts of other Islamic aggressions since 1979.  It’s a world-wide phenomenon with enormous implications.  And it’s taking place within the United States as well.  Radical Muslims, funded by Saudi Arabian petrodollars, are working hard to Islamize this country, though they present a benign face to the public.  “Masquerading as a civil rights organization,” for example, CAIR (the Council on American-Islamic Relations) “has had a hidden agenda to Islamize America from the start” (p. 138).  Gabriel documents and laments the degree to which Saudi money and compliant professors have established influential footholds for radical Islam on many university campuses.  To deal with this threat, at home and abroad, reading this book, with its many suggestions concerning what to do, is most helpful.  

* * * * * * * * * * * * * * * 

An equally readable book, addressing the same issue and coming to basically the same conclusion, though from a markedly different perspective, is Nonie Darwish’s Now They Call Me Infidel:  Why I Renounced Jihad for America, Israel, and the War on Terror (New York:  Sentinel, c. 2006).  Darwish was born into an elite Egyptian family, and her father was a highly placed officer in Gamal Abdel Nasser’s army, considered “one of the most brilliant analytical minds found in the Egyptian military” by an Israeli historian (p. 255).  Unfortunately, he was assassinated by Israeli agents while stationed in Gaza in 1956.  In death, however, he became a celebrated “shahid,” a martyr for Islam, a national hero.  Subsequently, the family settled in Cairo, where Darwish received an excellent education in a Catholic girls’ school and then the American University in Cairo.  She enjoyed the unique economic and social privileges of her class.  But she was also fully immersed in the culture of Islam.  From the radio, as well as the mosques, came “calls to war and songs praising President Nasser.  Arab leaders were treated as gods and they acted as gods” (p. 33).  The call for jihad was ubiquitous.  “No Arab could avoid the culture of jihad.  Jihad is not some esoteric concept.  In the Arab world, the meaning of jihad is clear:  It is a religious holy war against infidels, an armed struggle against anyone who is not a Muslim” (p. 33).  Yet she found herself inwardly torn by some of the incongruities of her world, especially when dealing with “marriage and family dynamics.”  She managed to avoid the arranged marriages expected of  Muslim women.  And she observed that “at the heart of Islamic fundamentalism lies the most precious and important object, the woman.  She is the source of pride or shame to the Muslim man who rules and is ruled by the most despotic, tyrannical, and humiliating forms of governments on earth” (p. 66).  Muslim men’s “honor is totally dependent on their female blood relatives” (p. 66).  Personal honor and integrity are not particularly important.  It’s their women that establish their “honor”!  

Darwish also struggled with the reality of polygamy and its power in Islamic culture.  Married women fear their husbands will take a second wife—often secret liaisons divulged only at the man’s death, when his estate must be divided among all his wives.  Muslim women, consequently, distrust both their husbands and any single women who might attract them.  Then there is the “temporary marriage,” also known as “pleasure marriage,” empowering men to have one-night stands, “usually in exchange for money (calling it a dowry), and still feel that it is acceptable in the eyes of God” (p. 68).  Men may easily divorce their wives, whereas women must beg (often unsuccessfully) for a dissolution of a dysfunctional union.  Consequently, Darwish found “very few happy marriages around me” (p. 79).  As a single woman Darwish worked for several years at the English desk of the Middle East News Agency.  This gave her a unique perspective on the world and also occasionally allowed her to travel abroad.  She became aware of a world quite different from that described by the Egyptian media.  She also made friends with Copts—Christian Egyptians who had suffered for centuries.  In fact she fell in love with and married a Coptic man, with whom she immigrated to the United States in 1978.  

Landing in Los Angeles, she acknowledges that she “loved America even before seeing it” (p. 113).  She found Americans friendly and helpful, courteous, hard-working, generous and honest—virtues  largely absent in Egypt.  She worked for a Jewish businessman and found that most everything she had heard about Jews in Egypt was wrong.  “I asked myself, Why the hate?  What purpose does it serve?  What are Arabs afraid of?”  Indeed, she concluded:  “The Arab-Israel conflict is not a crisis over land, but a crisis of hate, lack of compassion, ingratitude, and insecurity” (p. 126).  American women differed from the women she’d grown up around.  They were supportive of each other, complimenting and helping in various ways.  “Moving to America,” she says, in a memorable passage, “was like being catapulted to another time in history.  America for me was not just a place for making money, having a job, a house, and car, it was a place for becoming a human being” (p. 130).  

Part of that process was religious.  Though she remained a Muslim she hungered for an authentically personal relationship with God.  “The truth is that most Muslims are a part of ‘political Islam’ rather than a religion and a personal relationship with God” (p. 136).  Islam, for her (and most Muslims) is a matter of birth and politics.  Mosques are mainly for men, whom women are expected to obey.  To her dismay, she found “that rabid anti-American feeling is rampant in the majority of U.S. mosques, where Muslims are encouraged to stand out as mujahadeen in America” (p. 140).  Using America’s democratic processes, these Muslims seek to ultimately control the nation.  Knowing the history of Islam, Darwish says:  “The current onslaught against our society is nothing new.  Conquering the world for Islam has been going on since the seventh century using pretty much the same tactics” (p. 144).  

In time, Darwish had rejected her family’s religion, Islam.  One Sunday morning she was watching a Christian preacher on TV who was expounding the love chapter in Paul’s letter to the Corinthians.  She heard about “the love of God I was desperate for but was unable to find in my culture of origin” (p. 160).   Her daughter came in and announced that the TV preacher pastored the church that sponsored the Christian school she attended.   So Darwish determined to visit the church the following Sunday.  She did and heard “a message of compassion, love, acceptance, tolerance, and prayer for all of humanity” (p. 159).  This message differed radically from Muslim preachers’ hate-filled diatribes, urging hearers to “destroy the infidels.”  At that moment, sitting for the first time in that Christian church, this Muslim woman “was faced with a challenge, nothing less than the choice between love and hate” (p. 159).  She made a decision, and it made a difference.  Evaluating this, she writes:  “Many immigrants come to this great nation in search of material gain, which is fine; however, the biggest prize I gained was my religious freedom and learning to love.  For me it was nothing short of cataclysmic.  I had turned from a culture of hatred to one of love” (p. 161).  Though still nominally a Muslim, her God is not a jihadist!

Her new perspective provides readers a lens with which to evaluate developments in the Middle East.  When she made a brief trip to Egypt (arriving home in L.A. the night before the  terrorist attacks on  September 11, 2002, she saw again the deadening hand of Islam upon her land.  She heard again the lies about the Jews.  She sensed the irrational anti-Americanism promoted by the media, including the only U.S. media outlet available to Egyptians, CNN!  “To my surprise,” she says, CNN contributed to Arab hatred and suspicion of America by regularly criticizing America and President Bush” (p. 175).  She noted the pernicious impact of money from Saudi Arabia, funding radical jihadists.  And she sorrowed at the injurious impact of Islam upon the nation’s women, including many in her own family.

Mystified at the silence of allegedly “moderate” Muslims who failed to denounce the jihadists, Darwish began writing and speaking, trying to inform America about the threat of radical Islam.  “In the Arab world,” she insists, “there is only one meaning for jihad, and that is:  a religious holy war against infidels” (p. 201).  That’s what we now face everywhere.  Portraying Islam as a peaceful religion “can only bring disaster between the two worlds” (p. 202).  She especially critiqued America’s universities, where  Muslims are afforded unusual support and easily propagandize naïve students.  “The war of words and propaganda,” she warned, “could be as vital as the actual military war” (p. 211).  The message she declared is clear.  “After 9/11, my fellow Americans should never be in the dark again.  They must understand the brutality and persistence of their enemy” (p. 212).  Radical Muslims intend to conquer the world “and to usher in a Caliphate—that is, a supreme totalitarian Islamic government” (p. 212).  They will do anything possible to accomplish this goal.  “They are willing to bring about an Armageddon to conquer the world to Islam.  We are already in world War III and many people in the West are still in denial” (p. 212).  

She hopes that reading Now They Call Me Infidel will shake some of us out of such denial!  

                                              * * * * * * * * * *                                                                                Serge Trifkovic’s The Sword of the Prophet (Boston:  Regina Orthodox Press, Inc., 2002) set forth an admittedly “politically incorrect” perspective on Islam, its “history, theology, and impact on the world,” and portrayed today’s Middle East conflicts as a recent manifestation of an ancient religious struggle.  He basically reiterates the evaluation rendered by the great French thinker, Alexis de Tocqueville, who’s analysis was unusually prescient:  “‘I studied the Kuran a great deal. . . .  I came away from that study with the conviction that by and large there have been few religions in the world as deadly to men as that of Muhammad.  As far as I can see, it is the principal cause of the decadence so visible today in the Muslim world, and, though less absurd than the polytheism of old, its social and political tendencies are in my opinion infinitely more to be feared, and I therefore regard it as a form of decadence rather than a form of progress in relation to paganism itself’” (p. 208).  Trifkovic began his book with the assertion that the 9/11 Muslim terrorists’ attack on the United States demonstrated an antipathy against the Christian world deeply rooted in Islam.  That so many in the media refer to Islam as a “religion of peace” shows that “the problem of collective historical ignorance–or even deliberately induced amnesia–is the main difficulty in addressing the history of Islam in today’s English speaking world, where claims about far-away lands and cultures are made on the basis of domestic multiculturalism assumptions rather than on evidence” (p. 8).  Just as left-leaning journalists and professors long avoided condemning the evils of Stalinist Russia, pro-Muslim “experts” have skillfully spread skillful propaganda to gloss over the truth concerning Islam.  To set forth the facts—to counteract the propaganda—this book was written.

Importantly, we learn much about Muhammad.  Born in Mecca in 570 A.D., he spent his early years working at menial jobs.  Then, fortuitously, he met a wealthy widow, Khadija, for whom he worked and ultimately married.  Freed from financial concerns, he spent  time in the solitude and, in A.D. 610, received a message from an angel designating him as “the Messenger of God.”  In A.D. 622, he and 70 followers moved to the more hospitable city of Medina.  This event—the hijrah—marks Islam’s true beginning point.  Importantly, Muhammad turned from religion to politics, relying less on persuasion than coercion.  His followers raided camel caravans and enriched themselves and their prophet.  Evaluating the prophet’s career, Trifkovic says:  “Muhammad’s practice and constant encouragement of bloodshed are unique in the history of religions.  Murder pillage, rape, and more murder are in the Kuran and the Traditions seem to have impressed his followers with a profound belief in the value of bloodshed as opening the gates of Paradise’ and prompted countless Muslim governors, caliphs, and viziers to refer to Muhammad’s example to justify their mass killings, looting, and destruction.  ‘Kill, kill the unbelievers wherever you find them’ is an injunction both unambiguous and powerful” (p. 51).  His  example and teachings led quickly to what Trifkovic calls “jihad without end.”  The century following Muhammad’s death witnessed the success of Muslim armies, conquering much of the known world, creating “an Arab empire ruled by a small elite of Muslim warriors who lived entirely on the spoils of war, the poll and land taxes paid by the subjugated peoples” (p. 89).  Under Muslim rule, lush agricultural lands slowly withered away and became deserts.  Thriving economies, subordinated to Muslim dictates, slowly sank into impoverishment.  “The periods of civilization under Islam, however brief, were predicated on the readiness of the conquerors to borrow from earlier cultures, to compile, translate, learn, and absorb.  Islam per se never encouraged science, meaning “disinterested inquiry,” because the only knowledge it accepts is religious knowledge” (p. 196).  

Turning to the “fruits” of Islam, Trifkovic discusses such things as the absolute lack of religious liberty, the subjugation of women, the widespread practice of enslaving non-Muslims.  He also shows how deeply embedded is the hatred for Jews in the Muslim world.  For example, during WWII the Mufti of Jerusalem and former President of the Supreme Muslim Council of Palestine, Haj Mohammed Amin al-Husseini, urged Muslims to support Hitler.  In a radio broadcast from Berlin, he said:  “’Arabs!  Rise as one and fight for your sacred rights.  Kill the Jews wherever you find them‘” (p. 186).  “Kill the Jews!”  That’s chanted in Gaza and American universities.  Still they hate!  As do Hamas’ American supporters!  

371. Favale & Feminism

Reading Abigail Rine Favale’s spiritual autobiography—Into the Deep:  An Unlikely Catholic Conversion (Eugene, OR:  Cascade Books, c. 2018; Kindle Edition)—prods one to consider both the strengths and inadequacies of evangelicalism, the breakthroughs and and fallacies of feminism, and the reasons that led her to enter the Catholic Church.  Living in Utah and Idaho, surrounded by Mormons, she and her family “took refuge in a conservative evangelical bubble” (p. 9).  She cannot remember not being “Christian.”  At the age of three, responding to her father’s question concerning her readiness to accept Jesus into her heart, she said yes and was thus “saved.”  Subsequently, however, she wondered what had really happened.  Was she truly saved?  Again  and again she would repeat the sinner’s prayer just to make sure she was right with God.  Though taught to believe “once saved, always saved,” she continually struggled to feel at ease with that position.   She’d learned “that I should turn to prayer and ask for forgiveness, but this led to a dizzying loop:  I was saved and thus already forgiven for my sins, freed from their penalty, yet I needed to ask for forgiveness for new sins I committed, from which I’d already been forgiven, because I was saved” (p. 128).  But for all its limitations, the religion of her childhood provided a secure social world and ample exposure to the teachings and stories of the Bible, and she was particularly fascinated by Old Testament women who seemed resourceful and self-reliant.

       When she was a senior in high school her parents began attending a large, growing church.  Though they thought it would be good for her it mainly provided opportunities to “troll for boys.”  Her earlier religious fervor faded as she wearied of swinging from enthusiasm to apathy, so she sought comfort by “careening from boyfriend to boyfriend, from love to crush to meaningless hook-up, without much care or awareness of how anything I did affected anyone else.”  Her “drug of choice was male attention” (p. 16).  Losing her virginity, she considered herself “damaged goods” and distanced herself from both her parents and the church.  “Toward the end of that year, I smoked and drank a bit, as if to round out the archetype of the rebellious teenager,” and dabbled with a Ouija board.  Yet:  “Even though I was living in an array of colorful sins, marooning myself from the grace of God, I was not particularly concerned for my soul and fancied myself impervious to demonic forces—because, after all, I was ‘saved’” (p. 16).  

       Graduating from high school and hoping life would somehow improve Favale enrolled in a Christian university.  Though she never names it, a quick online search shows she attended George Fox University in Oregon.  She resolved to make a new start and in time met the man she’d ultimately marry.  During her freshman year she took a required New Testament class and began thinking about the male-female roles spelled out in Scripture.  Doing some research in the library, she “made a life-altering discovery:  Christian feminism.  On these shelves, I found ample resources to interpret the Bible in a way that confirmed my belief in the equality of the sexes before God.  By the end of the semester, just weeks away, I had embraced an evangelical feminist hermeneutic and wrote a term paper for that class with the provocative title ‘God is a Feminist’” (p. 21).  She began reading the Bible differently.  It was not to “be taken at face value as a clear-cut instruction guide for life, free from tensions and ambiguities.  No, this Bible was richer, scarier, multivalent, and in need of careful interpretation.”  She could now do hermeneutics, and that allowed her to believe in “the equal dignity of the sexes” and reject “strict gender roles” (p. 22).  Subsequently she moved from reinterpreting the Bible to disregarding it, losing confidence in its inspiration and authority, even trying to “to pray to God as Mother, convinced that masculine language for God was a hangover from patriarchy” (p. 25).  Yet addressing God as Mother left her strangely unable to pray.

       In her freshman year Favale also took an introductory class in philosophy.  It opened her mind to thinking more deeply, examining her faith in ways never expected in her childhood.  Her professor was an Anglican priest who encouraged students to consider a sacramental form of Christianity that appealed to her.  She’d earlier tasted a bit of liturgical, eucharistic worship in a Lutheran church, but attending Anglican services awakened her to “a deeper sense of the sacred.”  She then began meeting with a small group of students, using the Anglican Book of Common Prayer as a devotional guide.  “Eventually, I decided, along with several of my compatriots—including boyfriend Dave—to be confirmed in the philosopher-priest’s small Anglican denomination” (p. 29).  Nevertheless, her Christianity steadily weakened as she more fully embraced the feminist creed.  She took a generally “postmodern outlook:  ultimate truth cannot be known by finite human beings, so we collectively create metanarratives of meaning to connect with what remains beyond us.  Christianity is one such narrative, perhaps the best and truest one, but not necessarily actually or absolutely true in its entirety” (p. 35).

       As she recalls:  “Untethering myself from tradition, Scripture, creed—at first this all seemed so liberating.  Everything was boundless potential; I could salvage what was meaningful and purge myself from the rest of it.  There were no limits.  I lopped those branches from the great ancient tree, just enough to keep afloat, and I paddled away from the shore alone, without a clear heading, whispering to no one in particular, ‘I am free, I am free, I am free’” (p. 43).  In fact, she was adrift, rootless and foundering, professing to be autonomous and empowered but inwardly quite otherwise.  “When I think about this time,” she says, “this is the image I see:  a girl adrift in the ocean, no land in sight, clutching onto a tiny, wooden raft, just a row of logs tied together.  Her feet churn in the water, touching nothing but dark fathoms beneath.  She is clutching the raft, which holds her afloat, but the raft itself is anchorless, rudderless.  They bob in the water together, waiting to wash up on some shore, any shore, so the world can seem steady again” (p. 42).

       Nevertheless she went off to graduate school in Scotland and returned to George Fox University as a professor of literature.   “In this Christian academic setting, I saw myself as an iconoclast in the trenches, battling for the soul of Christianity against the fundamentalists” (p. 46).  But then she became a mother!  She experienced “the sea change that happens to a woman once she gives birth, the inner transformation that occurs, one simultaneously subtle and earthmoving” (p. 48).  Seeing an ultrasound of her baby boy at 10 weeks of age led her to question her abortion-rights feminism.  She then encountered “the intractable reality of maleness and femaleness.  These are not mere social constructs” (p. 51).  Her pretended autonomy faded as she discovered her “I” becoming “We.”  She hadn’t imagined “the wild motherlove that would pull me out of myself” (p. 52).  She simultaneously sensed a need for church, something she’d disregarded  for too many years, so she made “a radical, unanticipated move” and shifted from being “a disaffected post-evangelical feminist on the brink of atheism” to considering becoming a Catholic! 

       For years she’d embraced a form of Christianity that mainly espoused social justice and love.  Theology and dogma mattered little.  But now she found herself needing something more.  She found herself needing what Catholics call “actual grace”—not a forensic matter of “getting something you don’t deserve” but of receiving a supernatural infusion of God, enlivening and remaking a person.  With St Augustine she confessed:  “Too late have I loved you, O Beauty so ancient, O Beauty so new.  Too late have I loved you!”  She’d at last begun to really love Him and began, for the first time in years, to pray, finding (with Benedict XVI) that “prayer, properly understood, is nothing other than becoming a longing for God.”  To be a Christian is to become one with God through the infilling of His Spirit.  Salvation “for the Catholic, involves actual transformation, an ongoing process of sanctification, so the love of God, which has been poured into our hearts by faith, can be kept alive and continually refine us.  Since salvation ultimately culminates in union with the triune God, the soul must be purged from sin altogether, not merely freed from sin’s consequences.  Contrary to popular misconceptions, this process is not something we accomplish on our own, through our own works and merits—that would be Pelagianism, a heresy rejected by the Church in the fifth century.  No, sanctification is only possible through supernatural aid—divine grace—and our active participation with that grace” (p. 87).

       Having explained her reasons for entering the Catholic Church, Favole devotes the rest of her book to explaining how that decision enabled her to become whole, nourished by the teachings and sacraments of the Church.  The power of God is evident as she enters into a truly devout life, relishing her role as a wife and mother.  Her prose is fluent, her story is compelling, and we learn much from her about our world.  

                                                * * * * * * * * * * * * * * * * * * * *

       In The Genesis of Gender:  A Christian Theory (San Francisco:  Ignatius Press, c. 2022), Abigail Rine Favale expands upon ideas set forth in her engaging autobiography, Into the Deep.  After a decade of deep engagement with postmodern feminism, earning a Ph.D. in women’s writing and gender theory at Edinburgh University, she found herself in 2015 teaching a course on gender theory at her alma mater, George Fox University.  Though she had abandoned orthodox Christianity years earlier, she saw herself as a revisionist called to “construct a new Christianity, fully purged of sexism, hierarchy, and sin” (p. 18).  She’d taught the course many times, but now she “was in the midst of two dramatic upheavals in my personal life:  the birth of my second child” and “a tumultuous conversion to Catholicism, which was upending everything I thought I knew.  I found myself both giving birth and being born—my body turned inside out to bring forth a daughter; my soul turned inside out to make room for Christ.”  A year earlier she’d joined the Church, thinking she could “become a ‘cafeteria Catholic’, lugging my cherished progressive beliefs into the Church and taking shelter under the canopy of conscience.  Then something terrible happened.  My conscience started to rebel.  The progressive beliefs I was carrying began to feel less like personal belongings and more like baggage:  burdensome and out of place” (p. 8).

       In the following weeks she began questioning the feminist dogmas that had long shaped her life.  She realized she’d been living in a darkness of illusions, mistaking rhetoric for reality.  Consequently, she felt like she had “been giving my students poison to drink.”  Sadly enough:  “For so many years, I’d been careless, careless with their minds and, most disturbingly, their souls” (p. 10).  Listening to this confession, one of her colleagues, uninterested in coddling her, came to the point:  “‘You know that verse in Matthew?  The one that says if anyone causes the little ones to stumble, it would be better for him to have a millstone hung around his neck and be drowned in the sea?  I’ve always thought it would be a good idea for us professors to have that tattooed on our arms’” (p. 10).  She needed to repent and change her location.

       Recently appointed a professor in the McGrath Institute for Church Life at the University of Notre Dame and fully aware of feminist ideology, Favole says college students now “inhabit a world where feminism has become mainstream, even in Christian circles.  Not to be a feminist is a major faux pas, tantamount to being anti-woman,” and virtually all universities offer classes in gender theory and feminist philosophy.  A few years earlier she’d “had to go out of my way to find feminism . . . but this is no longer necessary” (p. 25).   There is, she thinks, “an authentically Christian feminism,” but it’s not what’s taught in the universities, so her distinctive form of orthodox Christian feminism makes her a “heretic.”  She’s thankful for some of the good feminism afforded her, but “it ultimately brought me to a place at odds with Christianity, a place I will call the gender paradigm.  The gender paradigm affirms a radically constructivist view of reality, then reifies it as truth, demanding that others assent to its veracity and adopt its language” (p. 26).

       Clearly explaining the positions she now rejects, Favale devotes a chapter to the history and theories of feminism, a term which has been used for a century to describe a 19th century social reform movement, akin to abolitionism and prohibitionism.  Most of those feminists—constituting the first wave—focused on getting the right to vote, and “were not radicals or revolutionaries.  Most were middle-class wives and mothers, committed Christians who opposed abortion” (p. 50).  Their goals were reached with the passing  of the 18th and 19th amendments to the Constitution.  Following World War II a second wave of feminism turned in in more radical directions.  Using the recipe concocted by Betty Friedan’s The Feminine Mystique, “the Women’s Liberation Movement caught fire,” and “feminists began to actively rethink women’s roles within the home and in the workforce.  A major part of this effort was a renewed emphasis on so-called ‘reproductive freedom’—that is, unlimited access to birth control and abortion” (p. 51).  In the 1990s a “third wave” of feminism called for unbridled sexual expression, making consent “the lone benchmark for sex to be considered licit.  If a woman chooses a particular sex act, that sex act is good, even if it involves prostitution, pornography, or sadomasochism” (p. 52).  Then recently a “fourth wave” feminism made an about face, revealing a “growing ambivalence toward unrestrained sexual license, an emerging awareness that women can be mistreated even within the boundaries of what is technically consensual” (p. 52).  “Me too” became its litany.  

       Analyzing these feminist movements, Favale finds the atheistic existentialist/Marxist philosopher Simone de Beauvoir enormously important.  She wrote The Second Sex in 1949 and “was the first philosopher to give an account of male domination that pervades all spheres of human life and thought.  Women were objectified by men, she insisted, and “female human beings are socialized to conform to this understanding of womanhood from birth.  This idea is behind her well-known line, ‘One is not born, but rather becomes, a woman.’  That statement is the mustard seed of gender theory” (p. 54).  To de Beauvoir, as to her mentor Jean-Paul Sartre, the world is meaningless and nothing is natural.  “Meaning must be made; it cannot simply be found.  It is up to us to justify our existence, to give it purpose.  We are not created; rather, we create ourselves, and failing to take up this work of self-creation is a moral transgression” (p. 55).  The second feminist thinker Favale examines is Judith Butler, who shifted the focus from “women’s studies toward gender studies.”  Following Michel Foucault and postmodern philosophy, her “primary goal as a theorist is to dismantle the normalization of heterosexual relationships—the tendency to see the male and female sexual relationship as normal and natural, which in theory-speak is called heteronormativity.  The idea that humankind is split into two sexes that are biologically complementary is, for Butler, a social fiction rather than a matter of fact” (p. 64).  Butler dismissed virtually all sexual standards, including the incest taboo.  Building on this, a black feminist theorist, Kimberlé Crenshaw, added “intersectionality” to today’s feminist agenda—making a worldview Favale calls “the gender paradigm.”  Importantly, “this paradigm is a godless one” and is “diabolic, in the literal sense.”  No God made us—we’re merely effusions of a material or social process as Marx declared.  “Reality, gender, sex—everything, even truth—is socially constructed.”  We are, as humans, nothing by nature.  So we can make ourselves whatever we want to be.  Today’s feminism lacks coherence because classical logic has been discarded as an aspect of toxic masculinity.  But Favale thinks it important to evaluate its framework so as to “understand how this framework differs from a Christian one.  Only from that foundation—from a solid understanding of competing worldviews—is it possible for Christians to mine feminist thought and praxis for hidden gems and to partner with secular feminists toward shared goals” (p. 73).  Doing so requires several chapters in the book as the author challenges us to think clearly about contraception, abortion, women’s liberation, autonomy, social engineering, preferred pronouns, transgender surgery, etc.  

       Having rejected the feminist gender paradigm, Favale sets forth her “Christian theory” of feminism—obviously relying on St. John Paul II’s theology of the body, a “personalism” that sees “each human being as a person, rather than a collection of ever-proliferating labels, and, more importantly, to attune our awareness to the sacramentality of every human body.  Bodies are not ‘just’ bodies.  Bodies are persons made manifest.  The sacramental principle is always at work: the visible reveals the invisible.  The body reveals to us the eternal and divine reality of the person—a reality that can only break into the tangible, sensible world through embodiment” (p. 120).  John Paul II insisted that “the body, in fact, and only the body, is capable of making visible what is invisible:  the spiritual and the divine.  It has been created to transfer into the visible reality of the world the mystery hidden from eternity in God, and thus to be a sign of it.”  

       Favale builds her case by contrasting the Babylonian and Hebrew creation stories.  In Genesis, God creates the cosmos ex nihilo.  “The God of Genesis has no parents; he does not come into being.  This absence of an origin testifies to his eternal presence.  He is not a being, like Marduk, but Being itself, the infinite ground of all finite existence” (p. 30).  God brought into being an orderly world, a good world, that includes human beings, male and female, who shared “a unique dignity, marked by the image of their Creator, and entrusted with the sacred work of cultivating life.  Sexual difference is not an extraneous or faulty feature of the cosmos but an essential part of its goodness” (p. 31).  Importantly:  “Genesis affirms a balance of sameness and difference between the sexes.  This is a delicate balance that is difficult, but necessary, to maintain.  Most theories of gender lose this balance, veering into extremes of uniformity (men and women are interchangeable) or polarity (men are from Mars, women are from Venus).  Both extremes lose the fruitful tension expressed here in Genesis” (p. 33).  What God created man describes, using the miracle of language.  With Adam we can see things as they are—their essence—-name them.  Language does not construct reality—it describes it.  God created by saying “let it be” and man sees and names what it is.  Contrary to a pivotal tenet of feminism, we do not construct reality—we should see and revere it.   

       Favale believes “that the constructionist view of language is a complete inversion of the correspondence view depicted in Genesis” (p. 37).  The Bible declares that we’re created beings, designed to behold and revere what God has made, to study and understand it as best we can, and to live in accord with its design.  The Christian tradition teaches us to “see the world as a created cosmos of which we are a part, this transfigures everything: embodiment, sex, suffering, freedom, desire—this is gathered up into an all-embracing mystery, an ongoing interplay between the human and divine. This imbues all-that-is with renewed significance” (p. 198).  There is a deeply teleological aspect to Christian thought enabling it to discern purpose and meaning in creation, for the “‘whatness’ of a thing, its essential identity, is connected to its purpose” (p. 198).  Drinking deeply from the well of Scripture and saintly writers, Favale rejoices to find how she fits into the cosmos an can walk humbly with her Lord.    

       This means accepting one’s body as a gift from God, something wondrous to be revered, not to be ignored or transformed.  Designed in His image, we ought to nurture and develop our body’s potential.  Openness to God, following the Virgin Mary in accepting His will, is our ultimate good and purpose.  “Our bodies are continual reminders to us that we are not autonomous, that the fantasy of self-creation is no more than a fever dream, a symptom of underlying illness” (p. 203).  Rejecting Genesis, post-modern thinkers declare that “we are not bodies animated by interior souls, but bodies shaped by external forces.”  Changing the language from sex to “gender,” sexual differences are seen as cultural stereotypes rather than naturally embedded realities.  Endless varieties of “gender” are now celebrated, none of them fixed by anything other than one’s inner feelings.  Consequently:  “In our postmodern moment, discussions about gender tend to revolve around appearance and roles.  To be a woman is to fulfill a particular social role, or to mimic typical feminine behavior and attire.  Feminism and its progeny, gender theory, centers the conversation on doing rather than being” (p. 206).

       But being truly matters.  Being open and yielded to God, choosing to say Yes to Him enables us to find what we long for.  “We can choose to receive all these things as gift.  We can choose to say yes to a Love that is stronger than death.  We can enter, even now, the eternal moment of Annunciation, when the yes of one woman becomes the fulcrum of redemption” (p. 211).  Be it done to me according to Your will!

370 America’s Cultural Revolution

       Thirty years ago I read and reviewed James Davison Hunter’s Culture Wars:  The Struggle to Define America.  A sociologist at the University of Virginia, he reported that Americans were deeply divided over such issues as abortion, homosexuality, and public school curricula.  On both sides (folks he labels “orthodox” and “progressive”) there were passionately committed individuals.  Those committed to “orthodoxy,” Davison said, shared an allegiance to “an external, definable, and transcendent authority” whereas those committed to “progressivism” embraced modernity and tended “to resymbolize historic faiths according to the prevailing assumptions of contemporary life.”  Whereas the orthodox defined “freedom” economically, the progressives defined it socially (e.g. permissive sexuality); the orthodox defined “justice” socially (criminals should get what’s due them) while the progressives defined it economically (welfare should provide all for all). 

         The primary “fields of conflict” included:  family; education; media and the arts; law; and electoral politics.  Hunter detailed the struggles going on in these areas and concluded:  “the culture war is rooted in an ongoing realignment of American public culture and has become institutionalized chiefly through special-purpose organizations, denominations, political parties, and branches of government. . . . .  In the end, however, the opposing moral visions become, as one would say in the tidy though ponderous jargon of social science, a reality sui generis:  a reality much larger than, and indeed autonomous from, the sum total of individuals and organizations that give expression to the conflict.  These competing moral visions, and the rhetoric that sustains them, become the defining forces of public life.

       What Hunter described three decades ago is more ominously explained and analyzed by Christopher F. Rufo in America’s Cultural Revolution:  How the Radical Left Conquered Everything (New York:  HarperCollins, c. 2023; Kindle Edition).  “This book,” he says, “is an effort to understand the ideology that drives the politics of the modern Left, from the streets of Seattle to the highest levels of American government,” and its lesson “is a serious one.  There is a rot spreading through American life” (p. xi).  It clearly began spreading in 1968 when the world was jarred by nihilistic “student uprisings, urban riots, and revolutionary violence that has provided the template for everything that followed” (p. 2).   Subsequently,  the revolutionary ideas unleashed in the ‘60s have shaped our world.     

       The most influential “father” of the revolution was Herbert Marcuse, a German philosopher who sought refuge in America when Hitler took control of his country. Rather than embrace the country that gave him refuge, he worked to destroy it, supporting radical groups such as the Weather Underground (led by Bernardine Dohrn and Bill Ayers, who later helped Barack Obama) and the Black Revolution Army.  While teaching at the University of California San Diego, Marcuse gave a lecture in London in 1967 urging hearers to launch a counter-culture fomenting a cultural revolution to upend Western Civilization.  In the audience were black militants (including Stokely Carmichael and Angela Davis) who quickly responded to Marcuse’s revolutionary rhetoric, joining students around the world marching to slogans celebrating “Marx, Mao, Marcuse” and calling for seismic changes needed to usher in a communist utopia.  

      This “New Left” would generally cite racial injustices, feminist woes, and environmental concerns rather than economic inequities to promote the cause.  Early abandoning hopes of waging guerrilla warfare a la Che Guevara, they would use “critical theories” to foment the revolution.  Rufo effectively shows how we now live “inside Marcuse’s revolution,” wherein he “posited four key strategies for the radical Left:  the revolt of the affluent white intelligentsia, the radicalization of the black ‘ghetto population’ the capture of public institutions, and the cultural repression of the opposition.”  In Rufo’s opinion:  “all of these objectives have been realized to some degree,” thus instantiating the “‘transvaluation of all prevailing values’ that Marcuse had envisioned” (p. 11).  In setting forth a “cultural” Marxism, Marcuse envisioned a “dictatorship of the intellectuals” rather than Lenin’s “dictatorship of the proletariat”—putting folks like himself and his university-trained devotees in charge of things.  Writing to Rudy Dutschke, a leader of of Germany’s “new left,” he declared that the “long march through the institutions” was the way to effectively orchestrate the revolution.  And the primary institution to be taken captive would be the universities and through them the public school systems.

     Radicals such as Angela Davis (one of Marcuse’s most devoted disciples) would find positions in America’s finest universities.  Davis herself would teach at UCLA, Vassar, and ultimately (for many years) at the University of California Santa Cruz.  Weather Underground terrorist Bernardine Dohrn became a professor of law at Northwestern University.  Her husband and fellow terrorist, Bill Ayers, joined the faculty at the University of Illinois as a professor of education.  They found intellectual ammunition in Paolo Freire, whose Pedagogy of the Oppressed became a manual for revolutionaries.  Praising Lenin, Mao, Guevara, and Castro, the Brazilian Marxist’s book “sold more than one million copies and is now the third-most-cited work in the social sciences.  It has become a foundational text in nearly all graduate schools of education and teacher training programs” (p. 145).  Freire’s devotees in America began publishing books and finding positions in universities.  In the 1980s “the critical theorists of education began methodically deconstructing the existing curricula, pedagogies, and practices, and replacing them, brick by brick, with the ideology of revolution” (p. 162).

       To illustrate the radicalization of the schools Rufo devotes a chapter to the “Child Soldiers of Portland.”  The city has few minority residents, but “it has become the headquarters of race radicalism in the United States.  The city has elevated white guilt into a civic religion.  Its citizens have developed an elaborate set of rituals, devotions, and self-criticisms to fight the chimeras of ‘systemic racism’ and ‘white supremacy.’  The ultimate expression of this orthodoxy is violence:  street militias, calling themselves ‘anti-racists’ and ‘anti-fascists,’ are quick to smash the windows of their enemies and burn down the property of anyone who transgresses the new moral law” (p. 188).  Just as Bolsheviks recurrently set forth “five year plans,” Portland’s “government has adopted a series of Five-Year Plans for ‘equity and inclusion,’ shopkeepers have posted political slogans in their windows as a form of protection, and local schools have designed a program of political education for their students that resembles propaganda” (p. 189).  By asserting “America is fundamentally evil, steeping children in the doctrine of critical pedagogy and lionizing the rioters in the streets, the schools have consciously pushed students in the direction of revolution” (p. 189).  Thus the city’s “child soldiers” occupy and vandalize downtown Portland!  

       Though the radicals first targeted the schools their ultimate objective was political, gaining control of the country.  To do so “critical race theory” proved pivotal.  After a distinguished career as a civil rights attorney, Derrick Bell became a Harvard professor of law and wrote a “thousand-page casebook called Race, Racism, and American Law, outlining ‘critical race theory.’  At the same time, he was intimately connected to the left-wing radical milieu:  Bell had provided legal support to Angela Davis at her murder trial, studied the critical pedagogy of Paulo Freire, and maintained a close relationship with Black Panther Party members such as Kathleen Cleaver, the wife of Eldridge Cleaver” (p. 205).  At Harvard he spent little time with colleagues but enlisted zealous students hungering for his “left-wing racialist ideology” and attuned to “the rhetoric of elite grievance.”  He rooted his position in the works of Antonio Gramsci (an Italian communist) and Paulo Freire, setting “the stage for the racial politics of our time” (p. 206).  His followers, claiming to be “critical race theorists,” would assail “the founding principles of the country, making the argument for dismantling colorblind equality, curtailing freedom of speech, supplanting individual rights with group-identity-based entitlements, and suspending private property rights in favor of racial redistribution” (p. 207).  They rapidly found positions in the nation’s elite law schools and courts, as is evident in Supreme Court justice Elena Kagan.  

       The ever-insightful “conservative black economist Thomas Sowell, whom Bell had attacked as a race traitor, offered an explanation of Bell’s predicament.  ‘Derrick Bell was for years a civil-rights lawyer, but not an academic legal scholar of the sort who gets appointed as a full professor at one of the leading law schools.  Yet he became a visiting professor at Stanford Law School and was a full professor at Harvard Law School.  It was transparently obvious in both cases that his appointment was because he was black, not because he had the qualifications that got other people appointed to these faculties,’ Sowell said.  ‘Derrick Bell’s options were to be a nobody, living in the shadow of more accomplished legal scholars—or to go off on some wild tangent of his own, and appeal to a radical racial constituency on campus and beyond.  His writings showed clearly that the latter was the path he chose.’  And this path, in Sowell’s view, was a tragic turn.  Bell’s ‘previous writings had been those of a sensible man saying sensible things about civil-rights issues that he understood from his years of experience as an attorney.  But now he wrote all sorts of incoherent speculations and pronouncements, the main drift of which was that white people were the cause of black people’s problems’” (p. 227).  Sadly enough, Sowell concluded:  “’He’s turned his back on the ideal of a colorblind society and he’s really for a getting-even society, a revenge society” (p. 231).  

       Bell’s influential followers (including Kimberlé Crenshaw, who would set forth the notion of “intersectionality”) substituted race for class in Marxism and embraced some of “the most acidic parts of modern thought, beginning from the assertion that ‘objective truth, like merit, does not exist,’ continuing to the Derrick Bell–style posture of ‘deep dissatisfaction with traditional civil rights discourse,’ and ending with a call for a ‘war of position’  against whiteness, colorblindness, private property, and traditional constitutional theory” (p. 233).  Citing postmodernists such as Jacques Derrida and Michel Foucault, they insisted “truth is a social construct created to suit the purposes of the dominant group” and rejected the natural law tradition basic to America’s founding.  “They wanted to replace the old system of colorblindness, equality, and individual rights with a new system one might call a theory of ‘racial reasoning’” (p. 235).  Rather than reason logically from propositions to conclusions, they substituted  “lived experiences,” anecdotes of oppression, allegations of victimization.  Consequently:  “personal offense becomes objective reality; evidence gives way to ideology; identity replaces rationality as the basis of intellectual authority” (p. 237).

       “Critical race theory,” Rufo says, “was never designed to reveal truth—it was designed to achieve power.  The real history of the discipline is not a story of its intellectual discoveries, but of its blitz through the institutions” (p. 249).  Its success is best evident in the widespread emphasis on DEI (Diversity; Equity; Inclusion) throughout much of America.  Embraced by virtually all universities, it’s now spreading through the nation’s public schools and is fully endorsed by the National Education Association.  The Obama Administration, implementing the Dodd-Frank bill, created Offices of Minority and Women Inclusion in numerous federal agencies, so bureaucracies such as the FBI and EPA now require employees to sit through training sessions designed to insure DEI throughout their ranks.  So, following the unrest sparked by the killing of George Floyd, “the National Credit Union Administration told employees America was founded on ‘white supremacy.’  The Department of Homeland Security told white employees they have been ‘socialized into oppressor roles.’  The Centers for Disease Control and Prevention hosted a thirteen-week training program denouncing the United States as a nation dominated by ‘White supremacist ideology’” (p. 255).  

      The nation’s preeminent corporations—especially those needing governmental support—have fallen in line.  “Lockheed Martin, the nation’s largest defense firm, sent white male executives on a mission to deconstruct their ‘white male privilege.’  The instructors told the men that ‘white male culture’ and the values of ‘rugged individualism,’ ‘hard work,’ and ‘striving towards success’ were ‘devastating’ to minorities” (p. 256).  Conducting the training sessions are folks like Johnnetta Cole, a Marxist “scholar-activist” with a history of leading communist organizations such as Venceremos Brigade (a pro-Castro group) and the July 4 Coalition (an ally of the Weather Underground).   She is now hired to conduct training sessions for federal bureaucrats.  Fixated on slavery, Cole holds all whites responsible for it inasmuch as they benefit from “a system that’s based on racism.”  Even “good and decent [white] people” stand guilty for the “racial terrorism” still harming the nation.  Blacks, she claims, suffer from “post-slavery traumatic syndrome” and the “deep emotional and physiological toll of racism.”  Earlier she had championed the Soviet Union and served on the editorial board of the journal Rethinking Marxism. “Now she was promoting it as an official contractor of the United States government.  After fifty years, the long march had been completed.  The radical Left had finally won its Gramscian ‘war of position’ and attained ideological power within the American state.”  Illustrating this ideological victory, “on his first day in office, President Joseph Biden issued an executive order seeking to nationalize the approach of ‘diversity, equity, and inclusion’ and ‘embed equity principles, policies, and approaches across the Federal Government.’  In business, every Fortune 100 corporation in America has submitted to the ideology of ‘diversity, equity, and inclusion’” (p. 265).  

       Concluding his study, Rufo says:  “The story of America’s cultural revolution is one of triumph.  The critical theories have become the dominant frame in the academy.  The long march through the institutions has captured the public and private bureaucracy.  The language of left-wing racialism has become the lingua franca of the educated class” (p. 269).  

       But he hopes for “counter-revolution” will restore health to this nation.  In part this is because the “the radical Left cannot replace what it destroys” (p. 275).  History shows this.  The 1789 French Revolution collapsed into the abyss of the Thermidor; the revolutions throughout Europe in 1848 were quickly co-opted by the much-maligned bourgeoisie; the 1917 Bolshevik Revolution fell rapidly into Stalinist tyranny.  All these revolutions were essentially nihilistic, bent on destroying.  We need a counter-revolution of hope, a positive commitment to enduring values and just political systems.  Who might lead it remains to be seen!

                                    * * * * * * * * * * * * * * * * * * * * * *

       Joanna Williams is an English journalist who has taught in universities and published in prestigious newspapers and journals.  She has recently written How Woke Won:  The Elitist Movement that Threatens Democracy, Tolerance and Reason (London:  John Wilkes Publishing, c. 2022; Kindle Edition).  Jonathan Haidt, co-author of The Coddling of the American Mind  says:  “This book is the essential guide for our era of confusion and incoherence as moral revolutionaries tear down statues, institutions and widely held values.  With clear thinking and gripping storytelling, Williams explains how a minority of the elites in Britain and America were able to intimidate the rest of the elites into silence or complicity, imposing a ‘revolution from above’ that is anti-democratic and cruel.  Anyone who wants to restore sanity, beauty or simple humanity to our public life should read How Woke Won.”

       After sketching a short history of “wokism” Williams surveys the current “cultural wars” and finds it firmly established in powerful English-speaking institutions.  That “Woke has conquered the West” became clear when President Joe Biden, in his first day in office, “signed an Executive Order permitting boys who identify as girls to compete on female sports teams and enter female changing rooms” (p. 14).  To be “woke” means to be conscious of and committed to overcome racism and social injustice.  The word gained currency following the death of Michael Brown in Ferguson, Missouri, when activists proclaiming “black lives matter” urged folks to “stay woke” and dismantle the nation’s racist establishment, which will be done through identity politics.  Woke folks especially stress proper words enabling them to identify folks as oppressors and oppressed.  So one must say “people of color” rather than “colored people, “Latinx” rather than “Hispanic,” and “sex assigned at birth” rather than “male” or “female.”  For those of us who are puzzled by these continually-shifting phrases it’s nice to know that woke language is generally “convoluted, indecipherable and alienating” (p. 35).  

       However convoluted their language, woke writers rigorously censor their predecessors for linguistic sins, so the “battle for the past” looms large in the culture wars. They disdain traditionally celebrated virtues such “as stoicism, courage, resilience, duty, sacrifice and self-control” and celebrate victimhood as the most admirable and coveted attribute.  Statues of Winston Churchill must be torn down because he supported the British Empire.  One of the finest novels of the past calling for tolerance and racial justice—Harper Lee’s To Kill a Mockingbird—is now dismissed as “problematical” for featuring a white protagonist and thus espousing a “white saviour motif.”  Contemporary writers, including the fabulously successful JK Rowling, must be censored for supporting women-only athletics and spaces.   Wokists have effectively reshaped the educational systems in the English-speaking world, establishing inflexible speech codes, as illustrated by a law student in a Scottish university who faced expulsion for “stating, in the context of a seminar on gender, feminism and the law, that ‘women have vaginas’” (p. 119).  Without doubt “‘correct’ speech—be it declaring pronouns or pledging allegiance to Black Lives Matter—is compelled” (p. 120).   

      Summing it all up, Williams says:  “Woke has won.  It has won because its fundamental assumptions have become so widely accepted among the cultural elite that they are considered not just uncontroversial but common sense.  Woke has won because its values have been adopted by members of the professional-managerial class, who have allowed woke thinking to take root within public institutions and to shape policies, practices and laws.  Woke has won because it has become embedded in schools and universities.  Teachers and lecturers cultivate woke attitudes in children and young adults who take for granted that what they are taught is factually accurate and morally correct. After graduation, they carry the lessons imbibed back out into the world.  Woke has won because its leading advocates appropriated the rhetoric of the civil-rights-era struggles for equality” (p. 267).  

       What we who oppose it can do remains to be seen.  Perhaps seeing it as it is is all we can do!

# # #

369 Lee, Grant & Twain

        At a time when obtuse mobs pull down or vandalize historic statues and politicians placate the vandals by removing public monuments, serious scholars continue studying great men, illustrating their value in understanding ourselves and our nation.  They realize, as William Faulkner said:  “The past is never dead.  It’s not even past.”  In Clouds of Glory:  The Life and Legend of Robert E. Lee (New York:  Harper, c. 2012; Kindle), Michael Korda provides a well-written, admiring account of the general.  As a young man Korda took part in the Hungarian Revolution of 1956, then graduated from Oxford University and now writes histories.  He begins his book not with details of Lee’s early life but with his role in suppressing John Brown’s attack on Harper’s Ferry in 1859—an incendiary incident helping provoke the Civil War.  Brown was revered by Northern abolitionists because of his guerrilla activities in “bleeding” Kansas and helping slaves escape to Canada.  Dispatched to quash the insurrection, Lee and a small army detachment did so, treating the captured survivors “with kindliness and consideration,” but overseeing Brown’s hanging.  Present with him were a number of soldiers who would serve with him during the Civil War—most notably J.E.B. Stuart and Thomas J. “Stonewall” Jackson.  Considering the event Herman Melville “described Brown prophetically as the ‘meteor of the war’” and his phrase rang true, for it would “be only seventeen months between John Brown’s execution and the firing on Fort Sumter that brought about the war.”  

       As a veteran U.S. Army officer, Robert E. Lee had served in various parts of the nation.  “He was a cosmopolitan, who felt as much at home in New York as he did anywhere in the South; he was opposed to secession; he did not think that preserving slavery was a goal worth fighting for; and his loyalty to his country was intense, sincere, and deeply felt.  He was careful, amid the vociferous enthusiasm for secession in Texas once Lincoln was elected, to keep his opinions to himself, but in one instance, when asked ‘whether a man’s first allegiance was due his state or the nation,’ he ‘spoke out, and unequivocally.  He had been taught to believe, and he did believe, he said, that his first obligations were due Virginia’” (#520).  A singular commitment to one’s state was not at all unusual in those days.  Lee’s first ancestor had settled in Virginia in 1639, two of his descendants signed the Declaration of Independence; two others would be-come generals and one, Zachary Taylor, would become a president.  To John Adams, when the American Revolution began, the Lees had “more men of merit . . . than any other family.”  They were all loyal Americans, but above all they were Virginians!  

       During the War for Independence, Robert E. Lee’s father, “Light Horse” Harry, became a celebrated military officer.  Like his father, “Robert was tall, physically strong, a born horseman and soldier, and so courageous that even his own soldiers often begged him to get back out of range, in vain of course.  He had his father’s gift for the sudden and unexpected flank attack that would throw the enemy off balance, and also his father’s ability to inspire loyalty—and in Robert’s case, virtual worship—in his men.”  But neither man worked well with politicians.  The father was “voluble, imprudent, fond of gossip, hot-tempered, and quick to attack anybody who offended or disagreed with him.”   But the son “kept the firmest possible rein on his temper,” disliked confronting or arguing with others.  “These characteristics, normally thought of as virtues, ultimately became Robert E. Lee’s Achilles’ heel, the one weak point in his otherwise admirable personality, and a dangerous flaw for a commander, perhaps even a flaw that would, in the end, prove fatal for the Confederacy.  Some of the most mistaken military decisions in the short history of the Confederacy can be attributed to Lee’s reluctance to confront a subordinate and have it out with him on the spot, face-to-face” (pp. 30-31).

       Lee’s mother, wanting her son to eschew her husband’s example, sought to instill in Robert a strong Christian faith.  “For this task she was extraordinarily well suited; her few surviving letters reveal formidable theological knowledge, as well as a precise sense of right and wrong and a deep spiritual belief.  ‘Self-denial, self-control, and the strictest economy in all financial matters were part of the code of honor she taught [him] from infancy,’ and in his later years Robert E. Lee frequently said that he ‘owed everything’ to his mother.”  Though an Anglican, “Ann Carter Lee was in many ways a child of the Second Great Awakening that swept through America in the early nineteenth century, creating sometimes startling new religious denominations and laying greater emphasis on the need to be saved and on personal piety rather than simply attending traditional religious services.  Her beliefs were what we would now call evangelical, and she had the strength of mind and purpose to impress them on her son Robert for life—indeed the most striking thing about his letters is his lifelong, simple, unshakable belief in the need to accept God’s will uncomplainingly, and his deep faith.  ‘It is all in God’s hands’ is a phrase he used often, not in a spirit of fatalism, but in one of confidence.  The intensity of Lee’s religious convictions was one of the elements that would make him a formidable warrior, and also one of the reasons why he remains so widely respected not just in the South, but in the North as well—not only as a hero, but as a kind of secular saint and martyr” (pp. 35-36).

       Korda takes the reader through Lee’s education, military service in Mexico, and work in various army posts (usually devoted to supervising engineering projects).  In most ways it was a rather prosaic career, with little possibility of attaining distinction until he captured the attention of Winfield Scott, who found him a fine field officer during the Mexican War.  As “Scott’s protégé, prized particularly for his uncanny eye for terrain,” Lee helped win the war and was made a “brevet lieutenant colonel.  No other officer in the Mexican War received such universal praise, or won such widespread admiration” (p. 255).  Indeed, General Scott declared Lee “’the very best soldier that I ever saw in the field’” (p. 266).  Following the war Lee returned to working in army posts (including an assignment dealing with Comanche raiders on the Texas frontier), serving a stint as superintendent of West Point, and caring for his family (seven children), struggling with finances, serving as executor for his father-in-law’s estate, and wondering if he’d made the right vocational choice.  

       But everything changed when Abraham Lincoln was elected President and southern states began seceding.  Though Lee personally opposed slavery he also opposed abolitionism.  Generally abstaining from politics, he was something of a Whig.  As the war began President Lincoln made Lee a colonel and ultimately offered him the rank of a “major general in command of the largest army in American history” (p. 391).   But when Virginia seceded Lee felt obliged to serve his beloved state and soon headed the Confederate military forces therein.  “Lee amazed everyone by his energy and professional skill, putting together in a matter of weeks an army of 40,000 troops” (p. 410).  He led them in various battles in Virginia, Maryland and Pennsylvania, ultimately surrendering to Federal forces led by Ulysses S. Grant in 1865.  Korda describes and analyzes the various battles, though he is less concerned with military details than with the person, General Lee, who always “set his men an example of resilience, confidence, and devotion to duty” (p. 1054).  

       Following the war, Lee enjoyed what Korda calls an “apotheosis.”  Rarely has a man come “not only to embody but to glorify a defeated cause.”  Amazingly, Lee  became “a national, not just a southern hero,” with a U.S. Navy submarine named for him, a postage stamp carrying his picture, a U.S. Army tank named after him, and President Gerald Ford posthumously restored his citizenship in 1975.  “It is hard to think of any other general who had fought against his own country being so completely reintegrated into national life, or becoming so universally admired even by those who have little or no sympathy toward the cause for which he fought” (p. 1141).  Rather than accept more prestigious and remunerative positions Lee became the president of Washington College, a tiny school with almost no students in Lexington, Virginia.  Under his guidance, the college flourished, and its new president sought to provide Southerners an example for adjusting to post-war realities.  To Henry Ward Beecher, a prominent New York minister, “Lee ‘was entitled to all honor,’ and praised him for devoting himself ‘to the sacred cause of education’” (p. 1170).  

       Korda’s portrait of Lee is consistently positive, if not quite as admiring as Douglas Southall Freeman’s famed four-volume biography.  He does show that a Hungarian emigrant, rather free from the many biases of native-born Americans, can carefully study and find worth celebrating the life of Robert E. Lee.

                                     * * * * * * * * * * * * * * * * * * * * 

       Notably more critical of the general, Allen C. Guelzo’s Robert E. Lee:  A Life (New York:  Knopf Doubleday Publishing Group; Kindle Edition. Vintage Books, c. 2021) seeks to appreciate Lee’s strengths without glossing over his faults.  To the author Lee remains very much a “mystery” inasmuch as he was both upright and errant.  Everyone who met Lee, “no matter what the circumstances of the meeting—ever seemed to fail to be impressed by the man.  His dignity, his manners, his composure, all seemed to create a peculiar sense of awe in the minds of observers” (p. 18).  And yet he fought for a  rebellious confederacy committed to preserving slavery.  A Princeton professor who has published a number of historical works, Guelzo’s stance is nicely summed up by Lee himself after the Civil War when he wrote a letter, saying:  “My experience of men has neither disposed me to think worse of them or indisposed me to serve them; nor in spite of failures, which I lament, of errors which I now see and acknowledge; or of the present aspect of affairs; do I despair of the future.  The truth is this:  The march of Providence is so slow, and our desires so impatient; the work of progress is so immense and our means of aiding it so feeble; the life of humanity is so long, that of the individual so brief, that we often see only the ebb of the advancing wave and are thus discouraged.  It is history that teaches us to hope” (p. 13).

      As is expected of a scholarly biographer, Guelzo digs into Lee’s ancestry, family, education, and career.  Though he doesn’t consider Lee a first-rate intellectual, he was certainly a well-tutored youngster, reading Caesar, Sallust, Virgil, Cicero, Horace, and Tacitus in Latin, plus Xenophon and Homer in Greek.  Most importantly, since he sought admission to West Point, he mastered arithmetic, algebra, and geometry.  After doing well at the academy he joined the Corps of Engineers—“a small cadre of brainy technicians who prided themselves on their superiority to lesser graduates who ended up in” other branches of the army (p. 69).  His academic work was exemplary, but it was “his almost unbearable gentility” that most impressed his classmates.  Lee was, Joseph E. Johnston remembered “full of sympathy and kindness, genial and fond of gay conversation, and even of fun, that made him the most agreeable of companions” (p. 73).  When he returned to West Point to serve as superintendent in the 1850s, he similarly impressed cadets as “‘the personification of dignity, justice, and kindness . . . the ideal of a commanding officer’” (p. 201).  

       Following the Compromise of 1850, slavery became a smoldering issue.  Lee favored neither slavery nor its abolition, saying:  “‘In this enlightened age, there are few, I believe, but what will acknowledge, that slavery as an institution, is a moral & political evil in any country’” (p. 227).  Yet he apparently saw no way to actually end it.  As southern states began severing their ties with the Union, many observers wondered if he would retain his position as an officer in the federal army.  He had served with distinction in the Mexican War and enjoyed the favor of General Winfield Scott, who  “did not hesitate to endorse him in the most dramatic terms:  ‘If I were on my death-bed tomorrow, and the President of the United States should tell me that a great battle was to be fought for the liberty or slavery of the country, and asked my judgment as to the ability of a commander, I would say with my dying breath, Let it be Robert E. Lee’” (p. 211).  

       President Lincoln apparently considered giving Lee command of the Union army and might have done so if Virginia had not joined the Confederacy.  He could not “draw his sword” against native State and devoted himself to serving her.  Thus he made “a decision in which he irrevocably, finally, publicly turned his back on his service, his flag, and, ultimately, his country.  All of this was done for the sake of a political regime whose acknowledged purpose was the preservation of a system of chattel slavery that he knew to be an evil and for which he felt little affection and whose constitutional basis he dismissed as a fiction” (p. 306).  In time he became the commanding general of the Army of Northern Virginia and for four years fought to win the war.  Guelzo carefully describes the various battles and evaluates Lee’s effectiveness as a strategist, noting that Lee’s triumphs were often due to his opponents’ failures and he relied too much on his subordinate generals to implement his general orders.  He did, however, inspire his men to fight courageously and merits commendation for his leadership during the war.  “Only Grant emerged in the war with military gifts on a par with Lee,” and there is a rightful “glory for Lee in that achievement” (p. 655).

      Lincoln, Lee and Grant all deeply desired peace and reconciliation.   (An interesting illustration of this was the fact that Ulysses Grant’s widow ultimately became good friends with Varina Davis, the widow of former Confederate president Jefferson Davis!).  Ulysses S. Grant believed the officers and men who had received parole at Appomattox should not be prosecuted, asserting:  “‘I should have resigned the command of the army rather than have carried out any order directing me to arrest Lee or any of his commanders who obeyed the laws’” (p. 567).  When Lincoln was slain Lee considered it “‘not only a crime against our Christian civilization’ but ‘a terrible blow to the vanquished.’”  And he praised Grant, whose treatment of Southern soldiers was “‘without parallel in the history of the civilized world’” (p. 574).  Though granted a parole when he surrendered at Appomattox—and though President Lincoln, in his last cabinet meeting had spoken “very kindly” of him—some Northerners wanted Lee to be indicted and imprisoned.  The he was, in fact indicted by a prosecutor, he was never brought to trial or imprisoned.  

     When he died, Lee was mourned throughout the South and rather admired in many sections of the North.  Thus Philadelphia’s Evening Telegraph declared that “‘the passionate feelings engendered by the conflict have so far died away that there is a general disposition to dwell upon his personal virtues rather than to follow him to the grave with denunciations’” (p. 630).  At the end of his presentation Guelzo—admiring the man but perplexed by his service for the Confederacy—concludes:  “Mercy—or at least a nolle prosequi—may, perhaps, be the most appropriate conclusion to the crime—and the glory—of Robert E. Lee after all” (p. 662).  His footnotes and bibliography show Guelzo’s  diligence in thoroughly researching his subject, and his portrait of Lee merits serious consideration.  

                                                   * * * * * * * * * * * * * * * * * * * * * *

       Mark Perry wrote a fascinating account in Grant and Twain:  The Story of a Friendship that Changed America (New York:  Random House Publishing Group, c. 2004; Kindle Edition).  Following his presidency Ulysses S. Grant settled in New York and engaged in various business ventures.  Thing the advice of a man he trusted, he invested in (and lent his name to) an endeavor that failed in 1884.  Rather than file for bankruptcy, “Grant vowed that he would repay every penny of the debt he owed and pledged that before his death, he would find a way to provide for his wife and children” (p. 20).  Then Mark Twain—a “Grant intoxicated man”—determined to help out by encouraging him to write his life story.  The two men met, and in 15 months “Ulysses S. ‘Sam’ Grant and Mark Twain—Samuel Clemens—became the best of friends.  Seemingly so different and yet with so much in common, Grant and Twain would, in that short time, transform the world of American writing.  For as Grant was struggling to write the story of his life, he was helped in his final battle by a man who had just completed the story of his.  Within that single fifteen-month period—perhaps the most creative in American literary history—Grant would not only write his Personal Memoirs, Twain would reach the peak of his career with the publication of Adventures of Huckleberry Finn. Those two books, perhaps the finest work of American nonfiction ever written and the greatest of all American novels, defined their legacy.  In the end, the struggle of both men—Grant’s struggle to retrieve his fortune and Twain’s to make his—was not about wars or books or even money.  Over a period of fifteen months, Grant and Twain wrote the story of their country and ours.” (p. 23).  

       After sketching biographies of the two men Perry describes their interactions.   Robert Underwood Johnson, hoping to make the Century Publishing Company successful, had earlier talked with Grant about writing an article covering some aspect of the Civil War.  Grant had been uninterested, but now he told Johnson he needed money and might do some writing.  Johnson said his magazine would publish whatever he wrote and “suggested that Grant write four articles, one each on the Battle of Shiloh, the Vicksburg Campaign, the Battle of the Wilderness, and the surrender of Lee.  Johnson said the magazine would pay him $500 for each article.  It was an extraordinary sum for the time.  Grant agreed to this arrangement” (p. 82).  He would be the first Civil War commander to write a memoir, and when he submitted his first article it was obvious he had a gift for writing, for he could recall “small incidents that gave color to the larger theme—and he had a prodigious memory.  At times his prose was almost electrifying” (p. 84).  A century later the literary critic Edmund Wilson said:  “‘The thick pair of volumes of the Personal Memoirs used to stand, like a solid attestation of the victory of the Union forces, on the shelves of every pro-Union home.”  Indeed:  “‘It may well be the most powerful military memoir in print, vying with Julius Caesar’s commentaries as (in Wilson’s words) ‘the most remarkable work of its kind’” (p. 278).  

      In 1884 Grant discovered he had throat cancer with little hope of recovery.  His physician prescribed pain killers but sometimes refused “to treat his patient, hoping that it would more quickly bring about his death, thereby putting an end to his suffering” (p. 95).  Facing his demise, financially broke “and now mortally ill, he viewed the publication of his memoirs not only as a fitting coda for his life, but as the sole means at his disposal to retrieve his reputation and leave his family financially secure” (p. 97).   At the same time Twain was a celebrated writer and humorist but had yet to write truly fine fiction.  Off and on, over the years, he had worked in a manuscript that would become Huckleberry Finn, but it was not yet finished.  In it he explored the nation’s “original sin” and its devastating impact on the South.  He sensed, deep within, “that the central and singular fact that had shaped his time and shaped him was the question of slavery—that ‘bald, grotesque and unwarrantable usurpation’ of human freedom that ‘stupefied humanity.’  And at the heart of slavery was the question of race, of racism—which is what made slavery possible’” (p. 261).  And as Twain devoted himself to the story he “realized that Huck Finn might be the one book for which he would always be remembered” (p. 147).  

       Encouraged by Twain’s promise to help him publish the manuscript, Grant worked hard.  Some weeks he was “particularly prolific, writing upward of ten thousand words on some days, while spending others editing and correcting what had already been written.  Twain, who saw Grant nearly every day during this period, was stunned by Grant’s abilities.  ‘It kills me these days to write half of that,” he commented” (p. 197).  He was also struck by the general’s “gentleness, goodness, sweetness.’”  Volume one his  Personal Memoirs of U. S. Grant was published on December 10, 1885, and within two months “Twain presented Julia Grant with a check for $200,000.  To that time it was the largest royalty payment ever made in U.S. publishing history” (p. 277).  Ultimately she received nearly $450,000 and Twain’s publishing firm turned a nice profit in the process.  Millions of people rejoiced when reading Grant’s autobiography, and the final words of Grant’s Memoirs came to symbolize the lesson of a war that divided a nation and cost six hundred thousand lives. ‘Let us have peace,’ Grant wrote.  They were the last words of his book” (p. 277).  Amen!

368 Today’s Totalitarians

         During the past century dystopian novels have highlighted fears regarding the personal and societal impact of what Jacques Ellul termed “the technological society.”  Running from Aldus Huxley’s Brave New World through C.S. Lewis’s That Hideous Strength to George Orwell’s Animal Farm and 1984, these fictions force us to face the unintended consequences of mechanistic thinking that makes machines which may ultimately destroy us.  What would be most harmful, these writers suggest, is the development of a technological totalitarianism that reduces human beings to cogs in an impersonal social machine, making a man little more than what Walker Percy called a “poor lonesome ghost locked in its own machinery.”  These thinkers discerned “the hard pull of the technological revolution moving us along at lightning speed toward a digital slavery,” echoing the thought of Emile Durkheim, who thought the industrial revolution was upending society  and burdening it with anomie—the debilitating feeling of aloneness even while surrounded by masses of people.  Aware of their power over lonely people, totalitarians forever try to abolish those “little platoons” Edmund Burke celebrated—the families and churches and local organizations that most easily resist tyranny.  

     No scholar invested more attention to this threat than Hannah Arendt, who wrote a two volume treatise—The Origins of Totalitarianism and Imperialism: Part Two of The Origins of Totalitarianism.  Soon after WWII ended she declared:  “We no longer hope for an eventual restoration of the old world order with all its traditions, or for the reintegration of the masses of five continents who have been thrown into a chaos produced by the violence of wars and revolutions and the growing decay of all that has still been spared.  Under the most diverse conditions and disparate circumstances, we watch the development of the same phenomena—homelessness on an unprecedented scale, rootlessness to an unprecedented depth.” When men are “isolated against each other” they easily fall prey to tyrannical governments.  Still more:  “It is as though mankind had divided itself between those who believe in human omnipotence (who think that everything is possible if one knows how to organize the masses for it) and those for whom powerlessness has become the major experience of their lives.”  Thus the subjects for totalitarianism are homeless, rootless people split between folks who aspire to omnipotence manipulating those who crave subservience. 

       Some 70 years later, younger writers have sought to update and make current Arendt’s analysis.  Probing the implications of today’s homeless, rootless masses, Stella Morabito (a former CIA analyst and current Federalist contributor) has written The Weaponization of Loneliness:  How Tyrants Stoke Our Fear of Isolation to Silence, Divide, and Conquer (New York:  Bombardier Books, c. 2022; Kindle Edition).   As a young scholar she studied the history of the Soviet Union and detected “patterns of weaponized isolation” that enabled “totalitarian forces” to destroy private life and set up a surveillance state” that makes everyone dependent upon the state (p. 10).  Leaving the academic world in to rear her children, she sensed similar developments in America, evident in the wide-spread support for abortion, euthanasia, “political correctness, identity politics, family breakdown, K–12 reforms, radical environmentalism, campus speech codes, and woke-creep in religious institutions, in the military, and in corporate America.  I watched the growth of gender ideology, including nascent propaganda on ‘transgender kids,’ which I detected in my local public library in the mid-1990s.”  Orchestrating these developments, the media began censoring dissidents, reviling anyone disagreeing with the dominant message, labeling dissenters “bigots.”  As she read and thought about such things, she “finally concluded that there is a machinery at work—a machinery of loneliness.  Tyrants operate that machinery—wittingly or not—in order to disarm those they wish to control” (p. 11).  In response, she urges us to “aggressively defend the private sphere of life because that is the only safe haven for developing the power of human connection.  Only then can we start defending ourselves against attempts to isolate us, especially from those we love and those who love us” (p. 12).

       Symptoms of a creeping authoritarianism have been evident worldwide since the 1968 upheavals.  Had we rightly understood the outcries over racism and sexism, had we subsequently known the implications of multiculturalism and identity politics, had we detected the growing hostility to free speech and religion, we’d not be surprised by the current situation.  We should have seen how government-decreed lockdowns due to the COVID-19 virus fully revealed totalitarian impulses gaining traction around the globe.  Incessant propaganda, mainly promulgated through TV and social media, “stoked fear of random death from the virus,” and polarizations followed.  “After elites in government, media, and Big Tech demonized anyone not in line with the mandates, many people responded by disowning friends and family members who weren’t with the program.  In fact, the COVID-19 mandates blatantly enforced our isolation from one another, often in the most intimate and brutal ways.”  Social distancing, masks, lockdowns, vaccine mandates, “censorship of several reputable medical experts who offered different opinions on treatment—a cascade of decrees thoroughly altered our customary behaviors.  These social controls not only meant people could no longer attend church but “patients in hospitals were not allowed any visitors at all.  Brutally separated from loved ones, many were left to die alone” (p. 18).  

       These dicta (always veiled in mantras of “public health”) remind us of C.S. Lewis’s judgment:  “Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies.”  Morabito devotes considerable attention to failed utopian endeavors, including Oliver Cromwell’s Puritan Commonwealth with its “godly citizens,” Robespierre’s “republic of virtue,” Stalin’s “dictatorship of the proletariat,” Hitler’s “Third Reich,” and Mao’s “cultural revolution.”  All were celebrated as wonderful ideals!  But they quickly ushered in reigns of terror.  So too legions of busybodies recently had a field day issuing warnings and orders designed to corral COVID-19.  Following the well-weathered authoritarian script they aimed “to remold human beings into purely collectivist creatures that serve the utopian model” (p. 36).   Atomized folks easily fall prey to utopian fantasies when they “feel alienated and yearn for a perfect society of peace and justice, particularly during times of economic and social upheaval” (p. 27).   Utopians envision a perfect community wherein our fondest desires are satisfied and our loneliness dissipates as we’re absorbed into a community.  Rejecting reality, utopians choose to live in an imaginary world (free from poverty, disease, sin, etc.) that simply cannot be.  To do so a small corps of revolutionary elites (be they fascists or communists) mobilize mobs and violently impose their agenda. 

       All these utopian movements cultivated a religious fervor shrouded with appeals to “sciences” such as eugenics or historical materialism.  “Fake science,” says Morabito, “is always the result of nonstop propaganda coupled with censorship of alternative views” and was fully on “constant display” during the COVID-19 panic.  A powerful coalition of Big Media and Big Tech “easily de-platformed any physician who had a different opinion about COVID-19 treatments or the origins of the virus.”  Under the banner of “science says” we were told to repudiate any “views that didn’t align with narratives on other topics such as global warming, abortion, or gender dysphoria in children.  The push to enforce critical race theory got the same treatment of heavy propaganda and censorship” (p. 63).  As always the ruling elites “seek to invade and destroy the private sphere of life.  All weaponize the human fear of ostracism—and our hardwired need for connection with others—to coerce conformity and compliance” (p. 72).

      One effective way to coerce conformity is to mobilize mobs.  Rudderless civilizations lend themselves to takeovers by the masses, and manipulating them “in order to seize power is integral to all totalitarian schemes” (p. 98).  Morabito lists six factors fueling mobs:  a “malady,” e.g. systemic racism; a “cure,” e.g. joining a group championing equity; an “enemy,” e.g. white racists; an “ideology,” e.g. social justice;  a “sense of urgency,” e.g. the catastrophic culmination of global warming; and a “monopoly on narrative,” e.g. silencing dissident speakers on university campuses (p.112).  A decade ago few of us imagined one’s sex could be anything other than what was “assigned at birth.”  Amazingly, “practically overnight,” we were told to support the notion that a man could simply “identify” as a woman and are treated thusly—an “idea was institutionalized into a mob mindset via media control by those pushing the agenda” (p. 121).

       In a series of enlightening chapters Morabito examines the re-segregation of blacks under the guise of “identity politics,” the “estrangement of women” under the banner of  “political correctness,” the radicalization of youngsters leading to mob behavior, and the resulting dehumanization of whites in America.  Breaking people into ever-smaller groups—highly evident in “intersectionality” rhetoric—illustrates the truth of Carl Jung’s comment:  “The mass state has no intention of promoting mutual understanding and the relationship of man to man; it strives, rather for atomization, for the psychic isolation of the individual.”  In a chapter titled “Cloning Lonely Puppets: The Subversion of Education” (a piece every parent should read) the author shows how the schools have contributed to our current malaise.  In 1901 an influential progressive sociologist, Edward A. Ross, wrote Social Control urging Americans “‘to replace community, family and church with propaganda, education, and mass media . . . the State shakes loose from the Church, reaches out to School. . . .  People are only little plastic lumps of human dough’” (p. 205).  Such views were soon embraced by John Dewey, who believed schools should train citizens for a socialistic society.  Subsequent educators would promote social engineering through government schools which were, in the wake of the ‘60s revolutions, infiltrated by the likes of Bill Ayers—the former Weatherman, friend of Barack Obama, and current professor of education—seeking to substitute identity politics and political correctness for  classical, content-based subjects. 

       What’s true for the schools is equally true for government, the military, medical and legal organizations, the judiciary, the media, et al.  Utopian revolutionaries forever try to isolate the individual and break up subsidiary institutions such as the family and church.  Marx and Engels, in The Communist Manifesto, urged the abolition of the family and burial of religion.  “As Robert Nisbet noted, ‘The State becomes powerful not by virtue of what it takes from the individual but by virtue of what it takes from the spiritual and social associations which compete with it for men’s devotions’” (p. 234). 

       So what’s to be done?  Morabito says we must begin in a very small but utterly momentous manner:  daring to speak freely.  Decades ago Jacques Ellul, in his masterful work Propaganda, said:  “Propaganda ends where simple dialogue begins.”  People of faith must “live out” their faith and endure possible ridicule.  “Strong communities of faith have a bigger impact than most realize.  If you’re part of one, invest in it and guard it vigilantly” (p. 261).  Forge strong families and make good friends.  “It’s up to us to shed as much light as possible on the methods of the madness in the machinery of loneliness.  We can save ourselves in the process, making the world a more civil and less lonely place” (p. 267). 

                                                  * * * * * * * * * * * * * * * * * * * * * * * * * *

       In The Psychology of Totalitarianism (White River Junction, VT:  Chelsea Green Publishing, c. 2022; Kindle Edition), Mattias Desmet, a young Belgian psychologist,  draws on insights from Hannah Arendt to analyze “the emergence of a new totalitarianism, no longer led by flamboyant ‘mob leaders’ such as Joseph Stalin or Adolf Hitler but by dull bureaucrats and technocrats” (p. 9).  Under certain conditions, masses of people behave as if hypnotized, losing both rationality and ethical surety.   We’re immersed in the current “Grand Narrative of Western Civilization” that reduces reality to material entities and processes.  In his doctoral dissertation Desmet had examined flaws in scientific publications.  “Sloppiness, errors, biased conclusions, and even outright fraud had become so prevalent in scientific research that a staggeringly high percentage of research papers [including medical resarch]—up to 85 percent in some fields—reached radically wrong conclusions.”  No less than 85 percent of medical studies come to questionable conclusions due to errors, sloppiness, and fraud.  Scientific papers were written by scholars who actually thought they were doing first-rate research without understanding how their work “was not bringing them closer to the facts but instead was creating a fictitious new reality.”  Much the same may be said about legions of scientists’ response to the coronavirus crisis:  “a maze of errors, sloppiness, and forced conclusions, in which researchers unconsciously confirm their ideological principles” (p. 163).   It all fit Arendt’s diagnosis:  “The undercurrent of totalitarianism consists of blind belief in a kind of statistical-numerical ‘scientific fiction’ that shows ‘radical contempt for facts.’”  (pp. 10-11).  Folks no longer able to recognize the difference between true and false, right and wrong, easily become totalitarian. 

      This became all to real amidst the COVID-19 panic.  Almost overnight nearly every country followed China’s response and placed “huge populations of people under de facto house arrest, a situation for which the term ‘lockdown’ was devised.”  Elected leaders stepped aside and granted bureaucratic “experts” the power to dictate what we could or could not do.  “Expert virologists were called upon as George Orwell’s pigs—the smartest animals on the farm—to replace the unreliable politicians.”  Such experts, however, soon proved anything but infallible.  “In their statistics and graphs, they made mistakes that even ‘ordinary’ people would not easily make.”  They arrogantly promised they could control the virus but failed.  Masks, social distancing, hand washing, shutting down churches while opening up marijuana dispensaries—all irrational, failing endeavors!   “And just like Orwell’s pigs, they sometimes changed the rules overnight, inconspicuously.”  They were going to “flatten the curve,” then “crush” it (pp. 12-13), and finally failed to admit their many abject failures..  

       Their experts’ failures should not have surprised us, however, says Desmet.  A bit of historical study reveals:  “The coronavirus crisis did not come out of the blue.  It fits into a series of increasingly desperate and self-destructive societal responses to objects of fear:  terrorists, global warming, coronavirus.  Whenever a new object of fear arises in society, there is only one response and one defense in our current way of thinking:  increased control” (p. 15).  With increased control comes a creeping totalitarianism which  “is the logical consequence of mechanistic thinking and the delusional belief in the omnipotence of human rationality.  As such, totalitarianism is the defining feature of the Enlightenment tradition” (p. 15).  And this is the central thesis of the book.  We cannot escape totalitarianism without discarding the hyper-rationalism of the Enlightenment that reigns in virtually every aspect of the modern world.  We don’t need better technologies.  We need a better philosophy, “a new view of man and the world, to find a new foundation for our identity, to formulate new principles for living together with others, and to reappraise a timely human capacity—speaking the truth” (p. 17).

       In the wake of great scientific work in the 17th century, a growing corps of true believers reduced “science” to a mechanistic ideology rather than a humble search for truth.  This world has neither meaning nor purpose and disdains all religious perspectives.  All out hopes reside in a humanistic paradise.  As Arendt said:  “Science [has become] an idol that will magically cure the evils of existence and transform the nature of man.”  To  Desmet:  “With the coronavirus crisis, this utopian goal seemed very close at hand.  For this reason, the coronavirus crisis is a case study par excellence in subjecting the trust in measurements and numbers to critical analysis” (p. 63).  This mechanistic ideology has been tried and manifestly failed.  Relying solely on experts and their numbers proved deadly.  During the coronavirus panic we were inundated by graphs and tables, numbers of cases and deaths.  The endless repetition of these data prompted us to accept extraordinary restrictions.  Dissenters were “stigmatized by a veritable Ministry of Truth, crowded with ‘fact-checkers’; freedom of speech is curtailed by censorship and self-censorship; people’s right to self- determination is infringed upon by imposed vaccination, which imposes almost unthinkable social exclusion and segregation upon society” (p. 82).  But these allegedly objective data varied significantly in different hands!  Yet the “dominant ideology,” working through a compliant media, crafted a “fictitious reality” fully accepted by the masses.  “Whether it concerns the origin of the virus (bat or laboratory), the efficacy of hydroxychloroquine, the (side) effects of vaccines, the usefulness of face masks, the validity of the PCR test, transmissability among schoolchildren, or the effectiveness of the Swedish approach, scientific studies lead to the most conflicting conclusions” (p. 78).  

       Moving toward his conclusion, Desmet says we now see that:  “The mechanistic ideology has put more and more individuals into a state of social isolation, unsettled by a lack of meaning, free-floating anxiety and uneasiness, as well as latent frustration and aggression. These conditions led to large-scale and long-lasting mass formation, and this mass formation in turn led to the emergence of totalitarian state systems. Therefore, mass formation and totalitarianism are in fact symptoms of the mechanistic ideology.” (p. 179).    To escape its tentacles we must recover a more ancient understanding of Reality, acknowledging there is much in the universe that we can never understand scientifically.  There is, of course, the material world empirical scientists endlessly examine.  But as Heisenberg and other 20th century physicists discovered, “matter” cannot be understood as hard little bits of stuff randomly streaming about.  While we study and manipulate it we cannot fully understand what it actually is! 

       Yet there is also an immaterial reality, deeper than matter, what Desmet routinely calls the “Other.”  He twice cites the great physicist Max Planck’s statement:  “‘As a man who has devoted his whole life to the most clearheaded science, to the study of matter, I can tell you as a result of my research about the atoms this much:  There is no matter as such!  All matter originates and exists only by virtue of a force which brings the particles of an atom to vibration and holds this most minute solar system of the atom together.…  We must assume behind this force the existence of a conscious and intelligent Mind.  This Mind is the matrix of all matter.  Both religion and science require a belief in God.  For believers, God is in the beginning, and for physicists He is at the end of all considerations.  To the former He is the foundation, to the latter, the crown of the edifice of every generalized world view.  That God existed before there were human beings on Earth, that He holds the entire world, believers and non-believers, in His omnipotent hand for eternity, and that He will remain enthroned on a level inaccessible to human comprehension long after the Earth and everything that is on it has gone to ruins; those who profess this faith and who, inspired by it, in veneration and complete confidence, feel secure from the dangers of life under protection of the Almighty, only those may number themselves among the truly religious’” (pp. 217-218).

       Though Desmet, like Planck, is anything but an orthodox Christian, he realizes we need much more than the aging, inadequate Enlightenment-style commitment to rationalism.  The multiple problems we face cannot be solved by better machines, faster computers, or better bureaucrats.  “The real task facing us as individuals and as a society is to construct a new view of man and the world, to find a new foundation for our identity, to formulate new principles for living together with others, and to reappraise a timely human capacity—speaking the truth” (p. 17).  Reiner Fuellmich, a German attorney who help found Berlin’s Corona Investigative Committee, says:  “Mattias Desmet is the world’s expert on the phenomenon of mass formation . . . . If you want to understand why and how the coronavirus pandemic response unfolded the way it did at a societal level and—even more importantly—how to prevent such a travesty from happening again, The Psychology of Totalitarianism is essential reading. Desmet shows us how to reclaim our humanity in an increasingly dehumanized and mechanized world.”  Reclaiming our humanity, however, will require more than the psychological insights of Mattias Desmet!  Only when his “Other” is beheld as the “Holy One” will we truly do so.

367 GOD’S GRANDEUR

           “The world is filled with the grandeur of God,” wrote Gerard Stanley Hopkins.  “It will flame out, like shining from shook foil; / It gathers to a greatness, like the ooze of oil Crushed. / Why do men then now not reck his rod?”  That theme pervades Ann Gauger’s edited collection, God’s Grandeur:  The Catholic Case for Intelligent Design (Manchester, NH:  Sophia Institute Press; Kindle Edition, c. 2023).  Admittedly, declaring the grandeur of God as evident in creation runs counter to the current academy’s climate of opinion, which reduces reality to particles of some sort randomly moving about in space.  “Sweet is sweet,” said Democritus long before Christ:  “bitter is bitter, hot is hot, cold is cold, color is color; but in truth, there is nothing in Nature but atoms and the void.”  So too a contemporary physicist, David Greene, declares in The End of Time, that “you and I are nothing but constellations of particles whose behavior is fully governed by physical laws.”  All that happens anywhere at anytime is merely particles moving about.   Following the Big Bang that spewed particles into space, everything in “cosmic history has been dictated by the nonnegotiable and insensate laws of physics, which determine the structure and function of everything that exists. . . .  We are no more than playthings knocked to and fro by the dispassionate rules of the cosmos” (p. 147).  “Atomists such as Democritus thought “Ultimate reality isn’t intelligent.  What fundamentally exists are atoms and empty space in which the atoms collide.”  For them,  highly organized beings like ourselves self-organize by accident” (p. 221)

       Thus Logan Paul Gage notes that two narratives have joisted for thousands of years.  The world and its grandeur result from either “accidental events or intelligent foresight.”   Differing from materialistic monists such as Democratus, Socrates thought Ultimate Reality is more mind than matter and set forth “an explicit design argument” subject to divine providence.  Entering into this ancient debate, today’s exponents of “intelligent design” are embracing Socrates and refuting Democritus.  They do so, Brian Miller says, because 20th century scientists came to believe the universe began in a moment—a Big Bang—which reaffirms the claim of Genesis that “in the beginning God created the heavens and the earth” (Gen. 1:1). Amazingly, evidence began piling up suggesting the universe seems “fine tuned” for human life on planet earth.  It looked as if only a Divine Mind could have miraculously created our wonderful world.  

       Micheal Behe, a microbiologist who’s written Darwin’s Black Box and other significant works, notes that:  “For all of recorded history until modern times, practically everyone — educated or not, devout or not — attributed the elegance of the world in general and life in particular to a designing mind, which many identified as God” (p. 63).  Then in 1859 came Darwin’s magnum opus, On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life, “which sought to explain how the elegance and functionality of life might arise from a mindless process” (p. 64).  But we now know far more about cells and bacteria than Darwin did, and modern biology, Behe believes, shows “that at the foundational molecular level of life, Darwin’s mechanism of random mutation and natural selection works chiefly by squandering genetic information for short-term gain.”  What’s needed, he thinks, is a recovery of “the same reasoning as Anaxagoras and Galen did in ancient times, and as the English clergyman William Paley used right before Darwin’s age,” attributing the intricate designs everywhere evident to an omniscient Designer.

       A distinguished paleontologist, Günter Bechly, chairs the Center for Biocomplexity and Natural Teleology in Austria, has written over 160 scientific publications and discovered 180 new species.  His essay, “The Fossil Record,” asserts that:  “In His providential wisdom God allowed for the process of fossilization to give us a window into the past and let us reconstruct all its wonders” (p. 91).  Unexpectedly, within a brief time:  “Complex cellular life popped into existence right when conditions first allowed for life, as if it was planted there by a Creator” (p. 95).  Suddenly “unique life forms appeared out of nowhere, without any intermediate precursors in the preceding geological layers,” and “20 of the 33 known animal phyla appear suddenly, without any precursors in the fossil record” (p. 96).  Equally impressive, there occurred what “mainstream paleoanthropologists” call “the big bang of the genus Homo.”   Abruptly, about 40,000 years ago, there appeared beings with a “globular braincase and a chin.  Might this event correlate with the origin of real humans as the image-bearers of God?  It certainly looks like a possibility” (p. 100).

       In one of the essays written by philosophers, Benjamin Wicker finds “The Intelligibility of Nature as Proof for God’s Existence,” suggesting the simple existence of oxygen points to a Creator.  For millions of years plants and animals survived because of oxygen, but “no one prior to Lavoisier knew that oxygen existed, let alone that its existence could be demonstrated” (p. 185).  It was there but no one knew it!  We had to learn, through exhaustive research and thought, that it was and what it was.  Most importantly:  “The more we know about oxygen, and everything else, the more intelligible nature becomes to us.  Since the advancement of science exists, then we can rule out both chance as the cause of nature and a God who did not condescend to make nature intelligible to us.  Therefore, there is an intelligent cause of nature’s order, and this cause, for whatever reason, created nature to be known by us.  The intelligibility of nature therefore proves God’s existence, and this is seen, in the very clearest way, in the demonstration that oxygen exists” (p. 191).  

       Another fine philosopher, J. Budziszewski, in “The Natural Moral Law,” says that even as we feel without thinking the power of gravity we also “have a dim awareness of the natural law even if we know nothing about the philosophy of natural law.”  He believes “the reality of the natural law gives good reason to believe in the reality of God — even apart from revelation, which imparts additional data, such as the plan of salvation” (p. 228).  Citing the “law of gravity,” scientists “describe how things actually do happen in the world.  Ethicists, citing precepts “such as the Golden Rule — describe how things ought to happen in the world, and serve as standards for the conduct of beings capable of grasping them.  But how things ought to happen is just as truly a structure of reality as how they do happen, and just as truly knowable by the use of our natural intellect” (p. 232).  We cannot help knowing it’s right to treat others rightly!  Our conscience demands it.  We have “an interior witness to a standard that we do not make up, which directs us and by which we are judged, and which we cannot change to suit ourselves” (p. 233).  

       A corps of theologians add their insights to God’s Grandeur.  John Bergsma contends that:  “The consistent teaching of Scripture is that God created the world intelligently (in “wisdom,”(Heb ḥokhmāh, Gk. sophia, Ps. 104:24), that the design in nature is obvious to human observation (Rom. 1:18–21), and that said design constitutes evidence for God’s existence, attributes, and activity (Rom. 1:20; Ps. 19:1–4) (p. 262).  God spoke into being all that is.  Speech conveys “necessarily information, and in presenting the Creation of the cosmos by acts of divine speech, the ancient author communicates that the physical world was created by being ‘in-formed’ by information whose source was God” (p. 263).  He gave form to the cosmos and filled it with wondrous beings.  As is evident in Psalm 104:  “The grass is for the livestock, the plants are for man to cultivate; likewise, the wine, oil, and bread are to ‘gladden the heart,’ ‘make [the] face shine,’ and ‘strengthen [the] heart.’  The Psalmist is approaching a “Privileged Planet” or “Rare Earth” perspective by recognizing that the terrestrial habitat is remarkably suited to supply the needs of a wide diversity of life forms, but especially to nourish and delight man (v. 15)” (p. 272).

       In the book’s conclusion, Anthony Esolen celebrates “A Living and Symphonic Order,” seeing the universe not as “a machine but a symphony; not a formula but an epic poem; not a goose-step of determinism, chaotic in its unmeaning, but the play of a dance, cosmic in its measures of indeterminacy and in the glorious liberty of its sign-bearer and sign-maker, man.”  A machine combines lifeless things but:  “A living thing is a whole in which the whole is present in every part, as every part makes sense as a part only in intimate relation to the whole” (p. 387).  Reductionistic materialists see the cosmos as “a bundle of equations and some primal particles,” but in so doing they “murder to dissect” and fail to behold the grandeur of it all.  Reading these essays helps us rejoice in its reality.  

                                                  * * * * * * * * * * * * * * * * * * * * * * 

       The great Latin poet Vergil once declared:  “The moon’s bright globe, the sun and stars are nurtured / By a spirit in them.  Mind infuses each part / And animates the universe’s whole mass.”  Thinkers ancient and modern have rejoiced at the grandeur of God, as Melissa Cain Travis shows in Science and the Mind of the Maker: What the Conversation Between Faith and Science Reveals about God (Eugene, OR:  Harvest House Publishers, c. 2018; Kindle Edition).  She endeavors to build a case “for the Maker Thesis in three ways:  (1) by using modern scientific evidence to support philosophical arguments for the existence of a Maker,  (2) by explaining some of the many features of our universe and planetary home that had to converge for the investigation of nature to be possible, and (3) by demonstrating the necessity of a rational Mind and ensouled creatures to account for the effective practice of the natural sciences” (#299).  

       Travis begins by noting that Pythagoras, 500 years before Christ, sensed in mathematics overtures of an immaterial, orderly world.  He influenced Plato, who “agreed that number is related to the organization of the visible cosmos” but developed a theory of visible “forms” imperfectly copying  eternal, immaterial, transcendent “Forms” (#391).  His views, eloquently set forth in the Timaeus, deeply shaped centuries of subsequent thinkers who believed “that the beauty, regularities, and intelligibility of nature are explained by a benevolent craftsman who brought order out of formlessness and purposively framed the universe according to the eternal, mathematical Forms.”  In “Timaeus, Plato draws a connection between the rationality of nature and the powers of the human mind” (#398), and from Plato and Aristotle, through Athanasius and Augustine and Aquinas, the best ancient and medieval thinkers crafted a natural philosophy celebrating God as the Creator of all that is, visible and invisible.  

       This natural philosophy gained scientific precision and detail in the hands of thinkers such as Nicholas Copernicus and Johannes Kepler, whose laws of planetary motion “transformed the field of astronomy into a sophisticated theoretical science.  He was convinced that the universe operated according to laws put in place by its Maker, much like a clock is subject to a clockmaker” (#1001).  He believed God created in a rational, mathematical way, and that man has been given a mind akin to God’s enabling him to understand it.  He famously said:  “‘To God there are, in the whole material world, material laws, figures and relations of special excellency and of the most appropriate order . . .  Those laws are within the grasp of the human mind; God wanted us to recognize them by creating us after his own image so that we could share in his own thoughts’” (#1024).  Indeed, he “called the universe ‘our bright Temple of God’ and described astronomers as ‘priests of the highest God in regard to the book of nature’” (# 1043).  

      Subsequent centuries featured scientific masters such as Sir Isaac Newton, who “believed that one of the important goals of natural philosophy was to formulate convincing arguments for the existence of God” (#1121), and Sir Robert Boyle, who “was a man of passionate Christian faith, and his desire to further illuminate the mechanical philosophy of nature was partly due to his deep conviction that the regularities and harmony of the material world reflected the omniscience and foresight of the Creator, who had made an orderly world intelligible to mankind. Like Kepler, Boyle saw his work as a theological vocation and described natural philosophers as priests who deciphered truths about the natural world—the temple of God.  He wrote that “if the world be a temple, man sure must be the priest, ordained (by being qualified) to celebrate divine service not only in it, but for it” (#1152).  

       In the 19th century this deeply religious perspective was challenged by Charles Darwin and his supporters, who said a mindless cosmos forever evolves through chance and necessity.  But today that purely materialistic view appears less and less persuasive, and eminent physicists and cosmologists are increasingly open to the “God Hypothesis.”  A century ago Max Planck showed that “classical physics” fail to explain how sub-atomic particles behave.  “As a result, the field of quantum mechanics was born.  Planck regarded science and faith as compatible and complementary enterprises.  He was particularly fascinated by the congruence between the mathematical, law-governed structure of the material world and human rationality; he saw this correspondence as indicative of a designing Mind” (#2251).  Convinced there was no necessary conflict between science and religion, he ultimately declared “On to God!”  Yes indeed!  “On to God!”

       Working out the implications of quantum mechanics has occupied some of the finest minds of the past century—Einstein, Eddington, Heisenberg, et al.  They work within a truly strange world, filled with unexpected and highly mathematical realities.  Many of them now espouse varieties of “substance dualism,”  believing that along with the material world there is an equally real mental (or spiritual) world.  There is a non-material mind as well as a biological brain; there is a non-material Mind as well as a physical world.  Some scientists sound much like St Athanasius who, in the fourth century, declared:  “Like a musician who has tuned his lyre, and by the artistic blending of low and high and medium tones produces a single melody, so the Wisdom of God, holding the universe like a lyre, adapting things heavenly to things earthly, and earthly things to heavenly, harmonizes them all, and leading them by His will, makes one world and one world order in beauty and harmony” (#3214). 

       Melissa Travis has written a readable, coherent account supporting her “Maker Thesis,” finding in Kepler the notion that the cosmos is orderly, following laws that are understandable to us inasmuch as we are rational beings capable of actually thinking God’s thoughts.  Kepler’s “unapologetically Christian philosophy of nature—that it is rationally ordered in a manner compatible with the mind of man, a creature made in God’s image—harmonized exceptionally well with both early Christian teaching on natural revelation and Pythagorean-Platonic thought about the intelligible structure of the cosmos” (#1033).  To recover the robust faith of Kepler and do so in the light of contemporary science is Travis’s laudable goal.  

                                                 * * * * * * * * * * * * * * * * * *

       For several decades Hugh Ross has led an apologetics ministry—“Reasons to Believe”—and published a number of fine treatises proclaiming the compatibility of contemporary science and biblical thought.  In Designed to the Core (Covine, CA:  Reasons To Believe Press, c. 2022; Kindle Edition), Ross details evidence from astronomy showing how a “fine-tuned” universe makes life on earth possible.  As Michael Strauss, a professor physics at the University of Oklahoma says, Ross takes us “on an unprecedented journey to explore the necessary requirements for a planet to support complex life.  His truly comprehensive approach to the subject examines every aspect of Earth’s life-friendly environment, from the cosmic supercluster that we inhabit to our location in the Milky Way to our unusual solar system and even deep inside Earth’s core. The sheer number and scope of the needed parameters is mind-boggling and unambiguously answers the question of whether Earth is unique in its capability of supporting complex life.  The only question left for the reader to ponder is how such a fortunate planet could have come into existence at all. Many of us who have pondered that question will agree with Dr. Ross, that such exquisite design requires an intelligent and powerful Designer” (pp. 3-4).   

       Successive chapters in the book move from the vastest dimensions of the universe to planet earth—all showing how improbable it all is.  No random concoctions of matter-in-motion could possibly have arranged the cosmos!  “An abundance of evidence now indicates that if the cosmic mass, size, age, inflation, elements, and ratio of elemental abundances weren’t structured exactly as they are, no one would be here to learn of them or to ponder how they came to be” (p. 20).  In fact:  “No planet like Earth and no physical life would be possible if the universe were not precisely as massive as it is” (p. 22).  The increasingly evident “fine-tuning is multifaceted and every facet crucial to the outcome, then the fine-tuning source must be more than a mindless, impersonal force or process” (p. 16).  Determined to present the best current conclusions of astronomers, Ross goes into deep detail, presenting data (and mathematical equations) much beyond my pay grade!  All I can say is that if one knows a great deal about physics and astronomy he will be able to truly digest the book’s contents.  

       For me, breezing through many pages filled with complexities I could not fathom, it was rewarding simply to know that a man such as Ross really understands the subject and makes his conclusions clear in summary sections.  “Those who pay attention to the scientific literature,” he says, “can attest to the progress of research.  Daily, new data accumulates, more than any one researcher in the investigative quest can keep track of or digest.  The challenge I faced in writing this book was determining which of the compelling anthropocentric design evidences to include and which to let go, for brevity’s sake.  Design, to use the word so commonly seen in the literature, increasingly appears ubiquitous.  There appears to be no end to the evidence of fine-tuning and design coming from scientific discovery.  Yet, design was evident even a few thousand years ago, as recorded in ancient writings about a man named Job, who commented on the long list of evidences drawn from observation of nature’s realm. . . . Job rightly discerned a Designer behind all the evidence in the natural realm.  Considering all the scientific exploration humans have done over the past four thousand years, we’ve gained deeper glimpses of the Fine Tuner’s works, though only glimpses, with infinitely more to see and understand” (p. 283).  Better still, “the One whose planning, power, and fine-tuned precision made our human existence possible” has also provided a Way for us to know him in Christ.      

                                                * * * * * * * * * * * * * * * * * * * * * * 

       While he was serving as prefect of the Congregation for the Doctrine of the Faith, Joseph Cardinal Ratzinger gave six lectures in a German monastery, translated into English as The Divine Project: Reflections on Creation and the Church (San Francisco:  Ignatius Press, c. 2022; Kindle Edition).  They give us important insight into the future Pope Benedict XVI’s wide-ranging theological interests, especially regarding creation.  He began by noting the importance of rightly interpreting Scripture.  Throughout the ancient and medieval eras it was assumed “that the only way for someone to understand each individual text of Scripture is always to understand it as part of Scripture in its totality; and not like the totality of a textbook, either, but as a dynamic totality” (p. 21).  As the Hebrew Scriptures were written, God came to be understood as a unique, “only one” God “who had all lands and all peoples at his command.  This was because he himself had created the heavens and the earth, because they were his own” (p. 24).  Jews came to believe “that God alone, the eternal Reason who is eternal Love, created the world, and that it rests in his hands” (p. 26).  Ultimately there developed in Israel what’s called “wisdom literature,” which is, Ratzinger says, “the final bridge on a long road, one that leads to the message of Jesus Christ, to the New Covenant.  And only there do we find the ultimate, definitive creation account in Scripture, the one that provides Christians with the standard for interpreting every other creation text.  This ultimate and definitive biblical creation account opens with the key verses: ‘In the beginning was the Word, and the Word was with God, and the Word was God. . . . All things were made through him, and without him was not anything made that was made’ (Jn 1:1, 3).”  The Logos, Reason, the Divine Mind indwelling creation enables us to say what Aristotle said 400 years before Christ, “against those who claimed that everything came into being by chance,” that a Divine Mind designed all that is in wisdom-wrought ways.