378 Endangered Virtues

  Michael Phillips is a Christian novelist who recently published Endangered Virtues and the Coming Ideological War: A Challenge for Americans to Reclaim the Historic Virtues of the Nation’s Christian Roots (Sterling VA:  Fidelis Publishing, c. 2023; Kindle Edition).  ”Rapid changes” around the world deeply concern him.  The culture war is very real and its outcome “will determine our future and that of generations to come” (p. 10).  It looks much too much like the pre-Civil War 1850s, when it was still a  war of words rather than shells and bayonets.  Today’s combatants are traditional Christians who are assailed by cultural and political progressives ruthlessly determined to destroy them.  We need “to be wise thinkers to properly absorb and respond intelligently to the important idea-crisis that is overtaking us.  We cannot take ideas haphazardly as they come, catching them, as the late . . . Francis Schaeffer once said, like measles.  We have to be aware of how we think, aware of our presuppositions and worldviews, and the implications of both.  Too much is at stake to be sloppy thinkers” (p. 11).  As Christians we need to be ever perceptive, discerning the spiritual dimensions to the cultural conflicts we face.  Unfortunately Phillips identifies the problem without providing much help resolving it, other than living godly lives.  To better grasp what’s needed requires a more adept, and well-grounded, thinker, Alistair McIntyre.  

   When I began teaching Ethics fifty years ago I used textbooks that addressed various issues by contrasting positions and encouraging students to analyze and formulate their own views.  The professor’s task was to clearly explain what a variety of philosophers thought while allowing students to freely pick and choose the views they preferred.  Then came Alistair MacIntyre’s After Virtue:  A Study in Moral Theory (Notre Dame:  University of Notre Dame Press, c. 1981)!  Reading this book changed my approach to the course and led me to make Aristotle’s Ethics the basic text used every semester, supplemented by more modern (and more popular) readings.  “Simply put,” McIntyre said, “the moral life aims at virtue” (p. ix), and he sought to determine how we become ethical persons rather than asking how to answer ethical questions.  Along with Josef Pieper, he upheld the “older view which held that our intellects are not to be creative but to be conformed to the truth of things—and that such conformity is increasingly possible only as we grow in virtue” (p. 23).

This is no purely “academic” question!  The fate of the world, the well-being of mankind, rests in the balance.  For we live in troubled times.  Indeed, MacIntyre suspected we face the kind of disintegrating culture historians describe in the centuries which marked the transition from the Ancient to the Medieval World, an era of “barbarism and darkness” (p. 263).  Today “the barbarians are not waiting beyond the frontiers; they have already been governing us from quite some time” (p. 263). He tried to show how our modern (Enlightenment-shaped) culture has lost its moral integrity, collapsing into the emotivist ethics widely evident in the West.  When school children are taught—in “values clarification” sessions—to decide what’s right, on a case-by-case basis in accordance with their feelings, emotivism reigns.  It’s as deeply rooted in our culture as PC and TV.  Since different folks have different feelings our culture lacks any ethical coherence, any rational rationale or objective standards.

Emotivists believe that saying “this is good” merely means “I approve of this and want you to feel likewise.”  So relativism reigns and everyone makes up his own rules.  Autonomous individuals insist on it.  Every man is his own ethicist–as well as his own historian, theologian, etc.  Thus:  ‘Seeking to protect the autonomy that we have learned to prize, we aspire ourselves not to be manipulated by others; seeking to incarnate our own principles and stand-point in the world of practice, we find no way open to us to do so except by directing towards others those very manipulative modes of relationship which each of us aspires to resist in our own case” (p. 68).  Friedrich Nietzsche’s celebration of the “will-to-power”–the “might-makes-right” ethics of unhampered autonomy—represents the final gasp of an Enlightenment-nurtured “morality” turned “immoral.”  A moment’s thought reveals the inevitable chaos concealed in such views, but Nietzsche discarded reason as well as morality, so we believe lies as well as behave immorally.

But not so fast said MacIntyre!  There is an ancient and eminently defensible ethical philosophy with roots in Aristotle and Aquinas.  To doubt Nietzsche, to oppose the drift of his antinomian ethics, forces one to examine the tradition Nietzsche (and the Enlightenment) rejected:  Aristotle.  To MacIntyre there are only two options:  Nietzsche or Aristotle!  “For if Aristotle’s position in ethics and politics—or something very like it—could be sustained, the whole Nietzschean enterprise would be pointless.  This is because the power of Nietzsche’s position depends upon the truth of one central thesis:  that all rational vindications of morality manifestly fail and that therefore belief in the tenets of morality needs to be explained in terms of a set of rationalizations which conceal the fundamentally non-rational phenomena of the will” (p. 117).

    This leads MacIntyre to argue on behalf of Aristotle, whose ethical ideas have thrived through the centuries in Greek, Muslim, Jewish and Medieval Christian circles.  Each of these cultures rooted the virtues in “a cosmic order which dictates the place of each virtue in a total harmonious scheme of human life.  Truth in the moral sphere consists in the conformity of moral judgment to the order of this scheme” (p. 142).  Aristotle represents a “pre-modern” way of thinking, but in fact he may be the wisest guide through the tar pits of what many call “post-modernism.”   To Aristotle, “Virtues are dispositions not only to act in particular ways, but also to feel in particular ways.  To act virtuously is not, as Kant was later to think, to act against inclination; it is to act from inclination formed by the cultivation of the virtues” (p. 149).  A good man doesn’t “follow the rules” but becomes the kind of rightly-educated person who habitually, naturally does what is right.  

     Just as we learn to play the piano or throw a baseball by working with a teacher, so we learn to act ethically through discipline, instruction, habit.  Thus:  “A Virtue is an acquired human quality the possession and exercise of which tends to enable us to achieve those goods which are internal to practices and the lack of which effectively prevents us from achieving any such goods” (p. 191).  We learn about the virtues primarily through stories.  Children need to hear stories which praise good and condemn evil.  We all need, to live virtuously, a steady diet of uplifting, challenging, admirable examples.  We need Bible stories, King Arthur stories, Jane Austin stories, C.S. Lewis stories.  The stories we tell, the songs we sing, the heroes we acclaim, the villains we despise, fundamentally shape our ethics.  

     MacIntyre’s After Virtue rewards reading and re-reading.  It provides the reader with many penetrating insights into the essence of “modernity” and some powerful suggestions as to the course we should take if we care for the welfare of coming generations.

                                     * * * * * * * * * * * * * * * * * * * * * * * * *

        John H. Garvey served as president of The Catholic University of America from 2011 until 2022.  Earlier he served as dean of Boston College Law School after teaching law at the University of Notre Dame.  Throughout his distinguished career at all these institutions he was constantly concerned with moral formation in the educational and legal world and routinely devoted his commencement remarks to the importance of the classical virtues.  He recently compiled his addresses in The Virtues (Washington, D.C.:  The Catholic University Press, c. 2022).  He begins by citing a review of Thomas Wolfe’s I am Charlotte Simmons in The New York Times,noting that Wolfe had “located one of the paradoxes of the age.  Highly educated young people are tutored, taught and monitored in all aspects of their lives, except the most important, which is character building . . . they find themselves in a world of unprecedented ambiguity . . . where it’s not clear if anything can be said to be absolutely true.”

Garvey gave the speeches and wrote this book to say yes, some things are absolutely true and should give you guidance throughout your life.  Unfortunately, the history of Harvard reveals a different trajectory.  Chartered in 1636, over the centuries Harvard had three mottos:  Veritas (“Truth”); In Christi Gloria (“for the glory of Christ”); and Christo et Ecclesiae (“for Christ and Church”).  But Harvard has changed.  Christ is no longer honored, nor is any truth considered absolute.  Harvard and other universities certainly provide a moral education, but it’s a libertine celebration of individual freedom to do whatever one feels.  And, like Mozart’s Don Giovanni, students mainly “want to be amused.”  Thus, in the infamous words of the late Supreme Court Justice Kennedy in Planned Parenthood v. Casey:  “At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of they mystery of human life.”  On the other hand, John Paul II insisted real freedom is not casting off  restraints but doing what’s right.  So Garvey says:  “The virtues are habits that channel our freedom in the direction we ought to go.  They are principles of action that move us to do good things.  ‘Only virtuous people are capable of freedom, Ben Franklin says” (p. 15).  And virtuous people become so by forming good habits, developing a “second nature” that enables them to almost instinctively.  Just as we develop the ability to play the cello  through painstaking practice, so we develop our moral character through  the disciplined doing of right things.

As a Christian, Garvey begins by focusing on the “theological virtues” basic to the Faith.  By grace, believers are given the faith, hope, and love needed to partially restore what was lost in the Garden of Eden.  We do not manufacture them—they come to us from another realm of reality.  Faith enables us to believe in God’s revelation.  Faith is  never blind—it actually “enlarges our field of vision” to include invisible realms of reality.  It enables us to grasp and cling to God’s Word.  Just as I believe in the theory of relativity because Einstein demonstrated its truth (not because I understand it) so too I take God at His Word because I trust Him.  Hope places our ultimate joy in heaven alone.  It “is the virtue that connects our desire for heaven with God’s promises.  It links our deepest longing for happiness with God” (p. 45).  Love (or charity), the finest of all the virtues, puts God first and enables us live out the Gospel.  “These three virtues are the heart of the Christian life.”  They bring a bit of heaven into our daily lives.  

Conjoined with the theological virtues are the “cardinal” virtues routinely cited by Plato, Ambrose and Aquinas:  Fortitude; Justice; Temperance; Prudence.  They are acquired by human effort though they certainly need God’s sustaining grace to develop.  First and most important is Prudence, the foundation of the other virtues.  It finds wise ways to get to the right end.  Given what seems to be a necessary choice between two apparent goods, prudence prescribes the best one.  At times it seems as if we must choose the “lesser of two evils,” and prudence helps us weigh the options and do what’s best in the situation.  We need to know ourselves, with all our strengths and weaknesses, so we need prudence to discern how to think and act well.  As the great novelist Flannery O’Connor once said:  “The older I get the more respect I have for Old Prudence.”  Justice assumes Cicero’s declaration:  Non nobis solum nati humus (“We are not born for ourselves alone”).  We live a common life and need to rightly interact with others.  Doing justice means giving others what is due them.  Cicero insisted we do so only as we see justice as doing the will of God in our world.  To do so requires Courage or Fortitude.  It’s on display in heroic figures such as Joan of Arc and George Washington.  But it’s equally present in parents getting out of bed and going to work every day to support their families.  For most of us courage emerges in small ways—just getting doing what’s needed at home and in church, telling the truth, treating others kindly.  Tearing down things requires little fortitude, but building up institutions or persons really does.   Temperance, said St. Benedict, requires “moderation in all things.”  It’s not a headline-grabbing virtue but it really matters when living a good life.  It’s important, rather paradoxically in a culture committed to being amused, because it enables us to really enjoy the good things in life.  As Fulton Sheen said:  “happiness comes from self-possession through temperance, not from self-expression through license.”  

In addition to the supernatural and cardinal virtues, Garvey reminds us there are other “little virtues” worth commending.  St. Thomas Aquinas said the “entire universe, with all its parts, is ordained towards God as its end, inasmuch as it imitates, as it were, and shows forth the Divine goodness to the glory of God.”  Nothing is so small that if fails to celebrate God’s Being and Goodness.  So practicing the “little” virtues are ways to glorify and worship God.  Small traits such as “gentleness, modesty, and humility are ‘graces which ought to color everything we do,’” said St Francis de Sales.  So Aristotle praised wittiness and liberality, Ben Franklin touted cleanliness, and constancy loomed large in Jane Austen’s fictional universe.  There are appropriate virtues for individual callings and stages of life.  

Consequently, Garvey considers some of the virtues most needed by younger folks who, John Paul II said, deeply desire for truth.  They want to know what makes life meaningful, what gives one purpose and direction.  They want to know what career to follow, what person to marry, what worldview is worthwhile.  Youngsters easily substitute enthusiasm for wisdom, making horrendous mistakes.  But there’s much to praise in enthusiasm and the willingness to commit to high ideals and social change.  What they need is to “cultivate:  docility, humility, honesty, industriousness, studiousness, modesty, and silence” (p. 95).  Reading these brief sections, imagining how the students Garvey addressed might have responded, makes for enjoyable reflections.  

Docility means to be teachable.  Contrary to the “hermeneutics of suspicion” derived from Freud and Nietzsche, docility encourages listening (especially to the elderly and the witness of tradition).  Said Aristotle:  “we ought to attend to the . . . saying and opinions of experienced and older people; because experience has given them an eye to see aright.”  Consequently:  “We call the University our ‘alma mater’ (our nourishing mother) because we trust that our professors are feeding us the truth.  We call the Church our Mother for similar reasons” (p. 105).  Humility allows us to rightly appraise ourselves, above all acknowledging God as Creator, imploring His aid in the many areas it’s needed.  Honesty, Garvey says, evaluating lawyers, is one of the two things that make good ones—trials and the judicial system ought to discover and uphold truth, dispassionately and without favoritism.  Modesty may apply to dress or comportment, but it’s essentially behaving appropriately, not making a spectacle of yourself.  

Industriousness makes good use of your time, investing your life in healthy and worthwhile things.  Studiousness “moderates our natural desire to know,” keeping from unwise excesses.  Silence tempers the tongue!  So it’s the kind of temperance and recommended by Ben Franklin as the second of his thirteen virtues.  “Speak not what may benefit others or yourself; Avoid trifling Conversation.”  

Turning to Middle Age, Garvey notes this is the time when we do most of our life’s work and shoulder major responsibilities, including marriage and family as well as occupations.  It’s a strenuous time but potentially packed with joy and satisfaction.  What we need in these years are truthfulness, patience, generosity, meekness, constant, and hospitality.   “Above all,” wrote Dostoyevsky, “don’t lie to yourself.”  Tell the truth.  Don’t “live by lies.”  Basically, “Patience waits for the right means to do what is good.”  Off-the-cuff remarks (or emails), thoughtless actions, premature judgments all reveal a lack of patience.  Generosity means not “random acts of kindness” but thoughtful, loving things done for another person’s well-being.  Meekness enables us to restrain anger and keep control of our passions.  Constancy keeps us at our post, even when it’s not pleasurable.  Hospitality, evident in Mary and Jesus at the marriage in Cana, “creates the circumstances for the social dimension of love to flourish” (p. 144), and will be perfectly on display in heaven!  

Considering Old Age, Garvey thinks it has important purposes and requires significant virtues.  It’s a time for Repentance, coming to terms with failures early in life.  Speaking for himself, he says:  “I have learned that repentance is the duct tape of family life.  It can fix anything” (p. 152).  How many parents and children could be reconciled if only they repented!  Gratitude distinguishes happy old people!  It’s not an emotion evoked by passing celebrations, but a constant doxology—“praise God from whom all blessings flow.”  And Mercy, both accepted and given, ought to mark old age.  It’s a “grandparent’s virtue.”  Magnanimity enables the elderly to give, at times lavishly—and many of them have at last the means to do so.  Gentleness is much needed in a world filled with conflict and tension.  It’s a kind of charity that softens the blows and eases the sorrows of a fallen world.  Benignity (not a familiar word to many of us but meaning kindness) may well be uniquely evident in the elderly.  

Finishing his admonitions to virtue, Garvey praises Wisdom (one of the gifts of the Holy Spirit), Peace (primarily peace of soul), and Joy (a lasting satisfaction, unlike passing pleasures).  Above all, “try to find God in all things.”  The Virtues contains much to be praised—many illustrations and eminently comprehensible injunctions, all rooted in the great virtue tradition in ethics.  Would all universities in America had men such as Garvey leading them!

                                * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Years ago I attended a scholarly conference and shared a dinner table with a young theologian, Matthew Levering.  In subsequent years he has published a number of scholarly works and joined an elite corps of gifted Christian thinkers.  Recently he published Dying and the Virtues (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2018).  The great Church historian Jaroslav Pelican once said:  “The core of Christian faith is pessimism about life and optimism about God, and therefore hope for life in God.”  How well we have lived certainly becomes manifest in our hour of death.  So, the author asks:  “What would it look like for a dying persons to have been ‘hid with Christ in God’ and to ‘blossom’ on Christ’s grave?  My answer involves what I call the virtues of dying” (p. 4).  These virtues, he says, are “given by God” and totally dependent upon Him.  He believes “these virtues exhibit that ‘it is only the cross of Christ that makes ultimate sense of human death, without which dying would be merely ‘the great wrecking ball that destroys everything’” (p. 5).  

Beginning with the “most excellent” of the virtues— Love—Levering cites Joseph Ratzinger:  “‘man’s longing for survival’ has its roots in ‘the experience of love,’ in which love wills eternity for the beloved and therefore for itself’” (p. 13).  Having loved—and lost a loved one—almost automatically prompts one to hope for life everlasting wherein love endures.  Musing on this leads the author to ponder the message of Job, who “repeatedly returns to the question of whether God intends to annihilate him” (p. 15).  In the end Job finds assurance in the goodness of his Maker and the confidence that inasmuch as his Redeemer lives forever so shall he!  When we love we long for more time with our beloved.  Thus the virtue of Hope rightly focuses on eternal life.  Generally speaking, people who pathologically fear death have no hope for life after death.  But Christians who have (as St Paul prescribed) died to the world and come alive in Christ face death peacefully.  The third of the supernatural virtues is Faith, the confidence that God is and will do what He said.  We who have walked by faith will pass through death’s valley relying on the One who designed and sustained us.  We’ve learned to “let go and let God” do His perfect will, enabling Him to work through us, perfecting such virtues as penitence, gratitude, solidarity, humility, surrender, and courage.  Death will be our final test, a step from mortality to immortality, from earth. Now that Jesus has arisen, we need not fear the grave.  Ultimately, says Levering,  “To understand our dying as an act of grateful living chraracterized by virtues, we need Christ” (p. 164).

Levering brings impressive erudition (35 pages of endnotes; a 27 page bibliography) and helpful illustrations to his discussion, making this a fine addition to the virtue tradition.  Not an easy read, but good things often cost us something.  

# # #


At a very young age children ask “why”?  A youngster may ask why are there 24 hours in the day.  In response his father will say because the earth revolves open its axis while it circles around the sun.  Then the child may very well want to know “how do you know that?”  The dad will probably explain that astronomers and physicists tell us how the solar system works.  The child may not fully understand, but he certainly wants to know, for we always wonder “how do you know what you know.”  Philosophically the realm of “epistemology” deals with this question, and it’s truly something that really matters.  If you’ve never pondered why numbers of men are “self-identifying” as women and competing in women’s athletics—or why the woman coaching South Carolina’s national championship basketball team supports such activities—you might not identify this as a deeply epistemological issue, but it is.  We confront the transgender question because of philosophical developments during the modern era.  Along with the ancient sophists, modern thinkers affirm that “man is the measure of all things.”  As philosophical nominalism gained traction, science displaced theology as the “queen” of academia, deism ousted theism, and what we call the “modern world” developed.  For six centuries now, thinkers have increasingly taken universals such as truth, goodness and beauty to be mere names we humans paste, like post-it notes, on things.  “The issue ultimately involved,” says Esther Meek, “is whether there is a source of truth higher than, and independent of, man; and the answer to the question is decisive for one’s view of the nature and destiny of humankind.  The practical result of nominalist philosophy is to banish the reality which is perceived by the intellect and to posit as reality that which is perceived by the senses.  With this change in the affirmation of what is real, the whole orientation of culture takes a turn, and we are on the road to modern empiricism” (p. 3). 

To overly simplify the story, in the 17th century two thinkers charted the course for our world.  Francis Bacon insisted we can only know empirical facts in the physical world and provided a handbook for the “scientific revolution” then unfolding.  He championed the “inductive” approach to knowledge.  At the same time Rene Descartes argued we can know with certainty only what’s absolutely self-evident and undeniable in our minds.  He followed an essentially deductive way of knowing.  Together they opened the way to an increasingly subjective notion of truth.  One of the best analyses of all this was set forth in Richard Weaver’s Ideas Have Consequences—published in 1948 but still remarkably prescient.  He wrote the book“as a challenge to forces that threaten the foundations of civilization,” fearing Western Civilization was collapsing under the assaults of nihilists who acknowledged no absolute truths, no permanent values.  Weaver  called this a “vertical invasion of the barbarians”—a cultural catastrophe equal to that visited upon the Ancient world by the Goths and the Vandals.  To do battle with modern barbarians Weaver sought to defend “the mind itself, and its capacity to actually know the Reality designed by a Higher Mind.”

Weaver traced this struggle back to the 14th century when nominalism began replacing realism as the dominant epistemology.  William of Occam replaced Thomas Aquinas, skepticism replaced certainty, and the decline of the West began.  “The defeat of logical realism in the great medieval debate was the crucial event in the history of Western culture; from this flowed those acts which issue now in modern decadence.”  What’s been lost is the “power of the word”—the word which aligns our minds with the Word which was (and is) God.   Weaver sought to reestablish the realism of the ancients, insisting philosophy begins not with skepticism but with wonder and that “sentiment is anterior to reason.”  Ideas and ideals, virtues and virtuous heroes, a love for one’s ancestors and descendants, a vision of the eternal good and a commitment to its acquisition, must find roots in the hearts of those who would restore our culture.  But beyond diagnosing the ills we confront Weaver proposed no way, given the scientific-industrial world we live in, to recover the wisdom of the ancients.

Enter Esther Nightcap Meek, a Christian philosopher most recently teaching at St Louis University, who finds in Michael Polanyi a thoughtful guide to help us think about thinking.  In her Contact with Reality;  Michael Polanyi’s Realism and Why It Matters (Eugene, OR:  Cascade Books, c. 2017;  Kindle Edition) she set forth her case.  “In this lively book,” says D.C. Schindler (a noted Christian professor), “Esther Lightcap Meek does more than simply make a compelling case for Polanyi’s realism in the context of dominant epistemologies and philosophies of science; she also brings out a beautiful dimension of Polanyi’s thought that is not often seen, deepening its metaphysical underpinnings through creative engagement with contemporary thinkers.  This book makes a much-needed contribution to the reception of Polanyi—and offers a fresh, new way to think about reason more generally.”  

   Many years ago I gave a lecture at my alma mater dealing with “light as a symbol of truth,” pointing out that light may appear as either a wave or a stream of particles.  So too truth may appear as a broad pattern (a field) or as individual data.  Following the lecture a physics professor chatted with me and mentioned the importance of Michael Polanyi for scientists such as himself.  Subsequently I read most of Polanyi’s works (especially his magnum opus, Personal Knowledge) and found him both fascinating and frequently persuasive.  Reading Esther Meek’s treatise renewed within me an appreciation for Polanyi’s insights.  Her book begins with his statement:  “We can account for this capacity of ours to know more than we can tell [personal knowledge] if we believe in an external reality with which we can establish contact.  This I do.  I declare myself committed to the belief in an external reality gradually accessible to knowing, and I regard all true understanding as an intimation of such a reality which, being real, may yet reveal itself to our deepened understanding in an indefinite range of unexpected manifestations.” 

Michael Polanyi was a physical chemist of considerable renown whose interests turned ultimately to philosophy.  A Hungarian of Jewish descent born in 1889 to a prosperous, socially-eminent family, he interacted with the likes of Albert Einstein and sired a son (John) who won a Nobel Prize.  He gave the Gifford Lectures in Natural Religion in the early 1950s which were published as Personal Knowledge in 1958.  Though Polanyi  explored multiple fields, Meek wants to focus on his philosophical realism.  “At the heart of what Polanyi was about, especially in his stepping away from science to do philosophy, was his concern to offer a fundamentally different epistemology that, rather than undercutting science (not to mention all of Western culture)—as he felt the prevailing paradigm was doing—would save it and enhance it.”  What we label “modernity” took a skeptical approach that cut “us off from the natural trust and communion with reality that lies at the heart of humanness” (p. 5).  As an eighth-grader Meek inhaled this skepticism, thinking she could only know what resided in her own mind.  Her modernist guides sought to conquer nature rather than commune with it.  Truth was essentially subjective.  Then Postmodernism pushed this further, declaring we “construct” the world, even to the point of declaring a man is a woman!

Contrary to many 20th century epistemologists (including Esther Meek as a child), who were deeply skeptical regarding the possibility of knowing much of anything, Polanyi sought to give us ways to actually know and trust our knowing.  We can discover that our insights “ring true to what we actually do when we come to know—when we know, that is, not only in frontline scientific research and discovery, but throughout all the byways of ordinary life” (p. 3).   Our minds can actually come into correspondence with reality—we can know (as Aristotle, Aquinas, et al. insisted) what is.  Accordingly, Meek has carved out what she calls a “covenant epistemology” and “moved from child skeptic to seasoned intoxicated realist” (p. 4).  Though a Protestant in the Reformed tradition, Meek finds herself drawn to the work of the Swiss Catholic theologian Hans Urs von Balthasar, who “has uncannily and aptly portrayed the philosophical trajectory of my life—and possibly yours.”  We all wrestle with basic questions which “keep coming back” as we “drill more deeply into the mysterious abyss of being.”  We wonder, von Balthasar says:  “Does truth in fact exist?” And that leads us to wonder even “being exists at all” (p. 8) 

Realists think things exist whether or not we think abut them.  They think we can truly know them—somewhat as an x-ray reveals what is under the skin— as we discern “essences” in what is.  Realists simply assume, without bothering to prove, that we are in a knowing relationship with the external world.  For them, we know what makes a circle a circle, a hawk a hawk, a woman a woman.  Polanyi certainly allowed for a subjective aspect to knowing—thus he emphasized personal knowledge.  But personal does not mean subjective!  In fact, Meek argues, it is simply an important component of his realism.  He also rejected “the false ideal of objectivity” entertained in the scientific community, which sometimes claims to function in in detached, mechanistic ways.  Such knowledge Polanyi labels “explicit,” and it is espoused by many scientists who want an impersonal, mathematical standard of truth.  Polanyi, however, thought to think in personal, not mechanical ways.  Indeed, he often said:  “We know more than we can tell.”  

We come to know what is through the process of discovery, discerning what we tacitly know and need to clarify.  We don’t “construct” truth—we discover things and bring our minds into correspondence with them.  Seeing something, whether a star or a snowflake, we assume it’s there and that we can truly discover things within it.  “Discovery involves the transference of information, not from one mind to another, but into the mind in the first place.  If knowledge is wholly explicit, there can be no learning, no discovery, and thus no scientific knowledge.  Discovery . . . involves the germination of new hunches and ideas and the pursuit of those hunches despite the absence of any sort of justification” (p. 21).  Such is part and parcel of the scientific method.  We can find explicit truths because we rely upon oft-unconscious tacit knowledge.  “Knowledge, therefore,” Meek says, “is objective by virtue of responsible personal involvement, explicit by virtue of its tacit root, and examinable by virtue of our foundational commitments” (p. 23).  Polanyi insisted that much of what we know is tacit rather than explicit.  “We know more than we can tell.”  We deal with—and synthesize—knowledge of particular things and comprehensive wholes.  Giving attention to the particulars cannot be severed from a deep-level awareness of their context. 

Still more:  this kind of thinking involves intuition and imagination.  Scientists probing the problems facing them as they do research frequently have sudden moments of insight, breakthrough intuitions that suddenly provide answers unavailable to computer-style computations.  As with Archimedes pondering how to discern real gold by measuring the water it displaced and then running through the streets of Syracuse shouting “Eureka!” many scientists confess to an almost mystical awareness of solutions to their questions.   This is a “dynamic intuition” invaluable to deep-level thought, and it “recognizes clues and somehow ‘measures the distance’ between the present understanding and the intuited focus.”   To illustrate this Polanyi noted:  “that we have all experienced it in the common attempt to remember someone’s name; we know somehow that we are close and then closer to having it; we speak of its being ‘on the tip of my tongue’” (p. 45).   

Contact with Reality was a re-working of Meek’s Ph.D. dissertation, revealing all the strengths (careful research) and weaknesses (largely inaccessible to readers not grounded in philosophy) of such works.  But the main point, evident in the title, is this:  one of the finest 20th century thinkers provides a way to take a deeply realistic approach to knowledge.

                        * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

After teaching for many years and working with others in the Polanyi Society, Esther Meek published some reader-friendly books showing why she regards Polanyi so highly.  In Longing to Know:  The Philosophy of Knowledge for Ordinary People (Grand Rapids:  Brazos Press c. 2003; Kindle Edition), she targeted “people who wrestle with questions concerning truth and the possibility of knowledge as a result our culture’s recent consensus shift from modernism to postmodernism” (p. 7).  Postmodernism is deeply skeptical, claiming all we have are “narratives” telling various stories.  We may embrace one or more of the stories, but none of them is actually true.  So we must deal with people who not only deny objective truth but deny there’s any truth at all!  Everything’s “your opinion” they say!  

In particular Meek wants to provide Christians a firm foundation for their faith.  She thinks “that many questions can be answered, at least preliminarily, and many puzzles solved, and personal hope of truth restored, by appropriating this [Polanyi-crafted] model of how we know.  I believe the model is confirmed by the ordinary day-to-day experiences of every human being” (p. 9).  Though the endeavor may be difficult—life itself, and philosophy itself, can be quite difficult!—but the knowing the truth is worth the work.  Ultimately, and above all else, she says, we can know God.  Nothing can be finer, for:  “If God is, what he is has far-reaching consequences for our lives—who we are, how we live, and what happens after death.  Perhaps the simplest way to say it is this:  If God is, and he is master of all, then he is master of you and your world.  If he isn’t, then you are.  You might see one or the other alternative as the preferable one.  But it’s impossible to be indifferent about the choice; it hits just too close to home for comfort” p. 17).  

Meek was reared in a Christian home and believed in God, but she could not suppress many questions about Him and our ability to know anything about Him.  Over the years she has worked with students just like herself—wanting to believe but unsure if there is any warrant for belief.  To know, to engage in what she calls an “episematic act,” requires much more than just taking someone’s word for something.  There’s a difference between thinking and knowing.  I may think it’s freezing outside and be wrong.  If I know it’s actually freezing there’s a certainty as to what is. “Know is a success word: when we use it we imply that we were successful at getting the truth right.  So we have thought that knowing something means that what we claim to know can’t be wrong or we cannot doubt—that it is infallible, or certain.  For knowledge to be knowledge at all, it must be infallible or certain.  Otherwise it is opinion, or belief, but not knowledge.” (p. 26).  She writes:  “My point is going to be this: If knowledge is as philosophers have thought for centuries, if our efforts to know have certainty as their uncompromising ideal, then skepticism seems the inevitable alternative.  But our lived experience witnesses powerfully that this cannot be.  So maybe we need to revise how we think about knowledge” (p. 28).  

We do actually know things, and God is truly knowable.  We cannot know everything about Him—indeed we may be able to know just a little bit about Him—but it’s still trustworthy knowledge.  We know something when we integrate scattered bits of information with a more coherent pattern.  We see a leaf, then leaves, then the tree sustaining them.  We’re capable of grasping “a coherence, an integrated pattern, a making sense of things, that opens the world to us” (p. 50).  Certainly “all truth’s someone’s truth”—there is a personal perspective to all knowing.  But to acknowledge this does not mean, by any means, that “truth is relative” or nonexistent!  To know involves “commitment, love, and faith.  But it is not subjectivistic, relativistic, privatistic—those unfortunate labels that many have thrown at faith and that many have embraced as the death of truth.  It is not subjectivistic; it is human.  It is embodied, responsible human skill” (p. 60). Michael Polanyi’s Personal Knowledge, Meek believes, delivers us from the skepticism embedded in modernity.  

Induction and deduction have their places in a theory of knowledge.  But much more is involved than collating and arranging data or following mathematical formulae.  We know things best when we deal with our world much like a detective, following clues and noticing patterns, unlocking mysterious boxes, finding traces in the creation leading us rightly.   It’s what Lewis and Clark did leading the famous expedition up the Missouri River, over the continental divide, and down the Columbia River.  They were learning as they went and discovered fascinating sites.  So too we learn “to know God” by getting “tips” from daily life.  There are momentary insights, curious coincidences, unexpected illuminations that enable us to know Him.  As an amazing “image of God” we’re bursting with information ranging from the inner workings of tiny cells to the mysterious processes of recalling long-dormant memories.  We learn to walk, type, play a piano and ultimately do so without consciously commanding actions.  Meek thinks such “bodily clues are included in our experience of God, and I don’t think of it as a mystical experience” (p. 93).  It’s one of the many ways we can come to know Him.  

This Polanyi-kind of knowing helps us immensely “when it comes to our main question—whether we can know God.  It offers hope about whether we can know anything at all.  It dissolves some of the puzzles about knowing that have plagued thinkers for centuries, puzzles generated by a faulty, unrealistic model of knowing.  And it helps us see things in fresh and exciting ways, for it aptly and evocatively fits our ordinary human experience” (p. 56).   Such knowing enables us to trust our insights into a very real world independent of ourselves.  We actually “contact” it.  More than believing our ideas “correspond” to the external world, Polanyi-kind of knowing assures us that we are truly in touch with what’s Real, including God.  

                                 * * * * * * * * * * * * * * * * * * * * * * * * *

In Loving to Know:  Introducing Covenant Epistemology (Eugene, OR:  Cascade Books, an imprint of Wipf and Stock Publishers, c. 2011; Kindle Edition), Esther Meek continued to build her case for the importance of Michael Polanyi in Christian theology.  Unlike her previous treatise, Longing to Know, this one is written for persons considering the claims of Christianity but unsure whether they can know anything at all about it.  For Meek nothing is more important than epistemology—knowing how and what we can truly know.  We’re knowers who know something about what can be known.  We deal with it all the time—only philosophers try to be more precise and provide illuminating terms to help us think well.  “‘Epistemological therapy’ is what I call my personal effort to help people reform their default epistemological settings in a way that brings health, hope, and productivity” (p. 6).

Meek’s on a mission to get people to “care about knowing.  Because not to care is to be dead. Indifference to one’s surroundings is a telltale sign of sickness, of impending death.”  Importantly:  “It is human to care.  Boredom, absence of wonder, is a sign of sickness.  If our outlook on knowledge is such that it leads to boredom, then something is amiss in our outlook on knowledge” p. 31).  We need to be attentive to our deepest inner longings, following them help us find out why we’re here, what we should do, whom we can become.  Ultimately we want to know Reality in its fullness.  To know it “calls for an attentiveness on our part that is far less like a dispassionate cataloguing of information and more like passionate indwelling of that half-hidden object of our care,”  and if  “knowing is care at its core, caring leads to knowing. To know is to love; to love will be to know” (p. 33).

To help her readers on this journey into covenant epistemology, Meek utilizes the works of Annie Dillard, Lesslie Newbigin, and Parker Palmer.  Dillard’s Pilgrim at Tinker Creek helped her see a covenantal aspect of knowing.  When we join Dillard and carefully study nature we discover a wonderful world of complex creatures bestowing upon us an awareness of grace.  Newbigin stressed the importance of finding Jesus as The Truth and entering into a personal relationship with Him.  Palmer, a Christian philosopher, developed a  personalist epistemology that weds one to the Creator, finding truth within a loving relationship with Him.  Within such a relationship—not standing apart and asking abstract questions—enables us to actually know Ultimate Reality.  Doing so brings great joy, the joy of discovering what we most deeply desire to know.  So we both give ourselves to and invite what’s Real to join us in discovering what’s of ultimate concern.  To our delight we find that the Real is most profoundly personal.  He’s Real and we can know Him if we attend to His Presence. 

# # #


Asking what’s gone wrong with our kids is an ancient endeavor, but these days we must deal with what seems to be an unusually troubled younger generation awash in a “youth mental health crisis.”  Abigail Shrier, in Bad Therapy:  Why Kids Aren’t Growing Up (New York:  Penguin Publishing Company, c. 2024; Kindle Edition) offers a thoughtful analysis that merits attention.   While acknowledging some youngsters need serious psychological treatment she’s concerned about “the worriers; the fearful; the lonely, lost, and sad.  College coeds who can’t apply for a job without three or ten calls to Mom.”  They’re not mentally ill but they’re doing poorly and look for “diagnoses to explain the way they feel.”  Rarely does this help, but:  “We shower these kids with meds, therapy, mental health and ‘wellness’ resources, even prophylactically.  We rush to remedy a misdiagnosed condition with the wrong sort of cure” (p. xii).  

Shrier remembers how she was reared.  Parents spanked when necessary and rarely worried about their kids’ feelings.  They were told where to go, how to dress and behave.  Probing their kids’ psyches for some “repressed identity” never occurred to them.  “But as millions of women and men my age entered adulthood,” she says, “we commenced therapy.  We explored our childhoods and learned to see our parents as emotionally stunted.  Emotionally stunted parents expected too much, listened too little, and failed to discover their kids’ hidden pain.  Emotionally stunted parents inflicted emotional injury” (p. xv).  Resolving to do better, her generation determined to rear “happy” kids.  “We resolved to listen better, inquire more, monitor our kids’ moods, accommodate their opinions when making a family decision, and, whenever possible, anticipate our kids’ distress.  We would cherish our relationship with our kids. Tear down the barrier of authority past generations had erected between parent and child and instead see our children as teammates, mentees, buddies” (p. xvi).   And to do so they trusted a bevy of “wellness experts.”   

       Such experts were anxious and willing to help!  Therapy would solve all problems.  To provide more help than professional therapists could give school administrators jumped into the “crisis” and urged teachers to counsel and coddle their students, becoming “partners” with their parents in providing emotional comfort.  Shrier and millions more “bought in, believing they would cultivate the happiest, most well-adjusted kids.  Instead, with unprecedented help from mental health experts, we have raised the loneliest, most anxious, depressed, pessimistic, helpless, and fearful generation on record.  Why?  How did the first generation to raise kids without spanking produce the first generation to declare they never wanted kids of their own?  How did kids raised so gently come to believe that they had experienced debilitating childhood trauma?  How did kids who received far more psychotherapy than any previous generation plunge into a bottomless well of despair?” (p. xvii).  How indeed?  

The answer was therapy and more therapy.  Tragically, Shrier thinks, this resulted in an epidemic of iatrogenesis, illustrating how “healers” often make things worse.  “Well-meaning therapists often act as though talking through your problems with a professional is good for everyone.  That isn’t so” (p. 8).  Talking doesn’t always help, as careful studies dealing with policemen, burn victims, breast cancer patients, and grief-ridden mourners show.  Folks who say they don’’t want to talk about their problems are often much wiser than those who insist they do!  Unfortunately we’ve been persuaded that lots of us are sick.  Most Gen Zers think they have mental health issues and almost “40 percent of the rising generation has received treatment from a mental health professional—compared with 26 percent of Gen Xers” (p. 17).  To meet their needs “wellness centers” have sprouted on most college campuses and professors are routinely advised to make allowances for all sorts of mental health problems.  Yet the problems proliferate with no indication that treatments succeed.  

   This suggests, to Shrier, that we’ve been overwhelmed by “bad therapy.”  She thinks this, in part, because of scholars like Camilo Ortiz, a “tenured professor and leading child and adolescent psychologist.”  His research shows that “individual therapy has almost no proven benefit for kids.”  If anyone needs help it’s the parents,  who too often “unwittingly transmit their own anxiety to their kids.”  However,  “numberless psychotherapists not only offer individual therapy to young kids, they practice techniques like ‘play therapy’ that have shown scant evidence of benefiting kids. In fact, there’s very little evidence that individual (one-on-one) psychotherapy helps young kids at all” (p. 40).  

Consulting Ortiz and other scholars, Shrier lists 10 Bad Therapy steps:  (1)  Teach Kids to Pay Close Attention to their Feelings—in fact they should learn to distrust their emotions and often repress them;  (2)  Induce Rumination—in fact rehashing often harms a person; (3)  Make “Happiness” a Goal but Reward Emotional Suffering—in fact happiness comes as a result of doing things well; (4) Affirm and Accommodate Kids’ Worries—in fact they need to confront and deal with them; (5) Monitor, Monitor, Monitor—in fact they need to be supervised less and left alone much more than they are; (6)  Dispense Diagnoses Liberally—in fact we need to stop labeling kids’ disorders (e.g. ADHD) and treat them as ordinary aspects of growing up; (7) Drug ’Em—in fact drugs such as Ritalin should be administered only as a last resort; (8) Encourage Kids to Share Their “Trauma”—in fact, the less they share the better they fare; (9) Encourage Young Adults to Break Contact with “Toxic” Family—in fact few families are truly “toxic” and severing oneself from those who love them best rarely helps anyone; (10) Create Treatment Dependency—in fact interminable therapy sessions enrich counselors while harming kids.  In sum:  “Bad therapy encourages hyperfocus on one’s emotional states, which in turn makes symptoms worse” (p. 64).

Compounding the bad therapy kids may get in counselors’ offices, the nation’s schools are redesigning themselves as therapeutic care centers, flying the flag of  SEL (social-emotional learning).  Add to SEL the “restorative justice” President Barack Obama urged in his 2014 “Dear Colleague Letter” threatening schools with loss of funding if they continued to suspend and expel a disproportionate number of minority kids.  This presented schools with a quandary:  How do you maintain order without punishment?  The Dear Colleague Letter spelled out the solution:  “‘restorative practices, counseling, and structured systems of positive interventions.’  Violent kids were rebranded as kids in pain.  Schools stopped suspending or expelling them.  And a newly invigorated era of mental health in public schools was born. ‘Restorative justice’ is the official name for schools’ therapeutic approach that reimagines all bad behavior as a cry for help” (p. 94). 

Consequently, says Shrier:  “For more than a decade, teachers, counselors, and school psychologists have all been playing shrink, introducing the iatrogenic risks of therapy to schoolkids, a vast and captive population” (p. 71).  Teachers—even in math classes— may very well begin the day by asking their students how they’re feeling and even engaging in forms of group therapy urging them to confess their deepest anxieties.  “Sometimes described by enthusiasts as ‘a way of life,’ social-emotional learning is the curricular juggernaut that devours billions in education spending each year and upward of 8 percent of teacher time” (p. 77).  In SEL sessions kids’ parents are often blamed for a variety of problems and often referred to as “caregivers” or “service providers” who fail to treat their clients well.  Some “experts” even dismiss parents as “morons” incapable to rightly parenting.  Meanwhile, under their” guidance kids behave worse and schools appear increasingly anarchical.  

And their parents, determined to be “gentle” with their children cooperate with the teachers and therapists who declare they know how kids should be reared.  Earlier generations, however, “had a more masculine style of parenting.”  Dads usually did the disciplining but moms certainly did their fair share.  “This is the style I’ve called ‘knock it off, shake it off’ parenting.  The sort that met kids’ interpersonal conflict with ‘Work it out yourselves,’ and greeted kids’ mishaps with ‘You’ll live.’  A loving but stolid insistence that young children get back on the horse and carry on” (p. 168).  The “Battered Mommy Syndrome”—kids punching and kicking their parents—was unheard of.  Three-year olds weren’t asked for advice, nor did children dictate the dinner menu.  Youngsters were rarely considered particularly “sensitive” or “brittle” since they usually proved quite resilient in even difficult situations.  

Indeed they can be resilient if only they’re left alone to grow up as kids have done for centuries.  They don’t need drugs or counselors or constant monitoring.  Shrier has determined to relax and let her kids take risks, fall down and get up without dramatics, make friends on their own, etc.  She’s persuaded that many alleged childhood “disorders” can be corrected by assigning chores and demanding respect for elders.  Above all she urges readers to discern and flee bad therapy.  One a broader scale, perhaps all of us should free ourselves from the bad therapeutic culture advancing upon us in virtually all our institutions.

                                                          * * * * * * * * * * * * * * * * * * * * *

Much that Abigail Shrier describes in Bad Therapy was discerned by one of the finest thinkers of the past century, Philip Rieff, in his The Triumph of the Therapeutic: Uses of Faith After Freud (Chicago: University of Chicago Press, c. 1966), wherein he  warned educational and religious leaders to beware of psychological nostrums.  We’ve witnessed his prophetic insights as what Christian Smith called “moralistic, therapeutic deism” has secured a dominant position in the cultural landscape.  In our world, Rieff said,  “hospital and theater,” fitness centers and films, are replacing family and nation.  “Religious man was born to be saved,” he noted, but “psychological man is born to be pleased.  The difference was established long ago, when ‘I believe,’ the cry of the ascetic, lost precedence to ‘one feels,’ the caveat of the therapeutic” (p. 25).  Within the Christian tradition, this trend surfaced as early as 1857, when “Archbishop Temple put into clear English what had been muddled in German ever since the time of Schleiermacher:  ‘Our theology has been cast in a scholastic mould, all based on logic. We are in need of, and are actually being forced into, a theology based on psychology” (p. 42).  

       Ancient and Medieval civilization developed through the subjugation of our sensual desires, choosing to follow moral standards rather than pleasures.  The highest kind of knowledge is attained through faith—knowing and obeying God.  The good life in conforming to creation and the Creator.  This tradition of self-discipline and responsibility—labeled by Rieff a “dialectic of perfection, based on the deprivational mode” celebrating martyrs and saints, “is being succeeded by a dialectic of fulfillment, based on the appetitive mode” (p. 50).  Rather than restraining himself, psychological man seeks to “be kind” to himself.  An egoistic ethic of self-esteem and tolerance replaces the ethic of repentance and sanctity.  The “ideal man,” from Plato to Tocqueville, was understood to be a “good citizen,” sacrificing his own interests for the welfare of others.   In the emergent therapeutic culture, however, the “ideal man”—as is evident everywhere from the Oval Office to the box office—knows how to amuse himself.     Beginning with Francis Bacon, however, a new approach, progresively shaped by psychoanalytic theory, has exerted control.  According to this theory, we must learn how to change what is, to “create our own” reality, to craft whatever suits our desires.  Marx wielded philosophy as a hammer and sickle for social change. Freud proffered clients insights whereby they could choose whatever seemed desirable. Jung and Adler and hosts of lesser folks followed suit, and our world is largely ruled by folks who want to rule!                                            This led Rieff into extensive discussions of Jung, Reich, and D.H. Lawrence–thorough, penetrating, illuminating analyses. He showed, persuasively, how the “sexual revolution” has its roots in the likes of such intellectuals, and he makes clear how effectively they have subverted Western civilization.  As a result of adopting this therapeutic approach, many folks in our society lack the arresting sense of sin which typified the classical culture.  Indeed, it is “incomprehensible to him inasmuch as the moral demand system no longer generates powerful inclinations toward obedience or faith, nor feelings of guilt when those inclination are over-ridden by others for which sin is the ancient name” (p. 245).  No longer haunted by sin, modern man feels no need for salvation, no desire for a Savior.  So churches emphasize “religious” experiences and advertise “spiritual” therapies designed to help vaguely distressed people feel better.  Many, indeed, have “become, avowedly, therapists, administrating a therapeutic institution–under the justificatory mandate that Jesus himself was the first therapeutic” (p. 251).  Such, Rieff insists, is quite wrong-headed and needs to be rejected, but it’s what’s happened under the reign of modernity.                                                                                                                                                                                                                               * * * * * * * * * * * * * * * * * * * * * * In My Life among the Deathworks: Illustrations of the Aesthetics of Authority (Charlottesville: University of Virginia Press, c. 2006), the first volume of a trilogy entitled Sacred Order/Social Order, Philip Rieff explored the fact that “cultures give readings of sacred order and ourselves somewhere in it.”  Throughout human history, James Davison Hunter explains, all cultures have been “constituted by a system of moral demands that are underwritten by an authority that is vertical in its structure.  . . . .  These are not merely rules or norms or values, but rather doxa: truths acknowledged and experienced as commanding in character” (p. xix).  First World (pagan) and Second World (Judeo-Christian) Cultures—to use Rieff’s categories—humbly aligned themselves with a higher, invisible Reality:  the Sacred.                                                     The modem (what Rieff labels “Third World”) culture shapers, working out the position espoused by Nietzsche’s Gay Science declaring that “God is dead,” have negated that ancient sacred order. Turning away from, indeed assailing, any transcendent realm, they have rigidly restricted themselves to things horizontal—material phenomena and human perspectives.  Rather than reading Reality, they actively encourage illiteracy regarding it—e.g. idiosyncratic “reader responses” to “texts,” the venting of personal opinions, and the construction of virtual realities.  Their relentless attacks upon the sacred are what Rieff calls “deathworks” that are both surreptitious and ubiquitous, shaping the arts and education, dominating movies and TV, journalism and fiction, law schools and courtrooms.  As he says: “There are now armies of third world teachers, artists, therapists, etc., teaching the higher illiteracy” (p. 92).                                                                     

Throughout his treatise, Rieff weighed the import of the raging culture war.  This Kulturkampf “is between those who assert that there are no truths, only readings, that is, fictions (which assume the very ephemeral status of truth for negational purposes) and what is left of the second culture elites in the priesthood, rabbinate, and other teaching/directive elites dedicated to the proposition that the truths have been revealed and require constant rereading and application in the light of the particular historical circumstance in which we live.  And that those commanding truths in their range are authoritative and remains so” (p. 17).  He especially emphasizes that: “The guiding elites of our third world are virtuosi of de-creation, of fictions where once commanding truths were” (p. 4). By denying all religious and moral truths, they have established an effectually godless “anti-culture.”                  Rieff’s analyses of influential artistic works (many of them reproduced in the text) are particularly insightful and persuasive. What was evident a century ago in only a few artists (James Joyce and Pablo Picasso), and psychoanalysts (Sigmund Freud and Karl Jung), now dominates the mass media and university classrooms, where postmodern gurus Michel Foucault and Jacques Derrida are routinely invoked.  One thing these elites, will not acknowledge:  any transcendent, ”divine creator and his promised redemptive acts before whom and beside which there is nothing that means anything” (p. 58).  Nietzsche folly understood this, propounding “a rationalism so radical that it empties itself, as God the Father was once thought to have emptied himself to become very man in the Son.”  (p. 70).                         Rieffs grandfather, a survivor of the Nazi death camps, “was appalled to discover not only in the remnant of his family in Chicago but in the Jewish community of the family’s Conservative synagogue . . .  that the Jewish sense of commanding truth was all but destroyed.  Those old traditions were treated as obsolete, replaced by the phrase that horrified my grandfather most:  everyone is entitled to their own opinion” (p. 82).  The nihilism of the Nazis flourished in Chicago!  To Rieff, Auschwitz signifies “the first full and brutally clear arrival of our third world” (p. 83). But the death camps, both Nazi and Bolshevik, were simply the logical culmination of Hamlet’s ancient view that “there is nothing good or bad in any world except thinking makes it so’” (p. 83).   What was manifest in Auschwitz, Rieff says, is equally evident in the world’s abortion mills!  In one of Freud’s letters, we read a “death sentence, casually uttered, upon sacred self:  ‘Similarly birth, miscarriage, and menstruation are all connected with the lavatory via the word Abort (Abortus).’  How many things,” Rieff muses, “turn before my eyes into images of our flush-away third world” (p. 104).  Rejecting “pro-choice” rhetoric, he insists: “The abortionist movement does bear comparison the Shoah [the Jewish Holocaust].  In these historic cases both Jews and ‘fetuses’ are what they represent, symbols of our second world God.  It is as godterms that they are being sacrificed” (p. 105).      My Life among the Deathworks, says Hunter, “is stunning in its originality, breathtaking in its erudition and intellectual range, and astonishing in the brilliance of its insights into our historical moment” (p. xv).  It is however “difficult, intentionally so,” because “Rieff wants the reader to work for the insight he has to offer; to read and then reread” (p. xvi). The book rather resembles Pascal’s Pensees—a collage of aphorisms and illustrations (many of them paintings) rather than a systematic development of a thesis. The book does, however, richly reward the reader’s persistence.

* * * * * * * * * * * * * * * * * * * * *

Philosopher Mark Goldblatt would consider bad therapy a result of philosophical developments  leading to the declaration: I Feel, Therefore I Am: The Triumph of Woke Subjectivism (New York:  Bombardier Books, c.  2022; Kindle Edition).  He cites G.K. Chesterton’s words from a century ago:   “We shall soon be in a world in which a man may be howled down for saying that two and two make four, in which furious party cries will be raised against anybody who says that cows have horns, in which people will persecute the heresy of calling a triangle a three-sided figure, and hang a man for maddening a mob with the news that grass is green.”  Reason is drowning in a sea of emotion wherein everyone decides what is true or false,  right or wrong, on the basis of how it feels.  This is not exactly a unique moment, however, for Pilate asked Jesus “What is truth?”  He seemed to be tossing aside the possibility of Truth’s existence in any transcendent sense.  To him truth was simply instrumental, finding out what works to one’s own advantage.  He had the power to kill Jesus and did so, washing his hands in the process.  So it follows that today’s subjectivism is amply evident and clearly rooted in the perspectivism of Friedrich Nietzsche, who called Pilate the “solitary figure worthy of honor” in the New Testament.  In Goldblatt’s view, we are now “having a Pontius Pilate moment.  What is truth?  Whatever you will.  Whatever you can.  Whatever you dare” (p. 7).  Many of the major issues confronting us are, most deeply, questions of truth.  Is truth a clear seeing and accepting of what is—a “correspondence” between what I think and Reality—or is it merely what I imagine things are or ought to be?  Is truth objective or subjective?  “Objective truth is revealed by a careful examination of evidence and the application of logic to that evidence.  Objective truth is true regardless of our subjective feelings about it because it is anchored in the object of the belief or proposition; it is a relationship between out-there and in-here, an alignment between the two” (p. 15).  Philosophically it’s a form of realism.  Subjectivism, however, is a branch of idealism that “foregrounds not reality but perception.”  Following George Berkeley, “To be is to be perceived. Esse est percipi.”   As Goldblatt shows, many modern movements—Black Lives Matter, Transgenderism, et al—share this subjectivism.  MY truth is THE truth!  And feelings are triumphant!

375 An Apostolic Agenda

Taking seriously Pope Francis’ recent words to the Roman Curia—“Brothers and sisters, Christendom no longer exists!”—some scholars at the University of Mary (a college in Bismarck, North Dakota, that was established in 1987 and now enrolls some 5000 students) worked with James P. Shea to publish From Christendom to Apostolic Mission: Pastoral Strategies for an Apostolic Age (Bismarck, N.D.:  University of Mary Press., c. 2020; Kindle Edition).   This is hardly a novel concern, for in 1974 Archbishop Fulton Sheen said:   “We are at the end of Christendom.  Not of Christianity, not of the Church, but of Christendom.  Now what is meant by Christendom?  Christendom is economic, political, social life as inspired by Christian principles.”  It created a wonderful culture—gothic cathedrals, universities, hospitals, theologians and saints—for which we should give thanks.  But, Sheen insisted:  “That is ending — we’ve seen it die.”  Nevertheless:  “These are great and wonderful days in which to be alive.  . . . .  It is not a gloomy picture — it is a picture of the Church in the midst of increasing opposition from the world. And therefore live your lives in the full consciousness of this hour of testing, and rally close to the heart of Christ.”  Shea and his friends want to do precisely that by recovering an apostolic mindset.

Our formerly Western Christian culture has been slowly but surely disintegrating.  Dealing with it brings challenges not faced by early missionaries proclaiming the Good News to a pagan world.  C.S. Lewis said it is difference between a man wooing a young maiden and a man winning a cynical divorcée back to her previous marriage.  More disquieting:  many non-Christians actually call themselves Christian!  Many things have contributed to this development, including the massive changes wrought by technology.  But the “key battles our culture faces are intellectual ones” (p. 11).  Until we learn to think in truly Christian ways we’ll never evangelize our world.  These challenges will not likely be met by academics, for our institutions of higher learning “are often so decayed in purpose (apart from technical training) that not much wisdom or light is to be hoped from them; for various reasons, they can tend to deform rather than enlighten the minds of those who come under their influence.  Rather, what is needed is the sort of intellectual life that was characteristic of the Church in her early centuries, a life possessed to some degree by every Christian.  It is not simply or primarily a matter of college degrees but of the conversion of the mind to a Christian vision of reality and of readiness to live out the ramifications of that vision.  A compelling Christian narrative is called for, one that provides a counter to the secular vision, that helps Christians understand and fend off false gospels” (p. 12).

We must deal with the “spirit of the age” by casting a fresh vision, rooted in a new way of discerning truth, that offers people something more than this world affords.  As is true of any worldview, it will need to include philosophy, art, science, religion, et al. in setting forth a narrative describing the “cosmic battle for souls between God and the devil” and declaring the way to join the winning side.  In its beginnings, the Christian Church worked in accord with “an apostolic mode, by which is meant that she was making her way against the current of the wider society and needed to articulate and maintain a distinct and contrasting vision” (p. 19).  Different strategies were employed when addressing Jews or Gentiles—early evangelists dealt with the audience at hand, and we must also open our hearts to Christ and follow His way in our world.  We need to recover an apostolic zeal with strategies shaped to reach our generation, with “new movements and religious communities being born or rediscovering their vitality; institutions being founded or reformed; a deepening life of prayer and communal witness being expressed” (p. 38).  As ever, this will come about not by orchestrating mass movements but by heeding creative minorities who deeply believe in and proclaim the “Good News:  “that God in his mercy has come among us to set us free from our sins and from slavery to the devil, and for those who turn to their true allegiance, the nightmare of life apart from God can be transformed into a dawn of eternal hope.  They need to know, from their own experience, that obedience to the Gospel is perfect freedom, that holiness leads to happiness, that a world without God is a desolate wasteland, and that new life in Christ transforms darkness into light” (p. 43).

  The Good News is good for all peoples at all times.  It’s embraced by individuals, one at a time, who find it both true and efficacious.  It’s generally more effectively proclaimed by witnesses than scholars, by missionaries than moralists.   People need to turn around, learn to think differently, to be truly converted, and:  “Every conversion is a marvel of grace, an astonishing work of God.  Saint Augustine once said that it was a greater miracle for God to save one sinner than to have created the whole world. Augustine’s comment points to the attitude appropriate to an apostolic age” (p. 45).  Embracing an Apostolic Agenda, modern missionaries “should assume that the majority of their hearers are unconverted or half-converted in mind and imagination and have embraced to some degree the dominant non-Christian vision” (p. 71).  So the Good News must be set forth “in such a way that the minds of its hearers can be given the opportunity to be transformed, converted from one way of looking at the world to a different way” (p. 70).

Importantly, getting converted means replacing a mechanistic with a sacramental worldview.  There are invisible as well as visible realities in our world, “and the invisible world is incomparably more real, more lasting, more beautiful, and larger than the visible.  Our blindness to that world represents much of our predicament.  We are caught by the illusion of the merely seen and need to have our blindness cured.  This drama involves us not only with the awful and marvelous and incomprehensible being of God, who created us with a decisive purpose in mind, but also with a cosmic struggle among creatures of spirit more powerful than we are, who influence human life for both good and evil.  We have been born into a battle, and we are given the fearful and dignifying burden of choice: we need to take a side.”  We are designed and destined for eternity.  Our lives make a difference because we’ve been created for a reason.  “Not only are we meant to know good things, happiness, strength, length of existence, but we have been created to experience the unthinkable:  to share in the very nature of God, to become — in the language so beloved by Eastern Christians — ‘divinized.’  Created from the passing stuff of the material world fused with an invisible and immortal soul, we are each of us meant to be what we would be tempted to call gods: creatures of dazzling light and strength, beauty and goodness, sharing in and reflecting the power and beauty of the Infinite God” (p. 76).  Now that’s Good News!

Each of us has an assignment if we’re to live as apostles.  We have only one life to live and need to live it well.  We must both take the world lightly and earnestly work to make it better.  If we’re honest we realize we’ll not much matter, as the world calculates things.  But Christians need to remember “that in dealing with even the smallest details of life, they are working out an eternal destiny.  They fight the darkness within themselves and embrace the life of love laid out for them by Christ, delighting in conforming their wills to his, knowing that obedience to him does not limit them or impede their self-development but rather brings them to their true selves, to freedom and fulfillment.  They live as exiles, in hope and hard fighting, waiting for the final triumph of God, full of gratitude for what they have been given, full of hope for all they have been promised, full of love originating in Christ toward others who need to hear the good news of a merciful and forgiving and gift-giving God” (p. 79).

This is a book written by Catholics for Catholics.  But it provides an analysis and agenda for all believers.  Given the collapse of Christendom, we need not fear, for God is with us.  And empowered by His Spirit we can do what the apostles did long ago:  proclaim and live out the Good News.

                              * * * * * * * * * * * * * * * * * * * * * * * * * * * *

In a sequel to From Christendom to Apostolic Mission: Pastoral Strategies for an Apostolic Age, Jonathan Reyes and someprofessors from the University of Mary have set forth The Religion of the Day (Bismarck, N.D.:  University of Mary Press, c. 2023; Kindle Edition), attempting, as the title specifies, to analyze today’s dominant religion and propose ways for Christians to counter it—searching for clues in the first century when “God in Christ came among us to wage a spiritual battle and, in every age since the time of its founding by Christ, the Church has been engaged in a kind of three-front war.  On one front, Christians fight an external battle against the unbelief of a fallen world; a second front is an internal battle against disloyalty and corruption among Church members; and most importantly, the third front is a fight against the darkness and unbelief of one particular member of the Church:  namely, ourselves.  Much of the nature of that battle is the same in every age: Jesus Christ is the same yesterday, today, and forever (Heb 13:8), and human nature, despite what many current philosophies want to suggest, is fundamentally constant” (p. 10).  

Inasmuch as man is by nature incurably religious the basic question in every era is what form religion takes.  The book’s authors think we the religion of our age is Progressivism, which is basically Neo-Gnosticism (more definitively described as a “Modern Neo-Gnostic Progressive Utopian Revolutionary Religion”—a revival of perhaps the most persistent heresy in Church history. It was St Thomas Aquinas’s  main adversary, and it has been clearly propounded by a series of thinkers since the Enlightenment.  Thus John Dewey, the American philosopher still influencing this nation’s educational and political classes (who helped draft the “Humanist Manifesto”) called for a “humanistic religion” focused on Man rather than God.  Progressives like Dewey think we can save ourselves, following a variety of self-help schemes, because all the evils in the world result from flawed material and social arrangements.  Remaking the world in accord with our needs and desires will enable us to become (as Eleanor Roosevelt famously said) “better and better” persons living with another in perfect accord.  It is, as the authors perceptively declare, essentially “an expression of human pride” (p. 21).

The memorable lyrics of John Lennon embody the progressive religion:  “Imagine there’s no heaven; It’s easy if you try. / No hell below us; Above us, only sky. / You may say I’m a dreamer; But I’m not the only one. / I hope someday you’ll join us, And the world will be as one.”  Unpacking this more prosaically, the authors provide a helpful analysis of Twelve Aspects of Modern Progressive Religion, beginning with the typical Gnostic sense of alienation from the world, a feeling that something’s deeply wrong with the world as it is.  It’s not us—there’s no original sin in the Gnostic mind—but a world that’s deeply flawed. else. “‘Not my fault!’ is the universal Progressive religious mantra” (p. 28).  Without remorse, without repentance, modern religionists must imagine or dream of  something better attainable through esoteric knowledge of some sort, or remaking what is into what ought to be, even destroying the existing creation to make way for a better one.  If there is a Creator He failed to make tings as they ought to be.  The first great Cristian critique of Gnosticism, St. Irenaeus of Lyons, “wrote that the essence of all Gnostic sects was blasphemy against the Creator,” a trait still evident in Neo-Gnosticism. 

“Voltaire’s famous cry against the Church, ‘Écrasez l’infâme!’ (‘Crush the loathsome thing!’) is the battle cry of Progressive believers against the order of the current world and against the God who is perceived as somehow standing behind it, as they insist on the utter annihilation of the structure of an oppressive system as the necessary prerequisite for the new age of freedom to come” (p. 31).  Man must master his world, technologically transforming what is into what he wants it to be.  Man’s knowledge provides the key.  Not “Jesus saves” but “we will save” sets the agenda.  Drawing upon the Hegelian/Marxist dialectic, progressives seek to ever be on “the right side of history,” and “morally up-to-date,” making the world a better place.  As was true of the French leftists in 1789, today’s progressives champion revolution:  “Revolution – the annihilation of the structures of oppression – is the privileged means by which Progressive belief will bring about the new age of freedom.  . . . .  This is clear in Karl Marx’s famous revolutionary dictum, ‘The philosophers have hitherto only interpreted the world in various ways. The point, however, is to change it’” (p. 45).  

We see the revolutionary ethos in today’s youthful protesters who cheerfully embrace violence.  To gain their goals they promote the “cancel culture” so evident in American universities.  No dissent is allowed, no gradualism will suffice.  All must be uprooted and replaced.  Oppressive systems must be destroyed.  Consequently, a “program of willful systematic amnesia begins with artifacts – statues, texts, uses of language – but if the requisite power is gained it always moves on to eliminating living humans.  The logic of tearing down statues and erasing words is the same as the impetus behind the French Revolutionary Reign of Terror, the Soviet gulag, the Chinese cultural revolution, and the Nazi death camps. The sources and expressions of evil must be hunted down and eliminated so that the pure society can properly arrive” (p. 47).  As gnostic movements have risen and fallen in the past, so too it’s modern expression will ultimately fail.  But in our day it’s powerful and virulent.  Its power stands exposed in the 2005 treatise by sociologists Christian Smith and Melina Lundquist Denton—Searching: The Religious and Spiritual Lives of American Teenagers—describing what young Americans believe.  “They famously coined the term ‘Moralistic Therapeutic Deism’ or ‘MTD’ to describe what those teenagers, including Christian ones, most commonly believed” (p. 65).  They think there’s a God out there somewhere who created the world who mainly wants us to “be good, nice, and fair to each other, as taught in the Bible and by most world religions.  The central goal of life is to be happy and to feel good about oneself.  God does not need to be particularly involved in one’s life except when God is needed to resolve a problem.  Good people go to heaven when they die” (p. 66).  

This is of course anything but orthodox, traditional Christianity!  To address it we need not fear the darkness but learn to light candles illuminating it with Christ’s Light, to work with Him in rescuing the perishing.  We must begin by stressing the astounding fact of His Incarnation.  God really did become man.  The Maker of all that is actually lived among us—Immanuel, God with us.  “As C. S. Lewis once observed with the claim of the Incarnation in mind, ‘One must keep on pointing out that Christianity is a statement which, if false, is of no importance, and if true, is of infinite importance.  The one thing it cannot be is moderately important’” (p. 72).  Of all places, we must begin the battle for truth and righteousness within the Church!  The Progressive religion has poisoned too many professing Christians “who have abandoned key doctrines of the faith and have embraced some form of the neo-Gnostic gospel of personal self-creation” (p. 99).  They imagine themselves to be “Christians” but have never “encountered the risen Christ as their Lord and Savior” (p. 100).  They think everyone is basically good rather than sinful and need not so much a redeemer as a cheerleader.  

Simultaneously we must do battle within our own souls.  “God’s kingdom is established on earth mainly by personal conversion and holiness:  the saints are the true movers and shakers of history” (p. 98).  Rather than agitating for social justice we need to focus on being justified, made right, by God.  We need less to march in the streets than stand patiently with Christ.  We fight for the Faith with spiritual weapons, resisting the devil and boldly declaring the Word of the Lord.  It’s better to be a martyr than an emperor.  Facing an increasingly non-Christian world we must nevertheless believe God providentially brought us into it.  Now is our time.   This is our time.  ‘We will neither be lost in nostalgia for a distant time in the past, nor will we fall into the trap of thinking that Christianity is now ‘outmoded’ and needing a fundamental change of belief or morality. Instead, we will seek wisdom to understand how Christ is responding to our times, as the Gospel of the One who is ‘ever ancient and ever new continues to reach out to save the lost” (p. 131).

                                      * * * * * * * * * * * * * * * * * * * * * * * * * *

Eric Metaxas, the author of the highly-acclaimed Bonhoeffer:  Pastor, Martyr, Prophet, Spy, recently published Letter to the American Church (Washington, D.C.:  Salem Publishing, c. 2022; Kindle Edition), applying insights gained from writing it.  He wrote “this book because I am convinced the American Church is at an impossibly—and almost unbearably—important inflection point.  The parallels to where the German Church was in the 1930s are unavoidable and grim” (p. ix).  We may fail to realize that we are as immersed in evil as were the Germans under Hitler’s control, but we are in fact facing anti-God ideologies such as “Critical Race Theory,” LGBTQ+ rationales,  and pro-abortion rhetoric.  Rather than identify and oppose them, too many churchmen have set aside the Gospel in order to please cultural elites championing such perversions.  Few German pastors in 1932 understood that one must act when there’s still time to do so and that small steps determine the course of one’s future.  

Too often the Church fears to appear judgmental, to condemn evil, to oppose persons and organizations promoting it.  But Metaxas wonders:  “Where did we get the idea that we shouldn’t be at the forefront in criticizing the great evil of Communist countries like China that brutally persecute religious minorities in ways that bring to mind the Nazis themselves?” (p. 5).  What we should learn from Bonhoeffer is the importance of resisting evil, discerning its presence and speaking out at its manifestations.  Pastors and theologians are especially responsible for doing so.  Unfortunately, in 1954 Senator Lyndon Johnson orchestrated legislation that forbade churches from endorsing political figures, threatening the churches’ tax exempt status!  Inasmuch as they remained silent at this move to quiet them, “they behaved rather like many of the submissive pastors in Germany two decades earlier” (p. 8).  Still fearful of the taxman, all too many American churchmen still refuse to publicly hold politicians responsible for their behavior.  They’ve lost the courage to enter the public square and fight for justice.  Though few of us know much about the “Johnson Amendment” and the government’s capacity to quash religious freedom, churches saw it vanish during the recent COVID-19 shutdowns.  Churches were actually deemed “non-essential” and ordered to close their doors.  Virtually all of them did!  Marijuana dispensaries and strip clubs stayed open but churches closed and pastors said nothing.   “When questionable medical procedures were being forced on their parishioners . . .  they meekly adopted the stance that it was the ‘Christ-like’ thing to submit and not to fight, nor even to mention such tremendously serious issues.  This was a deeply disgraceful moment for the American Church” (p. 12).

That moment came for Bonhoeffer when, on Reformation Sunday in 1932, he  preached a message in an historic Berlin church.   “Rather than stroke the egos of those German elites slumbering in the pews, Bonhoeffer’s sermon was calculated to wake them up, if they were still able to be awakened” (p. 25).  Midway through his message, the authorities shut down the broadcast.  “To put it in our own modern parlance,” Metaxas says, he “had just been ‘cancelled.’”  Thenceforth he sought ways to resist the Nazis, helping lead the “Confessing Church” in opposition to the pro-Nazi, state-subsidized “Deutsche Christen” (Christian Church).  In time, only 3,000 of Germany’s 18,000 pastors stood with Bonhoeffer.  Many of them would be arrested and killed.  The majority failed to discern what was actually happening.  “They could not believe that the Nazis were devotedly anti-Christian—and that they were essentially atheist and pagan tribalists working to eventually obliterate the Christian Church” (p. 48).  Somewhat the same is now taking place—witness the rainbow banners and BLM flags on churches.  Such acts compromise the Faith and  churchmen must be resisted.   People need courageous leaders, and “God expects those who have a voice to speak out for those who do not—who most of all tend to be the poorest among us” (p. 13).  The COVID pandemic has receded and the church doors have opened, but today we’re besieged by Critical Race Theorists who want to indoctrinate our children and by homosexual and transgender ideologists who work to undermine the Christian Way.  It’s time, Metaxas thinks, for some new Bonhoeffers!  

374 The War on Masculinity

Himself childless, C.S. Lewis still wrote:  “Children are not a distraction from more important work.  They are the most important work.”  Yet one of the more distressing developments during the past half-century is the failure of men to embrace their traditional roles as fathers of children and providers/protectors of women.  Whether or not this is the result of men simply discarding their responsibilities or of women emasculating them is highly debated, but Nancy Pearcey gives valuable views in The Toxic War on Masculinity:  How Christianity Reconciles the Sexes (Grand Rapids:  Baker Publishing Group, c. 2023; Kindle Edition).  Pearcey has written numerous highly-praised books, including Total Truth: Liberating Christianity from Its Cultural Captivity and How Now Shall We Live? (coauthored with  the late Chuck Colson).  She was praised by The Economist as “America’s pre-eminent evangelical Protestant female intellectual,” and is currently a professor and scholar in residence at Houston Christian University. 

Though reared in a Christian home Pearcey struggled with her faith—in large part because of her highly-respected but abusive father.  In high school she discarded Christianity and became a committed feminist.  Then, wandering about Europe in search of something to live for, she stumbled into Francis Schaeffer’s L’Abri.  There, “for the first time I discovered that there exists something called Christian apologetics, and I was stunned.  I had no idea that Christianity could be supported by logic and reasons and good arguments.  Eventually I found the arguments persuasive and I reconverted to Christianity” (p. 14).  This move prompted her to rethink her feminist agenda in the light of biblical truth.  “So in a sense,” she says,  “I’ve been writing this book my entire life. As a little girl, I wondered how a man could sometimes be so wonderful and at other times so cruel.  As an adult, I have had to spend literally decades thinking through how to define a healthy, biblical concept of masculinity.  What is the God-given pattern for manhood?  How did Western culture lose it?  And how can we recover it?” (p. 14)

       She begins by noting that if masculinity is considered “toxic”—as it is by many—the best solution is emasculation!  Rip the maleness out of men!  Thus in 2018 the American Psychological Association (APA) issued guidelines for counseling men and boys, denouncing “traditional masculinity ideology” as “psychologically harmful.”  Influential gender studies professors justify hating men simply because they’re men.  There’s even a hashtag, #KillAllMen and books titled I Hate Men, The End of Men, and Are Men Necessary?  From many cultural sectors comes a strong message:  masculinity, like arsenic, is toxic!  But Pearcey wants to celebrate what’s good about men and help them live up to the goodness of their creation.  She says:  “Because of testosterone, men are typically larger, stronger, and faster than women.  In general, they are also more physical, more competitive, and more risk-taking. We need to affirm these God-given traits as good when used to honor and serve others” (p. 18).  In fact:  “We should not make the mistake of equating masculinity with men’s bad behavior.  A biblical worldview tells us that men were originally created to live by the ideal of the Good Man, exercising traits such as honor, courage, fidelity, and self-control.  A healthy society is one that teaches and encourages a God-centered view of masculinity” (p. 22).

      The Good Man, Pearcey insists, generally attends church!  Contrary to the stereotypical patriarch—an angry man ruling the family with an iron hand and traumatizing  women and children—the best research shows that devout, conservative evangelicals, regularly going to church, are the least abusive, most admirable males in America.  Citing Brad Wilcox, a professor of sociology at the University of Virginia, director of the National Marriage Project, and author of Soft Patriarchs, New Men: How Christianity Shapes Fathers and Husbands, she argues that the more devout the man the better he is as husband and father.  Wilcox says:  “‘the happiest of all wives in America are religious conservatives. . . .  Fully 73 percent of wives who hold conservative gender values and attend religious services regularly with their husbands have high-quality marriages’” (p. 39).  

Though American evangelicals may never have heard of St John Chrysostom, they’re living out his admonition, given 1600 years ago:  “Let everything take second place to care for our children, our bringing them up in the discipline and instruction of the Lord.”  Rooted in New Testament teachings, Ancient Church fathers such as Chrysostom proposed a “mutuality in conjugal rights.  It was a symmetry ‘at total variance . . . with pagan culture,’ writes sociologist Rodney Stark” (p. 53).  Christian women enjoyed a much higher status in the church than in pagan society and played a significant role in its development.  As the “head” of the family the father should act as a servant seeking others’ well-being rather than a tyrant exercising his authority.  He should sacrificially enable his wife and children to find their calling and exercise their spiritual gifts.  Such men are, Wilcox says, “soft patriarchs.”  

Unlike conservative Christian men, however, American males are struggling and  there is, Pearcey thinks, a toxic side to their worldview and behavior.  To understand why she conducts an in-depth historical search for some reasons and basically finds the Industrial Revolution largely responsible.  As long as families worked together on farms or cottage industries, most men took responsibility for their families and lived rightly.  During the colonial era, in New England “the ideal for manhood was not personal ambition or self-fulfillment but the subordination of one’s private interests for the common good.  As historian Gordon Wood explains, men ‘were expected to suppress their private wants and interests and develop disinterestedness—the term the eighteenth century most often used as a synonym for civic virtue’” (p. 77).  They were urged to be  “Christian gentlemen.”  Hundreds of religious publications in the 17th century praised men for being, one historian says, “‘forgiving, magnanimous, benevolent, virtuous, moderate, self-controlled, and a worthy citizen’” (p. 101).  

As they moved from farms to factories, however, American men embraced a more competitive, acquisitive philosophy and relied less on Christian principles.  Rather than embracing moral standards they were, in the 19th century one historian says, urged to be ambitious and strong in a competitive marketplace.  “By taking husbands and fathers out of the home, industrialization created the material conditions that made it more difficult to fulfill a biblical ideal of manhood.  Men were no longer physically present enough to be fully engaged husbands and fathers.  They spent most of their time in the public realm, which was growing increasingly secular.  The Industrial Revolution thus became a catalyst for the acceptance of secular views of masculinity” (p. 101).  With their men working away from home the women, almost by default, became the teachers and exemplars of virtue.  

So, as Frances Parkes said in 1825, the “world corrupts, home should refine.” Thirty  years later Ralph Waldo Emerson would hail women as the “civilizers of mankind.”  Harriet Beecher Stowe urged wives to “mother” their husbands for in time, she said, “the true wife becomes a mother to her husband; she guides him, cares for him, teaches him, and catechizes him in the nicest way possible.”  Given these cultural upheavals, women effectively took charge of families, schools and churches.  By the end of the century they constituted nearly 90 percent of Sunday morning church goers.  They generally had minimal doctrinal concerns but enthusiastically championed various reform movements—urging women’s suffrage, the abolition of slavery, and closing “down taverns, saloons, brothels, and gambling houses.”  However well-intended, these reform endeavors easily  alienated men because they generally singled out male vices.  “As historian Mary Ryan points out, ‘Almost all the female reform associations were implicit condemnations of males; there was little doubt as to the sex of slave masters, tavern-keepers, drunkards, and seducers’” (p. 124). 

Deeply impacted by their fathers working away from home and their mothers taking charge of it were young men—fatherless sons.  More than a century ago Frances Willard, president of the Woman’s Christian Temperance Union, saw this as a serious problem needing attention, saying “God is the father, but how many families there are where the prototype of the divine is practically absent from Sunday to Sunday.”  When mothers tried to replace fathers their sons often rebelled, preferring to be a “bad” boy rather than a feminized weakling.  Consequently they were seen by some women as “Goths and Vandals”—little barbarians!  Boys read books celebrating cowboys, soldiers, and frontiersmen who found solace in wild, solitary places.  They found in the Boy Scouts an organization appealing to their “Noble Savage” urge.  The novelist Henry James spoke for many men in his novel The Bostonians (1886).  In the words of his male protagonist, Ransom:  “‘The whole generation is womanized, the masculine tone is passing out of the world; it’s a feminine, hysterical, chattering, canting age.’  Ransome announces his intention to recover ‘the masculine character, the ability to dare and endure’” (p. 146).  

Throughout the past century men have struggled to rightly recover their masculine character.  They’ve done so amidst the growing problem of fatherless boys that has now become a crisis.  Neither the government nor the schools nor the churches have figured out how to restore the family to health.  The greatest challenge we face may very well be getting men to be good fathers.   To do so will entail significant changes and sacrifices.  Work needs to be reduced to a secondary vocation, making fathering a man’s real work.  Taking time to attend church—and support her activities helping children grow up—must become a priority.  If Pearcey’s right there’s little to hope for in the secular world.  But if Christians heed the call they can make a difference and become Good Men.  

                      * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Whereas Pearcey still upholds many egalitarian aspects of her early feminism, Anthony Esolen sharply attacks it in No Apologies: Why Civilization Depends on the Strength of Men (Washingon:  Regnery Gateway, c. 2022; Kindle Edition.). The acclaimed translator of Dante’s Divine Comedy and one of the best Catholic writers in America, Esolen writes “to return to men a sense of their worth as men, and to give to boys the noble aim of manliness, an aim which is their due by right” (p. vii).  He urges us to look around and see how much men have accomplished.  “Every road you see was laid by men.  Every house, church, every school, every factory, every public building was raised by the hands of men” (p. x).  Wherever hard, necessary, physical things get done men (not women) do them.  “The whole of your civilization rests upon the shoulders of men who have done work that most people will not do—and that the physically weaker sex could not have done” (p. x).  Men have nothing to apologize for! 

“Acquit yourselves like men,” Paul said in I Corinthians 16:13.  The Greek text is quite clear, for andrizeisthe means, literally, “Be men!”  Jerome’s Vulgate is equally clear:   “viriliter agite, ”Be men!”  Many modern translations, however, sooth feminist sensibilities by simply saying “be courageous” or “be brave.”  The admonition to men is erased!  So it goes even in the world of Bible translators!  To those who label masculinity toxic, Esolen replies:  “Who is toxic?  The word suggests something hidden, secret, sly.  Imagine someone sprinkling a bit of strychnine in the soup—not enough to kill, but certainly enough to make the diner sick.  That is similar to what is being done to boys in our schools and in mass entertainment.  They are told that there is something wrong with them because they are not like girls.  They are also told that girls can do all of the physical things they can, and perhaps do them better—an absurd falsehood.  Telling boys these things is poisonous, and I daresay it is intended to be so:  those who speak this way want the boys to be weaklings, to despise their own sex, to doubt their natural and healthy inclinations” (p. xiii).  Indeed, shouts Esolen, stop poisoning our boys!  Stop the teachers trying to make our boys little girls!  Enough!   

  Begin by dealing honestly with the facts.  Men are physically stronger—much stronger—than women.  Hundreds of high school boys run faster than female world champions.  “The strongest and fastest women in the world would be pulverized by a men’s professional football team.  You would not ask the score.  You would ask whether the women could stop a single play from scrimmage.  You would ask whether the women ended up in the hospital.  In fact, the best female athletes in the world would be made into mincemeat by a half-decent high school boys’ team.  They would be in danger of serious harm, because the boys would be heavier than they are, taller, faster, stronger, and with much more of that quick-surge muscle action that packs power into the shortest impulses” (p. 3).   Proving his point, recently “the Australian women’s World Cup soccer team was trounced, seven to two, by an under-sixteen boys’ team, and a similar thing happened to the American women’s team that actually won the World Cup” (p. 3). 

As in athletics, wherever you find mechanical systems sustaining modern technology you’ll almost certainly find that men designed and continue to maintain them.  So, Esolen says:  “If you call a plumber to deal with a sewer pipe that has backed up into your basement, it is a practical certainty that it is going to be a man, because the sheer strength required to deal with the valve rusted shut or with a section of pipe that has to be cut or muscled into place is like a threshold” (p. 40).  Sadly enough, our nation’s infrastructure (roads, bridges, etc.) is fraying and needs strong men to do the hard work necessary to repair it.  Where are we going to find such men when our boys are all told to go to college and get a desk job?  Women can’t do it and we’re not rearing boys to respect and embrace hard work.  It’s as if we think the world will run by “magic,” maintaining our comforts without requiring the hard work necessary to make it work.

Men often accomplish great things because they’re team players.  Eccentric geniuses certainly operate alone, but men typically want to get together to accomplish things.  Feminists frequently complain about the “old boys clubs” that keep women from succeeding, but in fact men simply like to be with other men.  They like to plan projects.  They launch hunting expeditions, as did Sioux men hunting bison, because working together is the only way to succeed.  “Out of the individual strengths and wills of the different men, you must create a new thing, a hunting party, whose members at work are less like separate individuals than like the limbs of a body” (p. 64).  The same instinct is at work when neighborhood boys come together to play football.  Esolen notes:  “For a very long time now, there have been girls’ basketball teams, and yet you rarely see a group of girls spontaneously organizing themselves for a game on a basketball court or spontaneously organizing themselves for a pickup game of softball.  Boys will invent more games in a year than girls have adopted from boys in fifty.  It is in their nature to do so” (p. 70).  

As they organize teams men embrace hierarchies.  Some men will be in charge of others, some skills will be more important than others.  There’s no egalitarian ethos on a sandlot baseball field or the NFL draft day!  “That men form hierarchies without embarrassment, and without necessarily destroying the real and important equality among them, is one of the most astonishing things we can say about them; it is something so common and so obvious that we do not even notice it.  But I say: if you do not have hierarchy, you will not only fail at civilization, you will fail even to have a strong tribe of savages in the woods.  You will not kill the bison” (p. 72).  A quarterback orders ten other players to carry out their assignment.  Should every man in the huddle be given equal opportunity to call the play?  Should every workman erecting a cathedral be allowed to design the building?  Effective teams can never be egalitarian.  Yet, apart from the task, such men may very well be best friends, comrades committed to treating each other as equals!  A team’s quarterback and cornerback have hugely different roles to play on the football field but may be inseparable friends attending the same church where the cornerback is considered an outstanding Bible teacher giving guidance to the team’s leader.  In a criminal trial a male prosecutor and his antagonist (the defense attorney) fight for their assigned side, then go out to dinner together with no injured egos.  They illustrate “the masculine capacity to set things in proper emotional compartments, to bracket, to feel and express great passion at one moment and then to set it aside as if it were irrelevant” (p. 94).

To Esolen:  “The miracle of culture and of civilization is the miracle of the transformation and redirection of masculine energy from the willful self to the team, the work crew, the school, and the army—for the sake of the home and the women at the center of the home, and, in the end, for the sake of the city and the nation” (p. 86).  So throughout the centuries men have worked together to build great things.  All-male Renaissance art studies gave us Michelangelo, Raphael, Titian, Tintoretto, and thousands of artists all over Europe.  We should be grateful!  That women were frequently excluded from such groups doesn’t trouble Esolen:  “No apologies, then, for the masculine institutions of the past.  Instead, we should question our refusing to grant to men and boys the opportunity or even the legal permission to form groups that are natural to them and that have proved to be so marvelously productive” (p. 87).  When boys build tree houses with signs saying “No Girls Allowed” let them be!  It’s part of the process of becoming a man as well as granting the “freedom of association” guaranteed by the Constitution.  

Nowhere are strong men needed more than at home.  Yet it’s everywhere evident that families are jeopardized by the shifting sands of modernity.  To Gabriel Marcel there was an “inexpressible sadness which emanates from great cities,” something rooted in “a self-betrayal of life “bound up in the most intimate fashion with the decay of the family.” In part this results from a feminist ideology saturated with by envy.  One of the seven deadly sins, “envy is always looking cross-eyed—that is what the Latin invidia means—at something good that someone else enjoys, and wishing to ruin the enjoyment.  It is spiritual poison for weaklings.  Specifically, envy is the spiritual poison for feminists who see what healthy men and women enjoy, do not themselves enjoy it, and therefore want to ruin it for everyone else.  We can see this in academe.  Feminist scholars have discovered no neglected female Chaucer, so they must tear the actual Chaucer down and make sure that nobody else learns from him, calling him a racist and a rapist and whatnot.  They cannot of themselves produce a Shakespeare, so they must tear him down or wrench his meaning away from the Christian faith he so often portrays in dramatic action.  And on it goes. They have discovered no neglected female Titian, no neglected female Bach. There are none to discover” (p. 100).  So too they hate traditional family and want to destroy it.

We need fathers—patriarchs—who rule wisely.  When they’re absent, boys turn aggressive and girls long for what’s gone.  Both go bad.  “If women lead men,” as is often the case today, Esolen asks, “where are the happy female bosses—and the joyful men they lead?  . . . .  Why do people in an egalitarian wonderland not sing their love of the sexes?  The truth is, as C. S. Lewis says, that love does not speak the language of equality.  It speaks the language of gratitude and superiority, of awe at the unique characteristics that make the beloved different from oneself. . . .  When fathers go absent, do not expect women to take their place” (p. 103).  We “can have patriarchy or not.  If not, you will either suffer anarchy—moral, intellectual, and civic—or you will suffer tyranny in your attempt to keep the anarchy from ruining everything . . .   You can have fathers who govern, or else you can have unattached and unaccountable males who take a dismal pleasure in doing nothing or a ferocious pleasure in destroying things—or sometimes alternate between one and then the other” (p. 105).  We need patriarchs.  Nothing else works.  It’s rooted in our nature as human beings.  No apologies!

Esolen brings to his discussion a deeply-informed knowledge of the West’s best literature.  Citing Dante and Chaucer and Shakespeare and C.S. Lewis enable him to draw upon the wisdom of our civilization in building his case for men.  He also writes as a committed Christian, knowing the truth revealed in creation as well as Scripture.  He’s off-ostracized for speaking the truth as he sees it—and he doubtlessly overstates some of his view—but he’s worth reading and heeding.  No apologies!  

373 “Science at the Doorstep of God”

For more than a decade Robert Spitzer, S.J., Ph.D., has been publishing a series of thoughtful treatises touching on science, philosophy, and theology.  His recent Science at the Doorstep to God:  Science and Reason in Support of God, the Soul, and Life after Death  (San Francisco:  Ignatius Press, c.2023; Kindle Edition) digs into current evidence lending credence to the Christian tradition.  He believes the intellectual “landscape is changing” with many of the old objections to the Christian faith collapsing.  Interestingly enough, younger scientists (66 percent) are more likely to believe in God than older ones and only one-third identify as agnostic or atheist.  Among physicians, three-fourths believe in God while only one-fifth claim to be skeptics.  “It is also worth noting,” says Spitzer, “that most of the originators of modern physics were religious believers, including Galileo Galilei (the father of observational astronomy and initial laws of dynamics and gravity), Sir Isaac Newton (father of calculus, classical mechanics, and quantitative optics), James Clerk Maxwell (father of the classical theory of electromagnetic radiation), Max Planck (father of quantum theory and co-founder of modern physics), Albert Einstein (father of the theory of relativity and co-founder of modern physics), Kurt Gödel (one of the greatest modern mathematicians and logicians and originator of the incompleteness theorems), Sir Arthur Eddington (father of the nuclear fusion explanation of stellar radiation), Werner Heisenberg (father of the matrix theory of quantum mechanics and the uncertainty principle), and Freeman Dyson (originator of multiple theories in contemporary quantum electrodynamics)” (p. 16).

       Such intellectual giants were fully aware of the limitations of natural science, restricted as it is to observational data and inductive reasoning.  Scientific truths are not universal truths because they are focused on the empirical world which can never be known in toto. It’s certainly an important way of knowing—but not the only way.  Scientists (as scientists) cannot know, as do mathematicians, that numbers are quantifiable universal ideas, not empirical data.  Scientists (as scientists) cannot know, as philosophers do, that some truths are a priori, necessarily true, as in the laws of thought.  Scientists cannot (as scientists) know history as historians do,  relying upon what Aristotle said are testimonies, credible eye-witness accounts.  Virtually all logicians insist that “intrinsic contradictions (like square circles or asserting a propositional statement is simultaneously right and wrong) are impossible (and therefore false) at all times everywhere, without exception.”  We also know many things about ourselves, derived from introspection and memory, that afford us important truths.  So Spitzer endeavors to show how evidence from a variety of trustworthy sources lends credence to trans-physical realities such as God, freedom, and immortality.  He believes the evidence will show that there must be a Creator and that man has “a transphysical soul capable of surviving bodily death, which is self-conscious, conceptually intelligent, transcendentally aware, ethical/moral, empathetic/loving, aesthetically aware, and capable of freely initiated actions” (p. 30).  

The best current scientific evidence shows that the universe came into being in an instant—the “big bang.”  Since Monsignor Georges Lemaître, a colleague of Einstein’s, set forth the Big Bang theory in 1927, a hundred years of studies have led, almost inexorably, to the conclusion that the material world is not eternal.  Lemaître “showed with great mathematical precision that the expansion of the universe as a whole was the best explanation of the recessional velocities of distant galaxies, but his conclusion was so radical that Einstein and others found it difficult to accept” (p. 35).  But in 1929 Lemaître’s theory was confirmed by Edwin Hubble at the Mount Wilson Observatory.   Hubble invited Einstein and Lemaître to speak at the observatory in 1933, and “Einstein reputedly said, ‘This is the most beautiful and satisfactory explanation of creation to which I have ever listened.’  Since that time, Lemaître’s theory has been confirmed in a variety of different ways, making it one of the most comprehensive and rigorously established theories in contemporary cosmology” (p. 36).  Everything points to an instantaneous beginning point!  “If a beginning of physical reality is a point at which everything physical (including mass-energy, space-time, and physical laws and constants) came into existence, then prior to this beginning, all aspects of physical reality would not have existed—they would literally have been nothing” (p. 52).  Ex nihilo—from nothing—everything that now exists began to be.  So Christians had proclaimed, purely on the basis of Scripture, for centuries.  Now cosmologists favor that view.

       Still more:  the more we know the more it appears that the universe is “fine-tuned” with a precision that defies chance and accident.  Consider, as did Roger Penrose, the low entropy of our universe in light of the Big Bang.   He calculated the improbability of this combination as a “number is so large that if it were written out (with every zero being 10-point type), our solar system could not contain it.  It is the same odds as a monkey perfectly typing the manuscript of Shakespeare’s Macbeth with random tapping of the keys in a single try!  The odds of this happening by a one-off random occurrence is, by most physicists’ reckoning, virtually impossible.  Yet this low entropy did occur at the Big Bang, which allowed an abundance of life forms to develop within our very spacious and complex universe” (p. 65).  How could this be unless the world is more than mere matter-in-motion.  When Sir Fred Hoyle (one of the last stout, atheistic defenders of the “steady-state” theory) “discovered the need for exceedingly precise fine-tuning in the resonance levels of oxygen, carbon, helium, and beryllium needed for carbon bonding and carbon abundance, his atheism was shaken to the core.  Upon considering the options for how such precise fine-tuning might occur, he concluded as follows:  ‘Would you not say to yourself, “Some super-calculating intellect must have designed the properties of the carbon atom, otherwise the chance of my finding such an atom through the blind forces of nature would be utterly minuscule.’”  A commonsense interpretation of the facts suggests that a superintellect has monkeyed with physics, as well as with chemistry and biology, and that there are no blind forces worth speaking about in nature.  The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond question’” (p. 58).  

Noted scholars have calculated that our “nearly flat universe” was most unlikely.  For it be be as it is, only one nanosecond after the Big Bang its mass density “would have to have been very close to 1024 kilograms per cubic meter.  If the mass-energy had been only one kilogram per cubic meter more, the universe would have collapsed in on itself by now (inhibiting the formation of life), and if it had been one kilogram less per cubic meter (out of 1024 kilogram per cubic meter), the universe would have expanded so rapidly that it would have never formed stars or galaxies necessary for life” (p. 69).  How it possibly happened is hard to imagine—but it seems to have happened precisely that way.   Still more:  “All four universal forces—gravitational, strong nuclear, electromagnetic, and weak—are exceedingly fine-tuned for life” (p. 70).  In the light of so many factors, Spitzer says:  “The ultimate explanation for fine-tuning will have to be not only transphysical (immaterial), but also intelligent to conceive the mathematical systems underlying our physical laws.  This transphysical intelligence will also have to transcend all material/physical processes, structures, and realities so that it can both conceive of those realities and infuse them with mathematical determinations and structures.  The ultimate explanation of fine-tuning, therefore, seems inescapably to be a transphysical/transmaterial conscious intelligence” (p. 95).  After compiling mountains of additional scientific evidence pointing to the fine-tuning of the universe, Spitzer says Fred Hoyle’s “superintellect” is in fact God the “maker of heaven and earth.”  Such evidence points to the high probability of God’s existence, though empirical science can never definitely prove or disprove it.  “Recall that all scientific evidence must be grounded in observable data.  But since God (an unrestricted reality transcending space and time) is not only beyond our universe (the furthest extent of our observational data), but also transcends our sensorial apparatus (and therefore can remain hidden), science will never be able to disprove His existence by its proper method” (p. 102).  Indeed, as the Psalmist said:  “The heavens declare the glory of God; and the firmament sheweth his handiwork” (Ps 19:1).

Turning from empirical science to philosophical metaphysics, Spitzer updates and defends Thomas Aquinas’ famous “proofs” for the existence of God.  The Angelic Doctor “seems to have been the first philosopher to have recognized the full implications of an uncaused reality existing through itself” (p. 111).  (This chapter builds material presented in his earlier treatise, New Proofs for the Existence of God.)  He believes Aquinas had two important metaphysical insights:  a “distinction between existence and essence” and the “priority of existence over essence” (p. 132).  Rooted in these principles he argued, in various ways, that whatever exists must have a cause and said (in Spitzer’s words) that:  “Since everything in reality (except the one uncaused reality) must be a caused reality, and since all caused realities require an uncaused reality to be their first cause, then the one uncaused reality must be the first cause of everything else in existence.  This is what is meant by ‘the Creator of everything else in existence’.  Therefore, the one uncaused reality is the Creator of everything else in existence.  Conclusion:  Therefore, there must exist one and only one uncaused, unrestricted reality that is the Creator of everything else in existence. To say otherwise requires you to argue a contradiction (an impossibility) or to deny the existence of everything (including yourself):  The unique, uncaused, unrestricted Creator is referred to as God.  Therefore, God as defined, exists” (p. 113).  Inasmuch as things exist there must be an “uncaused, unrestricted Creator” sustaining them.  

Spitzer also presents evidence showing we are, by nature, more than mere mortals.  We have a non-material or transphysical soul that explains why we are able to do some very interesting and significant things.  This is especially evident in the many persuasive near death experiences that have received scholarly attention for several decades.  Millions of people have reported having such experiences, and the research shows that they “cannot be thinking, seeing, recalling past memories, or remembering new data” with their biological brain.  There’s something more than grey matter at work here!  To Spitzer:  “here is the mystery.  Even though these patients have no meaningful brain functions, they report being able to think, see, remember, and move. What’s more, they report highly unusual data that can be validated by independent researchers after resuscitation” (p. 144).  Most amazingly:  blind people actually see during their out-of-body states.  “The phenomenon of people blind from birth accurately reporting data throws all known natural explanations of near-death experiences into question because blind people have no visual images in their physical brains that could be projected into imagination, visualization, or hallucination” (p. 147).  To those reporting on their near-death experiences there is no question regarding the reality of their souls.  More than mere matter we are most deeply spiritual beings.

A single anecdote (involving persons two of my recently deceased friends, Terry and Loretta Arnholt, knew quite well) is telling.  A young boy, Colton Burpo, the son of a Wesleyan pastor in Nebraska, had a near-death experience when he was four year old.  He told his parents he had sat on Jesus’ lap, heard angels sing, and met his great grandfather. “Most interestingly, he described an encounter with his deceased sister, who ran up to him and hugged him while he was in ‘heaven’.  She told him that she died in her mother’s tummy, and that she had not been named by their parents.”  When Colton told his mom he had two sisters she was perplexed since she had never told him about her miscarriage.  But he insisted:   “‘I have two sisters.  You had a baby die in your tummy, didn’t you?” She asked, “Who told you I had a baby die in my tummy?” … “She did, Mommy.  She said she died in your tummy.’  Mother Sonja tried to be calm but “was overwhelmed.  Our baby … was—is!—a girl, she thought.  Sonja focused on Colton, and asked:  “So what did she look like?”  “She looked a lot like [his sister] Cassie,” Colton said. “She is just a little bit smaller, and she has dark hair….” Asked to name her, Colton said, “She doesn’t have a name. You guys didn’t name her….” “You’re right, Colton,” Sonja said. “We didn’t even know she was a she.” Then Colton said something that still rings in my ears: “Yeah, she said she just can’t wait for you and Daddy to get to heaven.” When Colton went to another room all his mom could say was:  “Our baby is okay,” (p. 152-154).  

That we are souls indwelling bodies is further evident in our remarkable ability to think.  In defining us as “rational animals” the ancient Greeks were right on target.  We want to know—as young journalists learn—answers to questions regarding who, what, where, when, how, and why.  We cannot not think!  It’s ingrained in us to ask questions.  We do more than perceive things, as do animals, for we take sense perceptions and develop mental concepts.  Our language reveals this.  Only “3 percent of our words signify perceptual ideas, and about 97 percent, conceptual ideas” (p. 168).  We want to know what causes things to be as they are.  Aristotle’s enduring genius was evident when he showed how we invoke material, formal, efficient, and final causes to fully explain things.  To build a house we need wood and nails (materials), a plan (the form), a builder (the efficient cause) and a reason for building it (to secure shelter, the final cause).  Few ways of thinking make more sense—yet all too many moderns consider only material and efficient causes.  We can see differences and similarities in things.  We can understand that some things occur earlier or later than other things.  We can think abstractly, as is most evident in our use of language.  “In sum, without an understanding of high-order concepts such as ‘similarity’ and ‘difference (with respect to the question of what), ‘cause’ and effect’ (with respect to the question of why), and ‘earlier’ and later’ (with respect to the question of when), we would have no understanding whatsoever—no conceptual ideas, no predicates, no syntactically significant language; we would be reduced to the level of perceptual ideas alone” (p. 173).   Consequently, we must, Spitzer says, have a “preexperiential awareness of high-conceptual ideas” revealing “a transphysical origin capable of grasping relatability without reference to what is related.  This points to the existence of a transphysical soul” (p. 169).  

As does our self-consciousness!  “Self-consciousness was recognized to be transmaterial by Saint Augustine and Saint Thomas Aquinas, both of whom noticed that this act of self-reflectivity requires that the same act of consciousness be both content of thought and thinker of thought simultaneously” (p. 197).  We not only think—we know we are thinking.  We’re aware of ourselves and continually make decisions rooted in our ability to think. We can make decisions because we’re free to do so.  Our reason and will make us  free.  Whereas hard-core evolutionary determinists deny it, Spitzer counters with persuasive evidence favoring free will.  Scholars such as Rudolph Otto studying religious experiences have documented “a fundamental, prerational experience of what he termed ‘the numen’ (a spirit or divine power) underlying these experiences.  The numen is experienced as an interior presence of a transcendent ‘wholly Other’, which is mysterious, overwhelming, fascinating, and awe-inspiring, as well as desirable, inviting, and enchanting” (p. 216).  In such moments the sacred dimensions to reality impress us and we have a “spiritual awakening” that often makes all the difference in how we thenceforth live.  To ignore or deny such experiences diminishes us, for we are most fully human when knowing what’s ultimately real.  

We’re also deeply human when acknowledging our moral consciences.  We cannot not know that some things are right and wrong.  Just ask a seven year old boy whose bike has been stolen if he thinks it was right or wrong!  When we do wrong we generally want to make it right. Our conscience generally speaks in a still small voice rather than a loud speaker, but it’s almost always speaking, and “John Henry Newman held that this guiding moral force is one of the most important spiritual dimensions of human beings.  He showed that closely examining it could reveal the presence of God within us” (p. 228).  Spitzer draws upon great literature (e.g. Dostoevsky’s Crime and Punishment)to illustrate conscience’s power, and shows how it points us toward God as “a divine, loving, Fatherly authority figure.”  John Henry “Newman puts it this way: ‘[When we are] contemplating and revolving on this feeling the mind will reasonably conclude that it is an unseen father who is the object of the feeling’” (p. 230). 

Long ago Plato identified five kinds of uniquely human, transcendental desires:  “the desire for perfect truth, perfect love, perfect goodness/justice, perfect beauty, and perfect being/home.  Saint Augustine and Saint Thomas Aquinas, as well as many contemporary philosophers such as Karl Rahner, Bernard Lonergan, Josef Pieper, and Jacques Maritain, have spoken of these same desires through the centuries.  What these philosophers recognized is that these five transcendental desires reveal that God is present to our consciousness, showing that we must be spiritual as well as physical beings” (p. 232).  To deny our transcendental desires is to reduce us to purely physical beings, which is too often done in the modern academy.  But to be truly human is to see in our desires something essential about us revealing something about the world beyond us.  However often we’re told there is no “truth” we keep coming back to affirm it by confessing our knowledge is imperfect.  Yet we would not know it’s imperfect unless we had a hunger for its perfection!  “Without at least a tacit awareness of perfect knowledge, we would not be able to grasp that our current knowledge is imperfect” (p. 234).  Similarly, our desires for love, justice, and beauty all point toward an ultimate Source Who simply IS the transcendentals. 

Though Spitzer writes for a general audience, at times his scientific expertise taxes this reader’s competence!  So when he endeavored to link quantum physics to the soul I was awed without fully grasping it all!  Nevertheless, his discussion of “Quantum Hylomorphism” is quite fascinating.  He notes that for centuries monistic materialists denied the soul and propounded theories widely embraced by scientists. “The whole of reality,” they said, “can be explained by material reality organized in more and more complex layers giving rise to higher-level activities, such as self-consciousness and thought” (p. 240).   Plato and Aristotle, Augustine and Aquinas, of course, challenged the materialists, and, interestingly enough, quantum physics may show how right they were!  Modern physicists generally talk of fields of energy rather than bits of matter in motion.  Consequently:  “If we consider material particles to be excited states of more fundamental quantum fields (as in quantum field theory) interacting in space-time (the curvature of which gives rise to gravitational effects), then we could say that the constituents of the physical world are not purely material in the way that early philosophers . . . conceived them.  Rather, physical reality has something in common with the content of a human mind—information fields that can be reduced to instantiated states capable of interacting with other physical realities and systems” (p. 243).  Then, perhaps, “a transphysical soul with conceptual ideas could act as a higher-order information field influencing all layers of lower-order information fields all the way down to quantum fields intrinsic to particles.  This would enable a free creatively intelligent self-conscious soul to interact with material reality at the lowest levels without being reduced to them” (p. 244).  To Spitzer, these recent developments in science provide clear evidence that we are essentially spiritual beings by tying together “the laws of quantum mechanics, general relativity, and classical physics while allowing for an autonomous, self-conscious, rational, and transcendent soul integrated with the material world through the layering of information fields” (p. 249). 

Science a the Doorstep to God is a challenging read!  But it’s worth the effort—and it’s certainly worthwhile to know there are fully-informed Christians working to defend the faith once delivered unto the saints!  Many surveys show that many youngsters abandon the Christian faith because they think science has disproved it.  Militant atheists such as Christopher Hitchens have persuaded them of this.  What they need to know, as Spitzer shows, is that many atheists have only a superficial knowledge of science and philosophy, while truly deep thinkers frequently acknowledge there must be a Mind behind our visible world, seeing that “the worlds were framed by the word of God” (Her 11:3).

372 Still They Hate

       Since September 11, 2001, there have been 35,000 terrorist attacks around the world.  Virtually all of them were orchestrated by Muslims.  Before 9/11 I’d rarely studied Islam and knew only elementary facts about Muslim history, so I then read and reviewed a number of books.  Considering the brutal attack of Hamas on Israel on October 7, 2023, I’ve decided to republish three of those reviews (with some slight updating).  To understand what motivates Islamic Jihadists one should read first-hand accounts such as Brigitte Gabriel’s Because They Hate:  A Survivor of Islamic Terror Warns America (New York:  St. Martin’s Press, c. 2006), that give specificity to events in the Middle East.  Born in Lebanon, when it was still a peaceful, prosperous, predominantly Christian country, she witnessed the chaos and destruction that followed the Islamic Palestinians’ invasion of her homeland 40 years ago.  Living in the United States, she wrote this book to warn Americans “that what happened to me and my country of birth could, terrifyingly, happen here in America” (p. 2).  We simply must know this:  “The main objective in the radical Islamists’ strategy to dominate the world is the destruction of the United States.  They know that if America, the keystone, falls, then the arch of Western civilization will collapse” (p. 169).  

The only child of elderly, prosperous parents in southern Lebanon, Gabriel enjoyed (for a decade) an idyllic childhood, blessed with parties, religious holidays, good schools, and friendly neighbors.  Things changed rapidly, however, as the nation’s “open door” immigration policies allowed thousands of Palestinians to enter the country.  Following the successful establishment of the nation of Israel, growing numbers of Palestinians lived in PLO refugee camps in Jordan and launched terrorist raids against Israel.  Weary of their troublesome presence, Jordan’s King Hussein expelled them in 1970.  Subsequently, “Lebanon was the only one of twenty-two Arab countries that was willing to open its borders to a third wave of Palestinian refugees” (p. 18).  These refugees quickly seized control of their host country.  Gabriel’s home and village, located near the Israeli border, were reduced to rubble as Muslims routinely shelled it.  “To a ten year old, all this—the civil war and the attack against us—was bewildering.  Just as people asked ‘Why do they hate us?’ after 9/11, one evening I asked my father, ‘Why did they do this to us?’  He took a long breath and paused, deeply concerned about what he was about to say.  ‘The Muslims bombed us because we are Christians.  They want us dead because they hate us’” (p. 33).  To Americans mystified by the terrorists’ attacks on 9/11—and by the Muslims’ rejoicing thereafter—she says:  “There is a three-word answer that is both simple and complex:  because they hate.  They hate our way of life.  They hate our freedom.  They hate our democracy.  They hate the practice of every religion but their own.  They don’t just disagree.  They hate” (p. 145).  

In 1982 Israeli troops occupied southern Lebanon and brought blessed peace to Gabriel’s region.  It was a military action bringing what Europe experienced when the Nazis were defeated in 1945.  Protected by Israeli soldiers, she and her neighbors moved about freely and rebuilt their lives.  When her mother became seriously ill, Jewish military medical personnel took her to a hospital in Israel, where she received first-class treatment.  In that hospital a lifetime of anti-Jewish prejudice drained away from Gabriel.  The Israelis were even treating Islamic terrorists!  “I realized at that moment,” she says, “that I had been sold a fabricated lie by my government and culture about the Jews and Israel that was far from reality.  I knew for a fact, as someone raised in the Arab world, that if I had been a Jew in an Arab hospital, I would have been lynched and then thrown to the ground, and joyous shouts of ‘Allahu Akbar’ would have echoed through the hospital and the surrounding streets” (p. 79).  

In that Jewish hospital, Gabriel volunteered to serve as a translator.  This led in time to a job with a Jerusalem television station, where she worked for six years.  There the contrast between Judaism and Islam was striking.  On the Jewish side, “you see order, structure, cleanliness, and beautiful flowers planted everywhere” (p. 103).  A block away, in the Muslim section, dirt and disorder prevailed.  The “clash of civilizations” shone forth every day in Jerusalem.  At work, helping prepare daily newscasts, the clash seemed overwhelmingly clear, and she “began to realize that the Arab Muslim world, because of its religion and culture, is a natural threat to civilized people of the world, particularly Western Civilization” (p. 105).  Working as a journalist, Gabriel saw the astoundingly favorable treatment Western media gave homicidal thugs like Yasser Arafat.  Ever portraying the Palestine Liberation Organization in positive ways—and Israelis as villains—American journalists greatly helped the jihadists.  “Unable to defeat Western military superiority, our enemy depends on negative themes throughout the media to create disunity, opening schisms on the home front in our communities, on our campuses, and in our government” (p. 111).  Noting that “General Bui Tin, who served on the general staff of North Vietnam’s army, was asked why America was defeated in Vietnam.  He said:  ‘America lost because of its democracy; through dissent and protest it lost the ability to mobilize a will to win’” (p. 112).  In our “fight against Islamo-fascism” these words should give us pause.  Living in Jerusalem, she watched foreign TV “journalists” who “blew in, blew around, and blew out.  They came with their preconceived ideas, toed the network editorial policy line, and perpetuated,” albeit unconsciously, the “subtle Arab and PLO propaganda, which had reached them wherever they came from.”  They loved to photograph “wailing Palestinians” and “kids throwing stones against border patrol soldiers firing tear gas and rubber bullets.  Because I could speak the language and read the Arabic press and knew the nuances behind events, I sensed that reporters were being manipulated” (p. 119).  Thus it was with a both amazement and anger that Gabriel “watched the West fall further under the spell of anti-West, anti-Israeli propaganda, just as it did during its coverage of Lebanon, which portrayed the Palestinians and Islamo-fascists as the victims instead of the aggressors” (p. 119).  

Gabriel was alarmed by this because she had carefully observed developments in the Middle East—and America’s response to them—since 1975.  When, in the 1979 hostage crisis in Iran, President Jimmy Carter “alternately groveled and bungled, Ayatollah Khomeini exultingly proclaimed, ‘America cannot do a damn thing!’  This became a slogan and a battle cry throughout the Middle East” (p. 125).  Though markedly different from Carter in many ways, President Ronald Reagan behaved similarly in Lebanon.  When Hezbollah (subsidized and controlled by Iran) “blew up the marines in Lebanon in 1983, America turned tail and ran, leaving the Christians to be slaughtered in town after town.  It sent a strong, loud, and clear message to the Muslim radicals of the world, including Osama bin Laden:  America is no longer the power it used to be” (p. 125).  That being so, Sudanese Muslims, in 1983, launched a genocidal “jihad to impose Islam on black African Christians and animists in the southern part of the country” (p. 125).  Some two million innocent people were killed within a decade.  

She further provided brief accounts of other Islamic aggressions since 1979.  It’s a world-wide phenomenon with enormous implications.  And it’s taking place within the United States as well.  Radical Muslims, funded by Saudi Arabian petrodollars, are working hard to Islamize this country, though they present a benign face to the public.  “Masquerading as a civil rights organization,” for example, CAIR (the Council on American-Islamic Relations) “has had a hidden agenda to Islamize America from the start” (p. 138).  Gabriel documents and laments the degree to which Saudi money and compliant professors have established influential footholds for radical Islam on many university campuses.  To deal with this threat, at home and abroad, reading this book, with its many suggestions concerning what to do, is most helpful.  

* * * * * * * * * * * * * * * 

An equally readable book, addressing the same issue and coming to basically the same conclusion, though from a markedly different perspective, is Nonie Darwish’s Now They Call Me Infidel:  Why I Renounced Jihad for America, Israel, and the War on Terror (New York:  Sentinel, c. 2006).  Darwish was born into an elite Egyptian family, and her father was a highly placed officer in Gamal Abdel Nasser’s army, considered “one of the most brilliant analytical minds found in the Egyptian military” by an Israeli historian (p. 255).  Unfortunately, he was assassinated by Israeli agents while stationed in Gaza in 1956.  In death, however, he became a celebrated “shahid,” a martyr for Islam, a national hero.  Subsequently, the family settled in Cairo, where Darwish received an excellent education in a Catholic girls’ school and then the American University in Cairo.  She enjoyed the unique economic and social privileges of her class.  But she was also fully immersed in the culture of Islam.  From the radio, as well as the mosques, came “calls to war and songs praising President Nasser.  Arab leaders were treated as gods and they acted as gods” (p. 33).  The call for jihad was ubiquitous.  “No Arab could avoid the culture of jihad.  Jihad is not some esoteric concept.  In the Arab world, the meaning of jihad is clear:  It is a religious holy war against infidels, an armed struggle against anyone who is not a Muslim” (p. 33).  Yet she found herself inwardly torn by some of the incongruities of her world, especially when dealing with “marriage and family dynamics.”  She managed to avoid the arranged marriages expected of  Muslim women.  And she observed that “at the heart of Islamic fundamentalism lies the most precious and important object, the woman.  She is the source of pride or shame to the Muslim man who rules and is ruled by the most despotic, tyrannical, and humiliating forms of governments on earth” (p. 66).  Muslim men’s “honor is totally dependent on their female blood relatives” (p. 66).  Personal honor and integrity are not particularly important.  It’s their women that establish their “honor”!  

Darwish also struggled with the reality of polygamy and its power in Islamic culture.  Married women fear their husbands will take a second wife—often secret liaisons divulged only at the man’s death, when his estate must be divided among all his wives.  Muslim women, consequently, distrust both their husbands and any single women who might attract them.  Then there is the “temporary marriage,” also known as “pleasure marriage,” empowering men to have one-night stands, “usually in exchange for money (calling it a dowry), and still feel that it is acceptable in the eyes of God” (p. 68).  Men may easily divorce their wives, whereas women must beg (often unsuccessfully) for a dissolution of a dysfunctional union.  Consequently, Darwish found “very few happy marriages around me” (p. 79).  As a single woman Darwish worked for several years at the English desk of the Middle East News Agency.  This gave her a unique perspective on the world and also occasionally allowed her to travel abroad.  She became aware of a world quite different from that described by the Egyptian media.  She also made friends with Copts—Christian Egyptians who had suffered for centuries.  In fact she fell in love with and married a Coptic man, with whom she immigrated to the United States in 1978.  

Landing in Los Angeles, she acknowledges that she “loved America even before seeing it” (p. 113).  She found Americans friendly and helpful, courteous, hard-working, generous and honest—virtues  largely absent in Egypt.  She worked for a Jewish businessman and found that most everything she had heard about Jews in Egypt was wrong.  “I asked myself, Why the hate?  What purpose does it serve?  What are Arabs afraid of?”  Indeed, she concluded:  “The Arab-Israel conflict is not a crisis over land, but a crisis of hate, lack of compassion, ingratitude, and insecurity” (p. 126).  American women differed from the women she’d grown up around.  They were supportive of each other, complimenting and helping in various ways.  “Moving to America,” she says, in a memorable passage, “was like being catapulted to another time in history.  America for me was not just a place for making money, having a job, a house, and car, it was a place for becoming a human being” (p. 130).  

Part of that process was religious.  Though she remained a Muslim she hungered for an authentically personal relationship with God.  “The truth is that most Muslims are a part of ‘political Islam’ rather than a religion and a personal relationship with God” (p. 136).  Islam, for her (and most Muslims) is a matter of birth and politics.  Mosques are mainly for men, whom women are expected to obey.  To her dismay, she found “that rabid anti-American feeling is rampant in the majority of U.S. mosques, where Muslims are encouraged to stand out as mujahadeen in America” (p. 140).  Using America’s democratic processes, these Muslims seek to ultimately control the nation.  Knowing the history of Islam, Darwish says:  “The current onslaught against our society is nothing new.  Conquering the world for Islam has been going on since the seventh century using pretty much the same tactics” (p. 144).  

In time, Darwish had rejected her family’s religion, Islam.  One Sunday morning she was watching a Christian preacher on TV who was expounding the love chapter in Paul’s letter to the Corinthians.  She heard about “the love of God I was desperate for but was unable to find in my culture of origin” (p. 160).   Her daughter came in and announced that the TV preacher pastored the church that sponsored the Christian school she attended.   So Darwish determined to visit the church the following Sunday.  She did and heard “a message of compassion, love, acceptance, tolerance, and prayer for all of humanity” (p. 159).  This message differed radically from Muslim preachers’ hate-filled diatribes, urging hearers to “destroy the infidels.”  At that moment, sitting for the first time in that Christian church, this Muslim woman “was faced with a challenge, nothing less than the choice between love and hate” (p. 159).  She made a decision, and it made a difference.  Evaluating this, she writes:  “Many immigrants come to this great nation in search of material gain, which is fine; however, the biggest prize I gained was my religious freedom and learning to love.  For me it was nothing short of cataclysmic.  I had turned from a culture of hatred to one of love” (p. 161).  Though still nominally a Muslim, her God is not a jihadist!

Her new perspective provides readers a lens with which to evaluate developments in the Middle East.  When she made a brief trip to Egypt (arriving home in L.A. the night before the  terrorist attacks on  September 11, 2002, she saw again the deadening hand of Islam upon her land.  She heard again the lies about the Jews.  She sensed the irrational anti-Americanism promoted by the media, including the only U.S. media outlet available to Egyptians, CNN!  “To my surprise,” she says, CNN contributed to Arab hatred and suspicion of America by regularly criticizing America and President Bush” (p. 175).  She noted the pernicious impact of money from Saudi Arabia, funding radical jihadists.  And she sorrowed at the injurious impact of Islam upon the nation’s women, including many in her own family.

Mystified at the silence of allegedly “moderate” Muslims who failed to denounce the jihadists, Darwish began writing and speaking, trying to inform America about the threat of radical Islam.  “In the Arab world,” she insists, “there is only one meaning for jihad, and that is:  a religious holy war against infidels” (p. 201).  That’s what we now face everywhere.  Portraying Islam as a peaceful religion “can only bring disaster between the two worlds” (p. 202).  She especially critiqued America’s universities, where  Muslims are afforded unusual support and easily propagandize naïve students.  “The war of words and propaganda,” she warned, “could be as vital as the actual military war” (p. 211).  The message she declared is clear.  “After 9/11, my fellow Americans should never be in the dark again.  They must understand the brutality and persistence of their enemy” (p. 212).  Radical Muslims intend to conquer the world “and to usher in a Caliphate—that is, a supreme totalitarian Islamic government” (p. 212).  They will do anything possible to accomplish this goal.  “They are willing to bring about an Armageddon to conquer the world to Islam.  We are already in world War III and many people in the West are still in denial” (p. 212).  

She hopes that reading Now They Call Me Infidel will shake some of us out of such denial!  

                                              * * * * * * * * * *                                                                                Serge Trifkovic’s The Sword of the Prophet (Boston:  Regina Orthodox Press, Inc., 2002) set forth an admittedly “politically incorrect” perspective on Islam, its “history, theology, and impact on the world,” and portrayed today’s Middle East conflicts as a recent manifestation of an ancient religious struggle.  He basically reiterates the evaluation rendered by the great French thinker, Alexis de Tocqueville, who’s analysis was unusually prescient:  “‘I studied the Kuran a great deal. . . .  I came away from that study with the conviction that by and large there have been few religions in the world as deadly to men as that of Muhammad.  As far as I can see, it is the principal cause of the decadence so visible today in the Muslim world, and, though less absurd than the polytheism of old, its social and political tendencies are in my opinion infinitely more to be feared, and I therefore regard it as a form of decadence rather than a form of progress in relation to paganism itself’” (p. 208).  Trifkovic began his book with the assertion that the 9/11 Muslim terrorists’ attack on the United States demonstrated an antipathy against the Christian world deeply rooted in Islam.  That so many in the media refer to Islam as a “religion of peace” shows that “the problem of collective historical ignorance–or even deliberately induced amnesia–is the main difficulty in addressing the history of Islam in today’s English speaking world, where claims about far-away lands and cultures are made on the basis of domestic multiculturalism assumptions rather than on evidence” (p. 8).  Just as left-leaning journalists and professors long avoided condemning the evils of Stalinist Russia, pro-Muslim “experts” have skillfully spread skillful propaganda to gloss over the truth concerning Islam.  To set forth the facts—to counteract the propaganda—this book was written.

Importantly, we learn much about Muhammad.  Born in Mecca in 570 A.D., he spent his early years working at menial jobs.  Then, fortuitously, he met a wealthy widow, Khadija, for whom he worked and ultimately married.  Freed from financial concerns, he spent  time in the solitude and, in A.D. 610, received a message from an angel designating him as “the Messenger of God.”  In A.D. 622, he and 70 followers moved to the more hospitable city of Medina.  This event—the hijrah—marks Islam’s true beginning point.  Importantly, Muhammad turned from religion to politics, relying less on persuasion than coercion.  His followers raided camel caravans and enriched themselves and their prophet.  Evaluating the prophet’s career, Trifkovic says:  “Muhammad’s practice and constant encouragement of bloodshed are unique in the history of religions.  Murder pillage, rape, and more murder are in the Kuran and the Traditions seem to have impressed his followers with a profound belief in the value of bloodshed as opening the gates of Paradise’ and prompted countless Muslim governors, caliphs, and viziers to refer to Muhammad’s example to justify their mass killings, looting, and destruction.  ‘Kill, kill the unbelievers wherever you find them’ is an injunction both unambiguous and powerful” (p. 51).  His  example and teachings led quickly to what Trifkovic calls “jihad without end.”  The century following Muhammad’s death witnessed the success of Muslim armies, conquering much of the known world, creating “an Arab empire ruled by a small elite of Muslim warriors who lived entirely on the spoils of war, the poll and land taxes paid by the subjugated peoples” (p. 89).  Under Muslim rule, lush agricultural lands slowly withered away and became deserts.  Thriving economies, subordinated to Muslim dictates, slowly sank into impoverishment.  “The periods of civilization under Islam, however brief, were predicated on the readiness of the conquerors to borrow from earlier cultures, to compile, translate, learn, and absorb.  Islam per se never encouraged science, meaning “disinterested inquiry,” because the only knowledge it accepts is religious knowledge” (p. 196).  

Turning to the “fruits” of Islam, Trifkovic discusses such things as the absolute lack of religious liberty, the subjugation of women, the widespread practice of enslaving non-Muslims.  He also shows how deeply embedded is the hatred for Jews in the Muslim world.  For example, during WWII the Mufti of Jerusalem and former President of the Supreme Muslim Council of Palestine, Haj Mohammed Amin al-Husseini, urged Muslims to support Hitler.  In a radio broadcast from Berlin, he said:  “’Arabs!  Rise as one and fight for your sacred rights.  Kill the Jews wherever you find them‘” (p. 186).  “Kill the Jews!”  That’s chanted in Gaza and American universities.  Still they hate!  As do Hamas’ American supporters!  

371. Favale & Feminism

Reading Abigail Rine Favale’s spiritual autobiography—Into the Deep:  An Unlikely Catholic Conversion (Eugene, OR:  Cascade Books, c. 2018; Kindle Edition)—prods one to consider both the strengths and inadequacies of evangelicalism, the breakthroughs and and fallacies of feminism, and the reasons that led her to enter the Catholic Church.  Living in Utah and Idaho, surrounded by Mormons, she and her family “took refuge in a conservative evangelical bubble” (p. 9).  She cannot remember not being “Christian.”  At the age of three, responding to her father’s question concerning her readiness to accept Jesus into her heart, she said yes and was thus “saved.”  Subsequently, however, she wondered what had really happened.  Was she truly saved?  Again  and again she would repeat the sinner’s prayer just to make sure she was right with God.  Though taught to believe “once saved, always saved,” she continually struggled to feel at ease with that position.   She’d learned “that I should turn to prayer and ask for forgiveness, but this led to a dizzying loop:  I was saved and thus already forgiven for my sins, freed from their penalty, yet I needed to ask for forgiveness for new sins I committed, from which I’d already been forgiven, because I was saved” (p. 128).  But for all its limitations, the religion of her childhood provided a secure social world and ample exposure to the teachings and stories of the Bible, and she was particularly fascinated by Old Testament women who seemed resourceful and self-reliant.

       When she was a senior in high school her parents began attending a large, growing church.  Though they thought it would be good for her it mainly provided opportunities to “troll for boys.”  Her earlier religious fervor faded as she wearied of swinging from enthusiasm to apathy, so she sought comfort by “careening from boyfriend to boyfriend, from love to crush to meaningless hook-up, without much care or awareness of how anything I did affected anyone else.”  Her “drug of choice was male attention” (p. 16).  Losing her virginity, she considered herself “damaged goods” and distanced herself from both her parents and the church.  “Toward the end of that year, I smoked and drank a bit, as if to round out the archetype of the rebellious teenager,” and dabbled with a Ouija board.  Yet:  “Even though I was living in an array of colorful sins, marooning myself from the grace of God, I was not particularly concerned for my soul and fancied myself impervious to demonic forces—because, after all, I was ‘saved’” (p. 16).  

       Graduating from high school and hoping life would somehow improve Favale enrolled in a Christian university.  Though she never names it, a quick online search shows she attended George Fox University in Oregon.  She resolved to make a new start and in time met the man she’d ultimately marry.  During her freshman year she took a required New Testament class and began thinking about the male-female roles spelled out in Scripture.  Doing some research in the library, she “made a life-altering discovery:  Christian feminism.  On these shelves, I found ample resources to interpret the Bible in a way that confirmed my belief in the equality of the sexes before God.  By the end of the semester, just weeks away, I had embraced an evangelical feminist hermeneutic and wrote a term paper for that class with the provocative title ‘God is a Feminist’” (p. 21).  She began reading the Bible differently.  It was not to “be taken at face value as a clear-cut instruction guide for life, free from tensions and ambiguities.  No, this Bible was richer, scarier, multivalent, and in need of careful interpretation.”  She could now do hermeneutics, and that allowed her to believe in “the equal dignity of the sexes” and reject “strict gender roles” (p. 22).  Subsequently she moved from reinterpreting the Bible to disregarding it, losing confidence in its inspiration and authority, even trying to “to pray to God as Mother, convinced that masculine language for God was a hangover from patriarchy” (p. 25).  Yet addressing God as Mother left her strangely unable to pray.

       In her freshman year Favale also took an introductory class in philosophy.  It opened her mind to thinking more deeply, examining her faith in ways never expected in her childhood.  Her professor was an Anglican priest who encouraged students to consider a sacramental form of Christianity that appealed to her.  She’d earlier tasted a bit of liturgical, eucharistic worship in a Lutheran church, but attending Anglican services awakened her to “a deeper sense of the sacred.”  She then began meeting with a small group of students, using the Anglican Book of Common Prayer as a devotional guide.  “Eventually, I decided, along with several of my compatriots—including boyfriend Dave—to be confirmed in the philosopher-priest’s small Anglican denomination” (p. 29).  Nevertheless, her Christianity steadily weakened as she more fully embraced the feminist creed.  She took a generally “postmodern outlook:  ultimate truth cannot be known by finite human beings, so we collectively create metanarratives of meaning to connect with what remains beyond us.  Christianity is one such narrative, perhaps the best and truest one, but not necessarily actually or absolutely true in its entirety” (p. 35).

       As she recalls:  “Untethering myself from tradition, Scripture, creed—at first this all seemed so liberating.  Everything was boundless potential; I could salvage what was meaningful and purge myself from the rest of it.  There were no limits.  I lopped those branches from the great ancient tree, just enough to keep afloat, and I paddled away from the shore alone, without a clear heading, whispering to no one in particular, ‘I am free, I am free, I am free’” (p. 43).  In fact, she was adrift, rootless and foundering, professing to be autonomous and empowered but inwardly quite otherwise.  “When I think about this time,” she says, “this is the image I see:  a girl adrift in the ocean, no land in sight, clutching onto a tiny, wooden raft, just a row of logs tied together.  Her feet churn in the water, touching nothing but dark fathoms beneath.  She is clutching the raft, which holds her afloat, but the raft itself is anchorless, rudderless.  They bob in the water together, waiting to wash up on some shore, any shore, so the world can seem steady again” (p. 42).

       Nevertheless she went off to graduate school in Scotland and returned to George Fox University as a professor of literature.   “In this Christian academic setting, I saw myself as an iconoclast in the trenches, battling for the soul of Christianity against the fundamentalists” (p. 46).  But then she became a mother!  She experienced “the sea change that happens to a woman once she gives birth, the inner transformation that occurs, one simultaneously subtle and earthmoving” (p. 48).  Seeing an ultrasound of her baby boy at 10 weeks of age led her to question her abortion-rights feminism.  She then encountered “the intractable reality of maleness and femaleness.  These are not mere social constructs” (p. 51).  Her pretended autonomy faded as she discovered her “I” becoming “We.”  She hadn’t imagined “the wild motherlove that would pull me out of myself” (p. 52).  She simultaneously sensed a need for church, something she’d disregarded  for too many years, so she made “a radical, unanticipated move” and shifted from being “a disaffected post-evangelical feminist on the brink of atheism” to considering becoming a Catholic! 

       For years she’d embraced a form of Christianity that mainly espoused social justice and love.  Theology and dogma mattered little.  But now she found herself needing something more.  She found herself needing what Catholics call “actual grace”—not a forensic matter of “getting something you don’t deserve” but of receiving a supernatural infusion of God, enlivening and remaking a person.  With St Augustine she confessed:  “Too late have I loved you, O Beauty so ancient, O Beauty so new.  Too late have I loved you!”  She’d at last begun to really love Him and began, for the first time in years, to pray, finding (with Benedict XVI) that “prayer, properly understood, is nothing other than becoming a longing for God.”  To be a Christian is to become one with God through the infilling of His Spirit.  Salvation “for the Catholic, involves actual transformation, an ongoing process of sanctification, so the love of God, which has been poured into our hearts by faith, can be kept alive and continually refine us.  Since salvation ultimately culminates in union with the triune God, the soul must be purged from sin altogether, not merely freed from sin’s consequences.  Contrary to popular misconceptions, this process is not something we accomplish on our own, through our own works and merits—that would be Pelagianism, a heresy rejected by the Church in the fifth century.  No, sanctification is only possible through supernatural aid—divine grace—and our active participation with that grace” (p. 87).

       Having explained her reasons for entering the Catholic Church, Favole devotes the rest of her book to explaining how that decision enabled her to become whole, nourished by the teachings and sacraments of the Church.  The power of God is evident as she enters into a truly devout life, relishing her role as a wife and mother.  Her prose is fluent, her story is compelling, and we learn much from her about our world.  

                                                * * * * * * * * * * * * * * * * * * * *

       In The Genesis of Gender:  A Christian Theory (San Francisco:  Ignatius Press, c. 2022), Abigail Rine Favale expands upon ideas set forth in her engaging autobiography, Into the Deep.  After a decade of deep engagement with postmodern feminism, earning a Ph.D. in women’s writing and gender theory at Edinburgh University, she found herself in 2015 teaching a course on gender theory at her alma mater, George Fox University.  Though she had abandoned orthodox Christianity years earlier, she saw herself as a revisionist called to “construct a new Christianity, fully purged of sexism, hierarchy, and sin” (p. 18).  She’d taught the course many times, but now she “was in the midst of two dramatic upheavals in my personal life:  the birth of my second child” and “a tumultuous conversion to Catholicism, which was upending everything I thought I knew.  I found myself both giving birth and being born—my body turned inside out to bring forth a daughter; my soul turned inside out to make room for Christ.”  A year earlier she’d joined the Church, thinking she could “become a ‘cafeteria Catholic’, lugging my cherished progressive beliefs into the Church and taking shelter under the canopy of conscience.  Then something terrible happened.  My conscience started to rebel.  The progressive beliefs I was carrying began to feel less like personal belongings and more like baggage:  burdensome and out of place” (p. 8).

       In the following weeks she began questioning the feminist dogmas that had long shaped her life.  She realized she’d been living in a darkness of illusions, mistaking rhetoric for reality.  Consequently, she felt like she had “been giving my students poison to drink.”  Sadly enough:  “For so many years, I’d been careless, careless with their minds and, most disturbingly, their souls” (p. 10).  Listening to this confession, one of her colleagues, uninterested in coddling her, came to the point:  “‘You know that verse in Matthew?  The one that says if anyone causes the little ones to stumble, it would be better for him to have a millstone hung around his neck and be drowned in the sea?  I’ve always thought it would be a good idea for us professors to have that tattooed on our arms’” (p. 10).  She needed to repent and change her location.

       Recently appointed a professor in the McGrath Institute for Church Life at the University of Notre Dame and fully aware of feminist ideology, Favole says college students now “inhabit a world where feminism has become mainstream, even in Christian circles.  Not to be a feminist is a major faux pas, tantamount to being anti-woman,” and virtually all universities offer classes in gender theory and feminist philosophy.  A few years earlier she’d “had to go out of my way to find feminism . . . but this is no longer necessary” (p. 25).   There is, she thinks, “an authentically Christian feminism,” but it’s not what’s taught in the universities, so her distinctive form of orthodox Christian feminism makes her a “heretic.”  She’s thankful for some of the good feminism afforded her, but “it ultimately brought me to a place at odds with Christianity, a place I will call the gender paradigm.  The gender paradigm affirms a radically constructivist view of reality, then reifies it as truth, demanding that others assent to its veracity and adopt its language” (p. 26).

       Clearly explaining the positions she now rejects, Favale devotes a chapter to the history and theories of feminism, a term which has been used for a century to describe a 19th century social reform movement, akin to abolitionism and prohibitionism.  Most of those feminists—constituting the first wave—focused on getting the right to vote, and “were not radicals or revolutionaries.  Most were middle-class wives and mothers, committed Christians who opposed abortion” (p. 50).  Their goals were reached with the passing  of the 18th and 19th amendments to the Constitution.  Following World War II a second wave of feminism turned in in more radical directions.  Using the recipe concocted by Betty Friedan’s The Feminine Mystique, “the Women’s Liberation Movement caught fire,” and “feminists began to actively rethink women’s roles within the home and in the workforce.  A major part of this effort was a renewed emphasis on so-called ‘reproductive freedom’—that is, unlimited access to birth control and abortion” (p. 51).  In the 1990s a “third wave” of feminism called for unbridled sexual expression, making consent “the lone benchmark for sex to be considered licit.  If a woman chooses a particular sex act, that sex act is good, even if it involves prostitution, pornography, or sadomasochism” (p. 52).  Then recently a “fourth wave” feminism made an about face, revealing a “growing ambivalence toward unrestrained sexual license, an emerging awareness that women can be mistreated even within the boundaries of what is technically consensual” (p. 52).  “Me too” became its litany.  

       Analyzing these feminist movements, Favale finds the atheistic existentialist/Marxist philosopher Simone de Beauvoir enormously important.  She wrote The Second Sex in 1949 and “was the first philosopher to give an account of male domination that pervades all spheres of human life and thought.  Women were objectified by men, she insisted, and “female human beings are socialized to conform to this understanding of womanhood from birth.  This idea is behind her well-known line, ‘One is not born, but rather becomes, a woman.’  That statement is the mustard seed of gender theory” (p. 54).  To de Beauvoir, as to her mentor Jean-Paul Sartre, the world is meaningless and nothing is natural.  “Meaning must be made; it cannot simply be found.  It is up to us to justify our existence, to give it purpose.  We are not created; rather, we create ourselves, and failing to take up this work of self-creation is a moral transgression” (p. 55).  The second feminist thinker Favale examines is Judith Butler, who shifted the focus from “women’s studies toward gender studies.”  Following Michel Foucault and postmodern philosophy, her “primary goal as a theorist is to dismantle the normalization of heterosexual relationships—the tendency to see the male and female sexual relationship as normal and natural, which in theory-speak is called heteronormativity.  The idea that humankind is split into two sexes that are biologically complementary is, for Butler, a social fiction rather than a matter of fact” (p. 64).  Butler dismissed virtually all sexual standards, including the incest taboo.  Building on this, a black feminist theorist, Kimberlé Crenshaw, added “intersectionality” to today’s feminist agenda—making a worldview Favale calls “the gender paradigm.”  Importantly, “this paradigm is a godless one” and is “diabolic, in the literal sense.”  No God made us—we’re merely effusions of a material or social process as Marx declared.  “Reality, gender, sex—everything, even truth—is socially constructed.”  We are, as humans, nothing by nature.  So we can make ourselves whatever we want to be.  Today’s feminism lacks coherence because classical logic has been discarded as an aspect of toxic masculinity.  But Favale thinks it important to evaluate its framework so as to “understand how this framework differs from a Christian one.  Only from that foundation—from a solid understanding of competing worldviews—is it possible for Christians to mine feminist thought and praxis for hidden gems and to partner with secular feminists toward shared goals” (p. 73).  Doing so requires several chapters in the book as the author challenges us to think clearly about contraception, abortion, women’s liberation, autonomy, social engineering, preferred pronouns, transgender surgery, etc.  

       Having rejected the feminist gender paradigm, Favale sets forth her “Christian theory” of feminism—obviously relying on St. John Paul II’s theology of the body, a “personalism” that sees “each human being as a person, rather than a collection of ever-proliferating labels, and, more importantly, to attune our awareness to the sacramentality of every human body.  Bodies are not ‘just’ bodies.  Bodies are persons made manifest.  The sacramental principle is always at work: the visible reveals the invisible.  The body reveals to us the eternal and divine reality of the person—a reality that can only break into the tangible, sensible world through embodiment” (p. 120).  John Paul II insisted that “the body, in fact, and only the body, is capable of making visible what is invisible:  the spiritual and the divine.  It has been created to transfer into the visible reality of the world the mystery hidden from eternity in God, and thus to be a sign of it.”  

       Favale builds her case by contrasting the Babylonian and Hebrew creation stories.  In Genesis, God creates the cosmos ex nihilo.  “The God of Genesis has no parents; he does not come into being.  This absence of an origin testifies to his eternal presence.  He is not a being, like Marduk, but Being itself, the infinite ground of all finite existence” (p. 30).  God brought into being an orderly world, a good world, that includes human beings, male and female, who shared “a unique dignity, marked by the image of their Creator, and entrusted with the sacred work of cultivating life.  Sexual difference is not an extraneous or faulty feature of the cosmos but an essential part of its goodness” (p. 31).  Importantly:  “Genesis affirms a balance of sameness and difference between the sexes.  This is a delicate balance that is difficult, but necessary, to maintain.  Most theories of gender lose this balance, veering into extremes of uniformity (men and women are interchangeable) or polarity (men are from Mars, women are from Venus).  Both extremes lose the fruitful tension expressed here in Genesis” (p. 33).  What God created man describes, using the miracle of language.  With Adam we can see things as they are—their essence—-name them.  Language does not construct reality—it describes it.  God created by saying “let it be” and man sees and names what it is.  Contrary to a pivotal tenet of feminism, we do not construct reality—we should see and revere it.   

       Favale believes “that the constructionist view of language is a complete inversion of the correspondence view depicted in Genesis” (p. 37).  The Bible declares that we’re created beings, designed to behold and revere what God has made, to study and understand it as best we can, and to live in accord with its design.  The Christian tradition teaches us to “see the world as a created cosmos of which we are a part, this transfigures everything: embodiment, sex, suffering, freedom, desire—this is gathered up into an all-embracing mystery, an ongoing interplay between the human and divine. This imbues all-that-is with renewed significance” (p. 198).  There is a deeply teleological aspect to Christian thought enabling it to discern purpose and meaning in creation, for the “‘whatness’ of a thing, its essential identity, is connected to its purpose” (p. 198).  Drinking deeply from the well of Scripture and saintly writers, Favale rejoices to find how she fits into the cosmos an can walk humbly with her Lord.    

       This means accepting one’s body as a gift from God, something wondrous to be revered, not to be ignored or transformed.  Designed in His image, we ought to nurture and develop our body’s potential.  Openness to God, following the Virgin Mary in accepting His will, is our ultimate good and purpose.  “Our bodies are continual reminders to us that we are not autonomous, that the fantasy of self-creation is no more than a fever dream, a symptom of underlying illness” (p. 203).  Rejecting Genesis, post-modern thinkers declare that “we are not bodies animated by interior souls, but bodies shaped by external forces.”  Changing the language from sex to “gender,” sexual differences are seen as cultural stereotypes rather than naturally embedded realities.  Endless varieties of “gender” are now celebrated, none of them fixed by anything other than one’s inner feelings.  Consequently:  “In our postmodern moment, discussions about gender tend to revolve around appearance and roles.  To be a woman is to fulfill a particular social role, or to mimic typical feminine behavior and attire.  Feminism and its progeny, gender theory, centers the conversation on doing rather than being” (p. 206).

       But being truly matters.  Being open and yielded to God, choosing to say Yes to Him enables us to find what we long for.  “We can choose to receive all these things as gift.  We can choose to say yes to a Love that is stronger than death.  We can enter, even now, the eternal moment of Annunciation, when the yes of one woman becomes the fulcrum of redemption” (p. 211).  Be it done to me according to Your will!

370 America’s Cultural Revolution

       Thirty years ago I read and reviewed James Davison Hunter’s Culture Wars:  The Struggle to Define America.  A sociologist at the University of Virginia, he reported that Americans were deeply divided over such issues as abortion, homosexuality, and public school curricula.  On both sides (folks he labels “orthodox” and “progressive”) there were passionately committed individuals.  Those committed to “orthodoxy,” Davison said, shared an allegiance to “an external, definable, and transcendent authority” whereas those committed to “progressivism” embraced modernity and tended “to resymbolize historic faiths according to the prevailing assumptions of contemporary life.”  Whereas the orthodox defined “freedom” economically, the progressives defined it socially (e.g. permissive sexuality); the orthodox defined “justice” socially (criminals should get what’s due them) while the progressives defined it economically (welfare should provide all for all). 

         The primary “fields of conflict” included:  family; education; media and the arts; law; and electoral politics.  Hunter detailed the struggles going on in these areas and concluded:  “the culture war is rooted in an ongoing realignment of American public culture and has become institutionalized chiefly through special-purpose organizations, denominations, political parties, and branches of government. . . . .  In the end, however, the opposing moral visions become, as one would say in the tidy though ponderous jargon of social science, a reality sui generis:  a reality much larger than, and indeed autonomous from, the sum total of individuals and organizations that give expression to the conflict.  These competing moral visions, and the rhetoric that sustains them, become the defining forces of public life.

       What Hunter described three decades ago is more ominously explained and analyzed by Christopher F. Rufo in America’s Cultural Revolution:  How the Radical Left Conquered Everything (New York:  HarperCollins, c. 2023; Kindle Edition).  “This book,” he says, “is an effort to understand the ideology that drives the politics of the modern Left, from the streets of Seattle to the highest levels of American government,” and its lesson “is a serious one.  There is a rot spreading through American life” (p. xi).  It clearly began spreading in 1968 when the world was jarred by nihilistic “student uprisings, urban riots, and revolutionary violence that has provided the template for everything that followed” (p. 2).   Subsequently,  the revolutionary ideas unleashed in the ‘60s have shaped our world.     

       The most influential “father” of the revolution was Herbert Marcuse, a German philosopher who sought refuge in America when Hitler took control of his country. Rather than embrace the country that gave him refuge, he worked to destroy it, supporting radical groups such as the Weather Underground (led by Bernardine Dohrn and Bill Ayers, who later helped Barack Obama) and the Black Revolution Army.  While teaching at the University of California San Diego, Marcuse gave a lecture in London in 1967 urging hearers to launch a counter-culture fomenting a cultural revolution to upend Western Civilization.  In the audience were black militants (including Stokely Carmichael and Angela Davis) who quickly responded to Marcuse’s revolutionary rhetoric, joining students around the world marching to slogans celebrating “Marx, Mao, Marcuse” and calling for seismic changes needed to usher in a communist utopia.  

      This “New Left” would generally cite racial injustices, feminist woes, and environmental concerns rather than economic inequities to promote the cause.  Early abandoning hopes of waging guerrilla warfare a la Che Guevara, they would use “critical theories” to foment the revolution.  Rufo effectively shows how we now live “inside Marcuse’s revolution,” wherein he “posited four key strategies for the radical Left:  the revolt of the affluent white intelligentsia, the radicalization of the black ‘ghetto population’ the capture of public institutions, and the cultural repression of the opposition.”  In Rufo’s opinion:  “all of these objectives have been realized to some degree,” thus instantiating the “‘transvaluation of all prevailing values’ that Marcuse had envisioned” (p. 11).  In setting forth a “cultural” Marxism, Marcuse envisioned a “dictatorship of the intellectuals” rather than Lenin’s “dictatorship of the proletariat”—putting folks like himself and his university-trained devotees in charge of things.  Writing to Rudy Dutschke, a leader of of Germany’s “new left,” he declared that the “long march through the institutions” was the way to effectively orchestrate the revolution.  And the primary institution to be taken captive would be the universities and through them the public school systems.

     Radicals such as Angela Davis (one of Marcuse’s most devoted disciples) would find positions in America’s finest universities.  Davis herself would teach at UCLA, Vassar, and ultimately (for many years) at the University of California Santa Cruz.  Weather Underground terrorist Bernardine Dohrn became a professor of law at Northwestern University.  Her husband and fellow terrorist, Bill Ayers, joined the faculty at the University of Illinois as a professor of education.  They found intellectual ammunition in Paolo Freire, whose Pedagogy of the Oppressed became a manual for revolutionaries.  Praising Lenin, Mao, Guevara, and Castro, the Brazilian Marxist’s book “sold more than one million copies and is now the third-most-cited work in the social sciences.  It has become a foundational text in nearly all graduate schools of education and teacher training programs” (p. 145).  Freire’s devotees in America began publishing books and finding positions in universities.  In the 1980s “the critical theorists of education began methodically deconstructing the existing curricula, pedagogies, and practices, and replacing them, brick by brick, with the ideology of revolution” (p. 162).

       To illustrate the radicalization of the schools Rufo devotes a chapter to the “Child Soldiers of Portland.”  The city has few minority residents, but “it has become the headquarters of race radicalism in the United States.  The city has elevated white guilt into a civic religion.  Its citizens have developed an elaborate set of rituals, devotions, and self-criticisms to fight the chimeras of ‘systemic racism’ and ‘white supremacy.’  The ultimate expression of this orthodoxy is violence:  street militias, calling themselves ‘anti-racists’ and ‘anti-fascists,’ are quick to smash the windows of their enemies and burn down the property of anyone who transgresses the new moral law” (p. 188).  Just as Bolsheviks recurrently set forth “five year plans,” Portland’s “government has adopted a series of Five-Year Plans for ‘equity and inclusion,’ shopkeepers have posted political slogans in their windows as a form of protection, and local schools have designed a program of political education for their students that resembles propaganda” (p. 189).  By asserting “America is fundamentally evil, steeping children in the doctrine of critical pedagogy and lionizing the rioters in the streets, the schools have consciously pushed students in the direction of revolution” (p. 189).  Thus the city’s “child soldiers” occupy and vandalize downtown Portland!  

       Though the radicals first targeted the schools their ultimate objective was political, gaining control of the country.  To do so “critical race theory” proved pivotal.  After a distinguished career as a civil rights attorney, Derrick Bell became a Harvard professor of law and wrote a “thousand-page casebook called Race, Racism, and American Law, outlining ‘critical race theory.’  At the same time, he was intimately connected to the left-wing radical milieu:  Bell had provided legal support to Angela Davis at her murder trial, studied the critical pedagogy of Paulo Freire, and maintained a close relationship with Black Panther Party members such as Kathleen Cleaver, the wife of Eldridge Cleaver” (p. 205).  At Harvard he spent little time with colleagues but enlisted zealous students hungering for his “left-wing racialist ideology” and attuned to “the rhetoric of elite grievance.”  He rooted his position in the works of Antonio Gramsci (an Italian communist) and Paulo Freire, setting “the stage for the racial politics of our time” (p. 206).  His followers, claiming to be “critical race theorists,” would assail “the founding principles of the country, making the argument for dismantling colorblind equality, curtailing freedom of speech, supplanting individual rights with group-identity-based entitlements, and suspending private property rights in favor of racial redistribution” (p. 207).  They rapidly found positions in the nation’s elite law schools and courts, as is evident in Supreme Court justice Elena Kagan.  

       The ever-insightful “conservative black economist Thomas Sowell, whom Bell had attacked as a race traitor, offered an explanation of Bell’s predicament.  ‘Derrick Bell was for years a civil-rights lawyer, but not an academic legal scholar of the sort who gets appointed as a full professor at one of the leading law schools.  Yet he became a visiting professor at Stanford Law School and was a full professor at Harvard Law School.  It was transparently obvious in both cases that his appointment was because he was black, not because he had the qualifications that got other people appointed to these faculties,’ Sowell said.  ‘Derrick Bell’s options were to be a nobody, living in the shadow of more accomplished legal scholars—or to go off on some wild tangent of his own, and appeal to a radical racial constituency on campus and beyond.  His writings showed clearly that the latter was the path he chose.’  And this path, in Sowell’s view, was a tragic turn.  Bell’s ‘previous writings had been those of a sensible man saying sensible things about civil-rights issues that he understood from his years of experience as an attorney.  But now he wrote all sorts of incoherent speculations and pronouncements, the main drift of which was that white people were the cause of black people’s problems’” (p. 227).  Sadly enough, Sowell concluded:  “’He’s turned his back on the ideal of a colorblind society and he’s really for a getting-even society, a revenge society” (p. 231).  

       Bell’s influential followers (including Kimberlé Crenshaw, who would set forth the notion of “intersectionality”) substituted race for class in Marxism and embraced some of “the most acidic parts of modern thought, beginning from the assertion that ‘objective truth, like merit, does not exist,’ continuing to the Derrick Bell–style posture of ‘deep dissatisfaction with traditional civil rights discourse,’ and ending with a call for a ‘war of position’  against whiteness, colorblindness, private property, and traditional constitutional theory” (p. 233).  Citing postmodernists such as Jacques Derrida and Michel Foucault, they insisted “truth is a social construct created to suit the purposes of the dominant group” and rejected the natural law tradition basic to America’s founding.  “They wanted to replace the old system of colorblindness, equality, and individual rights with a new system one might call a theory of ‘racial reasoning’” (p. 235).  Rather than reason logically from propositions to conclusions, they substituted  “lived experiences,” anecdotes of oppression, allegations of victimization.  Consequently:  “personal offense becomes objective reality; evidence gives way to ideology; identity replaces rationality as the basis of intellectual authority” (p. 237).

       “Critical race theory,” Rufo says, “was never designed to reveal truth—it was designed to achieve power.  The real history of the discipline is not a story of its intellectual discoveries, but of its blitz through the institutions” (p. 249).  Its success is best evident in the widespread emphasis on DEI (Diversity; Equity; Inclusion) throughout much of America.  Embraced by virtually all universities, it’s now spreading through the nation’s public schools and is fully endorsed by the National Education Association.  The Obama Administration, implementing the Dodd-Frank bill, created Offices of Minority and Women Inclusion in numerous federal agencies, so bureaucracies such as the FBI and EPA now require employees to sit through training sessions designed to insure DEI throughout their ranks.  So, following the unrest sparked by the killing of George Floyd, “the National Credit Union Administration told employees America was founded on ‘white supremacy.’  The Department of Homeland Security told white employees they have been ‘socialized into oppressor roles.’  The Centers for Disease Control and Prevention hosted a thirteen-week training program denouncing the United States as a nation dominated by ‘White supremacist ideology’” (p. 255).  

      The nation’s preeminent corporations—especially those needing governmental support—have fallen in line.  “Lockheed Martin, the nation’s largest defense firm, sent white male executives on a mission to deconstruct their ‘white male privilege.’  The instructors told the men that ‘white male culture’ and the values of ‘rugged individualism,’ ‘hard work,’ and ‘striving towards success’ were ‘devastating’ to minorities” (p. 256).  Conducting the training sessions are folks like Johnnetta Cole, a Marxist “scholar-activist” with a history of leading communist organizations such as Venceremos Brigade (a pro-Castro group) and the July 4 Coalition (an ally of the Weather Underground).   She is now hired to conduct training sessions for federal bureaucrats.  Fixated on slavery, Cole holds all whites responsible for it inasmuch as they benefit from “a system that’s based on racism.”  Even “good and decent [white] people” stand guilty for the “racial terrorism” still harming the nation.  Blacks, she claims, suffer from “post-slavery traumatic syndrome” and the “deep emotional and physiological toll of racism.”  Earlier she had championed the Soviet Union and served on the editorial board of the journal Rethinking Marxism. “Now she was promoting it as an official contractor of the United States government.  After fifty years, the long march had been completed.  The radical Left had finally won its Gramscian ‘war of position’ and attained ideological power within the American state.”  Illustrating this ideological victory, “on his first day in office, President Joseph Biden issued an executive order seeking to nationalize the approach of ‘diversity, equity, and inclusion’ and ‘embed equity principles, policies, and approaches across the Federal Government.’  In business, every Fortune 100 corporation in America has submitted to the ideology of ‘diversity, equity, and inclusion’” (p. 265).  

       Concluding his study, Rufo says:  “The story of America’s cultural revolution is one of triumph.  The critical theories have become the dominant frame in the academy.  The long march through the institutions has captured the public and private bureaucracy.  The language of left-wing racialism has become the lingua franca of the educated class” (p. 269).  

       But he hopes for “counter-revolution” will restore health to this nation.  In part this is because the “the radical Left cannot replace what it destroys” (p. 275).  History shows this.  The 1789 French Revolution collapsed into the abyss of the Thermidor; the revolutions throughout Europe in 1848 were quickly co-opted by the much-maligned bourgeoisie; the 1917 Bolshevik Revolution fell rapidly into Stalinist tyranny.  All these revolutions were essentially nihilistic, bent on destroying.  We need a counter-revolution of hope, a positive commitment to enduring values and just political systems.  Who might lead it remains to be seen!

                                    * * * * * * * * * * * * * * * * * * * * * *

       Joanna Williams is an English journalist who has taught in universities and published in prestigious newspapers and journals.  She has recently written How Woke Won:  The Elitist Movement that Threatens Democracy, Tolerance and Reason (London:  John Wilkes Publishing, c. 2022; Kindle Edition).  Jonathan Haidt, co-author of The Coddling of the American Mind  says:  “This book is the essential guide for our era of confusion and incoherence as moral revolutionaries tear down statues, institutions and widely held values.  With clear thinking and gripping storytelling, Williams explains how a minority of the elites in Britain and America were able to intimidate the rest of the elites into silence or complicity, imposing a ‘revolution from above’ that is anti-democratic and cruel.  Anyone who wants to restore sanity, beauty or simple humanity to our public life should read How Woke Won.”

       After sketching a short history of “wokism” Williams surveys the current “cultural wars” and finds it firmly established in powerful English-speaking institutions.  That “Woke has conquered the West” became clear when President Joe Biden, in his first day in office, “signed an Executive Order permitting boys who identify as girls to compete on female sports teams and enter female changing rooms” (p. 14).  To be “woke” means to be conscious of and committed to overcome racism and social injustice.  The word gained currency following the death of Michael Brown in Ferguson, Missouri, when activists proclaiming “black lives matter” urged folks to “stay woke” and dismantle the nation’s racist establishment, which will be done through identity politics.  Woke folks especially stress proper words enabling them to identify folks as oppressors and oppressed.  So one must say “people of color” rather than “colored people, “Latinx” rather than “Hispanic,” and “sex assigned at birth” rather than “male” or “female.”  For those of us who are puzzled by these continually-shifting phrases it’s nice to know that woke language is generally “convoluted, indecipherable and alienating” (p. 35).  

       However convoluted their language, woke writers rigorously censor their predecessors for linguistic sins, so the “battle for the past” looms large in the culture wars. They disdain traditionally celebrated virtues such “as stoicism, courage, resilience, duty, sacrifice and self-control” and celebrate victimhood as the most admirable and coveted attribute.  Statues of Winston Churchill must be torn down because he supported the British Empire.  One of the finest novels of the past calling for tolerance and racial justice—Harper Lee’s To Kill a Mockingbird—is now dismissed as “problematical” for featuring a white protagonist and thus espousing a “white saviour motif.”  Contemporary writers, including the fabulously successful JK Rowling, must be censored for supporting women-only athletics and spaces.   Wokists have effectively reshaped the educational systems in the English-speaking world, establishing inflexible speech codes, as illustrated by a law student in a Scottish university who faced expulsion for “stating, in the context of a seminar on gender, feminism and the law, that ‘women have vaginas’” (p. 119).  Without doubt “‘correct’ speech—be it declaring pronouns or pledging allegiance to Black Lives Matter—is compelled” (p. 120).   

      Summing it all up, Williams says:  “Woke has won.  It has won because its fundamental assumptions have become so widely accepted among the cultural elite that they are considered not just uncontroversial but common sense.  Woke has won because its values have been adopted by members of the professional-managerial class, who have allowed woke thinking to take root within public institutions and to shape policies, practices and laws.  Woke has won because it has become embedded in schools and universities.  Teachers and lecturers cultivate woke attitudes in children and young adults who take for granted that what they are taught is factually accurate and morally correct. After graduation, they carry the lessons imbibed back out into the world.  Woke has won because its leading advocates appropriated the rhetoric of the civil-rights-era struggles for equality” (p. 267).  

       What we who oppose it can do remains to be seen.  Perhaps seeing it as it is is all we can do!

# # #

369 Lee, Grant & Twain

        At a time when obtuse mobs pull down or vandalize historic statues and politicians placate the vandals by removing public monuments, serious scholars continue studying great men, illustrating their value in understanding ourselves and our nation.  They realize, as William Faulkner said:  “The past is never dead.  It’s not even past.”  In Clouds of Glory:  The Life and Legend of Robert E. Lee (New York:  Harper, c. 2012; Kindle), Michael Korda provides a well-written, admiring account of the general.  As a young man Korda took part in the Hungarian Revolution of 1956, then graduated from Oxford University and now writes histories.  He begins his book not with details of Lee’s early life but with his role in suppressing John Brown’s attack on Harper’s Ferry in 1859—an incendiary incident helping provoke the Civil War.  Brown was revered by Northern abolitionists because of his guerrilla activities in “bleeding” Kansas and helping slaves escape to Canada.  Dispatched to quash the insurrection, Lee and a small army detachment did so, treating the captured survivors “with kindliness and consideration,” but overseeing Brown’s hanging.  Present with him were a number of soldiers who would serve with him during the Civil War—most notably J.E.B. Stuart and Thomas J. “Stonewall” Jackson.  Considering the event Herman Melville “described Brown prophetically as the ‘meteor of the war’” and his phrase rang true, for it would “be only seventeen months between John Brown’s execution and the firing on Fort Sumter that brought about the war.”  

       As a veteran U.S. Army officer, Robert E. Lee had served in various parts of the nation.  “He was a cosmopolitan, who felt as much at home in New York as he did anywhere in the South; he was opposed to secession; he did not think that preserving slavery was a goal worth fighting for; and his loyalty to his country was intense, sincere, and deeply felt.  He was careful, amid the vociferous enthusiasm for secession in Texas once Lincoln was elected, to keep his opinions to himself, but in one instance, when asked ‘whether a man’s first allegiance was due his state or the nation,’ he ‘spoke out, and unequivocally.  He had been taught to believe, and he did believe, he said, that his first obligations were due Virginia’” (#520).  A singular commitment to one’s state was not at all unusual in those days.  Lee’s first ancestor had settled in Virginia in 1639, two of his descendants signed the Declaration of Independence; two others would be-come generals and one, Zachary Taylor, would become a president.  To John Adams, when the American Revolution began, the Lees had “more men of merit . . . than any other family.”  They were all loyal Americans, but above all they were Virginians!  

       During the War for Independence, Robert E. Lee’s father, “Light Horse” Harry, became a celebrated military officer.  Like his father, “Robert was tall, physically strong, a born horseman and soldier, and so courageous that even his own soldiers often begged him to get back out of range, in vain of course.  He had his father’s gift for the sudden and unexpected flank attack that would throw the enemy off balance, and also his father’s ability to inspire loyalty—and in Robert’s case, virtual worship—in his men.”  But neither man worked well with politicians.  The father was “voluble, imprudent, fond of gossip, hot-tempered, and quick to attack anybody who offended or disagreed with him.”   But the son “kept the firmest possible rein on his temper,” disliked confronting or arguing with others.  “These characteristics, normally thought of as virtues, ultimately became Robert E. Lee’s Achilles’ heel, the one weak point in his otherwise admirable personality, and a dangerous flaw for a commander, perhaps even a flaw that would, in the end, prove fatal for the Confederacy.  Some of the most mistaken military decisions in the short history of the Confederacy can be attributed to Lee’s reluctance to confront a subordinate and have it out with him on the spot, face-to-face” (pp. 30-31).

       Lee’s mother, wanting her son to eschew her husband’s example, sought to instill in Robert a strong Christian faith.  “For this task she was extraordinarily well suited; her few surviving letters reveal formidable theological knowledge, as well as a precise sense of right and wrong and a deep spiritual belief.  ‘Self-denial, self-control, and the strictest economy in all financial matters were part of the code of honor she taught [him] from infancy,’ and in his later years Robert E. Lee frequently said that he ‘owed everything’ to his mother.”  Though an Anglican, “Ann Carter Lee was in many ways a child of the Second Great Awakening that swept through America in the early nineteenth century, creating sometimes startling new religious denominations and laying greater emphasis on the need to be saved and on personal piety rather than simply attending traditional religious services.  Her beliefs were what we would now call evangelical, and she had the strength of mind and purpose to impress them on her son Robert for life—indeed the most striking thing about his letters is his lifelong, simple, unshakable belief in the need to accept God’s will uncomplainingly, and his deep faith.  ‘It is all in God’s hands’ is a phrase he used often, not in a spirit of fatalism, but in one of confidence.  The intensity of Lee’s religious convictions was one of the elements that would make him a formidable warrior, and also one of the reasons why he remains so widely respected not just in the South, but in the North as well—not only as a hero, but as a kind of secular saint and martyr” (pp. 35-36).

       Korda takes the reader through Lee’s education, military service in Mexico, and work in various army posts (usually devoted to supervising engineering projects).  In most ways it was a rather prosaic career, with little possibility of attaining distinction until he captured the attention of Winfield Scott, who found him a fine field officer during the Mexican War.  As “Scott’s protégé, prized particularly for his uncanny eye for terrain,” Lee helped win the war and was made a “brevet lieutenant colonel.  No other officer in the Mexican War received such universal praise, or won such widespread admiration” (p. 255).  Indeed, General Scott declared Lee “’the very best soldier that I ever saw in the field’” (p. 266).  Following the war Lee returned to working in army posts (including an assignment dealing with Comanche raiders on the Texas frontier), serving a stint as superintendent of West Point, and caring for his family (seven children), struggling with finances, serving as executor for his father-in-law’s estate, and wondering if he’d made the right vocational choice.  

       But everything changed when Abraham Lincoln was elected President and southern states began seceding.  Though Lee personally opposed slavery he also opposed abolitionism.  Generally abstaining from politics, he was something of a Whig.  As the war began President Lincoln made Lee a colonel and ultimately offered him the rank of a “major general in command of the largest army in American history” (p. 391).   But when Virginia seceded Lee felt obliged to serve his beloved state and soon headed the Confederate military forces therein.  “Lee amazed everyone by his energy and professional skill, putting together in a matter of weeks an army of 40,000 troops” (p. 410).  He led them in various battles in Virginia, Maryland and Pennsylvania, ultimately surrendering to Federal forces led by Ulysses S. Grant in 1865.  Korda describes and analyzes the various battles, though he is less concerned with military details than with the person, General Lee, who always “set his men an example of resilience, confidence, and devotion to duty” (p. 1054).  

       Following the war, Lee enjoyed what Korda calls an “apotheosis.”  Rarely has a man come “not only to embody but to glorify a defeated cause.”  Amazingly, Lee  became “a national, not just a southern hero,” with a U.S. Navy submarine named for him, a postage stamp carrying his picture, a U.S. Army tank named after him, and President Gerald Ford posthumously restored his citizenship in 1975.  “It is hard to think of any other general who had fought against his own country being so completely reintegrated into national life, or becoming so universally admired even by those who have little or no sympathy toward the cause for which he fought” (p. 1141).  Rather than accept more prestigious and remunerative positions Lee became the president of Washington College, a tiny school with almost no students in Lexington, Virginia.  Under his guidance, the college flourished, and its new president sought to provide Southerners an example for adjusting to post-war realities.  To Henry Ward Beecher, a prominent New York minister, “Lee ‘was entitled to all honor,’ and praised him for devoting himself ‘to the sacred cause of education’” (p. 1170).  

       Korda’s portrait of Lee is consistently positive, if not quite as admiring as Douglas Southall Freeman’s famed four-volume biography.  He does show that a Hungarian emigrant, rather free from the many biases of native-born Americans, can carefully study and find worth celebrating the life of Robert E. Lee.

                                     * * * * * * * * * * * * * * * * * * * * 

       Notably more critical of the general, Allen C. Guelzo’s Robert E. Lee:  A Life (New York:  Knopf Doubleday Publishing Group; Kindle Edition. Vintage Books, c. 2021) seeks to appreciate Lee’s strengths without glossing over his faults.  To the author Lee remains very much a “mystery” inasmuch as he was both upright and errant.  Everyone who met Lee, “no matter what the circumstances of the meeting—ever seemed to fail to be impressed by the man.  His dignity, his manners, his composure, all seemed to create a peculiar sense of awe in the minds of observers” (p. 18).  And yet he fought for a  rebellious confederacy committed to preserving slavery.  A Princeton professor who has published a number of historical works, Guelzo’s stance is nicely summed up by Lee himself after the Civil War when he wrote a letter, saying:  “My experience of men has neither disposed me to think worse of them or indisposed me to serve them; nor in spite of failures, which I lament, of errors which I now see and acknowledge; or of the present aspect of affairs; do I despair of the future.  The truth is this:  The march of Providence is so slow, and our desires so impatient; the work of progress is so immense and our means of aiding it so feeble; the life of humanity is so long, that of the individual so brief, that we often see only the ebb of the advancing wave and are thus discouraged.  It is history that teaches us to hope” (p. 13).

      As is expected of a scholarly biographer, Guelzo digs into Lee’s ancestry, family, education, and career.  Though he doesn’t consider Lee a first-rate intellectual, he was certainly a well-tutored youngster, reading Caesar, Sallust, Virgil, Cicero, Horace, and Tacitus in Latin, plus Xenophon and Homer in Greek.  Most importantly, since he sought admission to West Point, he mastered arithmetic, algebra, and geometry.  After doing well at the academy he joined the Corps of Engineers—“a small cadre of brainy technicians who prided themselves on their superiority to lesser graduates who ended up in” other branches of the army (p. 69).  His academic work was exemplary, but it was “his almost unbearable gentility” that most impressed his classmates.  Lee was, Joseph E. Johnston remembered “full of sympathy and kindness, genial and fond of gay conversation, and even of fun, that made him the most agreeable of companions” (p. 73).  When he returned to West Point to serve as superintendent in the 1850s, he similarly impressed cadets as “‘the personification of dignity, justice, and kindness . . . the ideal of a commanding officer’” (p. 201).  

       Following the Compromise of 1850, slavery became a smoldering issue.  Lee favored neither slavery nor its abolition, saying:  “‘In this enlightened age, there are few, I believe, but what will acknowledge, that slavery as an institution, is a moral & political evil in any country’” (p. 227).  Yet he apparently saw no way to actually end it.  As southern states began severing their ties with the Union, many observers wondered if he would retain his position as an officer in the federal army.  He had served with distinction in the Mexican War and enjoyed the favor of General Winfield Scott, who  “did not hesitate to endorse him in the most dramatic terms:  ‘If I were on my death-bed tomorrow, and the President of the United States should tell me that a great battle was to be fought for the liberty or slavery of the country, and asked my judgment as to the ability of a commander, I would say with my dying breath, Let it be Robert E. Lee’” (p. 211).  

       President Lincoln apparently considered giving Lee command of the Union army and might have done so if Virginia had not joined the Confederacy.  He could not “draw his sword” against native State and devoted himself to serving her.  Thus he made “a decision in which he irrevocably, finally, publicly turned his back on his service, his flag, and, ultimately, his country.  All of this was done for the sake of a political regime whose acknowledged purpose was the preservation of a system of chattel slavery that he knew to be an evil and for which he felt little affection and whose constitutional basis he dismissed as a fiction” (p. 306).  In time he became the commanding general of the Army of Northern Virginia and for four years fought to win the war.  Guelzo carefully describes the various battles and evaluates Lee’s effectiveness as a strategist, noting that Lee’s triumphs were often due to his opponents’ failures and he relied too much on his subordinate generals to implement his general orders.  He did, however, inspire his men to fight courageously and merits commendation for his leadership during the war.  “Only Grant emerged in the war with military gifts on a par with Lee,” and there is a rightful “glory for Lee in that achievement” (p. 655).

      Lincoln, Lee and Grant all deeply desired peace and reconciliation.   (An interesting illustration of this was the fact that Ulysses Grant’s widow ultimately became good friends with Varina Davis, the widow of former Confederate president Jefferson Davis!).  Ulysses S. Grant believed the officers and men who had received parole at Appomattox should not be prosecuted, asserting:  “‘I should have resigned the command of the army rather than have carried out any order directing me to arrest Lee or any of his commanders who obeyed the laws’” (p. 567).  When Lincoln was slain Lee considered it “‘not only a crime against our Christian civilization’ but ‘a terrible blow to the vanquished.’”  And he praised Grant, whose treatment of Southern soldiers was “‘without parallel in the history of the civilized world’” (p. 574).  Though granted a parole when he surrendered at Appomattox—and though President Lincoln, in his last cabinet meeting had spoken “very kindly” of him—some Northerners wanted Lee to be indicted and imprisoned.  The he was, in fact indicted by a prosecutor, he was never brought to trial or imprisoned.  

     When he died, Lee was mourned throughout the South and rather admired in many sections of the North.  Thus Philadelphia’s Evening Telegraph declared that “‘the passionate feelings engendered by the conflict have so far died away that there is a general disposition to dwell upon his personal virtues rather than to follow him to the grave with denunciations’” (p. 630).  At the end of his presentation Guelzo—admiring the man but perplexed by his service for the Confederacy—concludes:  “Mercy—or at least a nolle prosequi—may, perhaps, be the most appropriate conclusion to the crime—and the glory—of Robert E. Lee after all” (p. 662).  His footnotes and bibliography show Guelzo’s  diligence in thoroughly researching his subject, and his portrait of Lee merits serious consideration.  

                                                   * * * * * * * * * * * * * * * * * * * * * *

       Mark Perry wrote a fascinating account in Grant and Twain:  The Story of a Friendship that Changed America (New York:  Random House Publishing Group, c. 2004; Kindle Edition).  Following his presidency Ulysses S. Grant settled in New York and engaged in various business ventures.  Thing the advice of a man he trusted, he invested in (and lent his name to) an endeavor that failed in 1884.  Rather than file for bankruptcy, “Grant vowed that he would repay every penny of the debt he owed and pledged that before his death, he would find a way to provide for his wife and children” (p. 20).  Then Mark Twain—a “Grant intoxicated man”—determined to help out by encouraging him to write his life story.  The two men met, and in 15 months “Ulysses S. ‘Sam’ Grant and Mark Twain—Samuel Clemens—became the best of friends.  Seemingly so different and yet with so much in common, Grant and Twain would, in that short time, transform the world of American writing.  For as Grant was struggling to write the story of his life, he was helped in his final battle by a man who had just completed the story of his.  Within that single fifteen-month period—perhaps the most creative in American literary history—Grant would not only write his Personal Memoirs, Twain would reach the peak of his career with the publication of Adventures of Huckleberry Finn. Those two books, perhaps the finest work of American nonfiction ever written and the greatest of all American novels, defined their legacy.  In the end, the struggle of both men—Grant’s struggle to retrieve his fortune and Twain’s to make his—was not about wars or books or even money.  Over a period of fifteen months, Grant and Twain wrote the story of their country and ours.” (p. 23).  

       After sketching biographies of the two men Perry describes their interactions.   Robert Underwood Johnson, hoping to make the Century Publishing Company successful, had earlier talked with Grant about writing an article covering some aspect of the Civil War.  Grant had been uninterested, but now he told Johnson he needed money and might do some writing.  Johnson said his magazine would publish whatever he wrote and “suggested that Grant write four articles, one each on the Battle of Shiloh, the Vicksburg Campaign, the Battle of the Wilderness, and the surrender of Lee.  Johnson said the magazine would pay him $500 for each article.  It was an extraordinary sum for the time.  Grant agreed to this arrangement” (p. 82).  He would be the first Civil War commander to write a memoir, and when he submitted his first article it was obvious he had a gift for writing, for he could recall “small incidents that gave color to the larger theme—and he had a prodigious memory.  At times his prose was almost electrifying” (p. 84).  A century later the literary critic Edmund Wilson said:  “‘The thick pair of volumes of the Personal Memoirs used to stand, like a solid attestation of the victory of the Union forces, on the shelves of every pro-Union home.”  Indeed:  “‘It may well be the most powerful military memoir in print, vying with Julius Caesar’s commentaries as (in Wilson’s words) ‘the most remarkable work of its kind’” (p. 278).  

      In 1884 Grant discovered he had throat cancer with little hope of recovery.  His physician prescribed pain killers but sometimes refused “to treat his patient, hoping that it would more quickly bring about his death, thereby putting an end to his suffering” (p. 95).  Facing his demise, financially broke “and now mortally ill, he viewed the publication of his memoirs not only as a fitting coda for his life, but as the sole means at his disposal to retrieve his reputation and leave his family financially secure” (p. 97).   At the same time Twain was a celebrated writer and humorist but had yet to write truly fine fiction.  Off and on, over the years, he had worked in a manuscript that would become Huckleberry Finn, but it was not yet finished.  In it he explored the nation’s “original sin” and its devastating impact on the South.  He sensed, deep within, “that the central and singular fact that had shaped his time and shaped him was the question of slavery—that ‘bald, grotesque and unwarrantable usurpation’ of human freedom that ‘stupefied humanity.’  And at the heart of slavery was the question of race, of racism—which is what made slavery possible’” (p. 261).  And as Twain devoted himself to the story he “realized that Huck Finn might be the one book for which he would always be remembered” (p. 147).  

       Encouraged by Twain’s promise to help him publish the manuscript, Grant worked hard.  Some weeks he was “particularly prolific, writing upward of ten thousand words on some days, while spending others editing and correcting what had already been written.  Twain, who saw Grant nearly every day during this period, was stunned by Grant’s abilities.  ‘It kills me these days to write half of that,” he commented” (p. 197).  He was also struck by the general’s “gentleness, goodness, sweetness.’”  Volume one his  Personal Memoirs of U. S. Grant was published on December 10, 1885, and within two months “Twain presented Julia Grant with a check for $200,000.  To that time it was the largest royalty payment ever made in U.S. publishing history” (p. 277).  Ultimately she received nearly $450,000 and Twain’s publishing firm turned a nice profit in the process.  Millions of people rejoiced when reading Grant’s autobiography, and the final words of Grant’s Memoirs came to symbolize the lesson of a war that divided a nation and cost six hundred thousand lives. ‘Let us have peace,’ Grant wrote.  They were the last words of his book” (p. 277).  Amen!