135 No Good Men Left?

Barbara Dafoe Whitehead (the author of The Divorce Culture) is one of the premier scholars writing about marriage and family.  Her most recent treatise, Why There Are No Good Men Left:  The Romantic Plight of the New Single Woman (New York:  Broadway Books, c. 2003) explores why so many highly successful career women–particularly in their 30s–fail so frequently to find a good man who will settle down and make a lasting commitment to marriage.  As she puts it:  “This book is about a contemporary crisis in dating and mating.  It explains why some of the best educated and most accomplished young single women in society today are discontented with their love lives, why romantic disappointment has emerged as a generational theme, and why many of these women have come to believe that ‘there are no good men left'” (p. 2). 

The women she interviewed, researching the book, reveal their cultural milieu in the language they use to describe the loss of romance.  The poetry and song of traditional courtship has disappeared.  The traditional system, rooted in the concept of covenant, maintaining vows for a lifetime, has been replaced by a libertarian system, characterized by momentary interests.  One no longer “falls in love” or finds the “love of my life.”  Instead, there is much talk–using the more cerebral and “scientific” jargon of psychology–about “relationships,” about “being in a relationship.”  The “M” word, marriage, is rarely mentioned–”perhaps because they’ve been warned that talk of marriage can seem needy or desperate” (p. 4).  And that’s precisely what the young career women resist being!  To admit one actually needs a man, that one cannot live a fully satisfactory life one one’s own, rubs against all the feminist ideology most of them have absorbed. 

They illustrate the enormous success of the “Girl Project” launched in the ’70’s and symbolized by the application of Title IX to athletics.  “Rather than prepare girls for future adult lives as wives and mothers, the Girl Project’s aim has been to prepare them for adult lives without dependence on marriage” (p. 77).  So girls began studying harder and now constitute a majority of students in colleges and universities.  Rather than look for husbands, increasing numbers of them focus singularly on preparation for work.  They have successfully moved into medical schools, law schools, business schools, and in some of these graduate programs now constitute a majority of students.  They engage in athletics and serve in the military.  Success in the workplace has come, with bewildering speed, to America’s young women. 

Yet, when the truth is told, most of these young women really want to marry and have children.  Indeed, a 2001 Gallup Poll indicated that 89 percent of them thought it “extremely important” to do so (p. 6).  The novels they voraciously buy and read reveal the depth of these young women’s hunger for a spouse.  Whitehead seriously studied the “Chick Lit” which has proven so popular in recent years.  Great literature it is not.  But it does demonstrate the indestructible desire in the heart of most women.  Thirty years ago, when nearly 90 percent of the nation’s women married before they reached the age of 30, such aspirations were obviously satisfied.  Today, nearly one-fourth of all women are unmarried at that age.  There are today 2.3 million college-educated single women in the 25-34 age group–compared with 185,000 in 1960 (p. 25). 

Not finding a husband, however, does not mean these women are sexually chaste!  The average age of their first sexual intercourse is 17, and “the majority of young women today will live with a boyfriend before they live with a husband” (p. 11).  Cohabitation has become a widely practiced–and socially acceptable–pattern for folks in their 20s.   It’s the “signature union of the emerging relationships system” (p. 116), and more young women first live together with a man than marry one.   “Women often have sex with their boyfriend before they get to know him well as a human being” (p. 29).  Though initially exciting and satisfying, women (unlike men) ultimately find cohabitation a dead end road.  All too often, what they thought was a commitment that would merge into marriage was, from the man’s perspective, simply an attractive arrangement providing free sex and homey comforts.  Understandably, “the benefits of cohabitation for men help to explain why there is no courtship crisis for high achieving young men” (p. 124).   Indeed, Whitehead laments:  “If a corps of mischievous social engineers had deliberately set out to create confusion and uncertainty in the new single woman’s search for love, they couldn’t have come up with a more effective device than cohabitation-as-courtship” (p. 127).

Single women frequently find themselves dating–or living with–”Mr. Not Ready.”  After investing much energy and attention to a series of “Mr. Not Readys,” they remain unwed in their 30s.   Conversely, the family-oriented single men, rather than courting career women, more often select “younger women who are not as committed to serious careers or not as far along in their careers as she is” (p. 36).  They discover, as a greeting card says:  “‘Why are men like parking spaces?  All the good ones are taken'” (p. 40).   Putting marriage on hold while you pursue a career until you’re 30 may very well mean losing the opportunity to marry and have children.

By consenting to cohabit women discourage their “lovers” from marrying them.  “Because men see marital commitment as a status, they take seriously the formal, legal, and public events, ceremonies, and rituals that mark the change in their status from ‘not married’ to ‘married.’  They assign far less weight to the informal, intimate, and private gestures and understandings that serve, for a woman, as benchmarks along the way to marriage” (p. 143). 

Having described and explained the plight of the women she’s studied, Whitehead has little to say concerning a solution.  She–like her two unmarried daughters, now in their 30s–resolutely defends the career track that seems to create the very problem she laments.  Though she observes the problems associated with cohabitation, she makes no moral judgment concerning it.  If what Christians traditionally called “living in sin” is wrong to Whitehead it’s only because it fails to lead to a more permanent “relationship” wherein children can be born and reared.  Her morally indifferent social science simply fails to provide any reason to condemn the very social patterns so manifestly harmful to both men and women.  Yet, for painting an honest portrait of women without “good men” she must be praised.

* * * * * * * * * * * * * * * *

Providing a very personal perspective, bringing to the discussion the wisdom of a lifetime, incorporating  insights derived from rearing four children and taking delight in 10 grandchildren, Midge Decter has written a “memoir of my life as a woman,”  An Old Wife’s Tale:  My Seven Decades in Love and War (New York:  ReganBooks, c. 2001).  Born to a Jewish family in St. Paul, MN, during the Depression, she stands rooted in an America largely vanished but still worth remembering and emulating.  As a teenager, of course, she would hardly have agreed, for she left home as quickly as possible (dropping out of the University of Minnesota) and moved to New York to “make her way in the world.”  There she met and married her first husband, with whom she had two children.  Subsequently, when that marriage dissolved, she married Norman Podhoretz, the lasting “love of her life” and bore him two children.  In the midst of all this, she worked at various jobs, lived in both suburbs and the inner city, and thought much about the society surrounding her. 

She especially pondered “the true Woman Problem.  Not the oppression of women, to say the least a laughable proposition in the United States of America, nor the glass ceiling that so many have been relentlessly calling attention to, but rather a seemingly never-to-be-mediated internal clash of ambitions:  the ambition to make oneself a noticeable place in the world and the ambition to be a good mother” (p. 51).  She began to discern the problem when highly successful women, in private conversation, overflowed, like broken dams, with assorted grievances regarding their husbands (or their lack thereof).  She then began to study women’s literature, such as Betty Friedan’s The Feminine Mystique, a best-seller that she found “both intellectually and stylistically very crude.  It was also unbelievably insulting to ordinary housewives, written on the level and in exactly the kind of lingo previously used by a number of pop sociologists to denigrate the postwar lives of the ordinary people” of America (p. 69).  At the time, Decter failed to see that Frieden’s book was more than simply “another in the series of generally left-inspired attacks on the nature of American society” (p. 71).  It was, in fact, along with the anti-war protests and other manifestations of the rebellious ’60s, a thoroughly pernicious attack on the culture only strong families can sustain. 

Decter sensed that as women following Frieden became more vitriolic and aggressive, men retreated into silence, lest they be judged anti-female.  And yet, ironically, “the movement that began with the claim that it was out to make a real revolution in women’s lives began to define the various forms of male withdrawal from combat as victories, whereas the truth was they were for the most part expressions of the deepest (and in most cases to this day unrecognized) contempt” (p. 90).  As she read and thought and observed, staying at home with her youngsters, she decided to write a book, titled The New Chastity, “in which I faithfully stuck to the movement’s own sources and then compared it with the truths I knew on my own pulse about what women want and how they feel” (p. 93).  Published in 1972, calling into question the most passionately held articles of faith in the feminist movement, this book instantly catapulted Decter to something of a celebrity status–a woman willing to dispute the claims of the women’s movement!  For her efforts NOW gave her its “Aunt Tom” award, a badge of honor for her in the culture war just begun!

Her militancy solidified during the following decades as she watched the children of her “liberal” friends suffer under their parents’ ideological fantasies.  “It is,” she laments, “harrowing to remind oneself of the wreckage visited upon the children of the famous baby boom who grew up among the so-called enlightened classes” (p. 106).  Drug addictions, psychiatric treatments, lesbian experimentats, diet disorders–all symptoms of something seriously awry in the nation’s homes.  Summing up her views, she wrote Liberal Parents, Radical Children, an indictment of those who give children everything the need, of a material sort, and neglect the most important things, such as teaching them manners, how to treat members of the opposite sex, how to live right 

At the same time she began to critique feminism, she and her husband slipped away from the liberal political ideology they had long espoused.  The McGovern presidential campaign in 1972 signified the “capture of the Democratic party by the hard Left” (p. 122).  A personal conversation with President Jimmy Carter revealed an intransigent opposition to the moderate views of Decter and her husband.  So in the ’70s they left the Left!  They “had to rethink most of what we had once thought, not only about politics but about a whole slew of things that fall under the category of what you might call the Natue of Man and God” (p. 125).  Consequently, Norman Podhoretz’s “neo-conservatism,” articulated in Commentary, helped guide the ascendant conservative movement that triumphed with Ronald Reagan. 

Because she’s known both the satisfactions of professional success and family bonds, she weeps to watch young women choosing careers rather than marriage:  “How sad it is,” she says, in a passage that rather sums up her treatise, “that the movement claiming to liberate women and given them control over their own lives should have adopted a program in which they deprive females of one of the most significant means of tasting power and control.  All the law and medical degrees in the world will not make up for what women have been losing in their relations with men, for to become tough and demanding as feminism has defined the process of their taking control is as nothing compared with being hungered for and, later on in life, indispensable” (p. 196).  

Astute, engaging, filled with the wisdom of a maturity, An Old Wife’s Tale could help young women hoping to discover how to become one!


If you need to be alarmed about the future of the nation’s youths, read Meg Meeker’s Epidemic:  How Teen Sex Is Killing Our Kids (Washington:  LifeLine Pressc. 2002).  A medical doctor who practiced pediatric and adolescent medicine for 20 years, a fellow of the American Academy of Pediatrics and a fellow of the National Advisory Board of the Medical Institute, she brings to this treatise both the data and the passion needed to alert us to a momentous problem.  In Elayne Bennett’s judgment, “I truly believe Epidemic is the most important book that anyone who lives or works with teenagers should read, and read now.  Not only does Meg Meeker vividly explain the problem, she explains the solution.”

In Part One Meeker declares “the epidemic is here.”  She blends both personal anecdotes and statistics to point out the pervasiveness of Sexually Transmitted Diseases.  In 1960, syphilis and gonorrhea were the two STDs that concerned physicians, and both of them could be treated if detected early.  Forty years later, there are dozens of them–perhaps 100!–and some have no known cure.  “Every day, 8000 teens will become infected with a new STD” (p. 3).  Of the sexually active teens, fully one-fourth carry a STD.  A British study indicates “that almost half of all girls are likely to become infected with an STD during their very first sexual experience” (p. 12). 

More than 45 million Americans carry an incurable herpes virus!  And kids engaging in oral sex, taking it to be “safe” since no pregnancy results, easily spread herpes throughout the population.  Sadly, President Bill Clinton’s affair with Monica Lewinsky “gave new meaning to the word ‘sex,’ and taught an entire nation of teenagers that as long as you didn’t have ‘vaginal penetration,’ you really weren’t having ‘sex'” (p. 145).  “HPV, one of the most prevalent sexually transmitted diseases,” directly causes “99.7% of cervical cancer cases and the deaths of nearly 5000 women each year” (p. 16).  Some 75% of sexually active people now carry HPV!  AIDS continues to haunt us, and increasing numbers of women how carry the HIV infection. 

Accompanying the physical problems, STDs also inflict grave emotional harm on youngsters.  Amazingly, Meeker has “asked hundreds of teenage girls whether or not they like having sex, and I can count on one hand those who said they did” (p. 78).  Severing the act from the lasting context of love and marriage renders it heartbreaking, for sex is ultimately “a spiritual experience” (p. 81).  Consequently, one of the main reasons for teen “depression is sex” severed from its proper context (p. 63).  Suicide now ranks as the third leading cause of young people’s deaths.  Fully one-third of our teens have contemplated suicide!  Rather than a joyous experience, sex has become a source of incredible pain!

“One classic example of how kids turn this rage inward is the preponderance of body piercing.  Punching holes in intimate parts of their bodies, such as their lips, tongue, belly button, or even vagina, sends a message to the world:  ‘I am hurting this intimate part of myself because I don’t like who I am.’  When girls pierce the sexual parts of their bodies, their labia and nipples (some so severely that they’ll never be able to nurse a baby), they’re saying:  ‘I am cutting on my womanhood.  This is anger turned upon the self'” (p. 72). 

All of this results, Meeker declares, from the sexual revolution birthed in the ’60s.  “With the coming of that revolution, my own generation demanded previously unheard-of-sexual freedom and promiscuity.  We may have gotten what we thought we wanted, but the ride wasn’t free.  Countless children are now paying the price” (p. 33).  Yet it’s reinforced by the dominant powers of our culture.  Young women are “encouraged to expose every inch of skin they can get away with,” but “in doing so, girls are taught that their bodies are not worth protecting” (p. 73).  This, Meeker says, violates one of the most basic feminine instincts, for like self-preservation the preservation of one’s virginity is “hard-wired into our psyches” (p. 73). 

Television, arguably our most influential medium, broadcasts highly sexualized programs, with men and women sexually active, but only one percent of folks having sex on TV are married!  (p. 126).  “On television today, teens are exposed to homosexual sex, oral sex, and multiple partner sex” (p. 126).  All this is done under the artifice of “artistic expression” or “free speech.”  How, ironic, Meeker notes, for “Selling sex to teens is just as bad as selling them cigarettes and alcohol.  Can you imagine the public outrage of parents if movies, magazines, and music incorporated glamorous smoking imagery to the same degree they do sexual content?” (p. 140). 

The sexual revolution, of course, has been fueled by birth control devices.  For years Dr. Meeker cheerfully prescribed contraceptives for teenagers, thinking they would insure the vaunted “safe sex” encouraged by the culture.  She failed to envision how contraceptives would contribute to the proliferation of STDs.  “While we physicians handed out oral contraceptives, chlamydia rates rose.  While we gave injections of Depo-Provera, the numbers of HPV rose.  And while we handed out condoms to teens, we say syphilis outbreaks and genital herpes climb” (p. 95).  The proverbial law of unintended consequences seems demonstrated herein.  For though contraceptives prevent births they routinely fail to provide even minimal protection against STDs.  Condoms, especially, though they may protect against some diseases, have little value in preventing the spread of many of them.  Importantly, in giving teenagers condoms, adults have informed them that they aren’t expected to control their desires.  Since they’re going to “do it,” just make sure their activities cause the least harm! 

Epidemic paints a bleak portrait!  What little hope there is for our kids, as one might expect, comes from better parenting.  Kids need trustworthy parents who know what they believe and live in accordance with their beliefs, who care for them, who insist on good behavior.  Importantly, kids want strong family structures.  “Kids like having someone they love set high standards because it demonstrates faith that they can meet these standards” (p. 220).  Other adults–in family, church, school, or neighborhood–also help.  Sexually active teens are seeking something lacking in their lives.  If that something is satisfied by loving adults, they are less likely to go astray.  And they will be spared the anguish of the epidemic sweeping the nation. 

134 The Battle for the Trinity


  We are, by nature, word-shaped and word-shaping creatures.  Consequently, we define and debate the meaning of words.  “What is the good of words,” asked G.K. Chesterton, “if they aren’t important enough to quarrel over?  Why do we choose one word or than another if there isn’t any difference between them?  If you called a woman a chimpanzee instead of an angel, wouldn’t there be a quarrel about a word?  If you’re not going to argue about words, what are you going to argue about?  Are you going to convey your meaning to me by moving your ears?  The Church and the heresies always used to fight about words, because they are the only things worth fighting about” (The Ball and the Cross {NY:  John Lane Co., 1910}, p. 96). 

            Above all, words regarding the Trinity are worth fighting about!  As one of the giants of 20th century theologians, Emile Brunner, said:  “We are not concerned with the God of thought, but with the God who makes His Name known.  But He makes His Name know as the Name of the Father; He makes this Name of the Father known through the Son; and He makes the Son known as the Son of the Father, and the Father as Father of the Son through the Holy Spirit.  These three names constitute the actual content of the New Testament message” (The Christian Doctrine of God, trans. Olive Wyon {Philadelphia:  Westminister Press, 1974}, p. 206).  No doctrine has been more essential–or more disputed–for 2000 years, for on it the Christian faith stands or falls. 

          As we enter the 21st century, one of the strongest challenges to traditional trintarianism is feminist theology, and probably the best assessment of that threat is found in Donald Bloesch’s The Battle for the Trinity:  The Debate over Inclusive God-Language  (Ann Arbor:  Servant Publications, c. 1986).  In her foreword to the book, the distinguished biblical scholar Elizabeth Achtemeier noted that feminist theology–much of which she contends is a return to “Baalism”–had impacted “the liturgy and worship of the church, its governing bodies, its witness, its doctrine, and its sacred literature” (p. xi).  She believes that “several feminist theologians are in the process of laying the foundations for a new faith and a new church that are, at best, only loosely related to apostolic Christianity” (p. xi).  Anticipating Achtemeier’s concern for Baalism, the great Jewish biblical scholar and philosopher Martin Buber noted that ancient Israel’s prophets forever struggled against the pagan religions of Canaan that featured mother goddesses in “which the inherent dynamism of nature is worshipped as the force which procreates life, and always more life.”  Such worship contradicted the way of Jahweh, and such female deities, Buber said, threatened both “the purity of the faith” and “the humanity of women”  (p. 40).  

While Elizabeth Achtemeier sympathized with some of the pain responsible for the feminist movement, she refuses to justify its theology and warns that some of its most significant proponents seek to replace the Christian God with “a god or goddess” of their own making.  Consequently, she wonders why Christian scholars have “neither admitted any responsibility for current feminist misinterpretations of the Bible nor mobilized any effort to correct those misinterpretations.  On the contrary, many educators seem simply to accept feminists’ positions without questioning the fundamental theological issues involved” (p. xiii). 

            Donald Bloesh–Achtemeir notes–had the courage and conviction to question such issues in The Battle for the Trinity.   Re-reading his treatise 20 years after it was written, reflecting upon developments during intervening years, one is struck by the prescience of his insights.  He listed some of the changes then proposed for mainline churches.  For example, a “United Church of Christ document says that we should ‘avoid use of masculine-based language applied to the Trinity as in ‘Father, Son, and Holy Ghost.'”  We are also instructed to avoid the use of masculine pronouns and adjectives in reference to God, such as he or his.  We are even asked to abandon masculine role names for God including “Lord,” “King,” “Father,” “Master,” and “Son” (p. 2).   

“At a United Methodist General Conference in Baltimore in May of 1984, Methodists were urged to begin finding new ways of referring to deity, such as alternating male and female pronouns or using genderless terms” (p. 2).  Inclusive language reformers especially targeted biblical translations.  Thus Princeton Theological Seminary’s Bruce Metzger, one of the world’s greatest scholars, the chairman of the RSV translation committee, disavowed such tinkering with the wording of the New Revised Standard Version, declaring:  “The changes introduced in language relating to the Deity are tantamount to rewriting the Bible.  As a Christian, and as a scholar, I find this altogether unacceptable” (p. 4).  The illustrative list need not be extended, for a visit to most any mainline church–or a reading of various contemporary scholars–will document the success of the terminological turn.  The recent controversy over quite modest moves to embrace “inclusive language” in the Revised New International Version, indicates that the issue is now moving from mainline to evangelical circles. 

Probing beneath the new language, Bloesh explained the theological developments responsible for it.   This is what makes his treatise one of the best available.   Despite its modern expressions, an ancient tendency infuses feminism–the effort to shift from a Trinitarian to Unitarian understanding of God, to envision Him as an immanent (virtually pantheistic) Power rather than a transcendent Person.  Such becomes clear when reading the American women (e.g. Mary Daly; Rosemary Reuther; Nancy Hardesty; Elisabeth Schussler-Fiorenza) and men (e.g. Paul Jewett; Robin Scroggs; John Cobb), who have supported and contributed to feminist thought.  Among these thinkers, the German theologian Jurgen Moltmann (a major architect of “liberation theology”) has been especially influential, for he envisioned God as “bisexual,” contending the Shekinah denotes a “feminine principle within the Godhead” (p. 6).  Even earlier, Paul Tillich, who deeply influenced great numbers of 20th century theologians, espoused what he called an “‘ecstatic naturalism,'” and portrayed God “‘as Spirit or Spiritual Presence, and Spirit, it seems, is conceived basically as feminine rather than masculine'” (p. 7).  

While some feminists seek simply to do theology in a “different voice,” many more use it as a weapon, taking up arms as partisans, waging political battles against the Church and her traditions, seeking to establish a new religious regime.  Somewhat representative of the latter is Harvard University Divinity School Professor Elisabeth Schussler-Fiorenza, who boldly inveighs “‘against the so-called objectivity and neutrality of academic theology.'” She espouses a theology that “‘always serves certain interests'” and pledges “‘allegiance'” to a “‘partisan'” theology that becomes “‘incarnational and Christian'” by championing the cause “‘of the outcast and oppressed'” (p. 84).  (Her baneful influence has been recently evaluated by Eamonn Keane in A Generation Betrayed:  Deconstructing Catholic Education In the English-Speaking World .) 

Consequently–and most importantly–says Bloesch, “The debate in the church today is not primarily over women’s rights but over the doctrine of God.   Do we affirm a God who coexists as a fellowship within himself, that is, who is Trinitarian, or a God who is the impersonal or suprapersonal ground and source of all existence?  Do we believe in a God who acts in history, or in a God who simply resides within nature?  . . . .  Do we believe in a God who created the world out of nothing or in a God whose infinite fecundity gave rise to a world that is inseparable from his own being?” (p. 11).  Bloesh believes that the new feminist gospel is a resurgent Gnosticism, a refurbished Neoplatonic mysticism, that allows one to portray God in accord with one’s own desires rather than taking Him as revealed in Scripture and Christ.   Our God-talk either reveals to us truths concerning Him, or it’s nothing but man-made symbols ever groping for more satisfactory images of Him. 

After a probing discussion of what “symbol” means, Bloesch concludes that “Symbols may be either metaphors or analogies, and these are not the same.  I agree with Thomas [Aquinas] and [Karl] Barth that analogical knowledge is real knowledge, whereas metaphorical knowledge is only intuitve awareness or tacit knowledge” (p. 21).  Many biblical words are obviously metaphors–thus God is a Rock, in the same sense that my wife is my anchor.  Other words are analogies–God is Father or Lord, in the same sense that Roberta is my soul-mate.  A true analogy is not figurative but real.  So, as Hendrikus Berkhof said:  “‘When certain concepts are ascribed to God, they are thus not used figuratively but in their first and most original sense.  God is not ‘as it were’ a Father; he is the Father from whom all fatherhood on earth is derived” (p. 25).  Importantly, we must “understand that is not we who name God, but it is God who names himself by showing us who he is'” (p. 25).  So when we refer to God as “Father” we are using a symbol appropriate to Him.  “Such words as Father, Son, and Lord, when applied to God, are analogies,” Bloesh says, “but they are analogies sui generis.  They are derived not from the experience of human fatherhood or sonship or lordship, but from God’s act of revealing himself as Father, Son, and Lord.  They are therefore more accurately described as catalogies than analogies insofar as they come from above” (p. 35).

            Upholding the Church’s traditional language preserves her confidence in God the Father, Creator of all that is.  Fathers bring into being beings apart from themselves.  In a sense, they are separate from and transcend the creative process.  Mothers, of course, bring into being beings conceived within their wombs.  Consequently, “Goddess spirituality is a perennial temptation in the life of the church, but it must be firmly rejected, for it challenges the basic intuitions of faith–that God is Lord and King of all creation, that the world was created by divine fiat rather than formed from the being of God as an emanation of God, that God utterly transcends sexuality.  Whenever biblical theism is threatened by philosophical monism, whether this takes the form of pantheism or panentheism, theologians must be vigilant in reaffirming the biblical principle of the infinite qualitative difference between God and the world . . . and the absolute sovereignty of God over his creation” (41).

* * * * * * * * * * * * * * * * * *

            Mary A. Kassian writes as an Evangelical woman initially attracted to the feminist position but ultimately disillusioned by its radical and ultimately non-Christian implications.  In her work, The Feminist Gospel:  The Movement to Unite Feminism with the Church (Wheaton:  Crossway Books, c. 1992), she provides an overview of feminist thought and a distinctly evangelical response.  She notes, for example, that “The major thesis [i.e. the equality of the sexes, as asserted by Margaret Mead] proposed by Christian feminists in the early 1960’s was identical to the thesis of secular feminism” (p. 31).  They sought liberation through “a castrating of language and images that reflect and perpetuate the structures of a sexist world.”  By “cutting away the phallocentric value system imposed by patriarchy,” Mary Daly said, they could design a better world in their own image (p. 70).   Whereas traditional Christians tried to glorify God and serve Him, feminists “shifted the emphasis:  God’s purpose was to assist humans to realize liberation, wholeness, and utopia for themselves” (p.  95).  Embracing feminism freed women from the “hisstory” that ignored them.  They could declare their own truth–write “herstory.”  Accordingly everything can be questioned, there are no absolutes, meaning is socially constructed, and allegedly “natural” realities or ethical standards need not be heeded

            The process began incrementally, slowly and subtly so as to elicit a minimum of opposition.  First it was suggested that inclusive language for human beings be changed.  Thus the generic “man” was anathematized.  “Mankind” was replaced by “humankind.”   A committee chairman must be called a “chair.”  Victorious in changing terms regarding fellow humans, the inclusivists then shifted to loftier terrain and proposed that pronouns referring to God could be changed without seriously changing one’s understanding of God.  You could, in fact, simply use the word God incessantly, abolishing the need for pronouns.  Gaining ground, they then “began to take greater liberties with interpretive hermeneutic methods, using women’s experience as the norm” (171).  So Letty “Russell concluded that experience equals authority.  She stated that ‘the Bible has authority in my life because it makes sense of my experience and speaks to me about the meaning and purpose of my humanity in Jesus Christ.'”  Accordingly, the biblical “text only has authority as I agree with it and interpret it to my experience'” (171). 

            Such women then felt free to re-vision and re-write reality in accord with their own experiences.  Re-casting reality in accord with themselves, they felt free to re-name God as well.  Hinting at things to come, a popular musician, Helen Reddy, accepting a Grammy Award for her 1972 song, “I Am Strong, I am Invincible, I Am Woman,” said:  “I’d like to thank God because She made everything possible” (p. 135).    Such re-naming efforts, Kassian found, “logically led to an erosion of God’s independent personality.  God became a ‘force.'”  This was manifestly clear when an erstwhile “evangelical feminist,” Virginia Mollenkott, moved from calling God ‘He/She’ to ‘He/She/It'” (p. 145).  Or, one might add, “Whatever”!

            Ultimately Kassian concluded that “Feminism and Christianity are like thick oil and water:  their very natures dictate that they cannot be mixed” (p. 217).  She fully agrees with Virginia Mollenkott that “The language issue is anything but trivial” (p. 237).  But because that’s true she rejects Mollenkott’s conclusions.  Indeed, Kassian says:  “It is important to understand that it is not we who name God, but it is God who names Himself by showing us who He is.  In the book of Exodus, God calls Himself “I am who I am” (Exod. 3:14).  He also reveals Himself as Lord and Master (Adonai), Self-existent One (Jehovah Yahweh), God Most High (El Elyon), and the Everlasting God (El Olam).  In the New Testament Jesus Christ is revealed as Lord (Kyrios) and Son, and the first person of the Trinity is called Father and Abba (dear Father).  The names of god are God’s self-designation of His person and being.  Such names do not tell us who God is exhaustively, but they are informative symbols having a basis in revelation itself” (p. 243). 

            Still more:  “God has a name, ‘I AM who I AM’ (Exod. 3:14).  The name of God is important.  The  symbols of faith that compose the Biblical witness–in the form of God’s own name–have been elected by God as means of revelation and salvation.  To challenge or change the name of God as God has revealed it is a denial of God.  It is a denial of who God is.  It is by God’s name that we know Him, it is by His name that we are saved, and it is by His name that we are identified.   Feminism’s attempt to rename God is a blasphemy that comes out of the very depths of Hell.  We have no right to name God.  The only right we have, as created beings, is to submit to addressing God in the manner He has revealed as appropriate.  It is not we who name God, it is God who names Himself” (p. 244). 


            Another recent critique of feminist theology, from a Roman Catholic and European perspective, is Manfred Hauke’s God or Goddess?  Feminist Theology:  What Is It?  Where Does It Lead?  (San Francisco: Ignatius Press, c. 1995).  To explain some of the latent assumptions of feminism, Hauke takes us back to the Utopians of the 19th century, Saint-Simonist socialists, who referred to God “as both Father and Mother, as ‘Mapah'” (p. 21).  They further imagined that in the beginning “there was a mixed male and female being” (p. 21), and reasoned that there is no rigid difference between the sexes, postulating an androgynous ideal still embraced by radical feminists.  Consequently, modern thinkers, such as Simon de Beauvoir, assert:  “‘One does not arrive in the world as a woman, but one becomes a woman.  No biological, psychological, or economic fate determines the form that the female human being takes on in the womb of society'” (p. 29).   This androgynous assumption concerning male and female leads “Christian feminists” to insist that God is likewise androgynous.  So “Father” must be instantly coupled with “Mother” to fully name God.

These utopian socialist roots of the feminist movement are clear in the work of Rosemary Reuther, one of the highly regarded theologians, who praised the work on the family by Friedrich Engels, calling it “the fundamental text for consistent feminists” (p. 50).  She also spoke (perhaps influenced by Betty Friedan’s Stalinist views) highly of Russia’s Communist revolution “and praised the China of Mao Tse-Tung” (p. 50).  The revolution Reuther envisions, of course, is theological and ecclesiological, but the same Marxist antipathy to all forms of hierarchy is clear.  Whereas Mao overturned traditional Chinese society, she seeks to destroy the “hegemony” of the patriarchal Church and replace it with a kinder, gentler version.  To do so, one must destroy the “one-sidedly masculine symbols like ‘Father,’ ‘Lord’, and ‘King’.  Only then will the ‘male Church’ disappear.  Alongside the ‘our Father’, some then place an ‘our Mother’; ‘Jesus Christ’ is supplemented by ‘Jesa Christa’; while the third Person appears as the ‘Holy Spiritess'” (p. 49).

            More radical than Reuther, more deeply rooted in Marxism, Mary Daly, long a professor at Boston College, wrote the influential Beyond God the Father and rejected all hierarchical structures.  Neither God, nor any man, would stand above her.  (After successfully refusing, for 20 years, to admit men to her classes, Daly was recently dismissed from BC as a result of a lawsuit brought against her for such discrimination!)  “For Daly, God’s Incarnation as a male human being is the decisive reason for rejecting Christianity.  ‘Christ-worship’, Daly says, ‘is idol-worship'” (p. 193).    Equally radical, if not more so, is Jurgen Moltmann’s wife, Elizabeth Moltmann-Wendel, who tries to blend Israel’s Yahweh with a Cananite fertility goddess, Astarte–encouraging the worship “Yahweh/Astarte.”  To Moltmann-Wendel, a loving God, acting like a mother, “would ‘unconditionally’ accept even the immoral person.”  Rather than worry about or confess our sins, we can simply affirm that:   “‘I am good.  I am whole.  I am beautiful'” (p.  169).

            Having rather exhaustively studied the works of the most prominent modern feminists, Hauke concludes that they have clearly departed from the Christian faith.  He shares the view of Elke Axmacher, a theological professor at Berlin University, who says:   “A feminist approach to language about God has as much chance of success as an atheist approach to belief in God” (p. 60).  To the extent that the Church tolerates it, he warns, a new religion will develop.

133 Real (Christian) Ethics

 

                Most folks readily give “opinions” on various moral topics ranging from capital punishment to terrorist attacks.  Like the ancient Sophist, Protagoras, they take man to be “the measure of all things,” and every man supplies his own measuring rod.  To make decisions, many adopt pragmatic or utilitarian positions–assenting to democratic decisions favoring the greatest good for the greatest number, or whatever enables one to live more “successfully.”  But few of them venture to defend their “opinions” as moral absolutes, timelessly true.  Under the guise of tolerance, only a few evils–such as racism–elicit condemnation.  There are no absolutes, no “objective” truths.  So Hermann Goering quipped:  “I thank my Maker that I do not know what objective is.  I am subjective” (quoted in J..C. Fest, The Face of the Third Reich {Penguin, 1983}, p. 121).  Goering became a NAZI, he said, not because he took seriously Hitler’s ideology.  Rather he found it a vehicle whereby to vent his revolutionary passion, his nihilistic feelings, his hunger for vandalism and destruction.  If it feels good, do it!

                Such popular positions closely mirror views advanced in academia, where relativism reigns.  Fashionable “postmodernists” glibly insist that there are no universal “truths,” only interesting perspectives.  To John Rist, “the surcease of ethics can be seen to be parallel to and inextricable from the replacement of truth by assertion” (p. 151).  Learned professors, refusing to discriminate between good and evil, propound a fashionable nihilism that denies “reality” exists–only one’s interpretation of it.  Such intellectuals, as Thomas Wolfe shrewdly observed two decades ago, personify a “radical chic,” hosting terrorists and murderers at Manhattan cocktail parties.  They easily become “downwardly mobile,” taking seriously the criminal underclass or “gangsta rap” music, pretending to identify with the world’s “poor and oppressed,” defending despots like Fidel Castro or Palestinian assassins.

                In the midst of such moral nihilism, an eminent classicist and philosopher, John M. Rist, now Professor Emeritus at the University of Toronto, argues that of all the ethical “theories” advanced across the centuries, the moral realism of Platonism provides the “only coherent account” yet designed, and he defends that stance in Real Ethics:  Rethinking the Foundations of Morality (Cambridge:  University Press, c. 2002).  Responding to the widely-held notion that there is no metaphysical foundation, no higher source, for moral beliefs, Rist shows how Plato –who perenially pondered “How should I live?”–dealt with the same issue.  “He came to believe that if morality, as more than ‘enlightened’ self-interest, is to be rationally justifiable, it must be established on metaphysical foundations, and in the Republic he attempted to put the nature of these foundations at the centre of ethical debate” (p. 2).   

                The struggle between Socrates and Thrasymachus, in Plato’s Republic, illustrates the enduring difference between foundationalism and perspectivism.  Widely espoused by modern academics, perspectivism holds that “truth” cannot be found, so we must construct theories to realize our desires.  Following Thrasymachus, perspectivists like Nietzsche declare that we’re free to devise our own morality and seek, whenever possible, to subject others to our machinations.  There is no higher law, no transcendent source, for ethics.  In Nietzsche’s phrase, we may go “beyond good and evil,” devising our own rules for life.  The debate between Socrates, who insisted that morality is given to us from a higher source, and Thrasymachus, writes Rist, “is a debate between a transcendental realist and an anti-realist who disagree about the possibility of morality” (p. 19).  To Socrates and Plato, morality has a metaphysical basis.  Working within this tradition, theistic Platonists (especially Augustine) discerned that “God can create trees and men, men can make tables, but goodness and justice are not created by God (nor, it follows, by man), but subsist in God’s being or nature” (p. 38). 

                Having established his benchmark in Plato and Platonic theism, Rist then compares a variety of ethicists in the history of philosophy who have sought to establish other bases for morality.  Epicurus and Macchiavelli, Hobbes and Kierkegaard and Kant are carefully scrutinized, and Rist shows that despite many differences they all share a potent anti-realism.  Interestingly enough, all their “alternatives to ‘Platonic‘ objectivity in ethics may be forms of the claim–becoming explicit only after Kierkegaard but much indebted to him–that what matters in the world is what we prefer, what we choose to be ‘natural’, what we choose as our own–and precisely because we autonomous beings choose it as our own” (pp. 59-60).  Everything reduces itself to what I desire, what I know, what I can be.  Whether in self-help seminars or self-esteem educational publications, it’s clear that a fixation upon the self reigns in modern culture.  In its Nietzschean version (given a “Christian” twist by Paul Tillich), we’re encouraged to accept “ourselves as we are now, not in any responsibility for our actions, but simply in the being what we are” (p. 220).  Consequently, there has emerged–as is evident in various lawsuits and political appeals, “a choice-based, rights-claiming, largely consequentiality individualism, usually dressed up in democratic clothes” (p. 241).                

                In contrast, “For Plato what matters most about human beings is less that they can reason (though to some degree they can and that is important) than that they are moral or ‘spiritual’ subjects, capable of loving the good . . . and hence determining for themselves what kind of life they should live:  that is, whether we should live in accordance with a transcendent (and in no way mind-dependent) Form of the Good . . . or whether we should opt for the alternative life of force, fraud and rationalization, with, as its theoretical counterpart, the denial of metaphysical truths and concentration on the maximization of our desires:  a life in which reason is and ought only be . . . the servant of the passions, tyrannical as those passions will be over both ourselves and others” (pp. 86-87).   So too Aristotle, though he differed with his teacher in important ways, is a Platonist insofar as he emphasized the metaphysical foundations for ethical principles.  However foreign it may seem to modern philosophers, Aristotle thought that “there is something godlike about man,” namely his contemplative potential.   So endowed, despite our imperfections we can behold a transcendently perfect realm, “perfectly existing outside of man and independent of man’s control.  Man is not for Aristotle ‘the measure of all things’, but a variable creature” who finds his greatness through his awareness of and submission to “a superior being” (p.  145).  And Thomas Aquinas, Rist argues, was doubtlessly “a Platonist in that he believes in an ‘eternal law’ which is roughly the Platonic Forms seen in an approximately Augustinian manner as God’s thoughts” (p. 151). 

Though Rist’s placing of Aristotle and Aquinas alongside Plato and Aristotle may initially jar those of us who stress their differences, he makes his point persuasively, and I think he rightly insists that they are all moral realists.   Similarly, it makes sense that he insists that the only answer to our need for ethical direction lies in a recovery of Platonic Realism.  What is good for us, as individuals, is what is good for mankind.  The common good, not the individual good, should concern us.  Ultimately, as Plato held, the “common good will itself depend on the fact of God as a transcendent Common Good, who has made man with his specific needs and limitations and thus gives intelligibility to a common good which is (or should be) the object of human striving in social and political life” (p. 241). 

                More importantly, following Plato leads us to God!  We cannot live rightly apart from God.  Knowing what we ought to do does not imply we can do it.  As the Roman poet Ovid lamented, we generally “know the better and follow the worse.”  Without the empowering presence of God, we routinely fail to attain goodness.  To put it “in traditional terms, for morality to function God must function both as final and (at least in great part) as efficient cause of our moral life” (p. 257).  Those philosophers who construct moral systems based upon prudence or self-interest merely dream utopian dreams.  Theists who hope to establish links with atheists, sharing ethical principles, fail to grasp the fact that purely natural norms ultimately collapse into those of Protagoras and Thrasymachus.

                What Plato shows us Christians is that real ethics must be rooted in Reality.  With him, we must realize “that to be moral is not only to be rational, but also, far more importantly, to be godlike insofar as we are able–as Plato also said, agreeing with the Old Testament’s ‘You shall be as gods'” (p. 276).

* * * * * * * * * * * * * * * * * * * *

                When Professor Rist was asked to deliver The Aquinas Lecture, at Marquette University, in 2000, he titled his presentation On Inoculating Moral Philosophy Against God  (Milwaukeee:  Marquette University Press, 2000), reducing to a few pages some of the more detailed views he set forth in Real Ethics.  As an Aquinas lecturer, he joins some of the most distinguished philosophers of the 20th century, including Mortimer J. Adler, Yves Simon, Jacques Maritain, and Werner Jaeger.  The event provides a pulpit for distinguished philosophers, an opportunity to amplify their convictions.  Rist’s desire, in this lecture, was “to expose and attempt to correct a rather mysterious phenomenon, that of a group of theistic, indeed Christian, philosophers who act as though it makes no great difference in ethics whether God exists at all, who seem inclined to assume that the question of whether there can be moral truths at all in his absence can be lightly put aside” (p. 96).  To use St. Paul’s terms, addressing Christian philosophers, “be not conformed to this world, but be transformed by the renewing of your minds.” 

                Eminent ethicists today, such as Harvard University’s highly influential John Rawls, openly sought to establish a public, political morality in purely secular, implicitly atheistic terms.  Thus J.L. Mackie titled his influential textbook:  Ethics:  Inventing Right and Wrong,  Such thinkers are working out the legacy of  Immanuel Kant’s “Copernican Revolution”–the declaration “that theoretical reason was essentially impotent, and certainly has nothing to contribute to ethics” (p. 84).  To Kant, and his multiplied heirs, “practical reason” constructs morality in accord with human autonomy.  Nothing metaphysical, nothing supernatural, is knowable.  So man designs his own rules.  Whoever persuades–or coerces–others to follow him writes the laws of the land.  The Kantian “revolution” in philosophy, Nietzsche recognized, forces one to acknowledge that “morality either depends on God or it depends on the will and rationality of man.  We either find it or invent it; it rests either on fact or on choice” (p. 94).  Without God, whom Nietzsche declared to be “dead,” all things are possible.

                Rist’s concern is the split that has severed philosophy from theology–a disastrous dichotomy underlying much that’s wrong with the modern world.  What he wants to recover is an Augustinian approach that envisions theology as “an advanced form of philosophy, a philosophy, that is, in which more data is available (even though by ‘belief’ and ‘in hope’ rather than by knowledge”) (p. 19).  Though he calls this position “Augustinian,” it is more broadly “the mind of the early Church at least from some time in the second century, in the days, let us say, of Justin Martyr” (p. 19).  Augustine incorporated Plato’s philosophy into his theology, but he pushed beyond Plato and relied upon divine Revelation for ultimate truths.  In Plato Augustine found reference to God and the Logos–but only in Christ did he behold the Logos revealed as the incarnate Christ.  “It was above all the Platonist picture of God,” says Rist, “as transcendent and as the source and nature of value, which appealed to the developing Christian thinkers, especially when coupled with a theory of the return of the soul through love to God” (p. 87). 

                The same needs to be done by Christian ethicists today, says Rist, for “believers in the Christian religion must propose an understanding of the virtues which is impossible for pagans:  which indeed is only possible for those who believe in a God” (p. 37) Who is the loving Creator of all that is.  To see God as revealed Love, Augustine thought, enables one to “claim that love itself is the basis of the other virtues, which thus become, in his language, ‘nothing other than forms of the love of God'” (p. 38). Augustine’s position, anchored in Plato’s metaphysics, provides modern Christians a way to respond to modern moralists.  Christians must clearly set forth and defend an alternative to the secular model.  Indeed:  “The theistic tradition of which some of us believe that Christianity is the developing fulfillment, started, as Augustine recognized, with Plato.  It is not just any metaphysics which can provide an adequate philosophical framework for the truths of Christianity, but a Platonizing framework” (p. 85). 

* * * * * * * * * * * * * * * * * * * *

                Rist’s roots in the thought of St. Augustine stand clear in his Augustine:  Ancient thought baptized (Cambridge:  Cambridge University Press, c. 1994), written to describe “the Christianization of ancient philosophy in the version which was to be the most powerful and the most comprehensive” (pp. 1-2).  In Augustine one finds a unique thinker, fully open to the truth of philosophy and devoutly committed to the authority of Scripture and Church.  Many “truths about the Truth had been discovered by the Platonists” (p. 62).  But the “Truth” can be nothing less than God!  The “forms” Plato discerned by reason are “illuminated” by God for Augustine, lifted to a level of clarity and certainty through Revelation.  “When we learn something, he observes . . . our sources are intelligence (ratio) and ‘authority’; he himself has determined never to depart from the authority of Christ” (p. 57).  Only He “is the way, the truth, and the life.”

                To Plato, first-hand knowledge (episteme), excels second-hand knowledge (doxa), beliefs which may or may not be true.  To move from beliefs about, to knowledge of, what is, is the calling of truth-seekers.  To Augustine, so long as “we long for truth or wisdom we remain ‘philosophers,'” but the happiness we more deeply desire results from a rightly ordered love (p. 51).  To see truths about God may satisfy our minds, but to love God, with heart, mind, and soul, satisfies the soul.  And the reality and nature of the “soul” certainly concerned St. Augustine.  To know God and the soul, he thought, were the two great goals of man.  As Rist devotes a chapter to “soul, body and personal identity,” he makes clear that Augustine refused to reduce the soul to the body, ever insisting that there is a non-material essence to a human being, denouncing the “mad materialism” of Tertullian.  By nature we are, he said, both body and soul, mysteriously, indeed miraculously, “blended” together.  The body is the temple of the soul, worthy of reverence, and the body will be resurrected in the end, fulfilling God’s design for us. 

                Failing to fulfill that design–the imago dei–results, Augustine held, from a weakness of our will, the result of Adam’s fall.  As he expressed it in his Confessions, “it is I who willed it, and I who did not–the same I” (8.10.22).  Impaired by sin, the lack of love, “man is unable to choose the good (non posse non peccare), either in the sense that his good actions are never ‘wholly good’, because not motivated by pure love . . . or in the sense that at some point the will certainly choose evil” (p. 132).  Thanks be to God, however, His grace rescues us, restoring the freedom of the will, granting us sufficient strength to rightly respond to His love. 

                And it’s for love we are designed, to love we are called.  “the whole life of a good Christian is a holy desire,” he said (On John’s Epistle, 4.6).  A good man is impelled by “blazing love” (The Happy Life, 4.35).  Love, of course, may be perverted–loving self or things rather than God.  But rightly ordered, informing the virtues, love for God and neighbor constitutes the good life.  “When Augustine wishes to express the goal of the good life, he often speaks of the need to be ‘glued’ to God or ‘to cleave to God in love'” (p. 162).  Rightly loving God enables one to love others.  “For we are justified in calling a man good,” Augustine wrote in The City of God (11:28), not because he knows what is good, but because he loves the good.”  And he is able to love because God’s grace enables him to.  Consequently, Augustine’s famous injunction, Dilige et quod vis fac (“Love and do what you will”), has little in common with ancient antinomianism or contemporary “situation ethics.”  Rather, real love impels  one “wish to want what God wishes, loves and commands, and God wishes, loves and commands only what is constitutive of his own nature.  God’s nature is by definition unchanging; hence God’s love will be ‘eternal’, and hence we have an ‘eternal law'” (p. 191). 

                After dealing, in detail, with other aspects of Augustine’s thought, Rist concludes his book with a chapter entitled “Augustinus redivus.”  Granting that Augustine has been misread and misinterpreted–note Calvin’s take on his view of predestination, for instance–we should seriously study and courageously proclaim “the power and persuasiveness of many of Augustine’s ideas, and the perspicacity of many of his observations” (p. 292).  Reading Rist enables one to understand how this should be done.   He writes for scholars, and his works require disciplined attention.  But the rewards are worth the effort.  Few philosophers offer meatier material for Christians seriously committed to the truth and its proclamation.

132 Dissolving Materialism

Materialism, both scientific and philosophical, undergirds modernity.  The physical world, ourselves included, must be reduced to simple material entities, and if we understand them we understand everything.  This was proclaimed by Julien Offray de la Mettrie in the 18th century, who asserted, in L’homme machine (1747) that the mind and the brain are simply two words for a single material entity.  Essentially the same is declared by “evolutionary psychologists” such as MIT’s Steven Pinker today.  Man himself can be fully explained in terms of cells and neurons, following mechanical biological and chemical laws.  There are but slight differences of degree separating man from other animals, and to understand him the empirical sciences alone provide the key.  Reducing man to a machine, portraying the mind as a purely material entity–akin to the clockwork universe derived from Newton’s Laws–provides the foundation for secularism.

Countering such a worldview with the best of recent scientific research stands Jeffrey M. Schwartz, a professor of psychiatry at the UCLA School of Medicine, who with Sharon Begley has written a fascinating and persuasive treatise, The Mind and the Brain:  Neuroplasticity and the Power of Mental Force (New York:  HarperCollins Publishers, c. 2002).  This book builds upon the research he’s engaged in for 20 years, blending it into far-reaching philosophical conclusions, for “If materialism can be challenged in the context of neuroscience, if stark physical reductionism can be replaced by an outlook in which the mind can exert causal control, then, for the first time since the scientific revolution, the scientific worldview will become compatible with such ideals as will–and, therefore, with morality and ethics” (pp. 52-53).  He argues, armed with recent research breakthroughs, a view earlier advocated by noted neurologists such as Wilder Penfield, Charles Sherrington, and John Eccles–impeccably qualified scholars who (generally after a lifetime of study) concluded that there’s simply something more to the mind than the brain.  As Penfield said, in 1975, “‘Although the content of consciousness depends in large measure on neuronal activity, awareness itself does not . . . .  To me, it seems more and more reasonable to suggest that the mind may be a distinct and different essence'” (p. 163).

Materialistic assumptions–not accurate scientific data, Schwartz says–explain the deeply rooted belief that the brain, as a biological entity, fully explains our thinking processes.  Fleshed out in the highly influential writing of behaviorists such as John Watson and B.F. Skinner, or of psychoanalysts such as Sigmund Freud, materialism scoffed at free will and any alleged ability of the person thinking to transcend the mechanical activities of his brain.  To materialists, reference to any immaterial “mind” denotes the superstitions of a pre-scientific era.  Taking their position, of course, eliminates the possibility of consciousness (“knowing that you know” {p. 26}), free will and moral responsibility.  Indeed:  “The rise of modern science in the seventeenth century–with the attendant attempt to analyze all observable phenomena in terms of mechanical chains of accusation–was a knife in the heart of moral philosophy, for it reduced human beings to automatons” (p. 52).

Early enchanted by the mysterious inner workings of the thought processes, Schwartz began to do research with people suffering obsessive-compulsive disorders (e.g. repetitively washing one’s hands).  Drawing upon the Buddhist notion of mindfulness, he taught them to learn how to stand apart from their compulsive thoughts, to evaluate and consciously correct them, allowing their “minds” to give directions to their “brains.”  Such therapy did more than help his patients, however, for with the assistance of PET data Schwartz began to document the amazing plasticity of the brain.  “This was the first study ever to show that cognitive-behavior therapy–or, indeed, any psychiatric treatment that did not rely on drugs–has the power to change faulty brain chemistry in a well-identified brain circuit.  What’s more, the therapy had been self-directed, something that was and to a great extent remains anathema to psychology and psychiatry” (p. 90).  The conscious mind, supervising brain activities, actually re-wires the brain!

Schwartz’s neurological research linked him up with Henry Stapp, an eminent physicist working at the Lawrence Berkeley National Laboratory near Berkeley, California, who has devoted his scholarly career to the study of quantum physics.  Working out the implications of quantum theory as enunciated by John von Neumann, the great Hungarian mathematician, who said “‘that the world is built not out of bits of matter but out of bits of knowledge–subjective, conscious knowings'” (p. 31), Stapp’s research, fortuitously, paralleled Schwartz’s, and he had concluded, as is evident in Mind, Matter and Quantum Mechanics, that nuclear physics also reveals how the immaterial “mind” shapes the material world.  “‘The replacement of the ideas of classical physics by the ideas of quantum physics,'” says Stapp, “‘completely changes the complexion of the mind-brain dichotomy, of the connection between the mind and the brain'” (p. 48).  The reigning assumption, entrenched since Descartes, that only material entities could causally affect other material entites, dissolves in the world of quantum mechanics.  This is illustrated by the phenomenon of nonlocality, perhaps one of the most important scientific breakthroughs in the history of science.  Quantum physics shows that a specific “action here can [instantly] affect conditions there” (p. 348), even though here and there are light years apart!  Physical causation requires no material medium!  So Stapp and Schwartz both now believe that the power of the will, freely exercised, independent of physical stimuli, “generates a physical force that has the power to change how the brain works and even its physical structure” (p. 18).

In the process of building his philosophical case, Schwartz provides an extensive and fascinating discussion of what we know about the brain, a truly marvelous and mysterious three-pound ball of neurons.  He details how the brain develops, how it responds to various stimuli, how experiments with monkeys have opened for us deeper understandings of how it functions.  Virtually all the studies he discusses–and the high-level scholarly conferences he’s attended–have taken place during the past decade, and one easily grasps how up-to-date and pertinent is his presentation.  Within the past five years, for example, important and encouraging work has been done with small groups of stroke victims, who were once thought permanently disabled.  A new kind of therapy, constraint-induced (CI), reveals, for “the first time,” a demonstrable “re-wiring of the brain” following a stroke (p. 195).  Children suffering specific language impairment (SLI) may hope, given recently developed therapies, to overcome their affliction.

What’s being proved in such experiments is what researchers a decade ago widely doubted:  the reality of neurogenesis, neuroplasticity–consciously directed brain developments.   This further means we are truly free to think and to act.  Locked into classical physics, even Einstein in 1931 declared that it is “man’s illusion that he [is] acting according to his own free will'” (p. 299).  Ever resisting quantum theory, with its indeterminism, Einstein represents a worldview in the process of dissolving, Schwartz believes.  And he cites recent, carefully crafted experiments, documented in a special 1999 issue of the Journal of Consciousness Studies devoted to “The Volitional Brain:  Towards a Neuroscience of Free Will,” that demonstrate the growing openness to human freedom in the brain research community.  Much of this is to say that William James was right, a century ago, when he insisted that “Volitional effort is effort of attention.”  What we freely attend to, in our consciousness, shapes us.  “The mind creates the brain” (p. 364).  Obviously the brain is the material with which the mind works.  But mind is more than the brain.  As Anthony Burgess wrote, in A Clockwork Orange, “‘Greatness comes from within . . . .  Goodness is something chosen.  When a man cannot choose he ceases to be a man'” (p. 290).

The Mind and the Brain is one of the most fascinating books I’ve read in some time.  Dealing with some of the most difficult theoretical issues imaginable, the authors succeed in making clear the implications of the most recent scientific research.  And, equally important, they understand the philosophical implications of their study and develop them persuasively.

* * * * * * * * * * * * * * * * * *

Coming at the same issue from a very different perspective is Benjamin Wiker’s Moral Darwinism:  How We Became Hedonists (Downers Grove:  InterVarsity Press, c. 2002).  The book’s plot, as William Dembski says, is this:  “Epicurus set in motion an intellectual movement that Charles Darwin brought to completion” (p. 9).  Still more:  “Understanding this movement is absolutely key to understanding the current culture war” (p. 9).  Underlying both the ancient and the modern versions of hedonism is an anti-supernatural cosmological materialism.  Consequently, theists who see God at the center of their worldview cannot but do battle with Epicureans of every century, and Wiker wants to help arm us for active combat.

Materialism pervades virtually all branches of science, ranging from astronomy to microbiology, as naturalistic thinkers insist that everything that exists can be reduced to simply material entities.  The basic reason for this, Wiker says, is that “modern science itself was designed to exclude a designer.  Even more surprising, modern science was designed by an ancient Greek, Epicurus,” who lived three centuries before Christ (p. 18).  “The argument of this book, then, is that the ancient materialist Epicurus provided an approach to the study of nature–a paradigm, as the historian of science Thomas Kuhn called it–which purposely and systematically excluded the divine from nature, not only in regard to the creation and design of nature, but also in regard to divine control of, and intervention in, nature.  This approach was not discovered in nature; it was not read from nature.  It was, instead, purposely imposed on nature as a filter to screen out the divine” (p. 20).  To support his hedonistic ethics, to feel at ease with his lifestyle, Epicurus set forth a materialistic cosmology.  Centuries later, “Modernity began by embracing his cosmology and ends by embracing his morality” (p. 23).

Wiker develops his argument by tracing historical developments of Epicurean thought.  Embracing Democritus’s scientific hypothesis–that nothing exists but atoms-in-motion–Epicurus developed a consistent materialism that reduces moral questions to preferences of pleasure rather than pain.  Good is what feels good.  Evil is what feels bad.  So do whatever feels good, however much it may change from time to time and place to place.  Epicurus’s ideas were picked up and given poetic expression by Lucretius, one of the great Latin stylists.  Though Hedonism certainly impacted the ancient world, it wilted under the philosophical weight of Platonic and Aristotelian philosophy and the dynamic growth of Christianity.  The world is as it is, Christians insisted, because God designed it.  The godless cosmos and normless ethos of Epicurus slipped into the cellar of discarded errors as Christians shaped Western Christian Culture during the Medieval Era.  But errors are often dragged back to light, dressed up in new clothes, and such happened to Epicureanism.  During the late Middle Ages the authority of Aristotle was questioned and nominalism made powerful inroads in key quarters.  As the Renaissance developed, Lucretius was rediscovered, along with other classical texts, paving the way for the “scientific revolution” of the 17th and 18th centuries.  “We are materialists in modernity,” Wiker says, “in no small part, because were lovers of Lucretius at the dawn of modernity” (p. 59).

Shaping modernity were gifted scientists such as Galileo and Newton, in whom Wiker sees “the vindication of atomism through the victory of mathematics” (p. 112).  Consequently, under the guidance of increasingly irreligious scientists, a triumphant worldview is established which demonstrates “the complete theoretical victory of Epicurean materialism, all the essential elements of Epicurus’s system–the eternal and indestructible atoms, the infinite universe with the unlimited number of worlds the banishment of the creator God, the rejection of miracles, the displacement of design in nature by chance and material necessity, and the elimination of the immaterial soul–fell into place during the eighteenth and nineteenth centuries” (p. 112).  Laplace’s answer to Napoleon’s question concerning the place of God in his scientific work, sums up the consummation of this process:  “Sire, I have no need of that hypothesis.”

Without God, objective morality disappears as well.  Such is starkly evident in the work of Thomas Hobbes, one of the architects of modern thought.  By nature, we war against each other; only the fittest survive–nothing is naturally right or wrong.  To secure a peaceful society, however, we assent to the rule of a sovereign, who prescribes the rules.  Hobbes also helped subvert the authority of any divinely inspired Scripture, devising an approach of interpretation consonant with his Epicurean materialism, denying the reality of the immaterial, immortal soul, questioning the possibility of miracles and of heaven and earth.  Benedict Spinoza picked up on such ideas, and the corrosive acid of biblical criticism gained momentum.  So it follows that Thomas Jefferson, who “considered himself an Epicurean and studied Epicurus in Greek” (p. 207) and put together his own sacred text, entitled The Life and Morals of Jesus of Nazareth. 

Importantly, Wiker concludes, Epicureanism shaped Darwinism.  A materialistic metaphysics, evident in both positions, cannot be shape the ethical views it dictates.  Neither Epicurus nor Darwin had demonstrable evidence for their theories, but they both had a solid faith in their explanatory powers.  Eminent scientists, such as Lord Kelvin (relying on statistical probability) and Louis Agassiz (the reigning expert on fossils), resolutely critiqued the theory of evolution through natural selection.  But philosophers (Spencer and Marx) and publicists (Huxley) found it perfectly designed for their moral and social agendas.  Importantly, Wiker says, “We must always keep this in mind:  for Darwin nature did not intend to create morality, any more than nature intended to create certain species; morality was just one more effect of natural selection working on the raw material of variations in the individual” (p. 244).

In an amoral cosmos, of course, anything goes.  Thus Darwinian science has incubated Epicurean Hedonism.  Here Wiker guides us through the development of eugenics, from Darwin through Haeckel (whose books sold hundreds of thousands of copies in Germany) to Hitler himself.  Eugenics easily justifies abortion and euthanasia, also proposed by Haeckel as ways whereby to purify the race and later employed by Hitler’s henchmen.  Nearer home, Margaret Sanger embraced Darwinism and promoted various eugenic measures.  She championed birth control, for example, in order “‘To Create a Race of Thoroughbreds'” (p. 266).  Sexual activity itself, Sanger believed, should involve anything that feels good, for nothing is moral in the world of evolution through natural selection.

Even more abandoned to amorality was Alfred Kinsey, long regarded as an eminent man of science, a “sexologist” who allegedly informed the nation how people actually behaved.  Recent studies reveal that Kinsey was an incredibly perverted man, engaging in various forms of deviant behavior, including pedophilia.  His allegedly “scientific” studies were, in fact, fraudulent screeds designed to encourage the breakdown of sexual restraint.  However untrue, his views entered the nation’s textbooks and journalistic assumptions, powerfully evident in an episode on the recent PBS Evolution series, where viewers were encouraged to see the similarities between the sex life of humans and some primates called “bonobos,” who engage in all sorts of sexual activity (heterosexual and homosexual, adults with juveniles) simply for pleasure.  Consequently:  “Just as Kinsely’s views on the naturalness of premarital sex and homosexuality became the scientific foundation for the transformation of sexual morality from a Christian natural law position to that of the Epicurean, so also Kinsey’s views on the naturalness of pedophilia have become the foundation of the slow but sure revolution going on right now pushing adult-child sex and natural” (p. 285).  And, according to Darwinian principles, anything that feels good is natural and thus allowed.

Wiker sets forth a fascinating historical thesis.  To see modernity in the light of Epicurus certainly clarifies the deeply philosophical premise that shapes our culture.  To do as well as our ancient Fathers in the Faith, responding to hedonism, is clearly our challenge.

131 Islam: Past & Present

A widespread scholarly consensus exists concerning the Middle East:  to historically understand it one must read the works of Bernard Lewis, Professor of Near Eastern Studies Emeritus at Princeton University.  Having written over two dozen scholarly studies, he is well qualified to explain, in his most recent publication, What Went Wrong?  Western Impact and Middle Eastern Response (New York:  Oxford University Press, c. 2002).  For three centuries, he says, Muslims have asked this question, and it underlies much of the anger and envy now evident in the terrorism that now haunts the West.  Indeed, “In the course of the twentieth century it became abundantly clear in the Middle East and indeed all over the lands of Islam that things had indeed gone badly wrong.  Compared with its millennial rival, Christendom, the world of Islam had become poor, weak, and ignorant” (p. 151).

This reversed the conditions of the world Muslims once ruled.  Following Mohammed’s death in 632 A.D., his followers rapidly conquered much of the formerly Christian world–Syria, Palestine, Egypt, North Africa, Spain, Sicily.  By 732 they were in central France, and in 846 “a naval expedition even sacked Ostia and Rome” (p. 4).  In 1453 Muslims conquered Constantinople, finally burying the last remnants of the once powerful Byzantine Empire, and added the Balkans to their hegemony.  By 1529, as Luther was orchestrating his Reformation in Wittenberg, Muslim armies threatened Vienna, only to be repelled by Charles V.  Indeed, “Islam represented the greatest military power on earth,” Lewis says, and sustained it with a sophisticated (albeit exploitative) economic system (p. 6).

Then, abruptly, things changed.  Europeans, after a millennia defending themselves against Islam, took the offensive and rapidly overwhelmed their oppressors.  Incubated by the Renaissance and Enlightenment, new technologies provided Europeans the means with which to outmaneuver and overwhelm their foes.  Portuguese and Spanish explorers bequeathed colonies to their monarchs, encircling the Muslims and disrupting their trade monopolies, funneling gold and silver and agricultural products into Europe.  Whereas a Muslim army had merely been repulsed at Vienna in 1529, the second siege of Vienna, in 1683, resulted in a disastrous defeat, followed by a rout.  In the words of an Ottoman chronicler:  “This was a calamitous defeat, so great that there has never been its like since the first appearance of the Ottoman state” (p. 16).  Further east, Russia’s tsars, recovering lands lost during the Mongol invasions and occupation, began pushing south and east, challenging Muslim dominance.  By 1696, Peter the Great had occupied Azov, providing Russia a port on the Black Sea.

For the next three centuries, Muslims struggled to cope with their new, largely inferior status vis a vis Europe, trying to understand “what went wrong.”  One thing they learned, Lewis says, was learned on the battlefield.  Once almost omnipotent in battle, Muslims found themselves shocked by Europe’s military superiority.  Technically, whether considering naval vessels or soldiers’ arms, the West had advanced in military equipment whereas Muslims still tended to rely upon their swords and personal valor.  By 1798, when Napoleon and a small corps of French soldiers invaded and occupied Egypt, the disparity was clear, and during the 20th century most Arab lands were reduced to the humiliating status of European colonies.

Muslim inferiority was similarly evident in trade and commerce.  During the Renaissance and Enlightenment, Europeans began to study other languages and understand other cultures, whereas Muslims (elitists who disdained lesser cultures) rarely bothered to learn about their Christian foes.  To travel outside Muslim realms, to study under infidels, to acknowledge the achievements of non-Muslim peoples, was discouraged.  Though certain Western technologies were coveted and appropriated, the widespread resistance to everything associated with the Christian world prevented Muslims from assimilating many of the “modern” developments that transformed the world.  Illustrating the outcome of this process, Lewis says that today:   “the total exports of the Arab world other than fossil fuels amount to less than those of Finland, a country of five million inhabitants.  Nor is much coming into the region by way of capital investment.  On the contrary, wealthy Middle Easterners prefer to invest their capital abroad, in the developed world” (p. 47).

Turning to “social and cultural barriers,” Lewis focuses on three oppressed groups within Islam:  unbelievers, slaves, and women.  Though unbelievers enjoyed a degree of “tolerance,” economic restrictions and social pressures severely reduced their standing.  While Europeans largely outlawed slavery in the 19th century, the institution still persists in Muslim circles.  And virtually every Westerner visiting Muslim lands immediately notices the subordinate status of women under Islam.  Resurgent Islam, directed by radicals like the Ayatollah Khomeini, insist that “the emancipation of women–specifically, allowing them to reveal their faces, their arms, and their legs, and to mingle socially in the school or the workplace with men–is an incitement to immorality and promiscuity, and a deadly blow to the very heart of Islamic society, the Muslim family and home” (p. 70).

However embedded in Muslim traditions, such social and cultural factors contributed to the isolation and progressive impoverishment of their nations.  So they fell victim to European superiority.  Yet while Europeans–and now Americans–flexed their muscles in Arab countries, an abiding resentment boiled within Arab hearts.  So too, as Israel attained statehood–and developed a flourishing society in an area long reduced to a desert under Arab rule–a virulent anti-Semitism boiled to the surface.  Prophetically, writing this book in 1999, Lewis noted:   “If the peoples of the middle East continue on their present path, the suicide bomber may become a metaphor for the whole region, and there will be no escape from a downward spiral of hate and spite, rage and self-pity, poverty and oppression, culminating sooner or later in yet another alien domination; perhaps from a new Europe reverting to old ways, perhaps from a resurgent Russia, perhaps from some new, expanding superpower in the East” (pp. 159-160).

*************************************

For anyone interested in a more detailed history, Bernard Lewis’s The Middle East:  A Brief History of the Last 2,000 Years (New York:  Simon & Schuster, c. 1995) is probably the best available.  Accurate, analytical, up-to-date, readable, it deserves the accolades such as “masterpiece” routinely given it.

After sketching the pre-Christian societies in the Middle East, explaining the various peoples living therein, Lewis charts Christianity’s the effective expansion and establishment–from Ethiopia to Persia, from Macedon to Arabia–during the first six centuries of the Christian Era.  Then came Mohammed!  His teachings inspired devotees to conquer much of the world in the seventh century.  More importantly, Lewis says:  “It is the Arabization and Islamization of the peoples of the conquered provinces, rather than the actual military conquest itself, that is the true wonder of the Arab empire” (p. 58).  Amidst the success of Arab armies, however, the empire developed internal tensions.  Mohammed’s immediate successors, the “caliphs,” quarreled among themselves.  Indeed, during the “golden age” of Islam three of the four caliphs were assassinated.  Mohammed’s blood relatives struggled against those who claimed to better represent the prophet.  So factions developed– Shi’ite battling and Sunni–that still divide the Muslim world.

Despite internal turmoil, however, the Arab Empire prevailed, dominating much of the globe for 1,000 years.  Providing accurate information, without getting buried in the details, Lewis gives a cogent overview of the ‘Abbasid Caliphate, then charts the “steppe people’s” invasions from the north and east, including the conquests of Jenghiz Khan’s Mongol warriors.  First absorbing the blows of the invaders, then slowly converting them to Islam, Muslims preserved the essential character of Islam, though the center of power constantly as the dominance of one group (i.e. Egypt or Persia) dictated its trajectory.

Following a chronological overview, Lewis discusses various aspects of Muslim culture, explaining such things as the politics, economics, the elites, religion and law.  To Muslims, he explains, there is no clear distinction between politics and religion.  In accord with Mohammed’s teaching and example, “the choice between God and Caesar, that snare in which not Christ but so many Christians were to be entangled, did not arise.  In Muslim teaching and experience, there was no Caesar.  God was the head of the state, and Muhammad his Prophet taught and ruled on his behalf” (p. 138).  Since Muhammad himself was a trader and warrior, and his Arab followers were nomadic herdsmen and warriors, they tended to have little interest in agriculture.  Consequently, as the great Muslim historian Ibn Khaldun noted in the 14th century, under Islam “‘ruin and devastation prevail’ in North Africa, where in the past there was ‘a flourishing civilization, as the remains of buildings and statues, and the ruins of towns and villages attest'” (p. 166).  Warriors from the Arabian desert generally made deserts wherever they settled!

Lewis clearly explains Islam’s core elements, such as its “five pillars.”  Given the current world scene, his discussion of “jihad” (holy war) clarifies the perennially militant stance Muslims assume, for they embrace a sacred obligation to conquer the world and bring all peoples into submission to Islamic law (and thence, encourage conversion to the Islamic faith).  Consequently, “the Christian crusade, often compared with the Muslim jihad, was itself a delayed and limited response to the jihad and in part also an imitation.  But unlike the jihad it was concerned primarily with the defense or re-conquest of threatened or lost Christian territory” (p. 233).  Muslims, Lewis shows, were preoccupied with internal controversies and paid little attention to the Christian crusades.  And they certainly did not condemn them as do modern Westerners who wield the Crusades as a bludgeon with which batter Christianity.

**********************************************************

Far more critical of Islam, Bat Ye’or, an Egyptian-born scholar living in France, recounts what Christians suffered under Muslim rule in The Decline of Eastern Christianity Under Islam:  From Jihad to Dhimmitude (Cranbury, NJ:  Associated University Presses, c. 1996).  In an enlightening foreword to the book, Jacques Ellul notes that there exists in the West a “current of favorable predispositions to Islam,” notably evident in the many euphemistic discussions of jihad.  By setting forth the historical facts, Bat Ye’or dares to contradict the prevailing assumptions regarding Islam.  “Historians,” Bat Ye’or says, “professionally or economically connected to the Arab-Muslim world, published historical interpretations relating to the dhimmis, which were either tendentious or combined with facts with apologetics and fantasy.  After World War II, the predominance of a left-wing intelligentsia and the emergence of Arab regimes which were “socialist’ or allied to Moscow consolidated an Arabophile revolutionary internationalism” that remains strong is much of the contemporary world (pp. 212-213).

Jihad, in fact, helps constitute Islam, Ellul says, for it is a sacred duty for the faithful.  Indeed “it is Islam’s normal path to expansion.”  Unlike the “spiritual” combat imagined by some pro-Islamic writers, jihad  advocates “a real military war of conquest” followed by an iron-handed “dhimmitude,” the reduction of conquered peoples to Islamic law (p. 19).  Muslims divide the world into two–and only two–realms:  the “domain of Islam” and “the domain of war” (p. 19).  At times, strategy dictates tactical concessions and “peaceful” interludes.  But ultimately, Muslims are committed to conquer and control as much of the globe as possible.  Ellul stresses this “because there is so much talk nowadays of the tolerance and fundamental pacifism of Islam that it is necessary to recall its nature, which is fundamentally warlike!” (p. 20).  Writing presciently, in 1991, Ellul declared:  “Hostage-taking, terrorism, the destruction of Lebanese Christianity, the weakening of the Eastern Churches (not to mention the wish to destroy Israel) . . . all this recalls precisely the resurgence of the traditional policy of Islam” (p. 21).

Turning from Ellul’s remarks to Bat Ye’or’s treatise, we enter into a carefully crafted description of what happened to non-Muslim peoples under the yoke of Islam in the Mediterranean basin, Turkey, Armenia, Mesopotamia, and Iran, a subject heretofore distinguished by a paucity of reliable studies.  She meticulously defines jihad, noting that it may be waged through both overt war and more covert means:  “proselytism, propaganda, and corruption” (p. 40).  Whatever means necessary for Muslims to conquer and control lands and non-Muslim peoples find justification as jihad.  Thus motivated, Muslims established an enormous empire by the time of Charlemagne (ca. 800 A.D.), though in truth Muslim warriors were often brutal and booty-hungry pillagers, driven more by greed than holy zeal.

So too, when Muslims ruled a region, reducing all non-Muslims to dhimmitude, they exploited and oppressed (especially through onerous, discriminatory taxation) their subjects.  Forcibly occupying highly-civilized realms such as Egypt, Muslim rulers slowly and surely reduced them to wastelands, economically and culturally depressed shadows of ancient glory.  Everywhere the Muslims went, there resulted “the agricultural decline, the abandonment of villages and fields, and the gradual desertification of provinces–densely populated and fertile during the pre-Islamic period” (p. 102).   All the land under Muslim rule was “administered by Islamic law for the benefit of Muslims and their descendents” (p. 70).  More systematically and thoroughly than Europeans appropriating American Indian lands, the Muslims impoverished conquered peoples.  Even the much-vaunted “Islamic civilization” was derived, sucked out of dying corpses, not created.  “Islamic literature, science, art, philosophy, and jurisprudence,” Bat Ye’or says, “were born and developed not in Arabia, within an exclusively Arab and Muslim population, but in the midst of conquered peoples, feeding off their vigor and on the dying, bloodless body of dhimmitude” (p. 128).

Theoretically, Jews and Christians had religious freedom, but in fact “at no period in history was it respected” (p. 88).  Theoretically, conversions to Islam were to be voluntary.  In fact, massacres, torture, slavery and intimidation punctuated the process.  In Spain, two centuries after occupation, “in 891 Seville and its surrounding areas were drenched in blood by the massacre of thousands of Spaniards–Christian and muwallads.  At Granada in 1066, the whole Jewish community, numbering about three thousand, was annihilated” (p. 89).   To understand the much-maligned Christian Crusades, one must see them as defensive, just wars designed to relieve the suffering of oppressed and enslaved believers.  Centuries later, the 1915 “the genocide of the Armenians was a combination of massacres, deportations, and enslavement.  In the central regions of Armenia, the male population over the age of twelve was wiped out en masse:  shot, drowned, thrown over precipices, or subjected to other forms of torture and execution” (p. 196).

In short, Bat Ye’or says, “irrefutable historical and archaeological sources confirm” that the “process of Islamization” in conquered lands, “was perhaps the greatest plundering enterprise in history” (p. 101).  Reading this book certainly sobers one!  She supports her presentation with extensive footnotes and 175 pages of illustrative documents and finds little admirable in Islamic rule.  The weight of the evidence, the factual refutation of Arabophile histories, persuades one that the terrorists operating in the world today are hardly an aberration of Islam!

******************************************

For a brief, handy overview of the subject, James L. Garlow’s A Christian’s Response to Islam (Tulsa:  RiverOak Publishing, c. 2002) sets forth a pastor’s response to 9/11, including a clear critique of the gushy universalism that “referred to every deceased person as ‘being in Heaven'” (p. 83).  Such sentimentality was further evident when a “United Church of Christ fellowship announced it would substitute readings from the Koran for Bible readings for eight consecutive Sundays.  The pastor of one of the nation’s largest Methodist churches declared in a magazine article that God is the same one worshipped in ‘mosques, synagogues, and churches'” (p. 72).  Against such Garlow protests, for his concern is not so much with fully understanding Islam as with rightly responding as committed Christians to the contemporary scene.  The book began as a series of ever-expanding e-mailings to friends following the terrorists’ attacks, and, without pretending to be the definitive study of Islam or to provide a scholarly appraisal of its history, “it has one agenda:  to increase love and boldness for Christ with the result that we more effectively share Him with all (including Muslims), rather than simply ‘blending in with our multireligious culture” (p. 6).

Garlow roots his presentation in the ancient biblical account of Ishmael and Isaac, then explains how Mohammed and the Muslims, following the Koran’s message, have impacted the world.  In response, Christians must avoid either “Muslim-bashing” or “the knee-jerk reaction of platforming Muslims in Christian churches, thus implying that ‘We all worship the same God’ or buying into the politically correct line that ‘Islam is a religion of peace'” (p. 85).  There is, for example, a distinctive difference between Jehovah, revealed in the Old Testament, and Allah, highlighted in the Koran.  Jesus, to the Muslim, is merely one of 25 prophets, with Mohammed the last the most important.  To Christians, of course, He is the Eternal Son of God.  Consequently, Christians should take the opportunity to proclaim ever more vigorously that Jesus is the name above all names, the sole Savior of all mankind!  Without compromising their faith, Christians must also extend the hand of friendship to Muslims, building good relationships with them, learning the truth about their faith and their culture.  Having established a position of trust, dealing with them in very personal ways, Christians can bear witness to the faith that is within them, especially emphasizing the centrality of Christ.

# # #

130 Laws of Leadership

            

For many years John Maxwell has both exemplified and written about “leadership.”  Though his concern has always been the local church, having long pastored San Diego ‘s Skyline Wesleyan Church, his influence now includes the corporate world as well.  His The 12 Irrefutable Laws of Leadership:  Follow Them and People Will Follow You (Nashville:  Thomas Nelson Publishers, c. 1998) contains, he says, a “short list” of all he has learned.  The book became quite a “best seller,” garnering plaudits from diverse corners. 

Such plaudits include these words from Tom Landry, former coach of the Dallas Cowboys:  “John Maxwell understands what it takes to be a leader, and he puts it within reach with The 21 Irrefutable Laws of Leadership.  I recommend this to anyone who desire success at the highest level, whether on the ball field, in the boardroom, or from the pulpit.”  The founder of Promise Keepers, Coach Bill McCartney, agrees:  “In typical Maxwell style, filled with wisdom, wit, and passion, John provides a wealth of practical insights on what it takes to be a successful leader.”

            Let me simply list Maxwell’s “laws.”  1.  THE LID.  “Leadership Ability Determines a Person’s Level of Effectiveness.”  2.  INFLUENCE.  The True Measure of Leadership is Influence–Nothing More, Nothing Less.”  3.  PROCESS.  Leadership Develops Daily, Not in a Day.  4.  NAVIGATION.  Anyone Can Steer the Ship, But It Takes a Leader to Chart the Course.  5.  E.F. HUTTON.  When the Real Leader Speaks, People Listen.  6. SOLID GROUND.  Trust Is the Foundation of Leadership.  7.  RESPECT.  People Naturally Follow Leaders Stronger than Themselves.  8.  INTUITION.  Leaders Evaluate Everything with a Leadership Bias.  9.  MAGNETISM.  Who You Are Is Who You Attract.  10.  CONNECTION.  Leaders Touch a Heart Before They Ask for a Hand.  11.  INNER CIRCLE.  A Leader’s Potential Is Determined by Those Closest to Him.  12.  EMPOWERMENT.  Only Secure Leaders Give Power to Others.  13.  REPRODUCTION.  It Takes a Leader to Raise Up A Leader.  14.  BUY-IN.  People Buy Into the Leader, Then the Vision.  15.  VICTORY.  Leaders Find a Way for the Team to Win.  16.  BIG MO.  Momentum Is a Leader’s Best Friend.  17.  PRIORITIES.  Leaders Understand That Activity Is Not Necessarily Accomplishment.  18.  SACRIFICE.  A Leader Must Give Up to Go Up.  19.  TIMING.  When to Lead is as Important As What to Do and Where to Go.  20.  EXPLOSIVE GROWTH.  To Add Growth, Lead Followers–To Multiply, Lead Leaders.  21.  LEGACY.  A Leader’s Lasting Value is Measured by Succession. 

            Given the appeal of Maxwell’s work, the current pastor of Skyline Wesleyan, Jim Garlow, decided to illustrate its principles through a survey of historical leaders, titling his spin-off The 21 Irrefutable Laws of Leadership Tested by Time:  Those Who Followed Them . . . And Those Who Didn’t (Nashville:  Thomas Nelson Publishers, c. 2002).  To help him with the research, Pastor Garlow asked me to join him in the project, and he graciously credits me, on the title page, for my assistance, so I confess a vested interest in the publication.

            Prior to his pastoral ministry, Garlow earned a M.Th. from PrincetonUniversity and a Ph.D. in church history from DrewUniversity.  He has an absorbing interest in history and believes that “history is a great teacher.  By looking at the successes and failures of those who have gone before us, we can hopefully avoid their errors and gain from their strengths” (p. 2).  During one’s lifetime critics and lapdogs easily err, but judicious historians more accurately appraise a man’s true worth.  To them it becomes clear that some folks sacrifice their lives for “things that do not retain value.”  Conversely, others loom large for wisely investing in those permanent things that matter most.  Looking to the past, we discern those “who understood the principles of leadership” as well as those who tragically failed. 

            Maxwell’s first law, “The Law of the Lid,” insists that “leadership ability determines a person’s level of effectiveness.”  This law stands revealed in men who had great talents, unusual potential, but failed for lack of leadership skills.  “Leadership skill,” notes Garlow, “is the difference between success and failure; it is the difference between creative vitality and mediocre maintenance” (p.2).   This is dramatically illustrated in one of the two father-son teams that served as presidents of the United States, John Adams and John Quincy Adams. 

            “The second and sixth presidents of the United States came to that position thoroughly gifted and prepared–or so it seemed” (p. 7).  When elected President in 1796, John Adams enjoyed great prestige.  He’d excelled in virtually every previous endeavor, serving as a leader in the Continental Congress and as George Washington’s Vice President.  Furthermore, he was widely respected for his integrity.  He was, however, somewhat egotistical and bullheaded, adept at alienating both friends and foes.  Benjamin Franklin, who knew him well, quipped that he was “always honest, often great, but sometimes mad.”

            Taking up the reins of the presidency in 1797, Adams quickly showed how a gifted man fails as a leader.  Like many who personally perform well, he “was unable to delegate” (p. 8).   Like the Lone Ranger, “he tried to do most everything himself” (p. 8).  Compounding the problem, he frequently absented himself from his office!  “He loved his home in Quincy, Massachusetts, and was unusually unhappy in Philadelphia,” so he  “spent a shocking one-fourth of his presidency away from the nation’s capital, in Boston, in an era without phones, faxes, computers, or any other means of communication faster than horse travel!  He was an absentee president” (p. 8).  As is typical of highly intelligent men, Adams often saw too many sides of various issues and failed to act when crises demanded it.  Consequently, Adams lost the election of 1800.  “Inability to delegate, absenteeism, communication deficiencies, indecisiveness, and lack of discernment have one thing in common: lack of leadership skills,” Garlow says.  “Was he honest? Yes. Was he bright? Yes. Was he good?  Yes” (p. 10).   

            In 1824, John Adams’s son, John Quincy Adams, was elected the nation’s sixth president.  No one could ask for better parents!  “He had a loving father who guided him. His mother, Abigail Adams, was one of the most outstanding colonial women. Son John inherited much of his parents’ intellectual brilliance and Puritan ethic” (p. 10).   He was an unusually gifted man, obviously one of the most intelligent and most experienced of America’s presidents.  But despite his “uninterrupted success” in earlier assignments, he almost immediately failed.  Like his father, he had poor “relationship skills,” proving himself “exceptionally able to offend and alienate people.”  When he met Andrew Jackson, who had received more votes than Adams in the election, he refused to shake hands with the general,  “who graciously greeted him and offered his hand. Petulantly, Adams stood immobile, disdaining Jackson’s gesture, and replied in a manner designed to offend” (p. 11). 

            When he addressed the nation as President, Adams spoke apologetically, inviting criticism through his own lack of confidence in his abilities.  He was, without question, a good man, dedicated to his work.  “But he failed as a leader” (p. 12).  In Samuel Eliot Morison’s appraisal, he “‘was a lonely, inarticulate person unable to express his burning love of country in any manner to kindle the popular imagination.'”  As “John Maxwell so often says, “He who thinks he is a leader, but has no followers, is only taking a walk'” (p. 13).  “Much like his father,” Garlow says, “John Quincy Adams illustrates the ceiling principle. Utterly competent on one level, he failed to grow with his opportunities and failed to effectively serve as president. And that effectiveness hung on one thing: leadership” (p. 13).

            In contrast to the two Adams, another President, Theodore Roosevelt, provides a pattern for great leaders.  He illustrates the second “irrefutable law,” The Law of Influence.   “Leadership ultimately is influence” (p. 22).  “In 1910, at the Sorbonne in Paris, Roosevelt gave a speech that has been quoted by leaders ever since. It depicts his vigorous view of life and contains a profound challenge to everyone who reads the words today:

         “It is not the critic who counts: not the man who points out how the strong man stumbles or where the doer of deeds could have done better. The credit belongs to the man who is actually in the arena [italics the author’s], whose face is marred by dust and sweat and blood, who strives valiantly, who errs and comes up short again and again, because there is no effort without error or shortcoming, but who knows the great enthusiasms, the great devotions, who spends himself for a worthy cause; who, at the best, now, in the end, the triumph of high achievement, and who, at the worst, if he fails, at least he fails while daring greatly [italics the author’s], so that his place will never be with those cold and timid souls who knew neither victory nor defeat” (p. 23).  

Garlow challenges readers to note TR’s “words: ‘the man who is actually in the arena,’ ‘at least he fails while daring greatly.’  Those words ignite human hearts. That is the language of a leader. Those are the concepts of an influencer” (p. 23).  

            Roosevelt’s exploits, from the Spanish-American War through his years as President, reveal his ability to influence others.  The men he recruited for his famous “Rough Riders” followed them because he inspired them.  He truly cared for them and they loved him for it.  “Leaders draw others to themselves and their causes, even when the cause is difficult,” Garlow notes.  “Roosevelt’s cause was one that demanded a tough love, which calls men to risk their very lives in serving a higher good. Only leaders can inspire others to that level. There’s a name for it: influence” (p. 27).  His influence streamed, in part from his infectious courage.  In his Autobiography, he confessed, “There were all kinds of things I was afraid of at first, ranging from grizzly bears to ‘mean’ horses and gun-fighters; but by acting as if I was not afraid I gradually ceased to be afraid” (27).  Whether leading soldiers or declaiming from the “bully pulpit” in the White House, TR inspired men by his courageous confidence. 

            Moving to the third “irrefutable law of leadership,” The Law of Process, we discover that “Leadership Develops Daily, Not in a Day.”  Here Pastor Garlow provides some personal background, saying:  “I am uniquely qualified to write this book. Of the six billion persons on earth, I am the only one who had to follow John Maxwell in a leadership position since he has become so knowledgeable on leadership.”  Maxwell pastored San Diego’s SkylineWesleyanChurch for 14 years, and when he resigned Jim Garlow was asked “to consider coming to Skyline as the new senior pastor.  I immediately declined, saying, ‘Anyone who tries to follow John Maxwell is a fool.’ (Several years have passed since I made that comment. I think the statement might still be true!) Four months later, I found myself accepting the senior pastoral role at SkylineChurch.  I did follow–or attempted to follow–Maxwell. And it has been a challenge” (p. 36).

            The challenge came from trying to succeed (and then succeed) a highly gifted pastor.  Garlow had much to learn!  And learn he did, as the church’s continued growth and ministry testifies.  Learning “process” skills, however, stretched him.  He “underestimated” its importance.  In part this stemmed from the fact that he tends to be “event driven.”  As he confesses,  “I was an ‘event king.’  In fact, I can ‘out event’ anybody.  At ‘eventing,’ I’m good!  But leaders are not produced in events. They are made in process.  So I have been on a huge learning curve for the past few years.  I wish I could say that I have changed, and that I have conquered the process concept.  I haven’t. But I’m growing. I’m not where I want to be. But I’m not where I used to be. And while I see how far I have to go, I am thankful for the progress” (p. 37).  

            The importance of process appears in a careful study of the difference between the followers of two 18th century “exceptionally gifted” evangelists, George Whitefield and John Wesley.  “Both commanded enormous respect. Tens of thousands followed them” (p. 37).  They had “much in common, but they had one noticeable difference. As the years went by, Whitefield’s followers dissipated.  His organization faltered.  Wesley’s did not. What was the difference? Both men were brilliant. Both were winsome and compelling communicators. Both experienced phenomenal success in their lives. But Wesley understood process. Whitefield, it would appear, did not” (p. 38).  

            Whitefield was a powerful orator, probably the greatest of his generation.  He preached some 18,000 times, both in England and the American Colonies.  He helped ignite the Great Awakening in America.  “Thousands responded to his booming voice, which could be heard by a crowd of 20,000 (some have dared to say 40,000) without present-day public address systems” (p. 38).  He received generous financial support and established charitable foundations, especially orphanages.  Many gave of their finances to help support the orphanage that his wife operated in the Georgia Colony. 

            Wesley, like Whitefield, attended OxfordUniversity and became a priest in the Church of England.  Transformed by his Aldersgate experience in 1738, where his “heart was strangely warmed,” he joined his friend Whitefield in an innovative technique, preaching in open fields.   His preaching (some 40,000 times!) helped launch the “Evangelical Revival” which renewed religious life in England.  He continually traveled and preached.  “His energy level was amazing. He arose every morning at four o’clock, working eighteen-hour days. He rode on horseback a quarter of a million miles. He stopped riding a horse when he reached about seventy years of age, but he continued the rigorous travel schedule by horse and buggy. He traveled 4,000 to 5,000 miles a year, as many as 80 miles a day! It is believed that Wesley may have spent more time in the saddle that any other man who ever lived, including Bonaparte and Caesar. Equally amazing was his ability to convert the saddle to a library chair, reading literally hundreds of books while riding on horseback” (p. 40).

            In addition to preaching he wrote or edited some 233 books.  “At the time of his death in 1791, he led an enormous organization: 120,000 members in the Methodist movement, with some suggesting that the total adherents numbered one million” (p. 40).  More importantly, “Wesley’s Methodist movement flourished globally after Wesley’s death. Today there are scores of denominations that point to Wesley as their inspiration. There are millions of believers who see him as father of their denominations. In contrast, George Whitefield’s denomination, the Calvinist Methodists, had insignificant impact, eventually ceasing to exist. Why? What was the difference between Wesley’s leadership style and Whitefield’s leadership style?” (p. 40). 

            This happened because “Wesley understood the Law of Process.  He quickly saw that gaining followers was not the key issue; sustaining them was the real challenge. To that end, Wesley began to organize his new converts” (p. 41).   He organized “classes” and “bands” and “societies.”  Local leaders accepted responsibility for guiding, and holding accountable, fellow Methodists.  Lay preachers were encouraged to exercise their gifts.  Conversely, “Whitefield’s followers had no such structure to assist them in their personal growth.  Once converted, they were simply to gather in churches.  But that did not happen. What was lacking was a process, a system or device by which a person is enabled to go to the next level of growth” (p. 42).  Both men were gifted.  Both were devout.  But only one, Wesley, left a lasting imprint.  Wesley understood the importance of process!

            For purposes of illustration, I’ve focused on only three of the twenty-one “laws.”  Since I helped research and write the book I obviously recommend it!  And I think it’s worth perusing because I share Pastor Garlow’s conviction:  the study of the past reveals how significant leaders have responded to the challenges of their day, providing time-tested principles well worth heeding.

###

129 The Question of God

 

                For more than two decades Armand M. Nicholi Jr. a psychiatrist and professor at HarvardMedicalSchool, has taught a course at HarvardCollege and MedicalSchool.  Through assigned readings, lectures and class discussions, he engaged students in a dialogue between Sigmund Freud and C.S. Lewis.  Freud, thought his luster has dimmed considerably as his theories seem increasingly suspect, certainly helped shape the “therapeutic culture” which now reigns in throughout the West.  Lewis, resolutely defending the “permanent things” at the heart of classical Christian culture, stands permanently enshrined as their great apologist.  The core of his course at Harvard has been put in print by Professor Nicholi in The Question of God:  C.S. Lewis and Sigmund Freud Debate God, Love, Sex, and the Meaning of Life (New York:  The Free Press, 2002).  He tries to present both men’s views on important subjects, accurately portraying both men, interjecting his own explanations and interpretations and final evaluations in the process. 

                After short biographical introductions to the two men, Nicholi presents their views on “The Creator.”  Freud, who emphatically embraced philosophical materialism, acknowledged no Creator and judged all religions illusionary.  Reworking Feuerbach’s famous thesis, Freud thought that “believers” simply project deeply-held desires into outer space and fantasize, like children, notions such as a loving Heavenly Father.  He did, however, at times admit to a deep longing–a Sehnsucht, a hunger for something beyond earthly things–that haunted him all his life.  He attributed it to memories of long-lost days when he escaped from his father, finding solace in some woods near his boyhood home. 

                Lewis, on the other hand, after espousing atheism for more than 15 years, underwent a profound conversion at age of 31 and defended theism for the rest of his life.  Believing in a Creator, he insisted, brought one into contact with the ultimate Reality of the universe, morally demanding and fearsomely holy–hardly the kind of “god” we would conjure up if we wanted to comfort ourselves.  Still more, perhaps our inner longings (for which he used the same German word that Freud used, Sehnsucht), our hungers, accurately orient us to realities that will satisfy the.  Lewis said:  “If I find in myself a desire which no experience in this world can satisfy, the most probably explanation is that I was made for another world” (p. 47).    Lewis’s conversion, those how knew him testified, wrought deeply rooted changes in him.  “A buoyant cheerfulness replaced his pessimism and despair.   On the last days before he died, those who were with Lewis spoke of his ‘cheerfulness’ and ‘calmness'” (p. 77).  Freud, conversely, though he often cited Scripture in his letters and dealt with spiritual themes in his books, apparently never tasted a religious experience.  Styling himself an “infidel Jew,” he discounted reports detailing life-changing spiritual break through, judging them a form of  “hallucinatory psychosis.”                  

                Both men sought to understand and explain man’s “Conscience,” wondering if any Universal Moral Law existed.  No! said Freud.  One’s conscience, engrafted into him by parents and culture as a “superego,” obviously regulates behavior.  But it certainly contains no timeless truths.  Behavioral rules are crafted to lubricate social relationships and change continually as cultures evolve.  This position enabled Freud to consider himself  “very moral person” who compared favorably with the rest of mankind.  Yet, paradoxically, in one of his letters Freud claimed to “subscribe to the excellent maxim of Th. Visher:  ‘What is moral is self-evident.'”  (p. 66). 

                Freud’s admission that there is “self-evident” truth that gives moral guidance would have pleased C.S. Lewis.  Such an admission, he reasoned, ultimately leads one too acknowledge an ultimate Source, a Lawgiver, who prescribes righteous behavior for us.  Moral laws, like mathematical laws, Lewis believed, are discovered when we honestly investigate the manifold structures of the cosmos.  They reveal themselves to us.  We cannot “create” them.  Consequently, as he noted in Mere Christianity, two phenomena stand out in human history:  “First . . . human beings, all over the earth, have this curious idea that they ought to behave in a certain way, and cannot really get rid of it.  Secondly . . . they do not in fact behave in that way. . .  These two facts are the foundation of all clear thinking about ourselves and the universe we live in” (p. 61). 

                To live well, Lewis thought (sharing Aristotle’s view), makes one happy.  Freud also noted that one can hardly deny that most everyone, seeks “happiness; they want to become happy and to remain so” (p. 99).  Nevertheless, he despaired of its attainment.  Imbibing early of Schopenhauer’s pessimism, he apparently believed that “‘Man is never happy, but spends his whole life striving after something he thinks will make him so,” as Schopenhauer said (p. 98).  What satisfaction there is, he thought, comes from control of things, much as Friedrich Nietzsche, another of Freud’s mentors, insisted.  To Nietzsche one answers the “happiness” question by defining it as:  “The feeling that power increases–that resistance is overcome” (p. 98).  Nevertheless, Freud was frequently depressed, resorted to drugs like cocaine to numb his mind to his despair, and declared that as soon as you think happiness is “in your grasp” it slips away” (p. 109). 

                In his atheist years, Lewis shared Freud’s morose assessment of life.  “I was at that time living, like so many Atheists,” he wrote, “in a whirl of contradictions.  I maintained God did not exist.  I was also angry with God for not existing.  I was equally angry with Him for creating a world” (p. 113).  With his conversion, however, came unexpected happiness, pure joy.  Conversion turned his attention from himself to God and others, and he began to take delight in the good times he enjoyed with his friends and, late in life, with his wife.  One of his best friends for 40 years, Owen Barfield, remembered Lewis as “unusually cheerful,” taking “an almost boyish delight in life” (p. 115).  Miraculously, Nicholi says, following his conversion, Lewis “changed from an introvert who, like Freud, was highly critical and distrustful of others, to a person who reached out and appeared to value every human being” (p. 185). 

                Turning to the topic of sex, Nicholi stresses its centrality in the thought of Freud.  Unfortunately, popular misrepresentations have portrayed him as an advocate of libertine “free love.”  What he wanted to do freely was talk about sex and understand its importance.  “To believe that psycho-analysis seeks a cure for neurotic disorders by giving a free rein to sexuality,” he wrote, “is a serious misunderstanding which can only be excused by ignorance.  The making conscious of repressed sexual desires in analysis makes it possible, on the contrary, to obtain mastery over them which the previous repression had been unable to achieve.  It can be more truly said that analysis sets the neurotic free from the chains of his sexuality” (p. 132).   Freud himself apparently lived according to the restrained “Victorian” ethos of his era.  He (at the age of 30) and his wife were virgins when they married.  They had six children in the next eight years, whereafter he apparently discontinued sexual relations with his wife.  Amazingly enough, to those who see him as a libertine, he said, in a 1916 lecture:  “We . . . describe a sexual activity as perverse if it has given up the aim of reproduction and pursues the attainment of pleasure as an aim independent of it” (p. 149). 

                Lewis, though he married quite late in life, thought much about sex as part of the human condition.  Contrary to Freud, Lewis found “love” to be vaster than “sex.”  Rooted in the great works of literature, he “thought Freud’s understanding of love and relationships was incomplete” (p. 165).  Couples in love certainly taste the delights of “Eros,” but this must not be reduced to “Venus,” the sex act itself.  “Perhaps the greatest contribution Lewis makes to understanding sexuality and love,” Nicholi says, “is his clear distinction between being in love and love in its deeper, more mature form” (p. 141).  Love, even love between the sexes, is more than sublimated sexual desire. 

In a profound analysis, The Four Loves, Lewis distinguished between “Gift-love” and “Need-love.”  In Lewis’s words:  “Need-love says of a woman ‘I cannot live without her’; Gift-love longs to give her happiness, comfort, protection–if possible, wealth” (pp. 165-166).  Still more, Lewis utilized four Greek terms to indicate more fully the ramifications of love:  “(1) Storge, affection between members of a family; (2) Philia, friendship; (3) Eros, romantic love between people ‘in love’; and (4) Agape, the love one has toward God and one’s neighbor” (p. 166).  As Nicholi discusses these distinctions, he clearly prefers Lewis to Freud who scoffed at the biblical injunction to “love your neighbor as yourself” and rejected the possibility of loving one’s enemy.  Speaking personally, Nicholi writes, “As a clinician, I have observed that Agape is the key to all successful relationships, even those within groups and organizations” (p. 177).  Importantly, as Lewis clarified the meaning of Agape, he insisted that “Love is something more stern and splendid than mere kindness.”  This is because “love, in its own nature, demands the perfecting of the beloved” whereas “mere ‘kindness’ which tolerates anything except suffering in its object is, in that respect, at the opposite pole from Love” (p. 211).   This enabled him to deal effectively with another of life’s great questions:  the reality of pain and suffering. 

                Both Freud and Lewis personally suffered, and both thought deeply about it.  Freud, the atheist, routinely railed against God for making an anguished world, though a consistent atheist, of course, can hardly complain about pain since there is only a deaf, irrational, unfeeling cosmos responsible for it. Lewis, on the other hand, set forth, in one of his early books, The Problem of Pain, persuasive intellectual reasons as to how a good God could allow suffering:  it “is not good in itself.  What is good in any painful experience is, for the sufferer, his submission to the will of God, and, for the spectators, the compassion aroused and the acts of mercy to which it leads” (p. 203).  Though intellectually persuasive, however, such words failed to fully comfort Lewis in the midst of his wife’s dying.  Here we read, in A Grief Observed, the heart cry of a broken man at a loss for answers.  He raged and he doubted.  But (contrary to the impression left by the movie Shadowlands) he emerged from his sorrow with an even deeper faith, knowing that even death cannot destroy the soul that trusts in God. 

                Dealing with death further polarizes the two thinkers.  Agreeing with Schopenhauer, Freud quoted him to the effect that “the problem of death stands at the outset of every philosophy.”  Freud feared it all his life, finding each birthday a painful event, reminded thereby of his mortality.  When his own mother died he refused to attend her funeral.  When possible, he seemed to avoid thinking about it!  During his final days, he read and pondered Balzac’s The Fatal Skin, a story (much like Faust, Goethe’s classic that often cited) about a “young scientific man” selling his soul to the devil.  Then, asking his doctor to follow instructions, Freud was injected with a lethal dose of morphine, dying a (physician-assisted) suicide. 

                But Lewis, sustained by his Christian faith, believed, Nicholi says, that “the only person do decide the time of one’s death was the Person who gave one life” (p. 230).  He fully enjoyed each passing year, apparently relishing the very process of aging.  “Yes,” he wrote, “autumn is the best of the seasons; and I’m not sure that old age isn’t the best part of life” (p. 232).  He spent his final days contentedly, reading favorite authors, including Homer (n Greek), Virgil (in Latin), and other classic works of literature.  “Never was a man better prepared” to die, said a man who lunched with Lewis shortly before his death.  His brother, Warren, reported that Lewis said to him, a week before he died:  “I have done all that I was sent into the world to do, and I am ready to go.”  To his brother, “I have never seen death looked in the face so tranquilly” (p. 239). 

                Two men.  Two ways to live.  In his Epilogue, Nicholi emphasizes that the great difference between Freud and Lewis was God.  The book’s final paragraph merits repeating as Nicholi’s position:  “The answer to the question of God has profound implications for our lives here on earth, both Freud and Lewis agree.  So we owe it to ourselves to look at the evidence, perhaps beginning with the Old and New Testaments.  Lewis also reminds us, however, that the evidence lies all around us:  ‘We may ignore, but we can nowhere evade, the presence of God.  The world is crowded with Him.  He walks everywhere incognito.  And the incognito is not always easy to penetrate.  The real labor is to remember to attend.  In fact to come awake.  Still more to remain awake'” (p. 244).  Looking for some direction in life?  Try Lewis, says Nicholi!


                In God the Evidence:  The Reconciliation of Faith and Reason in a Postsecular World (Rocklin, CA:  Forum, c. 1997, 1999), Patrick Glynn explains how he recently came to believe in God and the immortality of the soul.  In the book’s first chapter, “The Making and Unmaking of an Atheist,” he explains his early embrace of atheism.  Attending a Catholic grade school, he encountered Darwin’s theory of evolution.  “It immediately occurred to me,” he says, “that either Darwin’s theory was true or the creation story in the Book of Genesis was true” (p. 3).  Siding with Darwin, he declared his position by standing up in class and making his case.  Though still a child, Glynn saw clearly the ultimate import of Darwin, for his theory “breathed fresh life into the atheist position–a fact immediately recognized across the globe.  Notably, that other famous nineteenth-century atheist, Karl Marx, asked Darwin if he could dedicate the English translation of Capital to the great naturalist” (p. 37).  Darwin demurred, but Marx rightly saw Darwin as an asset to his agenda. 

                Entering Harvard in 1969, Glynn fell in with the “New Left” and its Marxist views, solidifying his adolescent agnosticism.  Ultimately earning a Ph.D. in philosophy from Harvard, he settled into a deeply-entrenched atheism.  “Ironically,” he writes, “at the very time I was plumbing the depths of philosophical nihilism, science itself, unbeknownst to me and to many other people, was taking a surprising new turn” (pp. 6-7).  Physicists, acknowledging the reality of the “Big Bang,” were working out some of its implications, including the “anthropic principle,” the notion “that all the myriad laws of physics were fine-tuned from the very beginning of the universe for the creation of man” (pp. 22-23).  Rightly understood, this involves “a refutation of the original premise of the overarching modern philosophical idea:  that of the ‘random universe'” (p. 7).  Dealing honestly with this new evidence, Glynn began to ponder the implications of “A Not-So-Random Universe.”  Amazingly enough, “the picture of the universe bequeathed to us by the most advanced twentieth-century science is closer in spirit to the vision presented in the Book of Genesis than anything offered by science since Copernicus” (p. 26).  Design, not random material developments, better explains the way things really are!  Glynn laces his discussion with clear explanations of the most recent scientific discoveries, further indicating their philosophical importance by placing them within a historical framework.  Cracks are appearing in the foundations of the scientific-secularism that has reigned in the West for more than two centuries.

                Something similar, Glynn says, is transpiring in the inner world.  In a chapter entitled “Psyche and Soul:  Postsecularism in Psychology,” he documents the ebbing away of Freud’s substitute religion, psycho-analysis.  Awakening from its naturalistic slumbers, “Slowly but surely, modern psychology is belatedly rediscovering the soul” (p. 63).  Spirituality seems resurgent.  Witness the massive success of books such as Scott Peck’s The Road Less Traveled!  “It is more than a little ironic,” Glynn says, “that after its long odyssey into the unconscious and its multiplication of dark modernistic concepts of mental life, modern psychology at the end of the twentieth century should have arrived at a formula for mental well-being and happiness hardly distinguishable from that of traditional religion–faith, hope, love, self-discipline, and a life lived in conformity with solid, traditional moral principles” (p. 74).  The Ten Commandments make more sense than the Oedipus Complex!   

                In yet another realm, Glynn finds a growing bond between “faith and the physicians.”  In the words of a Harvard Medical School professor, Herbert Benson, we’re “wired for God” (p. 80).  With that comes some “intimations of immortality,” the accumulating data from near-death testimonials that something at the heart of us survives the body’s demise.  Careful studies indicate that people see out of body details, while apparently “dead,” that cannot be naturalistically explained apart from the reality of a “spirit.”  Indeed, Glynn holds that the ancient view, preeminently the New Testament view, that we by nature are primarily spiritual, still holds.  This means that the Enlightenment apotheosis of Reason must dissolve.  For centuries men have sought to replace God with human Reason, with dismal results, including what Martin Buber perceptively called the “deactualized self.”  Discarding God, man debases himself in the process.  Glynn argues:  “Reason, freed from divine guidance, originally promised humanity freedom; but its culmination in the moral realm is postmodernism, and the spirit of postmodern thought is nothing if not the spirit of [what Buber called] ‘caprice'” (p. 146).  Taking nothing seriously, postmodern man does whatever appeals him for a moment, taking not thought for eternity.  The chaos consequent upon the Sexual Revolution of the ’60s illustrates this pattern. 

                “What I am suggesting,” Glynn writes, with reference to these recent developments, “and what it seems to me history tends to corroborate, is this:  The knowledge of the Spirit is prior to the knowledge of reason.  Where reason follows Spirit, the results are good; where it rejects or parts ways with the Spirit, the results are invariably disastrous, whether one speaks of the political, societal, or personal spheres” (p. 166).  Indeed, he writes in his final sentence:  “If the history of this century offers any lesson, it is that goodness–and a relationship to God, to the Absolute by whatever name He is called–is not only the beginning of wisdom but the only path by which it can be attained” (p. 169). 

                Scholarly in its depth, popular in its presentation, God:  The Evidence makes a strong case, giving us a treatise ideally suited for those serious thinkers who wonder if there is, in fact, views worth considering.

###

128 Gifford Lecture Contrasts: Hauerwas & McInerny

 

                If we believe Time Magazine, Stanley Hauerwas is “America’s best theologian,” indeed, “contemporary theology’s foremost intellectual provocateur.” His stature was recently established when he was invited to deliver the prestigious Gifford Lectures in St. Andrews, Scotland. The lectures, given in 2001, are entitled With the Grain of the Universe: The Church’s Witness and Natural Theology (Grand Rapids: Brazos Press, c. 2001). “My aim,” he says, “is nothing less than to tell the theological story of the twentieth century by concentrating on three of the greatest Gifford lecturers — William James, Reinhold Niebuhr, and Karl Barth. I argue that Karl Barth is the great ‘natural theologian’ of the Gifford Lectures because he rightly understood that natural theology is impossible abstracted from a full doctrine of God” (pp. 9-10).  

                Before turning to this task, Hauerwas tries to explain why someone like himself (who like Karl Barth basically denies the possibility of “natural theology”) would accept the invitation to give the Gifford Lectures.  In self-defense, he notes that another Gifford lecturer, Alasdair MacIntyre, refused to do the “scientific” work mandated by Lord Gifford’s will, which amply endowed the project.  Rather, following the lead of St. Thomas Aquinas, MacIntyre set forth a “natural theology” rooted in the analogy of being, following principles quite different from the “natural theology” shaped by the Enlightenment. 

                The Enlightenment, as Hauerwas has incessantly argued, birthed the “modernity” that has subverted the Christian faith and community.  Citing a recent work by Matthew Bagger, Hauerwas says that “‘the rise of human self-assertion following the breakdown of the medieval world-view captures the central features of modern thought and culture.  Modernity represents the outcomes of a dialectic motivated by contradictions within medieval theology.  Self-assertion requires that humans give themselves the standards of thought and action rather than seeking them from an external source, like God'” (p. 32, quoting Religious Experience, Justification, and History, p. 212).  Consequently, Immanuel “Kant became the exemplary Protestant theologian, and Religion Within the Limits of Reason Alone became the great text in Protestant moral theology” (p. 38).  Rooted in Kant, F.D.E. Schleiermacher, Albert Ritschl, and Ernst Troeltsch shaped the “Protestant Liberalism” that has significantly shaped the theology Hauerwas rejects. 

                Though hardly a theologian, William James illustrates the religious sentiments of liberalism–and the religious pragmatism that so distresses Hauerwas.  Under Darwin’s influence, James had discarded the classical Christian doctrine of God and salvation.  He turned, instead, to the Emersonian Transcendentalism so evident in the Boston of his youth.  In a revealing note to a friend, he said “‘You will class me a Methodist, minus a Savior'” (p. 63).  James’s Gifford Lectures, The Varieties of Religious Experience, delivered at the beginning of the 20th century, proved both revealing and prescient.  So long as “religious experience” enabled one to deal more effectively with life, James considered it “true” and “good.”  As he earlier wrote, in The Will to Believe, “‘there are then cases where faith creates its own verification.  Believe, and you shall be right, for you shall save yourself; doubt, and you shall again be right, for you shall perish.  The only difference is that to believe is greatly to your advantage'” (p. 57). 

                Such pragmatism, Hauerwas rightly avers, has deeply dyed 20th century Christianity.  Discarding doctrine, under the impression that science has disproved its traditional assertions, modernists easily appropriated James’s approach:  believe whatever helps you cope with life, affirm whatever enables you to succeed, embrace whatever makes you feel good.  Such a “natural theology,” focused upon “natural man,” proposed an optimistic humanism fleshed out for popular consumption by preachers such as Norman Vincent Peale and Robert Schuller.  Reducing theology to psychology, joining arms with secularists in shaping today’s therapeutic culture, the followers of William James are legion.  So to carefully critique James is most helpful.

                Unlike James, Reinhold Niebuhr defines himself as a Christian theologian, though his real concern was social ethics.  Sometimes lumped with “Neo-Orthodox” thinkers, in that he rejected some of the liberalism of his early years, Niebuhr was, Hauerwas insists, fully committed to the liberal agenda.  Indeed, Hauerwas argues, “Neibuhr’s Gifford Lectures [The Nature and Destiny of Man] are but a Christianized version of James’s account of religious experience” (p. 87).  Politically, this was markedly evident in Niebuhr’s support for Norman Thomas (perennially the Socialist Party candidate for President) and alignment with the notoriously left-wing Americans for Democratic Action.  Consequently, Hauerwas caustically observes, “Niebuhr’s theology seems to be a perfect exemplification of Ludwig Feuerbach’s argument that theology, in spite of its pretentious presumption that its subject matter is God, is in fact but a disguised way to talk about humanity” (p. 115). 

                That Hauerwas may not be overly severe in his criticism finds support in a 1947 letter John Dewey wrote.  An atheist, fully committed to his own version of pragmatism, Dewey was a reasonably dispassionate critic.  He noted that both Niebuhr and Kierkegaard “‘have completely lost faith in traditional statements of Christianity, haven’t got any modern substitute and so are making up, off the bat, something which supplies to them the gist of Christianity–what they find significant in it and what they approve of in modern thought–as when two newspapers are joined.  The new organ says “retaining the best features of both”‘” (p. 97). 

                So Niebuhr, Hauerwas says, shares James’s pragmatic approach and fails to uphold authentic Christianity.  His critique of liberalism fails because he never really abandoned liberalism.  Having myself recently read The Nature and Destiny of Man, however, I suspect Hauerwas protests too much!  While Niebuhr’s “theology” may be faulted for various failures, he is primarily a social ethicist and apparently had little aptitude for or interest in the classical issues of theology.   I suspect Hauerwas dislikes Niebuhr’s politics, particularly his approval of America, as much as his theology.

                Repudiating the approach of James and Niebuhr, both of whom certainly set forth a form of “natural theology,” Hauerwas appropriates, as an ally, Karl Barth, well known for his staunch “Nein!” to Emile Brunner’s defense of natural theology.  In Barth Hauerwas finds the man, and the theology, worth celebrating.  Amazingly, Hauerwas endeavors to show that Barth rightly set forth a “natural theology.”  He says this despite Barth’s vehement opposition to such!  He claims Barth “provides the resources necessary for developing an adequate theological metaphysics, or, in other words, a natural theology.  Of course, I assume that ‘natural theology’ simply names how Christian convictions work to describe all that is as God’s good creation” (p. 142).  However problematic, this “assumption” allows Hauerwas to build his case!

                This leads Hauerwas to an intricately detailed discussion, rooted in an appreciative  reading of Barth’s Church Dogmatics, designed to show why–and in what ways–he is remarkably akin to Thomas Aquinas!  This is because both men, Hauerwas insists, relied upon the analogia fidei, the analogy of faith.  We can think about God only in terms of “like” and “as,” taking clues from visible realities discern invisible Reality.  Moving from the created world, the natural world, to the Creator, involves thinking analogically.  Barth’s understanding of God, derived from Revelation, works itself out, Hauerwas says, in metaphysical categories and ethical imperatives. 

                Both Barth himself and his Dogmatics were “witnesses” to this endeavor, Hauerwas says.  This leads him, in the book’s eighth and final chapter, to set forth his own position, “The Necessity of Witness.”  Here familiar Hauerwas themes appear.  Whereas Barth was mainly concerned that we “let God be God,” Hauerwas’s message is “let the church must be the church,” living out the radical imperatives of the Gospel.  In an authentic community of faith, worship and praise incubate and shape theological reflection.  He cites John Howard Yoder and Pope John Paul II as demonstrations as to how this is done in our day–especially insofar as they espouse non-violence (Hauerwas’s special passion).   

                This book’s value, in my judgment, lies in its probing, richly footnoted discussion of James, Niebuhr and Barth.  Though Hauerwas’s interpretations can never be taken at face value, they prod one to think and see new dimensions to these thinkers.  When he sets forth his own views, however, things turn more problematic.  Take, for instance, his contention that “witness” is crucial for the church.  There must be no disparity between one’s beliefs and acts.  Thus he sternly rebukes allegedly “Christian universities” for failing to be Christian.  Indeed, he declares, “we should not be surprised that the most significant intellectual work in our time may well take place outside the university” (p. 232).  Yet, one must remember, Professor Hauerwas himself teaches at DukeUniversity, where he is lavishly paid for propounding his “countercultural” views. 

                Still more, it seems to me that as one considers Hauerwas’s allegedly “radical” positions, it becomes clear that he almost unfailingly appeases the modern academic intelligentsia, of which he is a celebrated insider.  To criticize liberalism, in today’s post-modern academic environs, costs one very little.  To share Stanley Fish’s constructivist, reader response approach to hermeneutics, places one comfortably at the center of today’s triumphant secularism.  To trumpet one’s Anti-Americanism, as Hauerwas routinely does, enables one to garner accolades from university colleagues.  To support pacifism, multiculturalism, feminism, socialism, etc. hardly severs connections with the liberal establishment. 

                Finally, though Hauerwas condemns James and Niebuhr for their pragmatism, his own approach to the Christian faith is ultimately pragmatic.  Faith, to Hauerwas, works!  It works in different ways for him than for James and Niebuhr.  Whereas to James faith is personal, and what works brings personal satisfaction, to Hauerwas faith is corporate.  The community, above all, is what matters.  What the worshiping community discerns as true and good, what enables the community to function well, is what counts.  For Hauerwas, the church community validates itself in non-violence, in social justice, in Anabaptist separation from political powers.  But, ultimately, “witness” means validating one’s faith, not testifying to the Risen Lord Jesus!


                Unlike Hauerwas, the 1999-2000 Gifford lecturer, Ralph McInerny, a professor of philosophy at Notre Dame University (as well as the author of the “Father Dowling” mystery stories which were serialized in a television series several years ago), cheerfully embraced the calling to do “natural theology” in Characters in Search of Their Author (Notre Dame:  University of Notre Dame Press, c. 2001).  He writes clearly, directly, determined to uphold the philosophy of St. Thomas Aquinas.  Whereas Hauerwas employs irony, polemic, sometimes tortuous expositions, McInerny writes with a certain structured serenity.  In part, as he says, “There are two kinds of philosopher:  one kind denies the obvious, the other kind states the obvious.  I am of the latter kind” (p. 119).

                The book’s title is explained thusly:   “It has been said that life is a book in which we set out to write one story and end by writing another.  Deflective surprises are due to chance or, as men have thought from time immemorial, to another author in whose drama we are but players.  A play within a play.  How can we not be in search of our author?” (p. 3).  There is an Ultimate Playwright, McInerny believes, and “We are to God as characters to their author” (p. 4).  To grasp the plot, the follow the action, much can be learned through a careful study of the natural world he has made. 

                Trusting one’s reason, upholding the dignity of traditional philosophy, puts one in a counter-cultural position today.  Since Rene Descartes shifted philosophers’ attention from the external to the internal world in the 17th century, increasing numbers of thinkers have assumed that “There is no reality sans phrase, only interpreted reality, what we make of it” (p. 43).  Descartes’ stance undergirds a “fashionable nihilism among influential philosophers,” markedly akin to the “radical chic” Thomas Wolfe detailed in the plush Manhattan parties which celebrated terrorists of various sorts.  Black Panthers, paroled murderers, Weatherman renegades–all enjoyed the embrace of luminaries like Norman Mailer!  Amazingly, McInerny says, having abandoned its traditional pursuit of truth and wisdom, “Philosophy itself has now become a form of Radical Chic” (p. 44).  Consequently, McInerny laments, “Philosophy has become a bone yard.  Having passed through the abattoir of doubt, linguistic reduction, and nihilism, philosophy is but a skeleton of its former self” (p. 73). 

                Noting the same pragmatic tendencies Hauerwas condemns, McInerny says that for great numbers of thinkers today “Language is no longer the sign of thought and thought is no longer the grasp of nature, of essence, of the way things are.  We are thrown back on language itself, and to language is assigned the great task of constructing the self we are and the world in which we live.  Language is a set of rules we adopt for purely pragmatic or utilitarian reasons.  We no longer seek to achieve the true and avoid the false.  Forget about both of those.  The only question is, does it work, is it successful” (p. 26).  Ultimately, this relativistic, nihilistic view, clearly evident in Nietzsche, cannot endure, for it conflicts with reality.  But it is difficult to rationally refute because its proponents deny the legitimacy of reason! 

                Ironically, he says, facing the nihilistic irrationalism of post-modernism, Catholic philosophers like himself are called to uphold the integrity of the mind and the natural ability of man to know truth, even truth about God.  As John Paul II said, in his great encyclical, Fides et ratio: “”One may define the human being, therefore, as the one who seeks the truth'” (p. 121).  Ultimately, this means, as the Second Vatican Council affirmed, that “Human dignity rests above all on the fact that man is called to communion with God.  This invitation to converse with God is issued to a man as soon as he is born, for he only exists because God has created him with love and through love continues to keep him in existence.  He cannot live fully in the truth unless he freely acknowledges that love and entrusts himself to his creator” (p. 30, citing Gaudium et Spes, n. 19). 

                This is, of course, no new task!  In the ancient world, Sophists propounded versions of nihilism, relativism, subjectivism.  Protagoras, the Sophist who declared that “man is the measure of all things,” was the “first of the Pragmatists as well” (p. 45).  In his dialogue entitled Cratylus, Plato recorded that Protagoras taught “that as things appear to me, then, so they actually are for me, and as they appear to you, so they actually are for you” (p. 45).  Strongly reacting to such teachers, Socrates, Plato and Aristotle carefully carved out the lineaments for classical philosophy, a perennial philosophy ever ancient, ever new.

                Centuries later, St. Augustine faced the same challenge — skeptical, nihilistic philosophers — and “wrote the Contra academicos to confront thinkers who held that nothing could be known. It is significant that Augustine as a believer saw the importance of addressing this attack on reason” (p. 45). Thinking rightly, he knew, means bringing one’s mind into alignment with the world that is. Right thinking, logic, reflects the logos, the Word enstructuring the world.

                Aristotle, for example, insisted there are inescapable, undeniable “first principles.” So, he said, enunciating the basic laws of thought:

  1. It is impossible to affirm and deny the same thing of the same subject simultaneously and in the same sense.
  2. It is impossible for a proposition and its contradictory to be simultaneously true.
  3. It is impossible for a thing to be and not to be at the same time and in the same respect. (pp. 47-48).

Centuries later, “Thomas Aquinas, like Aristotle, uses these three self-evident principles as if they were synonymous.  When he is speaking of the first principles of practical reasoning, the precepts of Natural Law, he draws an analogy between them and the first principles of reasoning as such.  He gives as the most fundamental judgment reason makes, non est simul affirmare et negare” (p. 48, citing Summa Theologiae, 1-2.94.2).  One cannot affirm and deny that a given thing such as a tree or a wildfire exists. 

                As Aquinas studied various pagan philosophers, he appreciated their understanding of such principles.  By nature, without supernatural assistance, they reasoned rightly.  Still more:  many of them discerned truths identical with biblical truths.  For example, “Aristotle called the philosophical discipline that culminates the lengthy task of philosophy theologia.  It has come to be called metaphysics, and is in effect the  wisdom the seeking of which gives philosophy its name” (p. 77).  Thus, to Aquinas, doing “natural theology” was obviously possible.  So he “coined a phrase to cover these naturally knowable truths about God that had nonetheless been revealed.  He called them praeambula fidei.  They were distinguished from the other sort of truth about God, the kind that dominates Scripture, which he dubbed mysteria fidei (p. 66).   Mysteries are not, however, irrational.  Were we wiser, we would understand the reasonableness–the logos–of our faith.  Indeed, Aquinas thought, “If some of the things that have been revealed can be known to be true–the preambles–then it is reasonable to accept that the others–they mysteries–are, as they claim to be, true” (p. 67).

                Turning to one of the most basic questions in natural theology, McInerny argues that God’s existence is rationally demonstrable.  Rooted in Thomas Aquinas and the common sense tradition, he notes that one thinks well, “not by sweeping away or casting a skeptical eye on the thinking of ordinary folk, but by seeking there the well-springs of human thinking as such.  The amazing assumption is that everybody already knows all sorts of things” (p. 118).  Moving from things any normal person knows, one discovers both the necessary ontological truth that there must be a First Cause of all that is, as well as certain moral “principia per se nota, precepts of natural law” (p. 119). 

                So faith and reason conjoin.  “Thomas Aquinas discusses the act of religious faith in terms of Augustine’s definition of it as “cum assensione cogitare:  thinking with assent” (p. 124).  Assisted by God’s grace, however limited by our human weakness, we can think.  And the more we think rightly the better we grasp certain truths concerning God, man, and salvation.  Richard John Neuhaus’s appraisal of this book in First Things merits repeating:  “Ralph McInerny never ceases to amaze.  This book is another such occasion.  Here erudition is joined by wit and lucidity in examining fundamental questions of human existence in a manner that is both accessible to the general reader and an intellectual challenge to the specialist.  Prof. McInerny provides a reliable, and enjoyable, guide to reasoned faith and faithful reason.” 

###

127 Ropke’s “Humane Economy”

                

John Zmirak, in William Ropke:  Swiss Localist, Global Economist ( Wilmington :  ISI Books, 2001), introduces readers to one of the finest (if largely unknown to Americans) economists of the 20th century, “a key intellectual architect of postwar prosperity in Europe ” (p. 5).  Following WWII Germany lay prostrate, devastated by the war.  The Allies, ironically, initially imposed the same economic agenda favored by Hitler:  full employment; price controls; inflation.  Ludwig Erhard persuaded U.S. General Lucius Clay, the only Allied occupation leader who favored free markets, to help him restore a free-market economy in West Germany .  Far more important than the Marshall Plan, funneling American dollars into a ravaged Europe , Erhard’s economic reforms freed his country from the legacies of the Third Reich and the temptations to seek socialistic solutions.  He was mentally prepared for the task because, during the war, “Erhard worked as an obscure advisor to a cigarette company and schooled himself in market economics by reading Ropke’s works.  These books, banned by the Gestapo, had to be smuggled in from Switzerland ” (p. 6).  Labeling his agenda “social market economy,” he (advising Konrad Adenauer and the Christian Democrats) helped orchestrate the “German miracle” that so quickly restored West Germany to economic health.  

                In Ropke’s writings, Erhard saw a way out of the totalitarian structures of socialism–be it Hitler’s National Socialism or Stalin’s Soviet Communism.  Both systems, Ropke insisted, “rested their platforms on implicitly or explicit economic arguments, especially promises of increased prosperity, more fairly distributed throughout the population.  It is no accident that each movement claimed the title ‘socialist'” (p. 50).   And he also discerned that all forms of socialism are deeply immoral, for they  “‘give too little to man, his freedom, and his personality; and too much to society'” (p. 56).  Ropke’s works brought readers like Erhard “words of transformation, offering them once more firm ground under their feet and an inner faith in the value and blessings of freedom, justice and morality'” (p. 6). 

                Born in 1899 in Schwarmstedt, Germany, he grew up relishing the colorful, productive, soul-satisfying generosity of village life, lingering memories of which helped shape his economic thought.  He served with distinction as a soldier in WWI, following which he studied at three universities, earning his doctorate in 1921.  After a brief stint working for the Weimar Republic, “In 1924 Ropke was appointed extraordinarius (professor) at the University of Jena, making him the youngest professor in the German-speaking world” (p. 30).  Successive academic appointments led him to Graz, Marburg, and Frankfurt, where he openly opposed both socialists and  nationalists.  When Adolph Hitler came to power, most academics–Heidegger, Bultmann, et al.–maneuvered to keep their positions.  But not Ropke!  “At Frankfurt that day in February 1933, he rose to deliver a wry, acid account of the Nazi movement as ‘a mass revolt against reason, freedom, humanity, and against the written and unwritten millennial rules that enable a highly differentiated human community to exist without degrading individuals into slaves of the state'” (p. 35-36).  He denounced being “‘lukewarm, lazy, and cowardly in the hour of utmost danger, with having been an obfuscated worshipper of the childish twaddle of the day!'” (p. 38). 

                Predictably, the Nazis moved against Ropke.  Within months he lost his tenured position.  Fleeing for his life, he sought shelter first in Holland and then in Turkey, where he taught for four years in Istanbul.  In 1937 he moved to the Graduate Institute of International Studies in Geneva, Switzerland, where he remained for nearly 30 years.  He found among the Swiss not only political refuge but a model for healthy economics.  The “founding fathers” of the United States, such as John Adams and Benjamin Franklin, had openly admired “the Helvetic Republic” as an example of “limited government and political liberty” (p. 16).  Then, when the Swiss set forth a new constitution in 1848, they turned to the Constitution of the United States as their model. 

                Consequently, since localism has suffered setbacks in America–as is evident in the courts’ disdain for the ninth and tenth amendments–the Swiss now enjoy “a system that is still more successfully decentralized than any on earth” (p. 17).  Swiss citizens and cantons–not the federal government–exercise real power in the nation.  This allows intermediate institutions–families, churches, social groups– considerable influence in shaping and maintaining society.  It also assures low taxes.  Such localized power centers provide buffers against both the harsher edges of the free-market economy and the voracious, totalitarian hunger of highly centralized states. 

                Living amongst the Swiss, Ropke saw how efficiently–and how justly–their system worked.  It was, he concluded, in Zmirak’s words, “no accident that the Swiss enjoy the highest standard of living, per capita, in the world; it is the concrete fruit of localism, liberalism, and direct economy” (p. 21).  So he made it the model for his proposals for the rest of Europe.  “Put briefly, Ropke centered his economics in the dignity of the human person, who lives not alone but as part of a family and a community; who thrives or suffers according to the health of those institutions; and who regulates his own economic activity according to financial and personal incentives that he–and not the State–is best equipped to interpret.  Ropke further held that economic incentives are most efficiently conveyed through the price system, while non-economic goods are best preserved through private associations such as the extended family, the village, and the church” (p. 53). 

                Seeking to chart a course between ruthless free enterprise capitalism and brutal state-run socialism, Ropke proposed a “third way” requiring “‘the powerful influences of religion, morality, and law'” to sustain it (p. 82).  Entrepreneurs, vital to an efficient economic system, must be restrained lest they trample the weak.  Bureaucrats, necessary for any government, must be restrained lest they bloat themselves at the public trough.   Ropke especially feared the growth of the welfare state in Europe and America, seeing it as a democratically established dictatorial system.  The “‘formidable problem of our times is the leviathan of omnipotent government,'” he said (p. 157).  The very “national socialism” the Allies fought to destroy in Germany silently slipped into the “welfare” systems they devised to “help” people at home.  Doing so, however innocently they lost  the very thing folks most need:  freedom!

                Ropke insisted that bigness–whether corporate or governmental–all concentrations of inordinate power must be resisted.  “‘Away from centralization in every connection, from accumulations of property and power which corrupt the one and proletarianise the other, from the soullessness and lack of dignity of labour through mechanised production and towards decentralisation in the widest and most comprehensive sense of the word; to the restoration of property; to shifting of the social centre of gravity from above downwards; to the organic building-up of society starting with the family through parish and county to the nation; to a corrective for exaggerations in organization, in specialisation, and in division of labour (with at least a minimum of self-maintenance from one’s own soil); to the bringing back of all dimensions and proportions from the colossal to the humanly reasonable; . . . .'” (p. 175). 


                To explore Ropke’s own writings, the most accessible is a recent reprint of his 1960 treatise, A Humane Economy:  The Social  Framework of the Free Market (Wilmington:  ISI Books, 1998).  Such an economy, writes Dermot Quinn, “is only, in the end, a shadowy reflection of the divine one” (p. xviii).  He begins by declaring that “the technique of socialism–that is, economic planning, nationalization, the erosion of property, and the cradle-to-grave welfare state–has done great harm in our times; on the other hand, we have irrefutable testimony of the last fifteen years, particularly in Germany, that the opposite–the liberal–technique of the market economy opens the way to well-being, freedom, the rule of law, the distribution of power, and international co-operation” (p. 3).   Collectivism denies personal freedom, crushes the very image of God wherein we are created.  (Ignorantly, naively, tragically, Christians supporting socialistic economics have embraced a movement intent on destroying the verities of their faith.)

                Ropke, however, envisions man as defined by the Christian tradition.  Consequently, the truly important “things are those beyond supply and demand and the world of property.  It is they which give meaning, dignity, and inner richness to life, those purposes and values which belong to the realm of ethics in the widest sense.”  Man does not live by bread alone!  And yet, importantly, “There is a profound ethical reason why an economy governed by free prices, free markets, and free competition implies health and plenty while the socialist economy means sickness, disorder, and lower productivity” (pp. 5-6).  Freedom’s our birthright!  When it’s respected and protected, a “humane economy” results.

                 Freedom, however, has been an endangered fugitive in the 20th century.  “In all fields, mass and concentration are the mark of modern society; they smother the area of individual responsibility, life, and thought and give the strongest impulse to collective thought and feeling.  The small circles–from the family on up–with their human warmth and natural solidarity, are giving way before mass and concentration, before the amorphous conglomeration of people in huge cities and industrial centers, before rootlessness and mass organization, before the anonymous bureaucracy of giant concerns and, eventually, of government itself, which holds this crumbling society together through the coercive machinery of the welfare state, the police, and the tax screw.  This is what was ailing modern society even before the Second World War, and since then the illness has become more acute and quite unmistakable” (p. 7). 

                The totalitarian sickness of mass democracy remains rooted in the Jacobean ideology spawned by radicals like Robespierre in the French Revolution.  It is, above all, religious in nature.  In his earlier writings Ropke dealt almost singularly with economics.  By 1960, in A Humane Economy, he decided to clarify his deepest convictions:  “the ultimate source of our civilization’s disease is the spiritual and religious crisis which has overtaken all of us and which each must master for himself.  Above all, man is Homo religiosus, and yet we have, for the past century, made the desperate attempt to get along without God, and in the place of God we have set up the cult of man, his profane or even ungodly science and art, his technical achievements, and his State” (p. 8).  All our technical grandeur, without God, cannot but destroy us.

                 Replacing God, the collectivist State is now our gravest enemy, “the most immediate and tangible threat” we face (p. 33).  In 1787 Goethe prophetically said:  “I must say, I believe that humanism will eventually prevail; but I am afraid that at the same time the world will become a huge hospital, with everyone nursing his neighbor” (pp. 163-64).  Whereas classic liberalism defends human freedom, seeking to maximize individual liberty, Jacobins stress human equality and try to redistribute the world’s wealth.  Accordingly, “The state and the concentration of its power, exemplified in the predominance of the budget, have become a cancerous growth gnawing at the freedom and order of society and economy.  Surely, no one has any illusions about what it means when the modern state increasingly–and most eagerly before elections, when the voter’s favor is at stake–assumes the task of handing out security, welfare, and assistance to all and sundry, favoring now this and now that group, and when people of all classes and at all levels, not excluding entrepreneurs, get into the habit of looking on the state as a kind of human Providence” (p. 33).

                In a lengthy chapter entitled “Welfare State and Chronic Inflation,” Ropke addresses two closely linked and equally destructive aspects of the post-WWII era.  Voters in a democracy easily succumb to the allure of free social services and cheap money, so they periodically indulge in a legalized “robbery by the ballot.”  “It cannot be repeated too often that what is given to the one must be taken from the others, and whenever we say that the state is to help us, we are laying a claim to somebody else’s money, his earnings or his savings” (p. 174).   Though disguised by adroit politicians as “compassion,” the driving passion of the masses is envy, the desire to bring down those who prosper and distribute their wealth.  “It is,” he says, “a state which deprives people of the right to dispose freely of their income by taking it away from them in taxes and which, by compensation, and after deduction of the extraordinarily high administrative costs of the system, takes over the responsibility for the satisfaction of the more essential needs, either wholly (as in the case of education or medical care) or in part (as in the case of subsidized housing or food).  What people eventually retain from their income is pocket money, to be spent on television or football pools” (p. 158). 

                To resist such statism, Ropke continually praises sound currency and private property, essentials for a free society.  A free market economy requires “the institution of private ownership, in the true sense of legally safeguarded freedom to dispose of one’s own property, including freedom of testation” (p. 94).  Inflation, encouraged by economists like John Maynard Keynes and implemented by politicians like Franklin D. Roosevelt, eats away at the innards of a good society.  A certain euphoria always surrounds inflation, but generally “things happen just as they are described in the second part of Faust in the famous paper-money scene:  ‘You can’t imagine how it pleased the people.’  But that is precisely the dangerous seduction of inflation:  it begins with the sweet drops and ends with the bitter” (p. 195).  And, equally important, free folks need houses, lands, savings accounts–resources sustaining their independence from the state.  Unfortunately, John Locke’s firm conviction– that we are by nature entitled to life, liberty, and property–no longer stands.  Private property has increasingly been subjected too state control.   

                 Though Ropke mastered the arcane and technical details of his discipline, he insisted that economics is fundamentally a “moral” rather than a “natural” science.  In A Humane Economy one encounters a morally committed Christian economist who provides us with one of the finest books I’ve read on this subject.


                Soon after fleeing Germany, while teaching in Turkey, Ropke published, shortly before the Nazis marched into Vienna in 1937, Economics of the Free Society (Grove City, PA:  Libertarian Press, Inc., 1994).   One finds herein a marvelous introduction to the discipline of economics–readable and understandable without compromising its scholarly integrity.  One learns about the importance of the monetary system, the division of labor, marginal utility, etc.  Ropke knows how to use illustrations and simplify theoretical abstractions.  He also defends the classical, liberal, free market economy.  Amazingly enough, the apparent anarchy of the bewilderingly complex free market produces order.  Unlike the “commanded order” of socialist systems, the “spontaneous order” of the free market provides the very best economy known to man.  Importantly, however, “He who chooses the market economy must, however, also choose:  free formation of prices, competition, risk of loss and chance for gain, individual responsibility, free enterprise, private property” (p. 268). 

                He explains, for example, why money–serving as a medium of exchange, a common denominator–is a positive good.  Money enables the free market to thrive, makes possible the credit system, savings and investment, efficiently enabling the distribution of goods.  “It is, as Dostoievsky once expressed it, ‘coined freedom'” (p. 88).  Consequently, debasing the currency, indulging in inflation, deeply harms the economy–and ultimately persons.  Sound money (preferably tied to the gold standard, Ropke insists) truly blesses mankind.  Reams of learned articles have sought to escape the fact, but “for thousands of years men have continued to regard gold as the commodity of highest and surest worth and as the most secure anchor of wealth.  One may protest this as often as one likes–the fact remains” (p. 107). 

                He also helps one understand the rich and the poor.  And understanding is truly needed in an area routinely distorted by demagogues of various sorts.  Railing against the rich on behalf of the poor gains considerable currency for public figures.  “But instead of trying to acquire the facile reputation of a ‘social-minded’ man by vague demands for a ‘just wage,’ by railing against ‘interest slavery’ and ‘profiteering,’ by emotional outpourings over ‘gluttonous landlords,’ and real estate ‘speculators,’ and instead of shoving aside a ‘liberalistic’ the objections of those who understand something of these matters, one would serve his country better by applying himself to an unprejudiced study of the complex interrelationships of the economy” (p. 195).  Would that professors and preachers and politicians learned something before pontificating on economic injustices! 

                The free market, basic to a healthy economy, rightly functions only within a deeply moral society.  Healthy markets “cannot function unless there is general acceptance of such norms of conduct as willingness to abide by the rules of the game and to respect the rights of others, to maintain professional integrity and professional pride, and to avoid deceit, corruption, and the manipulation of the powers of the state for personal and selfish ends.  The big question of our time is whether we have been so heedless and unsparing in the use of our moral reserves that it is no longer possible to renew these vital props of our economic system and whether it is yet possible to discover new sources of moral strength” (p. 27). 

                Summing up his case, Ropke says that his “‘third road’ of economic policy is, above all a road of moderation and proportion.  It is incumbent upon us to make use of every available means to free our society from its intoxication with big numbers, from the cult of the colossal, from centralization, from hyper-organization and standardization, from the he pseudo-ideal of the ‘bigger and better,’ from the worship of the mass man and from addiction to the gigantic.  We must lead it back to a natural, human, spontaneous balanced, and diversified existence” (p. 271).  Sadly enough, amidst incessant progress man “forgot man himself:  forgot his soul, his instincts, his nerves and organs” (p. 271).  What Ropke proposed, in 1937, was that we take the road “favoring of the ownership of small and medium-sized properties, independent farming, the decentralization of industrial areas, the restoration of the dignity and meaning of work, the reanimation of professional pride and professional ethics, and the  promotion of communal solidarity” (p. 271). 

# # #

126 Recalling Education

                Hugh Mercer Curtler, Professor of Philosophy and Director of the Honors Program at Southwest State University in Marshall, Minnesota, urges us, in Recalling Education (Wilmington:  ISI Books, 2001), to undertake “a revolution in American education” (p. ix).  Appropriating Thomas Jefferson’s axiom that a nation needs a revolution every 20 years, Curtler argues that higher education should, above all else, liberate young people from the various shackles that threaten to imprison them.  Importantly, the liberty he champions, is (as Dostoevsky defined it “only the mastering of one’s self” (p. 1).  Accordingly, John Locke cautioned us not to allow someone “unrestrained liberty before he has reason to guide him” (p. 42).

                Curtler believes that the educational ‘situation has never been as bad as it is at present’ (p. 162).  Despite the vast numbers of students and universities, despite the superficial glamour they exude, ‘Our experiment with higher education, on balance, must be regarded as one of the great disappointments of the twentieth century’ (p. 162).  This is so, in part, because much ‘education’ has been reduced to ‘schooling’ and ‘job training,’ replete with the acquisition of information rather than understanding.  Accordingly, students now graduate from universities without anything resembling a ‘liberal education.’ 

                The schools, of course, mouth slogans celebrating ‘multiculturalism,’  ‘self-actualization,’ and ‘self-esteem,’ but these are, Curtler insists, the converse of real liberty, which comes with the cultivation of character, the discipline of desires, the conquest of irrational instincts.  Education, as the ancient Greeks insisted, should incubate arete:  human excellence.  Such cannot be programmed or indoctrinated, but it can be encouraged.  Educators, as Aristotle said, can provide a positive environment within which moral and intellectual virtues flourish.  Teaching the right skills, such as reading, and telling the right stories, illustrating the difference between right and wrong, provide youngsters with the mental and moral muscles necessary to become excellent human beings. 

                So, Curtler argues, colleges and universities must recover their main mission, rooted in the liberal arts.  Restoring required general education courses to one-third of the graduation requirements (as done by PLNU and other Nazarene universities) would be a major step in that direction. 


                Sharing Curtler’s concern, Richard T. Hughes, in How the Christian Faith Can Sustain the Life of the Mind (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2001), urges believers to blend heart and mind in their walk with God.  A Distinguished Professor of Religion and director of the Center for Faith and Learning at PepperdineUniversity, Hughes is a proven scholar with an established commitment to higher education.  There is, he believes, a fatal dichotomy in the nation’s mind (a dichotomy I might add as ancient as the Athens-Jerusalem tension that pitted Tertullian against Clement of Alexandria):  secular institutions ignore religious realities; church institutions, reacting, minimize the live of the mind.

                Some non-Christians know nothing about the Bible; some Christians know nothing but the Bible.  Hughes, however, argues that ‘dynamic Christian faith requires that we learn to make connections and to think creatively about the meaning of what we believe.  We call this kind of thinking ‘theology,’ and if we have any hope that Christian faith might sustain the life of the mind, every Christian scholar must learn to work as a theologian in his or her own right’ (p. 6).  We’re called to live in two worlds.  Or perhaps we’re called to live in the real world, where neither secular nor sacred is depreciated. 

                Hughes thus examines various stances, ranging from the Catholic ‘sacramental principle’ to the Reformed concern for ‘transformation’ to the Mennonites’ holism to the Lutheran ‘theology of the cross’ and its faith/doubt paradoxes.  Each approach has its strengths, duly acknowledged.  What Hughes insists is that one take a stand and address his world from within a clearly Christian position.  As a member of the Church of Christ, he himself takes a Reformed approach, sizeably influenced by Lutheranism. 

                When he describes how he teaches, and the sources he uses, however, it becomes more clear what Hughes envisions.  Paul Tillich’s theology, he says, enables him to engage students in suitable ways.  Howard Zinn’s A People’s History of the United States he finds ‘profoundly Christian in its orientation, not because the book is written by a Christian since, of course, it is not.  Rather this book embraces the same ‘upside-down’ values that we have been taught by the Christian faith’ (p. 121).  If ‘upside-down’ means devious and distorted, no doubt Zinn should be used!  Zinn’s radical ‘history’ mentions Pilgrims and Puritans only in their role as Indian-killers, deletes Jonathan Edwards and Charles Finney from the nation’s story, and generally celebrates socialism (Zinn’s personal agenda) everywhere.  That Hughes ’embraces’ on one of the most radical, slanted ‘New Left’ history texts ought give us reason to question his reliability! 

                When we learn what texts he assigns in his ‘Religion and Race in America’ class, we grasp how the good professor–earlier opposed to ‘indoctrinating’ students in theology–fervently does precisely that when he approaches really important issues such as racism.  Christian doctrines must be dealt with dispassionately–or even dutifully doubted.  But on issues such as race prejudice there can be no questions!  Finally, we’re urged to duplicate the ‘passion’ of Chris Lovdjieff, a San Quentin inmate who so impressed his fellow inmate, Eldridge Cleaver, that he called him ‘The Christ.’  Given Cleaver’s subsequent trajectory, one rather wonders how his jailhouse teacher provides us a model for emulation.  How what Hughes does with all this, claiming to stay rooted in a cogently Christian worldview easily eludes me!

                From Lovdjieff in prison, we segue to Parker Palmer’s treatise, The Courage to Teach, with its Quaker concern for the ‘Inner Light’ and a ‘circle of seekers’ gathered together to discuss ‘great things.’  We ‘teach who we are,’ Palmer says (p. 145), and Hughes agrees.  He’s teaching himself, properly clad in Tillich, Zinn, and Eldridge Cleaver!


                Robert Benne, a professor of religion at RoanokeCollege, a ‘partly secularized church-related’ Lutheran college, offers us Quality with Soul:  How Six Premier Colleges and Universities Keep Faith with Their Religious Traditions (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2001).  Arguing against doomsayers, such as James Burtchaell in The Dying of the Light, Benne finds much to commend in six exemplary schools:   Calvin, Wheaton, Valparaiso, Notre Dame, Baylor, and St. Olaf.

                Benne admits that virtually all Christian colleges, in time, slip away from their denominational ties and theological commitments.  Certain trajectories, such as declining numbers of students and faculty from the sponsoring denominations, the shunting aside of chapel, the elevation of humanitarian service projects over personal piety, and the loss of a clearly articulated theological vision, generally reveal this secularizing process.  But the process is neither inevitable nor irreversible.  What Benne wants to discover, in the ‘premier’ institutions he studies, is the secret to keeping Christian colleges Christian. 

                CalvinCollege, for example, clearly subscribes to the tightly wrapped theology of the Christian Reformed Church.  Professors both understand and articulate the college’s theologically-grounded worldview.  ‘With its emphases on the importance of education, cultural formation and preservation, and even covenant theology, the Christian Reformed subculture is arguably Protestantism’s counterpart to Judaism’ (p. 70).  Similarly, WheatonCollege, a non-denominational evangelical institution, blends the characteristic themes of biblicism, conversionism, activism, and crucicentrism, and sustains a vigorous Christian educational program.  Both colleges have a clearly articulated vision, a vibrant chapel program, and demonstrate the possibility of ‘quality with soul.’ 

                BaylorUniversity (Southern Baptist), and Notre DameUniversity (Roman Catholic), lack some of the clear indices of Christian institutions, but Benne thinks they have maintained a solid commitment to their traditions and demonstrate much religious vitality.  In his own theological tradition, two Lutheran colleges, Valparaiso and St. Olaf, seem even more tenuously ‘Christian’ in their vision and commitment.  But Benne stresses that there are possibilities for revival and renewal.  Though for some it may be a ‘long road back’ to the vision and spiritual vitality of their founders, such schools may very well find it. 

                Benne provides us with positive models–especially Calvin and Wheaton–evidence sustaining hope for the risky endeavor of keeping Christian colleges Christian.  But compared with the substantial research and sobering data of Burtchael the largely anecdotal and impressionistic materials in Quality with Soul leave one wondering about its generally Pollyanna-style spin.  Reading (in World Magazine) that Congressman Tom Delay urges folks not to send young people to Baylor, because of its liberalism, it’s perplexing to find Benne recommending it for its ‘Christian’ commitment.  


                Rather than lamenting the demise of liberal education, Jeffrey Hart reveals how it should be done in Smiling Through the Cultural Catastrophe:  Toward the Revival of Higher Education (New Haven:  Yale University Press, c. 2001).  He’s now Professor Emeritus, of English at DartmouthCollege.  ‘For years,’ Tracy Lee Simmons says, ‘Hart delivered a series of lectures to incoming students at Dartmouth on the theme ‘How to Get a College Education, Even If You’re in the Ivy League.’  The point was that no college or university guaranteed such an education:  It had to be pursued consciously, zealously, step by step.  Hart gave his audience practical pointers to use when choosing coursework and professors.  One was to stay away from courses with the word ‘studies’ in their titles–like ‘African-American Studies’ or ‘Women’s Studies’–they’re likely to be seedbeds of radical ignorance.  Keep to normal courses like ‘American Colonial History’ and ‘Seventeenth Century English Poetry.’  (These are risky enough.)  Another pointer was to avoid any professor who doesn’t come to class wearing a coat and tie ‘unless he’s won a Nobel Prize” (Crisis {April 2002}, pp. 51-52). 

                Hart’s persuaded that, as Aleksandr Solzhenitsyn says, ‘A people that no longer remembers has lost its history and its soul’ (p. vi).  Educators’ great task, Hart insists, is to help coming generations remember.  College professors, especially, must sustain the memory of the great works which anchor the grandeur of Western Civilization.  Quoting one of his own influential teachers, the philosopher Eugen Rosenstock-Huessy, he asserts that education should enculturate citizens–and “a citizen is a person who, if need be, can re-create his civilization” (p. ix). 

                Western Civilization developed, Hart holds, as a result of the creative tension–a dialectic–between Athens and Jerusalem, blending the lofty goals of spiritual and intellectual perfection.  “Plato and the Prophets,” said Herman Cohen, “are the most important sources of modern culture” (p. 3).  The views of the Greeks and the Jews came together when important thinkers, beginning with St. Paul, brought them into the synthesis that defined the West. 

                First Hart discusses (by carefully reading their texts) Homer and Moses, two heroic figures who tower over Athens and Jerusalem.  Commenting on the Commandment–’You shall have no other gods before me’–Hart asserts:  ‘Everything else follows from that.  In the great Psalm about the Law, 119, the longest and most elaborately wrought in the Psalms, the psalmist describes the Law as ‘true’ (emeth).  That is, the Law is rooted in Being, in actuality, in the way things are.  Its religious and ethical injunctions are not opinions or recommendations; there are moral and religious rules that are true just as there are observable principles operative in the world of nature’ (p. 63).  Consequently, the Law serves as an operator’s manual for both individuals and societies.

                Then come Socrates and Jesus, who internalized the heroic attributes of their progenitors.  Socrates, Plato shows, ‘internalized the Greek heroic tradition that came down to him as refracted through Homer.  The heroism of the battlefield and the pursuit of arete became heroic philosophy and the pursuit of truth, even at the cost of life itself.’  Four centuries later, ‘Jesus radically internalized the heroic tradition of the patriarchs, Moses, and the Prophets, refining it to an intense concentration on the inward condition of holiness, anchoring the older Law in the purified soul’ (p. 73). 

                Hart gives extensive, perceptive attention to Jesus.  His words, recorded in the four gospels, are ‘eloquent, memorable, often mysterious.  Into the world of the narrative voices there comes this entirely different voice.  . . . .  What this seems to show is that Jesus could not have been created as a fictional or semifictional character even by men who were close to him but virtually had to be part of a recollection they shared, however derived, of an extraordinary person.  Those who wrote the narrative prose could not have imagined the man who spoke as their central figure’ (p. 89). 

                Above all else, Jesus calls us to holiness.  ‘Jesus wants not only good behavior but a radical purification of being’ (p. 95).  ‘We begin to grasp Jesus’ goal for all of us:  the condition of holiness in which the inner self is so disciplined, so perfect, that no stain can possibly adhere to it’ (p. 96).  This call shines forth most clearly in the Sermon on the Mount.  Indeed, Hart wonders, ‘Might it not be that the state of perfect holiness that Jesus asks for in his Sermon on the Mount resembles the mind of God encountered in Genesis?’ (p. 97).  Yes, indeed: ‘What Jesus does in his Sermon on the Mount is concentrate the theme of holiness that can be found in the Hebrew Bible, concentrate it to a sharp point and, as he says, ‘fulfill’ it.  It could be argued that the Hebrew Bible in its deep structure yearns for fulfillment in such a hero as this, who embodies the triumph of holiness in word and act’ (p. 101).

                Drawing together Athens and Jerusalem, St. Paul ‘stands at the center of a mighty transformation, the coming together of biblical tradition and Greek philosophy’ (p. 105), giving birth to the Western mind.  ‘When you trace Western thought back along its many roads you find Paul standing there at a moment of strategic crystallization.  Read Augustine, Dante, Luther, Shakespeare, Milton, Swift, Dostoyevsky–you are aware of Paul’ (p. 120).  So Professor Hart helps us read and think about some of the great ‘classics’ of the Western tradition, working within the Pauline synthesis. 

                To illustrate Hart’s approach, note his note on Dante.  ‘The souls in Dante’s Inferno are not placed there by some external agency, throwing them into jail against their wills.  In fact they go willingly to the location in Hell appropriate for them.  They had chosen their Hell while still alive.  Their wills never turned against their choice.  Dante often speaks in his poem of the ‘sweet world’ and gives many examples of it in his similes.  Those in Hell lost this sweet world while they were in it through the distortions of their actual choices, their defective wills.  In external appearance, while in the world, they might have been handsome or fair, and prosperous and powerful, but internally they had turned away from the sweet world and also from their highest good.  Their destiny in Hell, as Santayana says, ‘is just what their passion, if left to itself, would have chosen.  It is what passion stops at, and would gladly prolong forever.’  In Hell, to put it another way, they achieve the ideal form of what they had willed all along without ceasing to will it’ (p. 152).

                The book’s title is somewhat misleading, for it exudes thanksgiving for the riches of the Western tradition, relished by a thoroughly experienced scholar who puts it in the best light.  However, in his ‘Afterword,’ Hart ventures to explain the ‘cultural catastrophe’ responsible for crushing liberal arts education.  When he began his studies, as an undergraduate at ColumbiaUniversity in 1948, the Western tradition, represented by the ‘great books,’ was at the core of a student’s studies.  Twenty years later, amidst a cultural  conflagration, that tradition went up in flames.  ‘What these eruptions appear to have had in common was an antinomian dislike of rules, a rebellion against genuine learning and authority, and an egalitarian abandonment of distinctions between the important and the unimportant, even between the prose on a cereal carton and the poetry of Shakespeare.  In their overall thrust, which amounted to a kind of reverse sentimentalism and unjustified rage, these moods appeared to be hostile to Western civilization itself’ (p. 246).

                And indeed they were.  Now hosts of ‘critics’ deconstruct rather than interpret.  Ever alert to various villains, ‘Their own tone is often snarling and accusatory.  Needless to say, the villain always turns out to be variously white, male, Western, racist, imperialist, sexist or homophobic–or, with luck, all of them together.  The result of this is not literary experience but an endless repetition of slogans and cliches’ (p. 246).  Consequently, students rarely receive the ‘education’ they deserve. 

                And yet . . . and yet there’s hope!  Hart ends his treatise with an enconium to Chaucer’s ‘clerc,’ the scholar who forfeited food in order to buy a set of Aristotle’s works. 

                                Of studie took he most cure and most hede.

                                Nought o word spak he more than was neede . . .

                                Souninge in moral vertu was his speche.

                                And gladly wolde he lerne and gladly teche. (p. 249)

So be it!  However desperate things appear, we need clercs of Chaucer’s stripe.  He gladly learned and taught.  So it is indeed possible, even necessary, to smile and go on teaching–amidst the cultural catastrophe.