215 “The Battle”

While Arthur C. Brooks was Professor of Business and Government Policy at Syracuse University he wrote Who Really Cares and Gross National Happiness—books I’ve favorably reviewed in the past.  Now he is president of the American Enterprise Institute for Public Policy Research and recently released The Battle:  How the Fight Between FREE ENTERPRISE and BIG GOVERNMENT Will Shape America’s Future (New York:  Basic Books, c. 2010), a brief treatise eminently worth pondering.  In a profound way the book strongly reaffirms the message of the Liberty Bell—a phrase taken from Lev. 25:10—“proclaim liberty throughout the land.”  

Though today’s “culture war” appears to be primarily economic and political it is actually quite philosophical, involving “a struggle between two competing visions of America’s future. In one, America will continue to be a unique and exceptional nation organized around the principles of free enterprise.  In the other, America will move toward European-style statism grounded in expanding bureaucracies, increasing income redistribution, and government-controlled corporations.  These competing visions are not reconcilable:  We must choose” (p. 1).  For himself, Brooks chooses free enterprise—“the system of values and laws that respects private property, encourages industry, celebrates liberty, limits government, and creates individual opportunity” (p. 3) that certainly characterized this nation in its formative years.  

America’s founders, after all, waged a war of independence to escape onerous taxation.  “Give me liberty or give me death,” said Patrick Henry.   “‘They who can give up essential liberty to obtain a little temporary safety, deserve neither liberty nor safety,’ declared Benjamin Franklin” (p. 3).  Furthermore, said Thomas Jefferson:  “‘To take from one, because it is thought his own industry and that of his fathers has acquired too much, in order to spare to others, who, or whose fathers, have not exercised equal industry and skill, is to violate arbitrarily the first principle of association, the guarantee to everyone the free exercise of his industry and the fruits acquired by it’” (p. 4).  And James Madison asserted that “the ‘first object of government is the protection of different and unequal faculties of acquiring property’” (p. 104).  Uniquely this “land of our fathers” has been a land of liberty—affording the freedom to live free from either tyrannical control or paternalistic care.  

Attuned to this nation’s tradition, free enterprise retains widespread popular support.  Brooks insists that at least 70 percent of the American people prefer free enterprise capitalism to various versions of centralized state socialism.  The other 30 percent, however, form a powerful coalition “led by people who are smart, powerful, and strategic” (p. 13).  They are the university professors and Hollywood celebrities, the journalists and judges—the “intellectual upper class:  those in the top 5 percent of the population in income, who hold graduate degrees, and work in intellectual industries such as law, education, journalism, and entertainment” (p. 13).  The 30 percent coalition emerged in FDR’s New Deal, and President “Obama wants to finish the job by turning it into a permanent ruling majority” (p. 66).  

“America is a 70-30 nation in favor of free enterprise.  Yet the 30 percent coalition is firmly in charge” (p. 27).  The reason is simple:  the financial crisis in 2008 enabled the elite minority (personified by Barack Obama) to take control of the country and orchestrate sweeping “change.”  No expenditures were considered inordinate, and within two months “the U.S. government and the Federal Reserve had spent, lent, or pledged some $12.8 trillion of America’s future prosperity.  It was an enormous amount, rivaling the value of everything produced in the U.S. economy in 2008” (p. 32).  All this was demanded, Obama intoned, because the George Bush administration and Wall Street had failed to rightly regulate markets and curb capitalism’s excesses.  

“Unfortunately for America,” Brooks says, “the Obama Narrative is wrong on every point” (p. 35).  Government (not business) and politicians (not bankers) actually caused the 2008 financial crisis.  “The government’s failure is most blatant in the implosion of Fannie Mae and Freddie Mac.  Through these two government-sponsored enterprises (GSEs), politicians pulled off some of their most dramatic, and costly efforts at social engineering.  At the same time, they enriched their political campaigns.  And in the process, they perverted the most basic rules of the free enterprise system” (p. 36).  The collapse of the housing market sparked the financial collapse, and the housing market collapsed because ideologues like Barney Frank and Chris Dodd “sparked the fire that burned down our financial system” (p. 41).  

Trusting politicians such as Barney Frank to resolve the recession they helped cause is, of course, ludicrous.  But the Obama Narrative insists we do precisely that—as was evident in the bill recently passed to regulate Wall Street.  Still more:  the president insists we can “spend our way out of this recession” and continually calls for bail-outs and subsidies of various sorts.  Unfortunately, “attempts to shore up the economy with massive public spending have done little to improve matters and have served primarily to chain future generations with debt” (p. 55).  Unemployment persists and the GDP stalls amidst profligate government spending!  

Many big government devotees pursue their agenda fueled by faith that they are providing “happiness” for the masses.  They imagine that we all crave equality, so giving everyone the same amount of money or entitlements or assets will make everyone equally happy.  As philosophical materialists they fail to understand “that the secret to human flourishing is not money but earned success in life” (p. 71).  Unearned income—welfare in various guises—is especially pernicious, almost guaranteeing unhappiness!  Only “earned success”—often attained in non-monetary realms—satisfies the soul.  This necessarily follows:  “If money without earned success does not bring happiness, then redistributing money won’t make for a happier America” (p. 81).  A truly good government, then, does not dole out goods but grants the freedom to pursue the happiness that only comes through earned success.  

Unfortunately, Brooks acknowledges, advocates of free enterprise such as himself too often rely on strictly economic rather than ethical categories.  But, he argues, though “we often use the language of commerce and business, what we really believe in is human flourishing and happiness.  We must articulate a set of moral principles that set forth our fundamental values and principles” (p. 97).   Importantly, this “is the first and most important of these moral principles:  The purpose of free enterprise is human flourishing, not materialism” (p. 97).  Added to this are four subsidiary principles:  2) seek equal opportunity, “not equality of income;” 3) promote prosperity rather than alleviate poverty; 4) celebrate America as “a gift to the world;” and 5) commit to ethical principles, “not political power” (p. 103).  If such principles are clearly explained and vigorously defended, Brooks believes, the battle can be won.

* * * * * * * * * * * * * * * * * * * * *

Angelo M. Codevilla, professor emeritus of international relations at Boston University, shares Brooks’ conviction that there is a “battle” raging in America that will determine the nation’s destiny.  He discerns a class conflict dividing America, but it is not the one Marx envisioned; it is essentially moral rather than economic, spiritual rather than material.  In The Ruling Class:  How They Corrupted America and What We Can Do About It (New York:  Beaufort Books, c. 2010) Codevilla sets forth “the massive fact that underlies all these issues and makes each a battlefield on which vie partisans of radically different Americas” (p. xv).  The battling partisans are the Ruling Class and the Country Class, and they are waging a Kulturkampf involving marriage and family, sexual orientation and values, as well as political and economic questions.  

The Ruling Class congealed as the 19th century closed, embracing Progressivism as its ideology, confident “that because man is a mere part of evolutionary nature, man could be improved, and that they, the most highly evolved of all, were the improvers” (p. 17).   Professionals—educated experts—needed to orchestrate the development of the modern industrial state.  Progressive leaders, generally upper class intellectuals such as Woodrow Wilson, “imagined themselves to be the world’s examples and the world’s reformers dreamed big dreams of establishing order, justice, and peace at home and abroad” (p. 17).  They favored “making the world safe for democracy” abroad and social reform (e.g. a graduated income tax and women’s suffrage and prohibition) at home.  Doing so, as Wilson recognized, meant the government must expand beyond the clear limits of the Constitution, envisioning a “‘living’ Constitution that does not so much restrict government as it confers ‘positive rights’—meaning charters of government power.  Thus they slowly buried eighteenth-century words with twentieth-century practice” (p. 37).  

With Franklin D. Roosevelt’s New Deal progressives gained firm control of the federal government and determined to further Wilson’s agenda and fundamentally change the nation.  In short order, Codevilla declares:  “The America described in civics books—in which no one could be convicted or fined except by a jury of his peers for having violated laws passed by elected representatives—started disappearing when the New Deal inaugurated today’s administrative state, in which bureaucrats make, enforce, and adjudicate nearly all the rules” (p. 41).  Illustrating the culmination of this process was Speaker of the House Nancy Pelosi who, in 2010, when asked were the Constitution permits the federal government to “force every American to purchase health insurance . . . replied:  ‘Are you kidding?  Are you kidding?’  It’s no surprise, then, that lower court judges and bureaucrats take liberties with laws, regulations, and contracts” (p. 45).  

Our Ruling Class (whether Republican or Democrat), Codevilla insists, “does not like the rest of America” (p. 25).  Rather, they look down on ordinary folks.  Certainly they pity—feel “compassion” for—their inferiors.  But they find the masses too backward and ignorant and religious—too enamored of God and guns and heterosexuality—to know what’s really good for them.  He cites an illuminating anecdote from Mikhail Gorbachev, who recalled “that in 1987, then-Vice President George H. W. Bush distanced himself from his own administration by telling Gorbachev, ‘Reagan is a conservative, an extreme conservative.  All the dummies and blockheads are with him.’  This,” Codevilla concludes, “is all about a class of Americans distinguishing itself from its inferiors” (p. 25).  These inferior folks need to be led (or coerced) by their betters through the mechanism of a strong, centralized government.  Dexterously pulling the strings of power (as Aristotle feared would transpire in democracies), our rulers transfer “money or jobs or privileges—civic as well as economic”—to themselves and their favored special interest groups (p. 28).   One aspect of this is the “crony capitalism” evident in the close alliance between politicians and financial institutions.  A Republican administration rescued Bear Stearns in 2008 and a Democrat administration did the same for auto industry labor unions in 2009.  “The regulators and the regulated become indistinguishable, and they prosper together because they have the power to restrict the public’s choices in ways that channel money to themselves and their political supporters” (p. 31).  

Battling the Ruling Class is a Country Class that resents the “ever-higher taxes and expanding government, subsidizing political favorites, social engineering, approval of abortion, etc.” (p. 52).  It’s distinguished by a commitment to “marriage, children, and religious practice” (p. 53). Individuals, not officials, should make decisions regarding what constitutes the good life, and churches, not bureaucracies, should determine ultimate truths.  The Country Class believes in self-government and shares Thomas Jefferson’s notion that “good government” never takes “‘from the mouth of labor the bread it has earned’” (p. 69).  Codevilla obviously sides with the Country Class and concludes his essay with practical suggestions regarding battle strategies—reducing taxes, returning schools to local control, restoring citizens to their rightful place in the republic, etc.  

Though Codevilla’s essay blends jeremiad with manifesto, it makes clear some of the major issues in America and provides a helpful analysis of underlying aspects of our very real Kulturkampf.   As the distinguished political author of The Almanac of American Politics, Michael Barone, says:  “Angelo Codevilla puts into words what has been troubling an increasing number of Americans about our politics and our government.  It’s a cri de coeur that needs to be read by anyone trying to understand what’s happening in public life in America today.”  

* * * * * * * * * * * * * * * * * * * * *

Whereas Anthony Codevilla and Arthur Brooks write for ordinary folks, a decade ago the celebrated Peruvian economist Hernando de Soto brought a related message to a more scholarly audience in The Mystery of Capital:  Why Capitalism Triumphs in the West and Fails Everywhere Else (New York:  Basic Books, c. 2000).  Prosperity obviously accompanies capitalistic economic systems, but these have only thrived in the West.  Capitalism, despite well-intentioned efforts, doesn’t easily transplant.  Since people work hard and save money everywhere, one wonders why this is so.  “In this book,” he says, to answer the question, “I intend to demonstrate that the major stumbling block that keeps the rest of the world from benefiting from capitalism is its inability to produce capital” (p. 5).  Capital is produced only when property rights are legally secured.  Impoverished peoples have lots of “things, but they lack the process to represent their property and create capital.  They have houses but not titles; crops but not deeds; businesses but not statutes of incorporation” (pp. 6-7).  They have lots of personal assets but not the representational system—formal property—necessary to exploit them.  “Formal property,” de Soto says, “is more than a system for titling, recording and mapping assets—it is an instrument of thought, representing assets in such a way that people’s minds can work on them to generate surplus value.  That is why formal property must be universally accessible:  to bring everyone into one social contract where they can cooperate to raise society’s productivity” (p. 218).  

To explain this situation de Soto explores the five “mysteries” of capital that structure his book.  First there is “the mystery of missing information.”  Allegedly impoverished people in Third World and formerly Communist countries are not, he insists, materially impoverished!  “The poor have already taken control of vast quantities of real estate and production” (p. 30).  They may live in shacks in dismal slums, but they have cell phones and TVs and state-of-the-art tools.  This fact never appears in international agencies’ (or humanitarian and religious organizations’) reports (which continually stress distress) because it is not reported.  It cannot be reported because these possessions have no legal standing, and the poor are forced to rely on informal ownership pacts and operate outside the law.  “They have trillions of dollars in dead capital, but it is as if these were isolated ponds whose waters disappear into a sterile strip of sand, instead of forming a mighty mass of water that can be captured in one unified property system and given the form required to produce capital” (p. 210).  

Secondly, there is “the mystery of capital.”  Just as Ludwig Wittgenstein recognized that “the sense of the world must lie outside the world,” so too the meaning of capital is found outside the physical world in an immaterial, symbolic realm wherein people can “grasp with the mind values that human eyes could never see and to manipulate things that hands could never touch” (p. 63).  Capital must be differentiated from money, which merely facilitates commerce.  Capital comes when property, legally secured with written contracts, deeds etc., is used to creatively envision and develop resources.  “The moment you focus your attention on the title of a house, for example, and not on the house itself, you have automatically stepped from the material world into the conceptual universe where capital lives” (p. 50).  

Thirdly, de Soto deals with “the mystery of political awareness.”  Around the world people are flocking from rural to urban areas, determined to prosper (whether illegally or legally) by entering the world marketplace.  Unfortunately the legal structures in many countries hamper their entrepreneurial endeavors.  Consequently, “people are spontaneously organizing themselves into separate, extralegal groups until government can provide them with one legal property system” (p. 73).  Without the security of legal protection they are limited to a relatively small circle of trusted trading partners.  Any significant division of labor and its resultant prosperity is denied them.  

Fourthly, we should study “the missing lessons of U.S. history,” because 150 years ago the U.S. was a “Third World” country that rapidly developed and prospered.  In the New World, particularly on the western frontier, many European systems (notably England’s common law) dissolved as squatters and settlers and miners and claims associations simply took and held desirable lands.  “Squatters began inventing their own species of extralegal property titles known as ‘tomahawk rights,’ ‘cabin rights,’ and ‘corn rights’” (p. 116).   In time these frontiersmen petitioned and pressured politicians to grant them suitable legal titles.  Soon there developed a “legal innovation” known as “‘preemption’—a principle that would be the key to the integration of extralegal property arrangements in American law over the next two hundred years” (p. 120).  During the 19th century, especially, laws were adapted to the peoples’ needs and the modern American system, strongly defending property rights, emerged.  In de Soto’s judgment, “the American experience is very much like what is going on today in Third World and former communist countries:  The official law has not been able to keep up with popular initiative, and government has lost control” (p. 149).  When and if (and only when and if) property rights are secure, we can expect dynamic developments around the world.  

So basically the poverty problem is a legal problem.  Infinite amounts of “aid” (whether governmental or private) will hardly heal the endemic abscesses of poverty.  “Without an integrated formal property system, a modern market economy is inconceivable” (p. 164).  Equally important is the elimination of ponderous bureaucratic procedures and self-serving lawyers defending the status quo that severely limit the abilities of industrious individuals to build their own homes and launch small businesses.  For example, in Lima, Peru it takes “728 bureaucratic steps” to “acquire legal title to a home” (p. 191)!  To provide entree to the modern economy what’s needed is leaders such as Thomas Jefferson, who “increased the fungibility of property by abolishing, among other things, the practice of entail (not being able to transfer property outside the family)” (p. 188).  Such leaders, de Soto emphasizes, must “do at least three specific things:  take the perspective of the poor, co-opt the elite, and deal with the legal and technical bureaucracies that are the bell jar’s current custodians” (p. 190).  

To Margaret Thatcher, “The Mystery of Capital has the potential to create a new, enormously beneficial revolution, for it addresses the single greatest source of failure in the Third World and the ex-communist countries—the lack of a rule of law that upholds private property and provides a framework for enterprise. It should be compulsory reading for all in charge of the ‘wealth of nations.’”  She knew whereof she spoke, and little more needs saying to endorse this book.  

214 Life After Death

“The immortality of the soul,” said Pascal in his Pensees, “is something of such vital importance to us, affecting us so deeply, that one must have lost all feeling not to care about knowing the facts of the matter.”  From the building of Egypt’s pyramids to Socrates’ arguments in Plato’s Phaedo to the obituary notices in the most recent newspapers, the hope for life after death shines clearly in the human story.  This is, no doubt, because, as Ecclesiastes reveals:  God “has put eternity in their hearts” (3:11).  Thus it makes sense that Dinesh D’Souza endeavors—in Life After Death:  The Evidence (Washington, DC:  Regnery Publishing, Inc., c. 2009)—to explain and defend one of the most deeply engrained intuitions of our species.  In what Rudolph Otto called the “sense of the numinous” we find a persuasive awareness of a “world behind the world” (p. 42).  This awareness informs a quip by Woody Allen, who said:  “I don’t want to achieve immortality through my work.  I want to achieve immortality through not dying.  I don’t want to live on in the hearts of my countrymen.  I would rather live on in my apartment’” (p. 4).  

Questions regarding immortality clearly haunt Woody Allen, but many of the “Enlightened People” who dominate our universities and media claim to give it little thought.  Atheists such as Richard Dawkins and Daniel Dennett—reductive materialists to the core—deny any prospect of eternal life along with their denials of any traces of God in our world.  They all take for granted the dubious Verification Principle of David Hume without recognizing either its flaws or their own unstated (and perhaps unrecognized) faith in non-provable assumptions.  “In reality,” as D’Souza shows, “Hume’s principle not only wipes out all metaphysical claims; it also wipes out the whole of science” (p. 30)—the very discipline that scientists such as Dawkins revere.  To refute the atheists, to encourage the seekers, to comfort the believers, D’Souza has written this book, the “core” of which “consists of three independent arguments for life after death:  one from neuroscience, one from philosophy, and one from morality” (p. 18).  

Following Einstein and Heisenberg—envisioning a world that gets stranger each decade— physicists have increasingly questioned the materialist assumptions of biologists such as Dawkins.  They wonder at mysterious (and utterly invisible) realities such as dark matter and dark energy which may well constitute 95 percent of all that is!  They revel in the amazing fine-tuning details of the universe—all of which suggest the Anthropic Principle.  It really seems, as “explicitly stated by astronomer Fred Hoyle, that a ‘super-intellect must have monkeyed with the laws of physics’” (p. 85).  Furthermore:  “Fantastic though it sounds, modern physics has legitimated the possibility of the afterlife” (p. 74).   Even biologists increasingly seem to grant an “undeniable teleology” to the living world.  Paul Davies says, “‘The laws of nature are rigged not only in favor of life, but also in favor of mind.  Mind is written into the laws of nature in a fundamental way’” (p. 91).  Studying even the simplest cell reveals thousands of intricately interrelated molecules enabling to function like a sophisticated factory programmed with digital software, containing information “equivalent to that found in several encyclopedias” (p. 99).  Thus Francis Crick, famed for his discovery of the DNA, confessed:  “‘An honest man, armed with all the knowledge available to us now, could only state that in some sense the origin of life appears at the moment to be almost a miracle, so many are the conditions that would have had to get satisfied to get it along’” (p. 100).  

Drawing upon both physics and biology, neuroscience now opens up vistas for our understanding of life after death, for there’s an inner mental reality that cannot be reduced to matter-in-motion.  “Our minds cannot be accounted for exclusively in terms of our neurons” (p. 107).  Whether I think or feel or remember or dream or love or hate or hope I’m ever aware that I am—there’s a real me, a consciousness self ever aware of a vast variety of undeniable realities.  So we must turn from neuroscience to philosophy as we wonder at the world, seeking meaning to existence.  In our minds we see beauty as well as rocks in mountains, love as well as color in another’s eyes.  There’s a rich inner world that defies all efforts to explain it in purely physical terms.  There’s an immaterial self—a soul, a mind—that most clearly defines us as a person and is most evident in consciousness and free will.  Invisible under a microscope, the soul is  self-evidently real to common sense thinkers who recognize realities that cannot be scientifically explained.  Indeed, as Immanuel Kant argued, “there is a part of human nature than transcendentally operates outside the physical laws of governing material things” (p. 143).  

An important part of our nature that transcends nature is our moral sense, what Adam Smith called the “impartial spectator.”  Immanuel Kant, in his Critique of Practical Reason, insisted that we must perfectly live in accord with the moral law in order to attain the holiness that constitutes our summum bonum.   We cannot, however attain that end on earth, so it is imperative for us to enjoy “endless progress” in “an endless duration of the existence and personality of the same rational being (which is called the immortality of the soul).”  Following Kant’s argument—a “postulate” of the practical reason, an ingredient of his “moral faith,” a necessary implication of his “categorical imperative”—D’Souza says:  “There has to be cosmic justice in a world beyond the world in order to makes sense of the observed facts about human morality” (p. 172).  Careful study of most religions, as well as philosophy, reveals a link between hope for life after death and cosmic justice:  after death comes the judgment!    

D’Souza finishes his argument for life after death with a rather conventional examination of the claims Christians (such as he) make regarding the Resurrection of Jesus Christ.  Given the historical information we have, nothing makes quite so much sense as the traditional position that He really died and was buried and rose again on the third day.  And, as Augustine stressed, “when we become Christians, we immediately become citizens of the heavenly city.  We don’t have to wait for the next life to get our membership cards; we get them right here and now.  Philosopher Dallas Willard writes that when it comes to eternity there is no waiting period; rather ‘Eternity is now in process’” (p. 234).  

* * * * * * * * * * * * * * * * * 

Whereas D’Souza sets forth arguments for life after death, Carlos Eire simply gives us A Very Brief History of Eternity (Princeton:  Princeton University Press, c. 2010).  Eire is a Professor of History and Religious Studies at Yale University and approaches his subject with an irenic air of ironic detachment.  He’s obviously curious about eternity (and eternal life) and appreciates its importance to individuals and societies, but he’s not committed to any position regarding it.  

The most ancient remnants of human history certainly reveal our species’ preoccupation with time and eternity.  “Many experts think that the cave paintings and fertility figurines were religious in nature, and an attempt to transcend mundane existence.  Paleolithic burial customs lend credibility to this hypothesis, for the caring respect shown to the dead, and the ritualistic behavior implied by such care, point to a belief in something beyond the material world” (p. 10).  For our Paleolithic ancestors, as for ourselves:  “Thinking and feeling that one must exist is part and parcel of human experience.  Conceiving of not being and of nothingness is as difficult and as impossible as looking at our own faces without a mirror” (p. 11).  Thus we find, Eire says, “eternity conceived” to comfort us as we acknowledge the reality of death.  Our most ancient written records, testify to our deep hunger for immortality.  Our earliest philosophical speculations, emerging primarily in Greece, quickly led to metaphysical (or ontological) inquiries, wondering about realities beyond the material world.  Plato—condescendingly called “naive old Plato” by the author (p. 224)—built upon the insights of earlier philosophers such as Xenophanes and Parmenides and concluded “there must be some ultimate, underlying all of existence” (p. 42).  

In Christianity there was a merger of Greek metaphysics and Jewish religious thought.  Hope for life everlasting was basic to the Gospel of Jesus Christ, and the earliest saints and martyrs testified to their confidence in it.  Thus St Augustine, in The City of God, rejoiced in “the promise of a resurrected eternal existence as the greatest and most universal of all hopes shared by human beings.  ‘I know you want to keep on living,’ he said to his readers:  ‘You do not want to die.  And you want to pass from this life to another in such a way that you will not rise again as a dead man, but fully alive and transformed.  This is what you desire.  This is the deepest human feeling; mysteriously, the soul itself wishes and instinctively desires it’” (p. 64).  With the emergence of a distinctively Christian culture deeply rooted in the thought of Augustine in the Middle Ages, we find what Eire describes as “eternity overflowing.”  Benedictine monks, working and praying, reciting the daily office, gave witness to their eternal orientation.  The flying buttresses and rose windows in lofty Gothic cathedrals (such as Chartres and Notre Dame) continue to lift the gaze of all who enter in heavenly directions.  Mystics such as St Francis Assisi and St Bonaventura, tasting eternity in their ecstasies, bear witness to the fact that Medieval Christians lived with an intense awareness and expectancy of eternal life, of heavenly realities.  Erudite theologians—and preeminently the “angelic doctor” Thomas Aquinas—made it clear that highly educated and rational thinkers could make transcendent realities their ultimate concerns.  Importantly, “Christ was made physically present in the eucharistic bread and wine, the mass itself, and the consecrated bread, especially, became the supreme locus divinitatis, the ultimate materialization of the divine and eternal” (p. 84).  In the “age of faith” eternity was right at hand.  This Medieval confidence that the dead “were never really dead and gone” began to fade in 16th century as the Protestant Reformation wrought subtle and ultimately profound changes throughout Europe.  Luther and the Reformers certainly hoped to go to heaven, but they abandoned the important Catholic belief in the “communion of saints.”  The dead were simply dead and gone.  “This life and this world, then, became the sole focus of religion, as did the individual over the community and even over all of history itself” (p. 152).  

The past three centuries Eire characterizes as “from Eternity to Five-Year Plans.”  In his judgment, the awareness of and hope for eternal life have faded as Enlightenment skepticism and scientific discoveries dissolved Christian beliefs.  “Banished from physics, heaven went into exile in metaphysics, a location that Immanuel Kant (1724-1804) would soon unmask as an imaginary island” (p. 171).  To substantiate his assertions, Eire cites the usual sources, from Voltaire to Nietzsche to Stephen Hawking.  He fails, however, to cite equally brilliant scientists and philosophers who devoutly believed in life eternal.  And, importantly, he notes but fails to appreciate the fact that for the huge majority of ordinary folks this question remains highly pertinent.  Though he’s devoted his scholarly career to religious studies, he apparently agrees with the noted cinematographer Luis Bunuel, who “concluded that ‘when all is said and done, there’s nothing, nothing but decay and the sweetish smell of eternity’” (p. 223).

Eire’s work gives us a readable narrative, a description of various views of eternity, but it leaves us pondering precisely why it lacks the existential energy the issue commands!  The philosophical relativism, the chronological snobbery so evident throughout his treatise, renders it in the final analysis an exercise in futility.  Despair, however genial, demonstrates the loss of hope, the deadliest of all sins!  

* * * * * * * * * * * * * * * * * 

In Evidence of the Afterlife:  The Science of Near-Death Experiences (New York:  HarperOne, c. 2010), Jeffrey Long (a medical doctor specializing in radiation oncology) provides empirical data that “all converge on one central point:  There is life after death” (p. 4).  Long is a medical doctor who, in 1998, established the Near Death Experience Research Foundation with a website that has now compiled the testimonies of more than 1300 individuals who responded to extensive questionnaires and described their near-death experiences.  As a scientist, he shares Einstein’s view:  “A man should look for what is, and not what he thinks should be.”  So he and his team carefully studied the responses and “followed a basic scientific principle:  What is real is consistently seen among many different observations” (p. 3).  

Long discerns and details nine lines of evidence to reach his conclusion.  Though he’d read a bit about near-death experiences, he had no serious interest in the issue until he listened to one of his friend’s account—a highly credible person’s persuasive witness.  She’d “had an allergic reaction to a medication during” an elective surgery.  Her heart stopped, but then, she said:  “Immediately after my heart stopped I found myself at ceiling level.  I could see the EKG machine I was hooked to.  The EKG was flatlined.  The doctors and nurses were frantically trying to bring me back to life.  The scene below me was a near-panic situation.  In contrast to the chaos below, I felt a profound sense of peace.   I was completely free of any pain.  My consciousness drifted out of the operating room and moved into a nursing station.  I immediately recognized that this was the nursing station on the floor where I had been prior to my surgery.  From my vantage point near the ceiling, I saw the nurses bustling about performing their daily duties” (p. 21).   Soon she passed through a tunnel and entered a wonderful, peaceful place where she was reunited with deceased loved ones.  Given the choice to return to her body on earth, she reluctantly did so and awakened in the ICU.  But she ever remembered how she had entered a “realm of overwhelming love” where she “was truly home” (p. 28).   This woman’s account prompted Long to begin his scholarly investigation.  He’d become proficient in computer technology and decided to establish a website (www.nderf.org) carefully constructed to filter out spurious accounts and dedicated to near-death case studies.  This enables people around the world to contribute their information as well as read others’ accounts.  “By studying thousands of detailed accounts of NDErs, I found the evidence that led to this astounding conclusion:  . . . it is reasonable to accept the existence of an afterlife” (p. 44).  This leads him, in successive chapters, to set forth nine proofs of his assertion.  

Proof #1 is “Lucid Death.”  Witnesses clearly remember their deaths and provide details regarding their surroundings.  Unlike dreams or hallucinations, these moments are filled with sharp details and vibrant visions.  Proof #2 cites “Out of Body” remembrances.  “Approximately half of all NDEs have an OBE that involves seeing or hearing earthly events” (p. 69).  They remember seeing quite tangible things in areas they’d never seen before.  Proof #3 is “Blind Sight.”  Amazingly enough, people born blind see clearly as they leave their bodies!  Their testimonies “required me to consider what I would have thought untinkable early in my medical career:  perhaps NDErs are actually describing another real, transcendental realm of existence” (p. 91).  Proof #4 deals with the “Impossibly Conscious” awareness of anesthetized persons who apparently die and yet retain a perfect sense of consciousness.  Proof #5, the “Perfect Playback,” provides evidence regarding the “life review” many NDErs experience.  They do, in fact, see their lives flashing before their eyes!  They saw everything they’d said and done—good and bad—and realized the moral dimensions to all of life.  Proof #6 is “Family Reunion.”  “Many near-death experiencers describe dramatic and joyous reunions with people known to them who died long before their near-death experience took place” (p. 121).   As Mark Twain said, “‘Death is the starlit strip between the companionship of yesterday and the reunion of tomorrow’” (p. 133).  Proof #7, “From the Mouths of Babes,” takes seriously the stories of children, who affirm (without the possibility of cultural indoctrination) the same truths declared by adult NDErs.  Proof #8 focuses on the “Worldwide Consistency” of NDE witnesses.  All around the world people of various religions, educational levels, and ethnic traditions tell the same story:  life after death is utterly real.  Proof #9 sets forth “Changed Lives” as evidence regarding the credibility of the persons who’ve experienced NDEs.  What they’ve experienced leads them to live better and face death fearlessly.  

Though you need patience and persistence to move through the evidence presented in this book, its tabulations lend credence to the philosophical and theological arguments available elsewhere.

* * * * * * * * * * * * * * * *

Having earlier published a theological treatise—Heaven and the Afterlife—James Garlow and Keith Wall have compiled and organized anecdotal evidence for their position in their most readable Encountering Heaven and the Afterlife (Minneapolis, Minnesota, c. 2010).  They present more than 30 first-person accounts by specific, identifiable individuals, describing visions of heaven and hell, and encounters with angels and ghosts.  Many of the persons survived a near-death experience, but some of them simply slipped into a deeper realm of reality and recall details concerning it.  The book is, they declare, “an eclectic collection, offering an intriguing look into the lives of ordinary people who have had extraordinary spiritual encounters” (p. 14).  

Garlow makes clear that this subject challenges him, since he is by nature and academic training rather skeptical of such accounts.  Having never personally survived a near-death experience or seen visions of supernatural realities, he strongly identifies with the apostle Thomas and finds great comfort in the biblical passage declaring “I do believe; help me overcome my unbelief.”  Thus he and his co-author “used reasonable and consistent vetting techniques.  To accept stories as valid, one of us either needed to know the people personally—with firsthand knowledge of their integrity and credibility—or we had to know someone with a high degree of reliability who could vouch for the person being profiled” (p. 19).  

Taking all the testimonies together, Garlow and Wall find three distinct themes.  First:  “The division or distance between the physical world and the spiritual world is incredibly thin—like tissue paper.  It’s probably more accurate to say there really is no distance.  Beings with bodies and beings without occupy the same space, just on different planes.”  Second:  “The more we learn about life beyond the here and now, the less likely we are to be unnecessarily fearful.”  Third:  “The mystery and magnificence of God make life (this one and the one to come) an amazing adventure” (p. 15).  

Typical of the book’s presentation is the story of little Kennedy Buettner, whose father is “a Tuscaloosa physician and the University of Alabama football team doctor” (p. 35).  Kennedy somehow slipped away from a crowd and fell into a backyard swimming pool, where he lay on the bottom for more than 12 minutes.  When found, his skin was blue, his body bloated, and his pupils dilated.  His father frantically sought to revive him, but he “began to thrash around and exhibit behavior that doctors call ‘abnormal posturing,’ a kind of muscle seizure that indicates severe brain damage—and usually precedes death” (pp. 37-38).  In time the paramedics arrived and Kennedy was taken to a hospital.   Miraculously, he came back to life, telling about an angel who intervened to save him and describing his visit to heaven (where he saw Jesus and his recently deceased uncle) as well as earthly places he’d not seen before.  Precisely one week after the accident, Kennedy was back home playing baseball with his siblings!  

Reading this highly readable book deepens our hope for life everlasting, providing experiential evidence for the biblical promises that have comforted Christians for two millennia.  

213 “The World Turned Upside Down”

Fifty years ago Bob Dylan sang “the times they are a changing,” and indeed they were.  But our times are not so much changing as confusing!  So Melanie Phillips, an eminent and learned English journalist, says:  “It is as if one has wandered onto the set of a Bunuel movie scripted by Kafka.  Nothing is really as it is said to be.  Society seems to be in the grip of a mass derangement” (p. x).  Phillips details and seeks to understand her bewilderment in The World Turned Upside Down:  The Global Battle over God, Truth, and Power (New York:  Encounter Books, c. 2010), a probing analysis of significant phenomena that threaten the survival of our civilization.  She writes as “a journalist who believes in telling truth to power and following the evidence.  What I have concluded is that power has now hijacked truth and made it subservient to its own ends.  The result is a world turned upside down” (p. xi).  This results from two centuries of intellectual developments which can “be summed up as man first dethroning God in favor of reason, then dethroning reason in favor of man, and finally dethroning man himself.  This was done by replacing objective knowledge with ideology, which grew out of the belief that man was all-powerful and could reshape the world in whatever image he chose” (p. 303).  

This loss of rationality haunts Phillips.  “The replacement of objective truth by subjective experience has turned some strands of science into a branch of unreason, as evidence is hijacked by ideology.  The perceptive Anglican bishop Lesslie Newbigin grasped this fact back in the 1980s.  In his essay The Other Side of 1984, he wrote, ‘I have started from the perception, which I believe to be valid and widely shared, that we are nearing the end of the period of 250 years during which our European culture has been confidently offering itself to the rest of the world as a torchbearer for human progress’” (p. 393).  President Barack Obama provided “a startling example of this genuflection to the forces of irrationality and antimodernity” in his “speech of conciliation to the Muslim world in Cairo in June 2009.”  He amplified the twisted Arabic rendition of the history of Palestine, “sanitized Islam and its history,” and “selectively and misleadingly quoted the Qur’an to present a passage that is a prescription for violence and murder against Jews and ‘unbelievers’ as instead a precept affirming the value of preserving human life; and he also claimed that Islam played a major role in the European Enlightenment” (p. 399).  

Phillips’ treatise charts the unexpected parallels and shared perspectives of left-wing “progressives” such as Obama, Islamists, environmentalists, fascists, militant atheists and religious fanatics.  “From manmade global warming to Israel, from Iraq to the origin of the universe, the West has replaced truth with ideology.  Faced with an enemy that has declared war upon reason, the West has left the citadel undefended” (p. 406).  All these movements are “united by the common desire to bring about through human agency the perfection of the world, an agenda which history teaches us leads invariably—and paradoxically—to tyranny, terror and crimes against humanity” (p. xiii).  This pervasive utopian desire is, at heart, a repudiation not only of Western Civilization but of its Judeo-Christian roots.  Its blatant irrationality betrays a deeper betrayal of the very civilization that birthed it.  There thus exists a curious but powerful chain—“the Red-Black-Green-Islamic Axis”—that gives Phillips’ treatise its synthesizing persuasiveness.  These very disparate movements share a deep commonality:  all are utopians who seek to establish their “alternative reality” (p. 219).   

To do so—to transform the world—Jews and Christians and their theological worldview must be marginalized if not banished.  This requires replacing the Mosaic God and His moral standards with something better—an evolving Mother Earth or benevolent Nanny State or Islamic Sharia.  Consequently:  “In Britain and America, dominant ways of thinking have simply reversed the notions of right and wrong, normal and abnormal, victim and victimizer, truth and lies” (p. 289).  Formerly immoral behaviors “such as sexual promiscuity or having children without a father, was treated as normal.  Correspondingly, those who advocated mainstream, normative values such as fidelity, chastity or duty were accused of bigotry because they made those who did not uphold these values feel bad about themselves—now the ultimate sin.  Alternative lifestyles became mainstream.  The counterculture became the culture” (p. 286).  Still more:  “the tyrannical ideologies of the modern age . . . [have] forgotten that the reason upon which it prides itself and the science that flows from that reason owe their existence to religion” (p. 337).  

To explore this transformation, Phillips begins with a brief and illuminating examination of “cults and conspiracies from Diana to Obama” which illustrate “an increasing tendency to live in a fantasy world where irrational beliefs in myths are thought to restore order to chaotic lives” (p. 6).  The Brits’ response to Princess Diana’s death—an “orgy of sentimentality” (p. 7)—revealed “the extent of Britain’s transformation—from a country of reason, intelligence, stoicism, self-restraint and responsibility into a land of credulousness, sentimentality, emotional excess, irresponsibility and self-obsession” (p. 6).  For mourners emoting over Diana, feeling bereaved validated one’s character and shedding copious tears proved one’s goodness.  

Across the Atlantic an equally irrational frenzy appeared as Barack Obama tugged on heartstrings and elicited the fantasy that “he would both redeem America’s shameful history of slavery and racial prejudice and bring peace to the world” (p. 8).  “Brushed aside were highly troubling details of his personal history:  his ambivalence about his fractured identity, his efforts to conceal or misrepresent crucial details about his background, and a pattern of unsavory or radical associations.  The fact that his pre-election statements were intellectually and politically incoherent, frighteningly naive or patently contradictory was of no consequence” (p. 8).  Obama invoked messianic claims, saying his election signaled “the moment when the rise of the oceans began to slow and our planet began to heal’” (p. 10).  “Presented with this absurd display of hubris and narcissism, American reacted by junking rationality altogether and elevating Obama not just to the presidency but to divinity” (p. 10).  Haunted by guilt for America’s “original sins of slavery and racism” (p. 259), Obama supporters elected a President in hopes of national (and personal) redemption.  

The irrationality evident in the cults of Princess Diana and Barack Obama is equally evident in “the myth of environmental Armageddon,” one of the most appalling hysterias now plaguing planet earth.    The scandal that rocked the scientific community—the 2009 revelations regarding the deceptions perpetrated by the Climatic Research Unit at the University of East Anglia—should give us pause!  Many of the apocalyptic claims, many of the end-of-time anthropogenic global warming pronouncements of Al Gore, may well prove to be nothing more than myths.  Indeed, Phillips says, it “is perhaps the single most dramatic example of scientific rationality being turned on its head” (p. 15).  When carefully examined, empirical evidence fails to demonstrate the computer-driven theory of anthropogenic global warming:  sea levels are not rising, polar bears are not disappearing, Arctic temperatures are not soaring, and Antarctic ice sheets are not declining.  

Historians, of course, know that the earth has (for three thousand years) experienced significant temperature changes.  To misinform the public about this, however, climatologist Michael Mann constructed his famous “hockey-stick” graph that simply air brushed away both the Medieval Warming Period and subsequent Little Ice Age!  Mann’s graph buttressed his claim that industrialization caused the 20th century’s rapid and atypical rise in temperatures, and one of the scientists promoting his agenda sent colleagues an “email that said ‘We have to get rid of the Medieval Warm Period’” (p. 27).  Falsifying data is fine as long as it leads to social change, it seems!  Indeed, Paul Watson (a Greenpeace leader) admitted:  “‘It doesn’t matter what is true; it only matters what people believe is true. . . .  You are what the media define you to be.  [Greenpeace] became a myth and a myth-generating machine’” (p. 31).  Phillips concludes:  “Manmade global warming theory lies in shreds, and yet this fact is denied and ruthless attempts are made to suppress it, even as they counterargument has gained ground and exposed the hollowness of its claims.  That is because the theory is not science.  . . . it is rather a quasi-religious belief system; and the only reason it was sustained for so long was through the abuse of authority and intimidation of dissent” (p. 32).  

Radical environmentalists, promoting a return to a pagan “progressive spirituality,” should have met resistance from Christian thinkers.  But significant sections of the Church have embraced rather than repudiated the pantheism basic to much of environmentalism.  This was particularly evident in the Church of England when, in 1989, the World Wide Fund for Nature “took over Canterbury Cathedral, its precincts and other church property for a Celebration of Faith and the Environment, including contributions by Buddhists, Sikhs and other Eastern faiths—but no mention of Jesus.  One of the highlights was a Celebration of the Forest, which took place in the cathedral” with a high school choir singing “‘The trees have power.  We worship them. . .  because they give us life’” (p. 358).  A year earlier, Prince Philip, Duke of Edinburgh, husband of the Queen Elizabeth (titular head of the Church of England), “made this notable remark:  ‘In the event that I am reincarnated, I would like to return as a deadly virus, in order to contribute something to solve overpopulation’” (p. 299).  Alarmed by such developments, Phillips says:  “In the face of the obscurantist and pagan onslaught against truth, reason and enlightenment, the most ‘progressive’ forces within the church thus not only have failed to hold the line for the civilization to which Christianity had given rise but have chosen to forsake its doctrines and join in the attack”  (p. 363).  

The Environmentalists’ quasi-religious belief system is, ironically, rooted in the “scientific triumphalism” or “scientism” evident in various claims made by scientists such as Oxford University’s Richard Dawkins, whose “belief that matter had probably arisen from literally nothing at all seemed itself to be precisely the kind of irrationality, or ‘magic,’ that he scorns” (p. 73).  His fellow atheist, Francis Crick, confessed that the information-packed DNA molecules could not have resulted from simple natural selection and proposed instead a “directed panspermia”—the floating to earth of living organisms from  outer space!  Anything but God suffices to explain the mystery of life on earth!  In fact:  “Although the view that living systems arose from inorganic matter is widespread enough to amount to an orthodoxy of thought, it is hard to find any evidence to support it.  The experiments proving the origins of life all depend crucially on the intervention of the chemist conducting the experiment.  If the current orthodoxy is not true, then the only alternative—let us be honest—is some form of designing intelligence.  Which leads inescapably to religious or metaphysical belief” (p. 84).  

Materialists such as Dawkins and Crick, upholding the “creation myth of scientific naturalism,” apparently fail to discern the difference between matter and information.  But the more we know about DNA and RNA etc. the more we realize that information, not bits of matter, orchestrate all that transpires in the physical world.  “To understand the role and character of biological information,” Phillips says, “is to see the limits of scientific materialism.  As the director of the German Federal Physics and Technology Institute, Professor Werner Gitt, has observed, ‘A physical matter cannot produce an information code.  All experiences show that every piece of creative information represents some mental effort and can be traced to a personal idea-giver who exercised his own free will, and who is endowed with an intelligent mind. . . .  There is no known law of nature, no known process and no known sequence of events which can cause information to originate by itself in matter. . . .  Information is something different from matter.  It can never be reduced to matter.  The origin of information and physical matter must be investigated separately’” (p. 86).  

So too much of the information spread regarding the Iraq War was fundamentally flawed.  Dispassionate examination of the evidence reveals that neither Prime Minister Tony Blair nor President George W. Bush launched the war because of stockpiled Weapons of Mass Destruction.  “The overwhelming emphasis was instead on Saddam’s refusal to obey binding United Nations resolutions and the need to enforce the authority of the UN” (p. 35).  Nevertheless, when no WMDs were discovered, the left’s mantra mushroomed:  Bush lied; people died.  “Yet at every level, this claim was itself demonstrably untrue” (p. 35).  Still more:  it’s quite likely, Phillips shows (relying on the testimony of Georges Sada, Iraq’s air vice-marshal), that the WMDs were removed from Iraq shortly before the war began.  Sada, an Assyrian Christian who is now president of the National Presbyterian Church in Baghdad, says “he had lived and worked with the ever-present daily reality of Saddam’s tactics of hiding his WMD from the weapons inspectors” (p. 43).  Sada’s firsthand evidence has been ignored by the mainstream media, but it probably indicates the truth regarding Saddam Hussein’s WMD.  To Phillips:  “The sustained distortion, misrepresentation selective reporting and systematic abandonment of evidence and reason over the war in Iraq clearly reflect something rather more profound than simple opposition to a divisive war” (p. 49).  There is, she suspects, a deep unwillingness to see Islam as a militant movement engaged in a systematic Jihad against the West.  Consequently, opinion-makers in England and America prefer to blame America and Israel for Muslim rage and terrorism.  

Indeed, one’s stance on the Arab-Israeli conflict serves as an accurate litmus test regarding one’s moral compass!  To Phillips, “The Middle East impasse is the defining issue of our time.  It is not an exaggeration to say that the position an individual takes on the conflict between Israel and the Arabs is a near-infallible guide to their general view of the world.  Those who believe that Israel is the historic victim of the Arabs—and that its behavior, while not perfect, is generally as good as could be expected given that it is fighting for its existence against an enemy using the weapons of religious war—typically have a rational, nonideological approach to the world, arriving at conclusions on the basis of evidence.  Those who believe that Israel is the regional bully hell-bent on oppressing the Palestinians, and who equate it with Nazism or apartheid, are generally moral and cultural relativists who invert truth and lies, right and wrong over a wide range of issues, and are incapable of seeing that their beliefs do not accord with reality” (p. 365).  The fact that only in Israel are Christians safe in the Middle East validates this generalization.  

Repeatedly Phillips reconsiders the animosity towards and “misrepresentation of Israel” that is everywhere evident.  Indeed:  “The fraught issue of Israel sits at the epicenter of the West’s repudiation of reason” (p. 53).  The history of Israel’s emergence as a nation is sorely distorted by both academics and journalists.  “History is turned on its head; facts and falsehoods, victims and victimizers are reversed; logic is suspended, and a fictional narrative is now widely accepted as incontrovertible truth.”  To set the record straight she provides a brief account of the Jews’ “historic claim to the land of Israel,” the “false ‘Palestinian’ claim to the land of Israel,” the “myth of the Arab expulsion from Palestine,” the “myth of Israel’s ‘illegal’ occupation,” the “myths of Israel’s ‘genocide and ‘apartheid,’” and “false allegations against Israel.”  Sadly enough, “There is no other world conflict that is so obsessively falsified.  Where Israel is involved, truth and reason are totally suspended.  Irrationality and hysteria rule instead” (p. 71).   

This is evident in “the Middle East Witch-Hunt” wherein Israel and her supporters are demonized.  Claims in pamphlets published by the notorious antisemite Lyndon LaRouche were touched up and certified by the New York Times and the New Yorker!  The most strident anti-Israel statements come from elite academics such as Noam Chomsky.  Virulent Arabic propaganda, cooked up by Hamas and other radical Islamists, takes on the simulacra of truth in many Western newspapers and journals and “is also specific to the intelligentsia.  It correlates overwhelmingly with education and social class” (p. 182).  The highly educated social elites (Harvard professors, Nobel Prize winners, “humanitarians” controlling dozens of NGOs) are bastions of antisemitism!  Thus former President Jimmy Carter, in Palestine:  Peace Not Apartheid, actually alleges that Israel illustrates “‘worse instances of apartness, or apartheid, than we witnessed even in South Africa’” (p. 189).  

In truth, “Muslim hatred of the Jews is the root cause of the war between Israel and the Arabs” (p. 160).  Such hatred, clearly enunciated in the Koran, has shaped Muslim-Jewish relations since Mohammed.  “Islam seeks to obliterate Judaism altogether by appropriating its foundational story and doctrines, radically altering them, and claiming them to be authentic Judaism, while accusing the Jews of falsifying their own sacred text so as to disguise the alleged priority of Islam” (p. 164).  Islamic hatred is especially fueled by the role of women in Jewish society.  The immorality of “Jewish women dressed in shorts, enjoying relative sexual and political freedom and near equal status with men” deeply offends radical Muslims.  Venturing an explanation of this, Phillips says:  “Women embody fecundity, earthiness, a bodily commitment to this world and to the human senses.  Female sexuality is therefore essentially life-giving and life-affirming.  To Islamists such as [Seyd] Qutb, however, sexuality was intrinsically ‘animalistic’—precisely because it affirms this life and not the next world.  . . . .  So the Islamists hate them because the Jews love life, have tenaciously hung on to it and have pursued happiness and fulfillment as the highest goals of existence.  Islamists by contrast define death and the afterlife as the highest goal and believe in the abnegation of the self and the denial of humanity” (p. 171).  

Radical Muslims, environmentalists, and scientific triumphalists, drawing a “false polarity between religion and science, deny the Judeo-Christian understanding of the world that enables science to flourish.  Only in the West, where it was believed that One God created an orderly universe that could be rationally studied, has science developed.  As C.S. Lewis said:  “‘Man became scientific because they expected law in nature, and they expected law in nature because they believed in a lawgiver’” (p. 327).  Though Muslims obviously believe in One God, Islam’s “concept of reason departs radically from that of the Hebrew Bible and Christianity.  It presents Allah not as the creator of a universe that runs according to its own natural laws, but as an active God who intrudes on the world as he deems appropriate.  Natural laws are thus deemed blasphemous, for they deny Allah’s freedom to act.  So Islam does not teach that the universe runs along lines laid down by God at Creation but assumes that the world is sustained by his will on a continuing basis” (p. 328).  

Jews and Christians believe God created a world that runs according to His natural and moral laws, His Logos.  “In repudiating Jewish teaching and its moral codes,” Phillips says in the books concluding paragraph, “the West has turned upon the modern world itself.  In turning upon the State of Israel, the West is undermining its defense against the enemies of modernity and the Western civilization that produced it.  The great question is whether it actually wants to defend reason and modernity anymore, or whether Western civilization has now reached a point where it has stopped trying to survive” (p. 408).  

# # # 

212 Modernity on Trial

Fortunately for the general reader, first-rate philosophers (whose scholarly tomes frequently target a select audience) often write more accessible essays, addressing both current issues and perennial truths.  Thus Leszek Kolakowski, a Polish thinker rightly renowned for his magisterial, three-volume Main Currents of Marxism, published a score of short essays in Modernity on Endless Trial (Chicago:  The University of Chicago Press, c. 1990) that offer serious readers valuable insights into some main intellectual currents of the 20th century.  Whenever an erstwhile Marxist casts a favorable glance as Christianity it makes sense for believers to consider his reasons.   

One fourth of the essays focus “On Modernity, Barbarity, and Intellectuals.”  Strangely enough, a corps of intellectuals has orchestrated the barbarism that has resurfaced during the last three centuries—an  era labeled “modernity.”  Since Kolakowski cannot see how “postmodern” differs from “modern,” he discerns the loss of religion (and loss of taboos) as the primary current in modern (and postmodern) times, leading to “the sad spectacle of a godless world.  It appears as if we suddenly woke up to perceive things which the humble, and not necessarily highly educated, priests have been seeing—and warning us about—for three centuries and which they have repeatedly denounced in their Sunday Sermons.  They kept telling their flocks that a world that has forgotten God has forgotten the very distinction between good  and evil and has made human life meaningless, sunk into nihilism” (pp. 7-8).  A series of influential, secularizing skeptics prepared the way for the destructiveness of “Nietzsche’s noisy philosophical hammer” (p. 8).  The “intellectuals” responsible for this process were not the scholars—scientists or historians—who “attempt to remain true to the material found or discovered” (p. 36) apart from themselves.  A barbarizing “intellectual” is someone who wishes not “simply to transmit truth, but to create it.  He is not a guardian of the word, but a word manufacturer” (p. 36).  Invariably, such intellectuals are seductive, spinning wondrous tales of utopian vistas.  

To Nihilists such as Nietzsche, truth is illusory.  Consequently, various cultures’ “truths” are equally “true” even if they are obviously contradictory!  Such cultural relativism—declaring all cultures are equal, praising the Aztecs as well as the Benedictines—easily ends in admiration for various forms of what was once judged barbarism.  The sophisticated, scholarly “tolerance” so mandatory in elite universities and journals ends by granting “to others their right to be barbarians” (p. 22).  What we are witnessing is the Enlightenment devouring itself!  In Kolakowski’s judgment:  “In its final form the Enlightenment turns against itself:  humanism becomes a moral nihilism, doubt leads to epistemological nihilism, and the affirmation of the person undergoes a metamorphosis that transforms it into a totalitarian idea.  The removal of the barriers erected by Christianity to protect itself against the Enlightenment, which was the fruit of its own development, brought the collapse of the barriers that protected the Enlightenment against its own degeneration, either into a deification of man and nature or into despair” (p. 30).  

Another fourth of the essays deals with “the Dilemmas of the Christian Legacy,” for modernity’s secularizing process has significantly, if indirectly, shaped much of the Christian world “through a universalization of the sacred,” sanctifying worldly developments as “crystallizations of divine energy.” (p. 68).  The “Christianity” rooted in process theology—as propounded by Teilhard de Chardin for example— envisions universal salvation and unending evolutionary progress.  “In the hope of saving itself, it seems to be assuming the colors of its environment, but the result is that it loses its identity, which depends on just that distinction between the sacred and the profane, and on the conflict that can and often must exist between them” (p. 69).  Kolakowski detects and dislikes what he finds in these circles—“the love of the amorphous, the desire for homogeneity, the illusion that there are no limits to the perfectibility of which human society is capable, immanentist eschatologies, and the instrumental attitude toward life” (p. 69).  

Losing their sense of the sacred, this-worldly philosophies and religions fail to provide any basis for culture.  Indeed:  “With the disappearance of the sacred, which imposed limits to the perfection that could be attained by the profane, arises one of the most dangerous illusions of our civilization—the illusion that there are no limits to the changes that human life can undergo, that society is ‘in principle’ an endlessly flexible thing and that to deny this flexibility and this perfectibility is to deny man’s total autonomy and thus to deny man himself” (p. 72).  A rejection of the sacred invites the denial of sin and evil.  

Though not overtly Christian, Kolakowski rejected the atheistic Marxism of his early years and found Christianity the best hope for the world.  “There are reasons why we need Christianity,” he argues, “but not just any kind of Christianity.  We do not need a Christianity that makes political revolution, that rushes to cooperate with so-called sexual liberation, that approves our concupiscence or praises our violence.  There are enough forces in the world to do all these things without the aid of Christianity.  We need a Christianity that will help us move beyond the immediate pressures of life, that gives us insight into the basic limits of the human condition and the capacity to accept them, a Christianity that teaches us the simple truth that there is not only a tomorrow but a day after tomorrow a well, and that the difference between success and failure is rarely distinguishable” (p. 85).   

Given his critique of modernity, Kolakowski has little patience with the modernist (or liberal) Christianity that focuses on “social justice,” peace, and ephemeral earthly progress—the this-worldly political agenda so routinely proclaimed in some quarters.  “Christianity is about moral evil, malum culpae, and moral evil inheres only in individuals, because only the individual is responsible” (p. 93).  To even speak of “a ‘morally evil’ or ‘morally good’ social system makes no sense in the world of Christian belief” (p. 93).  The hopeless “demythologization” project of modernists such as Bultmann elicits Kolakowski’s erudite refutation, for it was merely a fitful gasp of the irrational skepticism launched centuries ago by Occam and the nominalists, by Hume and the empiricists.  In truth, “there is no way for Christianity to ‘demythologize’ itself and save anything of its meaning.  It is either-or:  demythologized Christianity is not Christianity” (p. 105).  

Demythologized Christianity contradicts itself.  In this respect it’s simply another utopian political ideology.  Having early advocated the Marxist version of utopia, Kolakowski easily detects the many currents of such blissful imagining—popularly expressed in John Lennon’s popular song “Imagine.”  Consider the fantasies of folks who envision a world wherein fraternity is realized, where equality prevails in every realm.  They “keep promising us that they are going to educate the human race to fraternity, whereupon the unfortunate passions that tear societies asunder—greed, aggressiveness, lust for power—will vanish” (p. 139).  Inevitably they establish dictatorships designed to enforce equality.  Allegedly admirable goals—caring for the impoverished and weak—require the abolition of private property and a state controlled economy, the abolition of the free market.  However noble the intentions, however, “the abolition of the market means a gulag society” (p. 167).  

In the name of compassion, giving preferential treatment to various disadvantaged groups, societies easily “retreat into infantilism” (p. 173).  Citizens become dependent, childlike welfare recipients.  The State assumes more and more responsibility to care for everyone’s needs, and we “expect from the State ever more solutions not only to social questions but also to private problems and difficulties; it increasingly appears to us that if we are not perfectly happy, it is the State’s fault, as though it were the duty of the all-powerful State to make us happy” (p. 173).  The State, of course, cannot possibly do this.  Yet this blatantly utopian longing drove some of the most powerful mass movements of the 20th century, most of them Marxist to some degree.  Marx, of course, didn’t envision the gulags that would result from the implementation of his socialistic ideas!  But Lenin and Trotsky were, in fact, faithful to his precepts, installing a “dictatorship of the proletariat” that could not but violently pursue its agenda.  “By denouncing the ‘fables about ethics’ and asserting that ethics was to be an instrument of the class struggle, by sneering at bourgeois inventions such as the distinction between aggressive and defensive wars or the principle that one should keep international agreements, by insisting that there are no permissible limits in political struggle—in all these, Lenin did not depart from Marxist principles” (p. 211).  

* * * * * * * * * * * * * * * * *

Andreas Kinneging holds the Chair in Legal Philosophy at the Law Faculty of the University of Leiden and has collected seventeen of his essays in The Geography of Good and Evil:  Philosophical Investigations (Wilmington, Delaware:  ISI Books, c. 2009).  He joins Kolakowski in condemning modernity for its ethical emptiness.  (T.S. Eliot’s “Hollow Men” inhabiting “The Wasteland” presciently depicted, a century ago, things to come.)  In his Preface he says:  “This book is the work of a convert who once firmly believed in the blessings of modernity and its intellectual sources, the Enlightenment and Romanticism, but at some point suddenly grasped that . . . [what they] brought us constitutes in more than a few respects a decline and a deterioration, instead of progress and improvement.  The author who converted me was Cicero.”  Rather than finding the ancients obsolete, he “perceived that it was not their thinking but ours that is primitive and inadequate.  A complete about-face.  What I had set out to describe as an outmoded worldview was far superior to the new worldview, or views, that had replaced it” (p. vii).   

  This became even more evident to him when he traced the course of classical thought from antiquity through the Middle Ages, for in Augustine and Aquinas he found “that Christianity possessed an understanding of certain essential moral and existential truths that had eluded the Greeks and Romans—even Plato—and that these truths cannot be forgotten without doing great harm to ourselves and the world.  Hence, I came to admire Christianity as an indispensable source of wisdom that can benefit anyone—even the most inveterate atheist” (p. viii).  The thinkers he now admires uphold the objective and universal nature of good and evil; “they are part of the world outside of us.  They cannot be posited but have to be discovered . . . [and] to write about good and evil is to map the geography of good and evil” (p. ix).  Furthermore, these thinkers insist upon the importance of virtue—the ethical integrity of a good conscience inculcated through moral education.  

Importantly, classical thinkers pondered basic questions such as “what is man?”  They focused on basic human realities, including the importance of culture, a word derived from “the Latin colere, which originally meant ‘working the field.’  If the field is not worked, there will be nothing to harvest” (p. 57).  Human beings, like productive fields, must be carefully tended.  “The need for cultivation exists not only for the nonhuman world around us but equally for human nature” (p. 57).  Moral education—instilling virtues—has been largely abandoned in our nihilistic world, but without it our species can hardly survive, much less thrive.  So Kinneging urges us to return to the sources of our tradition, to recover the culture that makes a man, “what Plato called ‘order in the soul’” (p. 59).    

Unlike the light evident in Ancient and Medieval thinkers, there is a “solid darkness” to the 18th century Enlightenment, with its denial of human limits and inclination to sin.  “The Enlightenment doctrine par excellence is the view that evil should not be sought in man but in society—civilization, Christianity, feudalism, property, capitalism, the law, education, the family, and so forth—and that it can therefore be erased by bringing about a better society.  This view forms the basis of what Napoleon once called ‘le terrible esprit de nouveaute,’ a reformist verve that is directed outward, at the world, and not inward, at the soul” (p. 21).  Rather than conform to Reality as given us, reformers endeavor to reshape the world in accord with our desires.  Detached from what is, they easily embrace fantasies as to what might be.  Today’s elites are increasingly ignorant of Classical and Christian thought, especially since “the cultural revolution of the 1960s dealt the deathblow to the tradition” (p. 35).  Yet without that tradition, the heirs of the Enlightenment cannot but turn “man into a barbarian armed with an unprecedented array of weapons, a creature that controls everything except himself” (p. 36).  

This is especially evident whenever social (or communitarian) ethics edge aside personal morality—as if you could shape a healthy physique with diseased organs.  In much of the West today, “Solidarity, care for the weak and the poor, is commonly elevated to the alpha and omega of public morality” (p. 121).  In itself, of course, such compassion is most admirable, but it ignores the truth discerned by classical and Christian ethicists:  “a good society is only possible if large sections of the population have thoroughly internalized certain values, thus turning them into what used to be called virtues” (p. 122).  Social virtues require preexistent personal virtues, and “with the decline in individual virtues, social virtues are also bound to disappear, since the first are the spiritual capital that make the second possible” (p. 123).  What’s needed is a restoration of individual morality, for the “root of the crisis of our time is the thinning out of our moral consciousness, our demoralization” (p. 124).  This has taken place rather dramatically since the 1960s under the influence of the liberalism spawned by 18th century Enlightenment and 19th century Romantic thinkers.  

Convinced that modernity is bankrupt, Kinneging urges us, throughout the book, to go back—ad fonts!  Without a return to the Classical and Christian ethical systems we cannot find establish the justice—giving to each his due—needed in any good society.  “Via Plato, Aristotle, Cicero, Augustine, Thomas Aquinas, and Roman law, this notion of justice became widely known and accepted in Western thinking.  Curiously enough, however, it is barely mentioned today” (p. 130).  But it must be, because without it we’re consigned to the relativistic, degenerate nihilism evident everywhere.  And to establish justice, “the foundation of morality” (p. 142), we must cultivate the other classical virtues—prudence, fortitude, temperance.  These virtues were early celebrated in the work of Homer, the “educator of Hellas.”  Then skeptical Sophists such as Protagoras promoted relativism and taught (Plato said in Theatetus) that “‘things are to me such as they appear to me, but to you they are such as they appear to you’” (p. 164).  Citing Heraclitus, the Sophists claimed that “‘everything is in motion and that is all there is.’  This means that ‘nothing exists as invariably one, itself by itself, but always comes into being in relation to something, and he category of “being” should be altogether abolished.’  ‘Good’ is not something specific but becomes something, depending on the circumstances.  But ‘if all things are in motion, any answer to any question whatsoever is equally correct’” (p. 164).  Thus there is “the ghost of subjectivism or relativism” at the very “beginning of Western philosophy” (p. 175).  And one need only listen to politicians’ speeches—or students in the classes I recently taught—to realize how such relativism permeates our world.  

Plato, of course, rejected the Sophists’ assertions, holding (with Parmenides) that there is an eternal, unchanging, objective world of Forms or Ideas that we rationally discern.  He thus established “what is sometimes called ‘the onto-theological tradition’ in philosophy, which” Kinneging calls “the classical-Christian tradition” (p. 176).  Plato and his heirs, rather than the Sophists, powerfully shaped Western Civilization for a millennium or so.  Late Medieval Nominalists following Occam, however, rejected Platonic Forms and claimed that “universals” are malleable generalizations summing up sense experiences and the past several centuries of philosophy (running from Descartes through Hume and Kant, Nietzsche and Heidegger, Dewey and Derrida) may be summed up as “a continuous and radicalizing effort to destruct the onto-theological tradition and to affirm Greek Sophism” (p. 177). 

Still, some gifted philosophers strongly affirmed and sought to restore the onto-theological tradition.  Phenomenologist Edmund Husserl’s “transcendental turn” took him back to a more ancient and medieval perspective.  “As Scheler expressed it once, from a historical point of view phenomenology can be seen as a ‘renewal of an intuitive Platonism’” (p. 183).  Thus Husserl’s concern for Sosein (the essence of a thing) rather than Dasein (the passing hic et nunc of existing things) led him in directions quite different from modernity.  Some of his significant followers—Max Scheler, Nicolai Hartmann, Dietrich von Hildebrand—developed the ethical implications of phenomenology, insisting “that the ontological status of ethics is comparable to the ontological status of logic” and “that the principles of ethics cannot be reduced to the subject—whether empirical or transcendental—but, instead, constitute an objective eidetical sphere, an a priori moral order within the order of being” (p. 185).  Ethical truths, like geometric truths, are found within the ultimate structure of being, not within the perspectives of finite persons. 

Among other ancient goods Kinneging urges us to recover is the traditional family, “the rock on which society was built” (p. 198).  Clearly things have gone wrong with family life for nearly half-a-century, so much so that “at the beginning of the twenty-first century the traditional idea of the family has completely lost its hold on the minds—and hearts—of most Westerners” (p. 208).  Indeed, the “nuclear family is crumbling” (p. 196), and nearly one-third of today’s population live in single households.   Though various “material factors” help explain the familial disintegration, at bottom the problem is philosophical, because the “world is ruled by ideas and little else” (p. 201).  “The crisis of the nuclear family is to a large extent the result of the emergence—in the ‘70s—of a number of Romantic notions regarding marriage and family, including the notion that only love justifies the institution.  These relate first to sexual gratification, second to Romantic merging, and third to self-fulfillment” (p. 201).  The joy of sex “became endowed with a cosmic dignity, while its harm, even demonic aspect was trivialized” (p. 201).  Becoming “one flesh,” as Christians always envisioned, gave way to an ethereal mystical union, finding one’s “soul-mate,” experiencing a semi-divine mystical union.  “To the Romantics love in the sense of la grande passion is ideally ‘eternal.’  In reality the romantic notion of love guarantees that this ‘eternal bond’ will last a few months at most” (p. 225).  And, finally, marriage in the ‘70s became a vehicle whereby one finds self-fulfillment, realizing his or her own potential rather than sacrificing for another’s good.  

Kinneging resolutely defends patriarchy!  “Matrimony is hierarchical:  the man is the head of the household.  He rules over his wife and children, ‘[b]ecause the male is by nature better suited to lead than the female, and the older and more educated person is better suited than the younger and less educated one,’ as Aristotle puts it” (p. 211).  This is not to justify tyrannical rule, however, for the man must rule wisely and well.  The woman’s “place is in the home, but in that home she is mistress, not servant” (p. 211).  Our ancient sources—the Bible, Greek literature—portray “exemplary marriages” wherein “the wife is not portrayed as any less intelligent or reasonable than her husband” (p. 212).    The man may “lead,” but such leadership entails protecting and serving his wife and children.  Unlike Romantic illusions and aspiration regarding love as an ecstatic  feeling, “the traditional concept of marriage and family emphasizes the mutual responsibilities of husband and wife, the duties or all concerned toward each other” (p. 223).  

211 “Money, Greed, and God”

While attending a nominally Methodist university in the 1980s Jay W. Richards easily absorbed (from classes and assigned readings as well as TV and secular media) the anti-capitalistic bias of eminent academics such as John Kenneth Galbraith and “evangelicals” such as Jim Wallis.  He rather enjoyed, he now realizes, the sophomoric “chance to rebel against authority and feel self-righteous doing it” (p. 10).  In time, however, as he more carefully studied economics and observed the world, he changed his mind and has written Money, Greed, and God:  Why Capitalism Is the Solution and Not the Problem (New York:  HarperOne, c. 2009).  He argues that “despite what you’ve been told, the essence of capitalism is not greed.  It’s not even competition, private property, or the pursuit of rational self-interest.  . . . .  What we now know is that market economics work because they allow wealth to be created, rather than remaining a fixed pie.  Economics needs to be zero-sum games in which someone wins only if someone else loses.  We have discovered an economic order that creates wealth in abundance—capitalism.  And only the creation of wealth will reduce poverty in the long run” (pp. 7-8).  

To demonstrate this truth Richards addresses nine prevalent anti-capitalist “myths” that often beguile Christians.  First, there is “The Nirvana Myth (contrasting capitalism with an unrealizable ideal rather than with its live alternatives.”  All of us imagine, at times, how to build a just society.  Yet a brief glance at recent endeavors to actually do so in the USSR and China and Cambodia should quickly disabuse one of fantasies regarding social construction!  Scholarly studies, such as The Black Book of Communism, detail the somber truth of utopian socialist experiments.  Similarly scholarly studies reveal how first century Christians, unlike the dreamlike portraits of today’s social justice advocates, never made the brief communal life in Jerusalem “the norm for Christians everywhere” (p. 23)  Nor have Christian communities, such as the Pilgrims in New England, found sharing all things in common a viable way to coexist.  

Social justice devotees such as Jim Wallis and Ron Sider often want us to ignore history and just imagine “what would Jesus do?” when crafting public policies—championing a “living wage” as well as a “minimum wage,” favoring “fair trade” coffee, etc.  Clearly Jesus, in accord with the prophetic tradition of Judaism, calls His disciples to care for the poor.  Given this truth, however, Richards insists we must use our minds as well as our hearts, exercising the virtue of prudence.  And when dealing with economics we must heed Henry Hazlitt’s admonition:  “‘The art of economics consists in looking not merely at the immediate but at the longer effects of any act or policy; it consists in tracing the consequences of that policy not merely for one group but for all groups’” (p. 36).  

Government-run welfare programs, Richards says, illustrate a century’s failure to weigh the consequences of what’s been done.  FDR’s New Deal sought to end the Great Depression but acerbated it instead, and the Great Society of LBJ, with its “War on Poverty,” actually turned into a “War on the Poor” (p. 47).  Lots of well-intended endeavors, costing trillions of dollars, have wrought pernicious (if unintended) consequences.  For example, throughout most of America’s “history, the federal government cost every citizen about twenty dollars a year (in current dollars, not the more valuable dollars of the past).  Now it costs every one of us, on average, about ten thousand dollars” (p. 53).  We’re paying the bills for politicians who relish redistributing the nation’s wealth.  But, Richards insists:  “We don’t have the right to take the property of one person and give it to another.  Therefore, we can’t rightfully delegate that function to the state.  Delegated theft is still theft” (p. 53).  

Goods best circulate through society via free trade rather than government edicts and programs.  Richards learned the power of free trade as a sixth grader, when his teacher facilitated a simple session demonstrating its positive sum or win-win nature.  He learned that free enterprise capitalism doesn’t breed brutal competition.  Rather it maximizes the opportunities one has to acquire what appeals to him.  There is certainly a mystery to the market—it’s the “invisible hand” of Adam Smith, whose belief in God led him to see “this invisible hand as God’s providence over human affairs, since it creates a more harmonious order than any human being could contrive” (p. 75).    No human being, however intelligent, no human institution, however sophisticated, could even begin to design the millions of transactions that make the marketplace work.  

In this marketplace—a win-win forum—no one is impoverished when someone else prospers.  Whenever filmmakers such as Oliver Stone or Michael Moore or liberation theologians such as Gustavo Gutierrez or Ron Sider lament the lot of the poor, blaming the rich for exploiting them, they distort the truth.  Today’s “poor” hardly resemble the truly “poor” of a century ago.  The evidence reveals that as the rich prosper so do the poor—a rising tide floats all boats.  And even if the “gap” between rich and poor grows greater “The relevant issue is whether the lot of the poor improves over time, not how close they are to the richest member of their society” (p. 90).    Wealth is not taken from a common store of natural resources!  Rather it is created by free, innovative individuals (who design things such as e-mail and the calculator I-Pod), and then circulated through the marketplace.  

Entrepreneurs are driven not by greed but by the desire to offer their products to the public.  Despite the stereotypes, celebrated in the novels of Ayn Rand (severely critiqued by Richards) and personified by the likes of Ivan Boesky (the businessman who declared that “Greed is good”), the driving impetus of capitalism is the desire to develop and offer things of value—goods—to others.  To Christians, as well as ancient ethicists such as Aristotle, greed is indeed contemptible.  But “capitalism is not based on greed” (p. 112).  It takes for granted our limited self-interest, as well as our sinful nature.  But, as Adam Smith saw, “in a free market, each of us can pursue ends within our narrow sphere of competence and concern—our ‘self-interest’—and yet an order will emerge that vastly exceeds anyone’s deliberations” (p. 122).  

Many great Christian theologians, contrary to popular myths, have endorsed capitalism.  Certainly “usury” was condemned—but it must be understood in the light of ancient, agricultural economies.  During the High Middle Ages, as trade and technology began transforming Europe, Scholastic theologians thoroughly analyzed money and banking, discerning the difference (distinguished by Jewish theologians centuries earlier) between loaning to a person who needs a winter coat and loaning to a person who wants to start a carpentry business.  “Usury isn’t charging interest on a loan to offset the risk of the loan and the cost of forgoing other uses  for the money; it’s unjustly charging someone for a loan by exploiting them when they’re in dire straits.  That’s the work of loan sharks, not banks” (p. 144).  Similarly, though Christian thinkers soundly condemned gluttony, attributing “conspicuous consumption” to capitalism is fundamentally wrong.  Saving (not spending) sustains capitalism.  Wealth must first be created and then saved and reinvested.  “Delaying gratification is restraint; it’s the opposite of gluttony.  So consumerism is hostile to capitalist habits and institutions” (p. 165).  

Richards closes his treatise with a chapter on natural resources, arguing that we’re not actually exhausting the earth.  Indeed, various unexpected innovations seem to continually enable us to produce ever more food and energy.  Wealth results from our ingenuity and innovation—and there’s an endless supply of this immaterial resource!  As John Paul II said in 1991:  “‘besides the earth, man’s principal resource is man himself.  His intelligence enables him to discover the earth’s productive potential and the many different ways in which human needs can be satisfied’” (pp. 206-207).  Clearly we’re to be good stewards of creation, but that doesn’t entail regressing to some imaginary “sustainable” state of equitably distributed poverty.  Moreover, he says, “remember:  every predicted global environmental catastrophe based on current trends has proved false.  If we look at long-term historical trends, in contrast, the evidence of declining energy costs, increasing energy abundance, and growing prosperity provides no basis for such pessimism” (p. 202).  

This is an eminently readable, persuasive treatise, making the case for a Christian capitalism which seems to be “just what we might expect of a God who, even in a fallen world, can still work all things together for good.  Seen in its proper light, the market order is as awe inspiring as a sunset or a perfect eclipse.  . . . .  At the very least, it should settle the question we started with:  Can a Christian be a capitalist?  The answer is surely yes” (p. 215). 

* * * * * * * * * * * * 

In Jesus and Money:  A Guide for Times of Financial Crisis (Grand Rapids:  BrazosPress, c. 2010) Ben Witherington III, a Professor of New Testament for Doctoral Studies at Asbury Theological Seminary, proffers a critique of the prosperity gospel and those who imagine the popular “Prayer of Jabez” applies to them.  “If there is to be a prosperity gospel worthy of its name,” he declares, “it should be all about the great blessing of giving and living self-sacrificially and how freeing it is to be trusting God day to day for life and all its necessities” (p. 77).  What we need to do, he asserts, is to learn to live with less rather than looking (and praying) for more.  To prove his point he devotes most of the book to careful exegesis and exposition of selected biblical passages—though I (ever uneasy with highly paid seminary professors’ pleas for simplicity!) suspect his personal agenda (all too reminiscent of the “simple life” counterculture of the ‘60s) overly shapes his presentation.  

The Old Testament, Witherington acknowledges, has many passages indicating God prospers his people.  But many of these texts are in the Wisdom literature and are quite situation-specific, not universal in scope.  Turning to the world of Jesus, we must remember that he lived in an agricultural world that had been incorporated into the Roman Empire with its sophisticated trading networks.  Consequently, Witherington makes “a few key points about economics in the NT world:

“1.  The ancient economy was not a money economy, and money was mainly used to pay taxes, tolls, tribute.  . . . . 

“2.  There was no free market capitalism in Jesus’s world.  . . . .

“3.  Money had explicit religious connotations in antiquity that it seldom does today.  . . . .

“4.  Religious values affected how one viewed property, money, and prosperity, and undergirding Jewish views was the belief in a single creator to whom all things ultimately belonged.

“5.  By Jesus’s day there had been a long history of Jews not ruling their own country, and a long history of oppression, even in the Holy Land.  Thus the attainment of wealth was often a matter of collusion with the oppressors of your own people” (pp. 54-55).  

Within this context we must seek to understand Jesus’ words on money, neither over spiritualizing nor deeming them irrelevant.  It’s evident, especially in Luke’s Gospel, that Jesus condemned “persons who are all about enhancing their own assets, portfolios, standards of living, or retirement accounts, which in one sense is what the rich fool envisioned.  Jesus has only warnings for rich fools, warnings about the danger of unrighteous mammon” (p. 66).  Similarly, as we read James and John and Paul we need to discern and share their positions on wealth and poverty—admonitions to work honestly and share with those in need.  

“What is clear,” Witherington says in a concluding chapter, “is that we must not silence the repeated New Testament warnings about the deleterious effects of wealth on one’s spiritual life.  The New Testament as a whole encourages us to have generous hearts.  It encourages us not to live our lives working for ‘unrighteous mammon’ in a self-seeking and self-centered manner.  It encourages us to put our ultimate trust in God, and be willing to demonstrate that trust through sacrificial giving.  It encourages us to be wary of the wise and about the fallen economic and political institutions of this world, and to do our best to disengage from their unethical practices.  The New Testament urges us to have a theology of enough, that is, to live by a principle that godliness with contentment leads to great gain in ways that can’t be monetarily quantified” (p. 151).  

Rightly used, especially when studying and preaching from the texts analyzed, this volume proves helpful.   It betrays but superficial understanding of modern capitalism, however, and it too often rings with utopian aspirations rather than realistic advice for today’s believers.  

* * * * * * * * * * *

For several years, early in my teaching career, I assigned Ron Sider’s Rich Christians in an Age of Hunger for my Ethics students to read.    Sider launched the Evangelicals for McGovern organization in 1972 and Evangelicals for Social Action in 1974 and remains a fixture in the “Evangelical Left,” along with Jim Wallis and Tony Campolo.  Sider seemed, as a committed Evangelical (with Mennonite and Wesleyan leanings) to rightly reflect a solidly Christian understanding of “social justice.”  As a seminary professor (at Eastern Baptist Seminary), he clearly laid out the biblical injunctions regarding care for the poor, and I naively took for granted that his economic analysis was equally perceptive.  In time, however, while upholding his biblical analysis, Sider acknowledged that he is not a trained economist and the book’s economics were flawed.  

Consequently his Just Generosity:  A New Vision for Overcoming Poverty in America (Grand Rapids:  Baker Books, c. 1999) endeavors to better address the issues of wealth and poverty in America, declaring:  “We must combine solid, sophisticated socioeconomic analysis with normative biblical principles of justice if we are to formulate wise, effective social policy” (p. 14).  Importantly, this means reducing “injustice by providing the poor genuine opportunities to work their way out of poverty” (p. 46).  

To tell us what poverty looks like, Sider marshals statistics designed to persuade us that there “are 36 million people in the United States poor in the midst of enormous wealth, they are becoming poorer while the rich grow richer” (p. 42).  He acknowledges, as Robert Rector contends (in America’ Failed $5.4 Trillion War on Poverty), that if one considers only the basics—food, clothing, and housing—there are very few poor people in this nation.  But he insists poverty must be comparatively defined.  Relative to a prosperous plumber, for example, Sider says a welfare recipient is poor even if he has no unmet basic needs.   Having an apartment with cable TV, getting food stamps sufficient for healthy meals, and owning a car need not disqualify one from being considered “poor” in America.  

In view of this, we need “a holistic, biblical vision for empowering the poor.”  This leads Sider to sketch the biblical foundation necessary for a vision that includes both respect for human freedom and communal solidarity.  As persons we must be free to choose, and poverty usually comes with any “disobedient, lazy neglect of our responsibilities” (p. 52).  To work hard and create “wealth is one important way persons obey and honor the Creator” (p. 54).  But we are social beings, and we need healthy social systems to function justly.  Families are, indeed, primarily responsible for economic development, but the state has its proper and necessary role as well.  Thus honest courts of law, following proper procedures and dealing out commutative justice are needed, as are special provisions designed to care for those unable to care for themselves (evident in the OT’s sabbatical years, the Jubilee Year, and the practices of gleaning and Sabbath observance).  

A good society—a “civil society” in Sider’s terms—enables families, churches, and civic clubs, as well as governmental organizations, to thrive and contribute to everyone’s well being.  It will not only care for the disadvantaged but provide ways for them to become more self-reliant.  In accord with the principle of subsidiarity, articulated a century ago by Pope Leo XIII, Sider says:  “When a social problem emerges the first question should not be, What can government do?  The first question should be, What institutions in society have primary responsibility for and are best able to correct this problem” (p. 91).  Consequently, “We must reject liberals’ automatic preference for government solutions” (p. 91).  

But in many areas government must take charge.  Indeed, “paying taxes is one important way we love our neighbors and promote the common good” (p. 199).  So Sider argues that Christians should support such things as the Earned Income Tax Credit, food stamps, the minimum wage, child care subsidies and child tax credits, government funded jobs for everyone willing to work, job training programs, unemployment insurance, universal health care (Sider even includes a form letter to be sent to Senators and Congressmen calling for rapid action), better schools, stronger unions, safe streets, gun control laws, Social Security, summer jobs for inner city adolescents, and the progressive income tax.  A commitment to “biblical justice,” it seems, mandates support for most all of the Democrat Party platform!  (That Sider printed an endorsement of the book by President Obama’s former Chicago pastor, Jeremiah Wright, indicates something of its politics!  In fairness, the book is also endorsed by Jim Wallis, Chuck Colson and former U.S. Senator Paul Simon).  

Apart from the government, however, Sider insists families and churches have crucial roles.  Could American be persuaded to follow Christian teaching regarding sexual ethics and marital fidelity many social and economic problems would be solved.  Especially important is the preservation of families, for there is, as Sara McLanahan reports, this inescapable truth:  “‘The more single parents, the more poverty’” (p. 121).  Indeed, “Children in one-parent families are eleven times more likely to experience persistent poverty than children in two-parent families” (p. 121).  From every measurable standard, illegitimate children suffer from their parents’ irresponsibility.  Fatherless boys have an especially dismal prospect—failing in school, behaving violently, serving time in jail.  Churches as well as families can help eliminate poverty.  Sider cites successful efforts of various inner-city congregations, where it is obvious that faith-based programs effectively help the poor in their neighborhoods.  

There is no question that Sider seeks to help the poor.  His compassion is most evident.  His commitment to those biblical texts that call for “social justice” is also transparent.  But though his economic analyses and prescriptions merit consideration they amount to little more than a reiteration of the “progressive” ideology that has invested trillions of dollars to “solve” poverty without significantly eliminating it.  

# # # 

210 “A God Who Hates”

Wafa Sultan is a Syrian woman who unleashes her anger at Islam in A God Who Hates:  The Courageous Woman Who Inflamed the Muslim World Speaks Out Against the Evils of Islam (New York:  St. Martin’s Press, c. 2009).  She attained notoriety for bravely speaking her mind on the Al Jazeera TV network in a series that was quickly cancelled because of the wrath it elicited from viewers.  She then determined to put her positions in print, knowing that her life would be endangered, because she believes “that good will ultimately triumph over evil” (p. 7).  

The book’s message is simply stated by Sultan as she answers Americans who ask, following the 9/11 attacks, “Why do they hate us?”  She answers:  “Because Muslims hate their women, and any group who hate their women can’t love anyone else.”  And Muslims hate their women, she believes, “Because their God does” (p. 7).  From her perspective, Islam retains the ancient fears and angers of the desert dwellers who brought it into being and spread it throughout the world.  Out of their needs Muslims created a God—Allah—“this ogre, and then allowed it to create them” (p. 52).  And they have never ceased their ancient ways of fighting, raiding, despoiling the enemies.  Since coming to America nearly 20 years ago Sultan has learned of “a different God than the one I knew in my village” (p. 9), though that discovery has not drawn her to belief in any Divine Being.  Religions, for her, are projections of human fears, needs and desires; they may be good or bad, but they open no paths to any truly transcendental Reality.  

Sultan tells the story of her childhood, growing up in an oppressive male-dominated world that significantly changed when “the tentacles of the Saudi octopus [radical Islam funded by petrodollars] began to extend gradually into Syrian public life, where they still wreak havoc today” (p. 37).  Fortunately she had loving female relatives—and she had books!  Indeed, for her “life really began in the third grade when I learned to read” (p. 11).  She read and read and a wonderful world opened for her.  In time she was admitted to medical school in Aleppo and subsequently worked in a medical clinic.  She also met (defying Muslim customs) a young man who became her husband and the father of her children.

Her awakening to the evils of Islam escalated when, finishing her medical studies in 1979, she “witnessed the death of our ophthalmology lecturer” (p. 45).  As the shots rang out she heard “the killer’s voice shouting from the loudspeaker:  ‘Allahu akbar . . .  Allahu akbar!’” (p. 45).  “When Muslims kill,” Sultan explains, “they shout “Allahu akbar!”—Allah is the greatest!’” (p. 45).  The slain professor was “a man I had looked up to as an ideal of morality and humanity—an upright, generous, and cultured man” who had studied in Europe.   “Ever since that moment, Allah has been equated in my mind with the sound of a bullet and become a God who has no respect for human life.  From that time on I embarked upon a new journey in quest for another God—a God who respects human life and values every human being” (p. 45).  

Her journey, fortunately for her and her family, led to the United States.  In “1988 I got a heaven-sent gift:  I received my American visa, after spending three days and nights outside the U.S. consulate in Damascus” (p. 93).  Living in Los Angeles, she found the freedom she’d always craved, even though it meant initially working at a gas station!  Amazingly, Americans treated her better as a gas station attendant than Syrians had treated her as a medical doctor.  As she studied to improve her English and take medical exams in her new country, she also began to read voraciously in books that helped her understand the difference between America and Syria.  And she began writing and speaking, in both English and Arabic, identifying Islam as the source of the backwardness and evils in much of the world.  That, ultimately, led to her appearance on Al Jazeera and her international celebrity as a woman willing to challenge Muslim clerics.  

She is deeply distressed by Muslims who fail to repudiate Islam when granted freedom in the West.  Indeed:  “I have no hopes for Muslims, men or women, who live in the West.  They are, quite simply, hypocrites.  They are trying to have the best of both worlds” (p. 145).  Even worse, they are rearing their children to espouse radical Islam and its terrorist tendencies.  For her, there is no hope for any reformation of Islam, for “Islam is a sealed flask.  Its stopper allows no ventilation” (p. 155).  There is no freedom, in any realm of life, for the Muslim.  All is dictated.  Just as Allah dictates, without reason, so too Muslim rulers dictate to their passive followers.  “Obey Allah and the Apostle and those in authority among you” says the Koran.  Independent thinking and speaking and acting cannot be tolerated, for “Islam is a closed market.”   Nor does “the concept of responsibility” have much standing.  Muslim men still approach life as warriors, killing and raiding, rather than working and building.  And when they fail they become victims, convinced everyone is against them—especially the Jews, who are routinely skapegoated.   

To Sultan, there is an irresolvable “clash of civilizations” that can only be resolved by the defeat and destruction of Islam.  She is never shocked by terrorist attacks—it’s all a part of the Islamic approach to the world.  Nor is she surprised by the deceitful strategies of Muslims living in the West.  Americans especially, she laments, “are not expert at either debate or trickery.  They say what they mean and mean what they say, and have no idea that they are dealing with people skilled in saying what they don’t mean and meaning what they have never said” (p. 204).  Living in her adopted country, daily thankful for the many freedoms she now enjoys, she loves “America as few people do, and my love for it makes me feel concern for it” (p. 235).    

The election of Barak Obama in 2008 has intensified her concern because the “Islamists are not particularly interested in whether Obama is a Muslim or not:  The fact that the American president bears a Muslim name like Hussein is enough to convince them that Islam is marching into America and has already infiltrated the White House” (p. 239).  Even more alarming, to Wafa Sultan, was a remark made by Colin Powell during the election campaign, when he declared that even if Obama were a Muslim that would be fine!  How amazing that the former “secretary of state couldn’t see what was wrong with America’s choosing a Muslim president, even though it is the country that has suffered most from Muslim terrorism and paid the highest price because of it” (p. 240).  Powell’s naïveté, the author fears, typifies this nation’s leaders—and could easily lead to its downfall.  To prevent this—to awaken America to the real nature of Islam and the designs of her practitioners—Sultan has written this provocative treatise.  

* * * * * * * * * * * * * * * * * * * * 

Allah—the “God who hates” so detested by Wafa Sultan—is the God one finds in the “holy book” of Islam, says Robert Spencer in The Complete Infidel’s Guide to the Koran (Washington, D.C.:  Regnery Publishing, Inc., c. 2009).  It’s the latest of his nine critical treatises on Islam, and he believes every American needs to know what’s in the Koran since public pronouncements generally describe it as a depository of peaceful admonitions and prescriptions for righteousness.  On the contrary, he argues, the sacred text of Islam continually calls the faithful to violently spread Mohammed’s message, to engage in ceaseless jihad until everyone bows in submission to Allah.  Radical jihadists, such as the Pakistani Beitullah Mehsud, who declared that “‘Allah on 480 occasions in the Holy Koran extols Muslims to wage jihad.  We only fulfill God’s orders.  Only jihad can bring peace to the world’” (p. 7).  And Mehsud makes it clear that jihad means military action, not inner discipline.  

The glaring discrepancy between what American leaders say about Islam and what the Koran actually teaches was evident when Barak Obama spoke in Cairo soon after he became president.  He praised the compassionate, peaceful message of the “holy Koran,” citing an illustrative verse.  Importantly, however, he ignored the next verse, which calls for crucifying or amputating the limbs of all who dare resist Islamic expansion.  Obama then cited a verse apparently calling for religious tolerance, whereas “this Koranic passage is actually about fighting unbelievers and doesn’t remotely advocate peaceful coexistence” (p. 12).  “Out of this command to wage jihad warfare against unbelievers, Obama cherry-picked one sentence that made it appear as if the Koran were simply counseling one to speak the truth, mindful of the divine presence.  He took a passage about warfare and division and passed it off as a call for us all to come together and sing ‘Kumbaya’” (p. 13).  

To provide us with a better perspective than espoused by Obama, Spencer endeavors to explain “what exactly is the Koran,” revered as “the Book” by devout Muslims and “absolutely central to Muslim life and culture” (p. 25).  It is allegedly a record of Allah’s dictates, transmitted through Mohammed, perfect and eternal in every aspect—including its Arabic tongue (the only approved language for reading and prayer).  It is, however, obviously situated in a very particular historical period and largely reveals answers to a variety of Mohammed’s questions.  To an “infidel” such as Spencer, the historical record of the book’s composition and canonization make it much less than a fully divine book, but to understand Islam one must appreciate the high and infallible status it enjoys within the Muslim world.  

Attentively reading the Koran reveals its real message:  Mohammad is the last and greatest of the prophets.  In short:  it’s all about him!  Prior prophets, from Abraham to Jesus, have value only as they prefigure and prepare the way for Mohammad.  Jewish and Christian scriptures are twisted beyond recognition in order to justify his agenda.  And the series of revelations recorded in the Koran primarily provide the prophet with guidance or justification as he shifts from a religious reformer to a military leader (as well as add to his growing collection of wives).  

Mohammad also enunciated Allah’s hate for all who reject Islam, asserting they are “immensely, inveterately corrupt” (p. 96).  There is, quite simply, no tolerance in the Koran for other religions, “And in rejecting the message of Muhammed and refusing to worship Allah alone, Infidels commit the worst of all sins” (p. 98).  Those Jews and Christians who may be unaware of the truth of Islam are “proto-Muslims” and better off than polytheistic pagans, but once they encounter the Truth they must embrace the message of Mohammad or be damned.  Consequently, “most Muslim commentators believe that the Jews are those who have earned Allah’s wrath and the Christians are those who have gone astray” (p. 103).  Though Mohammad spoke positively about Jews in his early years, when they opposed him in Medina he began to verbally abuse and physically attack them, declaring that “Allah transforms disobedient Jews into apes and pigs” (p. 126).  Trinitarian Christians necessarily violate the strict monotheism Mohammad preached, and though Jesus receives some high marks as a prophet He was merely a man, significantly inferior to Mohammad.  

The Koran’s message on women elicits some of Spencer’s most critical comments.  Clearly women are inferior to men and regarded as their possessions.  Indeed, “the Koran likens a woman to a field (tilth), to be used by a man as he wills” (p. 158).   Consequently, a “woman’s legal testimony is worth half that of a man” (p. 158) and a son inherits twice as much as a daughter.  Polygamy is widely approved and practiced, even in Western nations where Muslim immigrants gather.  Diligent researchers believe there are “tens of thousands of polygamous unions” in the United States (p. 170).   A man is entitled to up to four wives and may (following Mohammad’s example) marry pre-pubescent girls.  (The Prophet himself married a six year old child and consummated the union three years later.)  To divorce his wife a Muslim man simply says “I divorce you,” whereas a woman must go through a different and much more difficult process.  Shi’ite men may also “enter into marriages that have a time limit:  the couple may marry one another for one night, or a weekend, or a month, or any limit they choose” (p. 164).  

Needless to say, Spencer finds the Koran responsible for much Islamic intolerance and violence.  He shares the position of two Hindus who petitioned “the Calcutta High court alleging the Koran violated Indian law because it ‘incites violence, disturbs public tranquility, promotes, on ground of religion, feelings of enmity, hatred and ill-will between different religious communities and insults other religions or religious beliefs of other communities in India’” (p. 215).  He also cites, with guarded approval, the views of Geert Wilders, the Dutch Parliamentarian renowned for his hostility to Islam.  “‘The Koran’s core theme,’ he said, ‘is about the duty of all Muslims to fight non-Muslims; an Islamic Mein Kampf, in which fight means war, jihad.  The Koran is above all a book of war—a call to butcher non-Muslims” (p. 218).  

So, “for Infidels, the Koran is a dangerous book” (p. 232).  

* * * * * * * * * * * * * * * * * * * * 

Shortly after the 9/11 terrorist attacks, Ravi Zacharias recorded his reaction in Light in the Shadow of Jihad (Sisters, Oregon:  Multnomah Publishers, Inc., c. 2002).  Born in India, Zacharias has extensive knowledge of world religions and has written and spoken widely as an esteemed Christian apologist.  Regarding 9/11, he says, “the world came to a standstill” (p. 9).  Clearly a war began and America is engaged, but it “is a different type of war” (p. 11).  Jihad has come to our shores!  “We are in a struggle for survival, and we face an uncertain future before a cruel and ultimately homeless enemy” (pp. 12-13).  

Reflecting on the struggle—which is, in fact, a “war between good and evil”—he argues:  1) the Muslims who attack us fear “a morally strong America;” 2) they are emboldened by the fact that “genuine faith in God suffers in this country at the hands of radical academics among us who have tossed out the Creator in our national ethos and worldview” (p. 14).  One need only visit Harvard Law School to find such academics.  Consider the words of Professor Alan Dershowitz:  “‘I do not know what is right,’ he contends.  It alls sounds very honest and real, until he points his finger at his audience and says, ‘And you know what?  Neither do you’” (p. 19).  Neither he nor anyone else knows what is good or true.  

When Americans cannot tell the difference between good and evil, when they are convinced relativists, they cannot defeat Jihadists.  “This is America’s quandary.  How do we determine what is evil?” (p. 25).  There was no quandary for Thomas Jefferson and the American Founders!  They believed that there are “certain unalienable rights, that among these are life, liberty and the pursuit of happiness.”  And they fought to secure them.  Then they established a union to sustain them, knowing, as George Washington declared:  “‘Of all the disputations and habits that lead to political prosperity, religion and morality are indispensable supports.   In vain would that man claim the tribute of patriotism who should labor to subvert these great pillars of human happiness—These finest props of duties of men and citizens….  And let us with caution indulge the supposition that morality can be maintained without religion’” (p. 31).  

The “struggle between truth and falsehood,” Zacharias believes, is the real war.  If Christianity is true, Islam is false.  It’s an either/or issue, and ultimately religion shapes a people’s mind-set.  Zacharias then provides readers with a brief history of Islam along with an explanation of its basic tenets.  An honest reading reveals Jihad—violently imposing Islam wherever possible—as central to Muslim life.  Members of the Muslim Brotherhood, for instance, read materials such as The Missing Religious Precept that “seethes with hate, incites to kill and destroy, calls for the spread of fear among the ‘backsliders and infidels,’ for the murder of leaders, for the scorching of a nation’s vegetation and its livelihood, and for building absolute fearlessness in its followers” (p. 47).  

To fight against such radical Jihadists, Zacharias urges Christians to remain rooted in their Source, ponder the message of the Bible, and win the battle for men’s minds.

* * * * * * * * * * * * * * * * * * * * * 

Samir Khalil Samir, a Jesuit priest of Egyptian and Italian descent, was born in Cairo and educated in Europe.  The president of the International Association of Christian Arabic Studies, he now lives in Beirut and is considered one of the greatest scholars in the Mideast.  Two journalists, Giorgio Paolucci and Camillel Eid interviewed Samir, and their edited transcripts appear as 111 Questions on Islam (San Francisco:  Ignatius Press, c. 2008).  This is best approached as a short, accessible reference work (with a helpful chronology, bibliography, glossary, and index) rather than a sustained treatise, but it certainly contains the considered, carefully nuanced  judgment of one of the world’s most knowledgeable scholars.  

Samir responds to questions regarding the foundations of Islam, the possibilities of change within the Muslim world, the problems of human rights wherever Sharia is established, the issues raised by immigrants to the West, and relations between Islam and Christianity.  Explaining the importance of the Muslims’ belief that the Qur’an is “the tongue of God,” he notes that interpreting the text is virtually impossible.  Consequently, rote memorization and repetition of the sacred book is mandatory—you may apply the words to current contexts, but reasoning and interpreting it is not allowed.  “For Muslims, the Qur’an can be compared to Christ:  Christ is the Word of God made flesh, while the Qur’an—please forgive my play on words—is the word ‘made paper’” (p. 45).  

This means, of course, that it is virtually impossible for Islam to significantly change, to reform, to adjust to the modern world.  Thus “the term ‘jihad’ indicates the Muslim war in the name of God to defend Islam” (p. 62).  Efforts to portray it otherwise  simply find no basis in the Qur’an, and those in the West who take Islam to be tolerant and peaceful “usually know very little about Islam” (p. 65).   Similarly, since all truth is found in the Qur’an there is no room for human reason.  Do what you’re told without questioning why!  The natural law, so significant in shaping the philosophy and theology of Western Civilization, has no real parallel in Islam, and Muslims think “it inconceivable to speak of natural law apart from religious law (shari’a) given by God to man, being persuaded that there is no universal given that is not already included in the Islamic conception of life.  While in Christianity one starts from reason and arrives at revelation, in the classic Islamic conception revelation comes before reason and prevails upon it, engulfs it” (p. 201).  

Throughout 111 Questions On Islam one this fact routinely appears:  Islam lacks real respect for reason.  “In Islamic thought, the argument of authority prevails (‘God established this’) over that of rationality (‘reason allows man to reach the knowledge of moral law’).  The qur’anic norm is more authoritative than reality” (p. 91).  Thus a noted Muslim theologian, explaining why the faithful should make a pilgrimage to Mecca, said:  “‘The pilgrimage is the most irrational thing in Islam.  There we perform gestures and rites that are absolutely irrational.  For this reason, the pilgrimage is the place where we can, better than any other place, demonstrate our faith because reason does not understand anything at all of it and only faith makes us do those actions.  Blind obedience to God is the best evidence of our Islam’” (pp. 179-180).  Little more need be said!

In many ways the carefully nuanced, restrained language of Samir—along with his obvious concern for the welfare of Muslims—makes his criticism of Islam more persuasive than those who write with more passion.  

209 Roger Kimball’s Cultural Critique

Few things matter more than our “culture.”  Few current writers better appreciate its power and explain its texture than Roger Kimball, the managing editor of the New Criterion and an art critic for the Spectator of London and National Review.  And, to speak personally, few writers have better enabled me to get a grasp on developments within education and art, ethics and philosophy.  (His Tenured Radicals:  How Politics Has Corrupted Our Higher Education, is as relevant today as when it was published in 1990.)  During the past decade he has published four volumes—all essay collections—that I recommend for anyone wondering about the formative currents of our culture.

First consider The Long March:  How the Cultural Revolution of the 1960s Changed America (San Francisco:  Encounter Books c. 2000), wherein Kimball observes that “our culture seems to have suffered some ghastly accident that has left it afloat but rudderless:  physically intact, its ‘moral center’ a shambles” (p. 4).  The “ghastly accident” was the revolutionary ‘60s, whose “paroxysms” still create the cultural chaos best evident “in our educational and cultural institutions, and in the degraded pop culture that permeates our lives like a corrosive fog” (p. 5).  The result has been revolutionary, “not in toppled governments but in shattered values” (p. 7).  Symbolic of the era were the rock musicians of the, including such dissimilar groups as The Beatles and The Rolling Stones, who effectively promoted a Dionysian philosophy of antinomian, hedonistic excess.  

  Their values—the ethos of the “counterculture”—quickly infiltrated our schools and colleges, our families and churches, our media and politics.  When they failed to orchestrate a political revolution, the radicals of the ‘60s moved from marching in the streets to launching a “long march through the institutions.”  So doing, they embraced the strategy of Antonio Gramsci (an Italian Marxist) and celebrated Mao Tse-tung’s “long march” and “cultural revolution.”  (Remember the voguish Mao jackets of those days!)  Urged on by Herbert Marcuse (perhaps “the philosopher” of the counterculture in the United States), young radicals like Tom Hayden and Bill Ayers determined to work within established institutions (universities, churches, media) while plotting their destruction.  

Ever attuned to historical developments, Kimball locates Jean-Jacques Rousseau as “an important intellectual and moral grandfather of so much that happened in the cultural revolution of the 1960s” (p. 18).  Rousseau, of course, provided a Romantic strain to the revolutionary turmoil that has blemished the world since the French Revolution.  He talked expansively about “freedom” and “virtue”—as have his devotees, beginning with Robespierre—but his version of virtue “had nothing to do with acting or behaving in a certain way toward others.  On the contrary, the criterion of virtue was his subjective feeling of goodness.  For Rousseau, as for the countercultural radicals who followed him, ‘feeling good about yourself’ was synonymous with moral rectitude.  Actually behaving well was irrelevant if not, indeed, a sign of ‘inauthenticity’ because it suggested a concern for conventional approval” (p. 17).    To understand the frenzy that overwhelmed the churches and schools in the ‘80s—sanctifying self-esteem as the noblest of human traits—one need only turn to Rousseau!  

  The turn to Rousseau was evident in the “beatniks” of the 1950s who provided a preview of the coming counterculture.  Allen Ginsberg, Jack Kerouac et al. launched “one of the most toxic cultural movements in American history” (p. 38).  They personified  “The adolescent longing for liberation from conventional manners and intellectual standards; the polymorphous sexuality; the narcissism; the destructive absorption in drugs; the undercurrent of criminality; the irrationalism; the naïve political radicalism and reflexive anti-Americanism; the adulation of pop music as a kind of spiritual weapon; the Romantic elevation of art as an alternative to rather than as an illumination of normal reality; the pseudo-spirituality, especially the spurious infatuation with Eastern religions:  in all this and more the Beats provided a vivid glimpse of what was to come” (p. 46).  

Having established his premises, Kendall carefully analyzes a number of important representatives of the counterculture, including the novelist Norman Mailer, who “is an important figure in the story of America’s cultural revolution not because people found him ridiculous but, on the contrary, because many influential people took the ideas of this ridiculous man seriously” (p. 73).  Then there was Susan Sontag, an “archetypical New-Left writer” who celebrated the both exploits of Fidel Castro and “the pornographic imagination.”  She and other acolytes of Sigmund Freud espoused “sexual liberation,” using the spurious “research” of Alfred Kinsey to justify their liberation from traditional norms.  Giving a Marxist philosophical rationale to the movement, Herbert Marcuse published Eros and Civilization (“a book that became a bible of the counterculture”), and Charles Reich’s The Greening of America celebrated the counterculture as the wave of the future destined to radically improve the nation.  

Concluding his treatise, in a chapter titled “What the Sixties Wrought,” Kimball asserts that the ideology of the ‘60s has “triumphed so thoroughly that its imperatives became indistinguishable from everyday life:  they became everyday life” (pp. 247-248).  If we look clearly, comparing where we are with where we were, we recognize that much in our schools and churches, our music and TV, bears the imprint of an enormous cultural revolution which succeeded through infiltration and subversion, not by challenging and openly defeating traditional ways.  What few of us could have imagined in 1960 has transpired and we now live in a cultural world shaped by the ‘60s.  And to Kimball, at least, this is unmitigated bad news!  

* * * * * * * * * * * * * *

Roger Kimball’s Experiments Against Reality:  The Fate of Culture in the Postmodern Age (Chicago:  Ivan R. Dee, c. 2000) amplifies a phrase of Hannah Arendt, who described “totalitarianism as a sort of ‘experiment against reality’—one that, among other things, encouraged people to believe that ‘everything was possible and that nothing was true’” (p. vii).   Her words prophetically delineated today’s  Postmodernists, wherein creating one’s “own reality” and denying “objective truth” are normative; such thinkers follow the injunctions of Friedrich Nietzsche, whose thought “is an indispensable background to almost every destructive intellectual movement this century has witnessed” (p. 6).  It was Nietzsche who “boldly demanded that ‘the value of truth must for once be experimentally called into question’” and declared “‘that there are no facts, only interpretations’” (p. 6).  

Sensing the drift of Western culture a century ago, contemporaries of Nietzsche, including T.E. Hulme and T.S. Eliot sought to shore up traditional religion and classical thought.  Thus Hulme scorned some of the popular fantasies of his day, including pacifism and socialism, caustically noting:  “It is a widespread but entirely mistaken idea to suppose that you amend for the advantages of wealth by asserting verbally that you are a Socialist’” (p. 55).  Eliot shared Hulme’s “craving for reality” and reached a larger audience with basically the same message.  “‘Man is man,’” he said, “‘because he can recognize supernatural realities, not because he can invent them’” (p. 81).  

But the classical approach, with its craving for reality, largely lost the battle for men’s minds.  Take, for instance, the poetry of Wallace Stevens, whom Kendall calls a “metaphysical claims adjuster.”  Influenced by William James, under whom he studied at Harvard, he “devoted his entire life to” working out James’ philosophy.  “‘The final belief,’ he wrote in one typical reflection, ‘is to believe in a fiction, which you know to be a fiction, there being nothing else.  The exquisite truth is to know that it is a fiction and you believe in it willingly’” (p. 90).  Believe it because you know it’s not true!  The relativism and skepticism now pervading Postmodernism are evident in this assertion, yet both ultimately failed Stevens, who in a final poem asked:  “’I wonder, have I lived a skeleton’s life, / As a disbeliever in reality’” (p. 93).  

The turn-of-the-century Pragmatism of William James, which so shaped 20th century American thought, strongly resembles the Utilitarianism of John Stuart Mill, the main architect of the “liberalism” that fundamentally altered politics and religion, economics and education.  Though some readers of Mill’s  On Liberty assume he championed individual liberty, he actually sought, Maurice Cowling insists, to ensure “‘that Christianity would be superseded by that form of liberal, rationalizing utilitarianism which went by the name of the Religion of Humanity.  Mill’s liberalism was a dogmatic, religious one, not the soothing night-comforter for which it is sometimes mistaken.  Mill’s object was not to free men, but to convert them, and convert them to a peculiarly exclusive, peculiarly insinuating moral doctrine.  Mill wished to moralize all social activity. . . .  Mill, no less than Marx, Nietzsche, or Comte, claimed to replace Christianity by ‘something better’” ((p. 166).  

One of the most discerning of Mill’s critics, James Fitzjames Stephen, published Liberty, Equality, Fraternity in 1873 and effectively challenged Mill’s liberalism by insisting it mirrored the objectives of the French Revolution and offered a substitute to Christianity—a “Religion of Humanity.”  To Stephen, liberty (like fire) is per se neither good nor bad, and the liberty Mill espoused “boils down to the exhortation:  Let everyone please himself in any way he likes so long as he does not hurt his neighbor” (p. 174).  Assuming the basic goodness of man, he naively declared “that if men are all freed from restraints and put . . . on an equal footing, they will naturally treat each other as brothers, and work together harmoniously for their common good” (p. 177).  To believe that now, in the light of gulags and death camps, demonstrates the sheer irrationality of the liberalism that still reigns in the hearts of our school teachers and social planners.  

Markedly different from Mill (though sharing his disdain for Christianity) Friedrich Nietzsche conducted his own experiment against reality.  He especially wanted to destroy the foundation of Western morality, and without question he cast a long shadow over 20th century developments.  “Nietzsche’s glorification of power and his contention that ‘there are altogether no moral facts’ are grim signatures of the age.  So too, is his enthusiasm for violence, cruelty and the irrational” (pp. 189-190).  Declaring “God is dead,” he “foresaw the rise of anomie, the spreading sense of angst and meaninglessness, what the Czech novelist Milan Kundera called ‘the unbearable lightness of being’:  the whole existentialist panoply of despair and spiritual torpor.  All this Nietzsche diagnosed under the heading of nihilism:  the situation, he wrote, in which ‘the highest values devalue themselves’ and the question ‘Why?’ finds no answer” (p. 192).  

Such nihilism now figures prominently in academia, where cadres of professors “parrot his ideas and attitudes.  Nietzsche’s contention that truth is merely ‘a moveable host of metaphors, metonymies, and anthropomorphisms,’ for example, have become a veritable mantra in comparative literature departments across the country” (p. 193).  And his nihilism marks modern times, Kimball believes.  “He defines the good as that which enhances the feeling of life.  If ‘to see others suffer does one good, to make others suffer even more,’ then violence and cruelty may have to be granted the patent of morality and enlisted in the aesthete’s palette of diversions.  In more or less concentrated form, Nietzsche’s ideal is also modernity’s ideal.  It is an ideal that subordinates morality to power in order to transform life into an aesthetic spectacle.  It promises freedom and exaltation.  But as Novalis points out, it is really the ultimate attainment of the barbarian” (p. 213).  Since WWII, the toxic ideas of Nietzsche have been popularized by Jean-Paul Sartre and Michel Foucault (“Nietzsche’s ape”).  Despite the fact that Foucault’s scholarship is riddled with errors and deliberate obfuscations” (p. 256) and “his arguments rest on shoddy scholarship, distorted history and untenable generalizations” (p. 257), he became a guiding light for many “postmodern” thinkers.  

In fact, there is a radical difference between Foucault and the makers of Western Civilization who generally join John Henry Newman in declaring that “to think correctly is to think like Aristotle,” humbly exploring and knowing Reality rather than conducting “experiments against” it.  Newman, of course, was a philosophical realist who took the very traditional notion of philosophy as “an attitude of openness,” a commitment to contemplative thought, a “patient receptiveness to reality” (p. 342).  To this task Kimball directs us if we want to effectively respond to the barbarism and cultural chaos of modernity.  

* * * * * * * * * * * * * * *

In The Lives of the Mind:  The Use and Abuse of Intelligence from Hegel to Wodehaouse (Chicago:  Ivan R. Dee, c. 2002), Roger Kimball probes the positions of a variety of influential thinkers.  To introduce this collection of essays he refers us to Raymond Aron’s masterpiece, The Opium of the Intellectuals, which excoriated those utopian ideologies (chiefly Marxism) that have proved so alluring and destructive—the intellectuals’ “drug of choice.”  Aron recognized the power of ideas, however deranged and dangerous.  So did Irving Kristol, who said (in a 1973 essay entitled “Utopianism, Ancient and Modern”):   “‘The truth is that ideas are all-important.  The massive and seemingly solid institutions of any society—the economic institutions, the political institutions, the religious institutions—are always at the mercy of the ideas in the heads of the people who populate these institutions.  The leverage of ideas is so immense that a s light change in the intellectual climate can and will—perhaps slowly but nevertheless inexorably—twist a familiar institution into an unrecognizable shape” (p. 17).  

Consider the fact that Plutarch’s Lives are no longer read in the schools.  Once a staple of the liberal arts (Plutarch was widely revered as “Europe’s schoolmaster”), his masterpiece constantly stressed  the “issue of character.” Following good ideas made good men; bad ideas, such as those espoused by the traitorous Alcibiades, led to disaster and destruction, both for himself and Athens.  History, for Plutarch, was “a moral theater whose performances it was his task to recapitulate for the edification of himself and his readers” (p. 31).  That he is today almost totally neglected illustrates the poverty of our culture.  Sadly enough we now feed our young an ideological diet shaped by the notions of Schiller and Schopenhauer, Kierkegaard and Kant, Hegel and Marx, Russell and Wittgenstein—all of whom receive both biographical and analytical treatment in Kendall’s essays.

Addressing “the difficulty with Hegel,” who so powerfully influenced many 19th century intellectual developments, Kimball says:  “I am perfectly happy to acknowledge that Hegel was a genius.  But so what?  That doesn’t mean he was right.  It doesn’t even mean that he was intelligible.  As the English essayist Walter Bagehot observed in another context, ‘in the faculty of writing nonsense, stupidity is no match for genius’” (p. 126).  And, indeed, “Hegel wrote a great deal of nonsense” (p. 126), primarily because (as George Santayana noted) he engaged in “the preposterous effort of ‘making things conform to words, not words to thing’” (p. 131).  To Hegel, and other absolute idealists, things are so because I think they are—“the real is rational and the rational is real.”  Declaiming such nonsense amidst an effusion of erudition, Hegel and his confreres departed from the knowable world to the quicksand of imagination.  

Though Kendall faults most of his subjects, he finds a few intellectuals who merit serious attention.   The great French writer, Alexis de Tocqueville, for instance, rightly discerned the fatal flaws in much that paraded as “progress” and “wisdom” in 19th century political and intellectual circles.  Anthony Trollope, the English novelist, wrote with a distinctive moral purpose, extolling in his characters the virtues that lead to the truly good life.  Both P.G. Wodehouse and Charles Peguy receive high praise, as does the Australian David Stove, one of the few contemporary philosophers who provided lucid corrections to much that has been written in the past 200 years.  In his works (including Darwinian Fairytales) Stove’s sharply argued positions “flew in the face of just about every intellectual cliché going, from relativism and irrationalism to doctrinaire Darwinism to the whole smorgasbord of established liberal orthodoxy about (e.g.) art, race, sex, nationalism, J.S. Mill, tobacco, education, and foreign policy” (p. 249).  Unfortunately, Kimball argues, the best thinkers have received the least attention, and our culture therefore suffers.

* * * * * * * * * * * * * * * 

Roger Kimball’s concern for the state of our fine arts is evident in The Rape of the Masters:  How Political Correctness Sabotages Art (San Francisco:  Encounter Books, c. 2004).   As he does in most all of his criticism, he laments both the tawdry quality and highly subjective, unrealistic nature of most modern art.  Witness, for example, Marcel Duchamp adding a moustache to a reprint of the Mona Lisa’s face and signing his name to a urinal that was solemnly placed in an exhibit!  Yet art historians and critics have fallen, lock-step, into a celebration of “artists” such as Duchamp.  Amazingly, the very folks who should be elevating and ennobling our lives actually endeavor “to transform art into an ally in the campaign of decivilization” (p. 11).  So Kimball wrote this book “to provide an antidote—or at least an alternative—to the poison that has infiltrated the study of art history” (p. 27).  

Arguing his case, Kimball pillories recent treatments given seven great artists:  Gustave Courbet, Mark Rothko, John Sargent, Peter Paul Rubens, Winslow Homer, Paul Gauguin, and Vincent van Gogh.  Courbet, for example, was “the leader of the [French] Realist school of painting in the 1850s” (p. 33) and “produced art that was accessible, generally unproblematic, easily understood” (p. 36).  In the fevered  mind of a contemporary art historian, however, his “realist paintings provide ‘an archetype of the perfect reciprocity between production and consumption that Karl Marx in the “General Introduction” to the Grundrusse posited’” (p. 42).  Even better, he was a harbinger of radical feminism, illustrating the “trendy academic criticism” that relishes endless “talk about sex, the more outlandish the better” (p. 49).  In short, there is little actual correlation between the gifted work of Corbet and his modern interpreters!  

Mark Rothko, born in Russia, emigrated to the United States and “produced some of the most ineffably fetching abstract pictures ever painted” (p. 58).  He defined himself as a “realist” and sought to embody in his paintings “the philosophical and moral gravitas of his favorite authors” such as Shakespeare (p. 61).  Needless to say, art critics saw much more in his work, including subtle manifestations of pietas or Hegelian dialectics!  Rather than simply taking pleasure in the beauty of what’s there, they find in his painting whatever pleases them.  Similarly there are eminent academics who project their own fantasies upon John Singer Sargent, a 19th century American whose “dazzling career” brought him fame and fortune.  Approaching him, however, an art “historian” teaching at Wake Forest University, Professor David Lubin, “regards the past less as a window than a mirror.  He gazes steadily at his subject and he sees—himself” (p. 80).  After all, he argues, the past is unknowable, so all he can say, studying the paintings of Sargent, is what he finds in his own mind.  

Such outrages characterize the academic discussions of the other artists Kendall covers.  Influencing them all are Heidegger and Derrida and Foucault and a choir of “postmodernists.” And they all illustrate Roger Scruton’s observation that “‘There is no greater error in the study of human things than to believe that the search for what is essential must lead us too what is hidden.’  This is the deep truth behind Oscar Wilde’s quip that only a very shallow person does not judge by appearances” (p. 161).  

# # # 

208 Love & Responsibility

The late Pope John Paul II (Karol Wojtyla), now routinely called “the Great” by many of his admirers, not only presided over the Catholic Church during an unusually tumultuous time but also left a strong legacy of written works, bound to help shape theological discourse in coming decades.  One of his most important works, Love and Responsibility, tr. by H. T. Willets (San Francisco:  Ignatius Press, c. 1981), was first published in Polish, the product of a lecture series at the Catholic University of Lublin in 1958-1959, merits serious study.  “The present book,” wrote Wojtyla, “was born principally of the need to put the norms of Catholic sexual morality on a firm basis, a basis as definitive as possible, relying on the most elementary and incontrovertible moral truths and the most fundamental values or goods” (p. 16).  Determined to reaffirm the traditional Christian position, he sought to do so in ways appropriate for the 20th century, attuned to the philosophical and psychological currents of his day.   And ultimately he wanted to joyously declare “the fundamental appeal of the New Testament, embodied in the commandment to love and in the saying ‘Be ye perfect’, a call to self-perfection through love” (p. 257).  

He began by examining “the person and the sexual urge,” emphasizing the difference between objective and subjective realities.  There’s an inner subject, as well as an outer object, to each of us.   Unfortunately, subjects can be reduced to objects, treated as something less than persons.  Importantly, then:  “The term ‘person’ has been coined to signify that a man cannot be wholly contained within the concept ‘individual member of the species’, but that there is something more to him, a particular richness and perfection in the manner of his being, which can only be brought out by the use of the word ‘person’” (p. 22).  Persons, unlike other terrestrials, can reason, as Boethius famously declared, defining a person as an individual being of a rational nature (individua substantia rationalis naturae).  

Capable of reasoning, man is by nature a spiritual being with a rich inner life that “revolves around truth and goodness.”  We wonder about ultimate things—why we’re here, where we came from, where we’re going—ultimately meaningful only within an invisible world of God’s making.  And in amazingly gracious ways we encounter “the most profound logic of revelation:  God allows man to learn His supernatural ends, but the decision to strive towards an end, the choice of course, is left to man’s free will.  God does not redeem man against his will” (p. 27).  He freely chooses to love or not.  “Man’s capacity for love depends on his willingness consciously to seek a good together with others, and to subordinate himself to that good for the sake of others, or to others for the sake of that good.  Love is exclusively the portion of human persons” (p. 29).  Still more deeply:  “Love is the unification of persons” (p. 38).  We’re called to love persons, not to use them for selfish pleasures.  Such love, importantly, must remain rooted in “the order of justice [which] is “more fundamental than the order of love” (p. 42).  Love goes beyond giving another person what is due him, but it never violates that basic principle of fairness, the commitment to doing what is right for the other.  

Such justice extends to God as well as man.  We do justice to God when we recognize Him as Creator of all things, ourselves in particular.  He alone gives and sustains life.  The world works in accord with His wisdom, and we live well when we desire to fit in with His designs.  “And this understanding and rational acceptance of the order of nature—is at the same time recognition of the rights of the Creator when he recognizes the order of nature and conforms to it in his actions” (p. 246).  Still more:  “Man, by understanding the order of nature and conforming to it in his actions, participates in the thought of God, becomes particeps creatoris, has a share in the law which God bestowed on the world when He created it at the beginning of time” (p. 246).  

Living rightly, doing justice, we must rightly understand our sexuality, “one of the crucial problems in ethics” (p. 54), recognizing that the real “end of the sexual urge is the existence of the species, Homo, its continuation (procreatio), and love between persons, between man and woman, is shaped, channeled one might say, by that purpose and formed from the material it provides” (p. 53).  Sex, basically, is all about babies!  The primary reason for sexual relations, evident in nature, is the preservation of the species.  And yet, unlike other animals, we have a spiritual as well as “libidinistic” nature, so sex necessarily means more than mere procreation.  Thus the Church always insists “that the primary end of marriage is procreatio” but it also provides a way for men and women to both enjoy physical pleasure and establish a spiritual union—a “mature synthesis of nature’s purpose with the personalistic norm” (p. 67).  

Turning to a metaphysical and psychological analysis of “love,” Wojtyla insists it must rightly fuse freedom and truth, “the primary elements of the human spirit” (p. 116).  To freely affirm the worth of another, to deeply desire the happiness of one’s beloved, to fully commit oneself to establishing and maintaining the well-being of him or her, is the authentic way of loving.  Still more:  “to desire ‘unlimited’ good for another person is really to desire God for that person:  He alone is the objective fullness of the good, and only His goodness can fill every man to overflowing” (p. 138).  Taking personal responsibility, doing one’s duty, validates love.  Mutually giving and receiving establishes the ontological union—becoming one flesh—that we by nature deeply desire.  

Such love presumes chastity, continence—properly controlling sexual behavior, developing self-mastery, perfecting our being.  Sadly rare in our sensual culture, without this virtue there can be no true love.  The “love” celebrated in films, little more than “carnal concupiscence,” promulgates “a ‘love’ which is not love, a love which provokes erotic feelings based on nothing but sensual desire and its satisfaction.  These feelings have as their object a person of the other sex, yet do not rise to the level of the person, since they do not go beyond ‘the body and sex’, as their proper and sole content” (p. 15).  Two people “having sex” engage in what’s properly defined as “bilateralism”—using each other, doing things together that provide pleasure but never establish a reciprocal, unifying personal relationship.  They confuse the fervor of their “feelings” with the reality of love, when in fact:  “‘Authenticity’ of feeling is quite often inimical to truth in behavior” (p. 163).  Indulging a “sinful love” that betrays their deepest desire for happiness, they exchange ultimate goods for passing pleasures.  They sincerely “feel” and thus misinterpret their “love,” for it’s an illusion, a fiction:  “‘Sinful  love’ comes into being when affirmation of the value of the person, and intentness on the true good of the person, (which are at the core of true love) are absent, and instead a hankering after mere pleasure, mere sensual enjoyment connected with ‘sexual experiences’ invades the relationship between man and woman.  ‘Enjoying’ then displaces ‘loving’” (p. 164).  

Chastity, however, says “yes” to loving rather than enjoying, manifesting a “tenderness’ that fuses benevolence and devotion.  “The essence of chastity consists in quickness to affirm the value of the person in every situation and in raising to the personal level all reactions to the value of ‘the body and sex’” (p. 171).   Refusing to use another person as a pleasurable object, chaste lovers manifest a “loving kindness” rooted in their commitment to each other’s well-being.  It also calls for modesty.  Thus, wroteWojtyla:  “A woman wants to be loved so that she can show love.  A man wants to love so that he can be loved.  In either case sexual modesty is not a flight from love, but on the contrary the opening of a way towards it.  The spontaneous need to conceal mere sexual values bound up with the person is the natural way to the discovery of the value of the person as such.  The value of the person is closely connected with its inviolability, its status as ‘something more than an object of use’.  Sexual modesty is as it were a defensive reflex, which protects that status and so protects the value of the person” (p. 179).  

Living chastely, one may give himself to another person.  Giving one’s self is the highest form of love.  Dying to self we find our self, for self-gift is the highest and most distinctive potential we have as humans designed in God’s image.  Death to self—the ancient key to holiness—opens the door to a joyous intimacy in marriage as well as union with the Lord.  Love “makes for unification through the reciprocal gift of self” (p. 127).  Indeed:  “Love consists of a commitment which limits one’s freedom—it is a giving of the self, and to give oneself means just that:  to limit one’s freedom on behalf of another” (p. 135).  Taking responsibility for the well-being of another being, giving one’s life (often in routine domestic chores) to preserve the life of another, brings to fruition all that is good in a person.   

* * * * * * * * * * * * * * *

Soon after his election as Pope, John Paul II began exploring, in his weekly addresses, Man and Woman He Created Them:  A Theology of the Body tr. by Michael Waldstein (Boston:  Pauline Books and Media, c. 2006, 1997).  We discover in Genesis, he says, “the ‘beginning’ of the theology of the body.  The fact that theology also includes the body should not astonish or surprise anyone who is conscious of the mystery and reality of the Incarnation.  Through the fact that the Word of God became flesh, the body entered theology—that is, the science that has divinity for its object—I would say, through the main door” (p. 221).  Though obviously focused on man, woman, and marriage, the Pope had far more in mind as he addressed the subject.  As was evident in Love and Responsibility, these messages sought to correct (as Christoph Cardinal Schonborn notes) the egregious “habit widespread among intellectuals of confusing the order of nature with the biological order” (p. xxiii).  By nature we’re embodied souls.  By nature we’re “endowed with certain unalienable rights.”  By nature we’re capable of recognizing the difference between right and wrong.  There’s obviously more to “the order of nature” than matter-in-motion, for it rightly “includes all these richer relationships among real beings” (p. xxiv).  In his Letter to Families, John Paul II said that “man is a person in the unity of his body and his spirit.  The body can never be reduced to mere matter:  It is a spiritualized body, just as man’s spirit is so closely united to the body that he can be described as an embodied spirit” (p. 96).    

So John Paul’s “catechesis,” John West explains, “illumines the entirety of God’s plan for human life from origin to eschaton with a splendid supernatural light.  It’s not only a response to the sexual revolution, it’s a response to the Enlightenment.  It’s a response to modern rationalism, Cartesian dualism, super-spiritualism, and all the disembodied anthropologies infecting the modern world.  In short, the theology of the body is one of the Catholic Church’s most critical efforts in modern times to help the world become more ‘conscious of the mystery and reality of the Incarnation’—and through that to become more conscious of the humanum, of the very purpose and meaning of human life” (p. xxvii).  

Though various editions of the Pope’s weekly addresses have been printed, this one excels because of the work of its translator.  “Michael Waldstein,” says George Weigel (the author of a fine biography of John Paul II), “is going to put many people in his debt with this superb piece of work, a labor of love shaped by an acute intelligence.  The illuminating translation, the brilliant [128 page] introduction, and the carefully crafted index will make this the standard English-language edition throughout the twenty-first century for scholars, for pastors, for students, and indeed for anyone interested in exploring John Paul II’s most creative contribution to human self understanding.”  Waldstein asserts that “the full greatness of John Paul’s vision only emerges when one sees his concern for spousal love in the larger context of his concern about our age, above all for the question of scientific knowledge and power over nature, that is, the characteristically modern question of ‘progress.’  He argues that ‘the essence of the church’s teaching’ about contraception lies in a more critical judgment about the domination of the forces of nature’ by human power” (p. 3).   

The desire to dominate nature gained philosophical traction (as ecologists have long recognized) in the works of Francis Bacon (a British empiricist) and Rene Descartes (a French rationalist), the 17th century architects of “modern” philosophy.  To John Paul II, the “scientific rationalism spearheaded by Descartes is above all an attack on the body” (p. 95).  Following Descartes and drawing on scientists such as Isaac Newton, various forms of mechanistic thinking soon dominated elite scientific and philosophical circles.  As John Paul II noted, in Evangelium Vitae:  “Nature itself, from being ‘mater’ (mother), is now reduced to being ‘matter,’ and is subjected to every kind of manipulation.” (p. 100).  Consequently we now live in a world that “rejects the very idea that there is a truth of creation which must be acknowledged, or a plan of God for life which must be respected” (p. 100).  Even Immanuel Kant, despite his insistence that we treat persons as ends rather than means to an end, took for granted the essentially agnostic, materialistic perspectives of Thomas Hobbes and David Hume.  The inner, subjective, autonomous realm for Kant had little congruence with the external physical world, where “objective” scientific procedures prevailed.  

The German philosopher Max Scheler, however, critiqued Kant and rejected much of the “modern” philosophical endeavor; he insisted that love for the world, not the desire to manipulate it, opens the mind to the Truth.  To Waldstein:  “The opposition between Kant and Scheler goes to the very roots of philosophy.  For Scheler the central animating principle of philosophy is the desire to dwell with love and devotion in a receptive, contemplative vision in order to grasp what is truly evident.  Against the ‘constructions’ of Kantian Idealism, he insists that philosophy must have a supple and obedient regard for what is given in experience.  Philosophy must be an account (logos) of what is truly evident (phainomenon).  In short, it must be phenomenology.  In agreement with Scheler, John Paul II emphasizes love as the animating principle of phenomenology.  “Phenomenology is primarily a style of thought, a relationship of the mind with reality, whose essential and constitutive features it aims to grasp, avoiding prejudice and schematisms.  I mean that it is, as it were, an attitude of intellectual charity to the human being and the world, and for the believer, to God, the beginning and end of all things” (p. 65).  

To Scheler’s phenomenology John Paul II added the profoundly Christian personalism of St. John of the Cross, whom he encountered as a student during WWII.  To master John’s mystical theology in the original language, which routinely compares the soul’s bond with God with the union of husband and wife, the future pope learned Spanish.  A few years later he wrote his doctoral dissertation on Faith according to St. John of the Cross and ever after confessed his debt to the writings of the 16th century Carmelite.  “To him I owe so much in my spiritual formation,” said the Pope, and “I have found in him a friend and master who has shown me the light that shines in the darkness for walking always toward God” (p. 26).  In his mystical works, St. John prescribed reciprocal giving as the key to union with God—and he used the spousal union as the finest earthly example of this truth.  “’This spiritual marriage,’” he said, in Spiritual Canticle, “‘is total transformation in the Beloved, in which each surrenders the entire possession of self to the other with a certain consummation of the union of love’” (p. 31).  “God so loved the world that He gave His only-begotten son.”  Loving means giving, and we imitate God inasmuch as we give ourselves away.

           I have devoted most of this review of Man and Woman He Created Them to Michael Waldstein’s introduction to the book for two reasons:  1) he makes clear the philosophical and theological issues basic to the text; 2) he provides a cogent analysis of the underlying to structure of the Pope’s message—difficult to perceive when simply reading the weekly addresses, frequently repetitious and intellectually challenging.  Basically, however, the addresses are extended analyses of pivotal biblical tests—including Genesis 2:5-7; Mathew 5:27-28; Matthew 19:2-12; Ephesians 5: 20-33.  Though the Pope’s academic training was in philosophy, his weekly meditations refer almost exclusively to the Bible.  Though not a renowned as a biblical scholar, he certainly reveals his immersion in the Scriptures and commitment to their authority and insight.  

            In Genesis we find justification for “the theology of the body,” he explains, for it “is linked from the beginning with the creation of man in the image of God, [and] becomes in some way also a theology of sex, or rather a theology of masculinity and femininity” (p. 165).   When stressing the importance of marriage, Jesus referred back to the “beginning,” stressing that the primordial union of Adam and Eve—their “one flesh”—serves as the model for understanding our sexual nature as human beings, for here “God reveals himself above all as Creator” (p. 179).  Man and woman He created them!  “Uniting so closely with each other that they become ‘one flesh,’ they place their humanity in some way under the blessing of fruitfulness, that is, of ‘procreation,’ about which the first account speaks (Gen 1:28).  Man enters ‘into being’ with the consciousness that his own masculinity-femininity, that is, his own sexuality, is ordered to an end” (p. 184).  Following the first of all commandments—“be fruitful and multiply”—Adam and Eve brought children into the world. “The first woman to give birth has full awareness of the mystery of creation which renews itself in human generation.  She also has full awareness of the creative participation God has in human generation, his work and that of her husband, because she says, ‘I acquired a man from the Lord’” (p. 213).   

            Man’s fall into sin disoriented his sexual desires and behaviors.  But in Jesus’ Sermon on the Mount we are reminded of how we ought to behave—“as it was in the beginning.”  Curtailing concupiscence—the lust of the flesh (illicitly “looking to desire”)—his disciples are called to avoid adultery and be faithful to their spouses.   “What Christ demands from all his actual and potential listeners in the Sermon on the Mount clearly belongs to that interior space inn which man—precisely  the one who listens—must rediscover the lost fullness of his humanity and want to regain it” (p. 301).  Such fullness can only be regained through love—caring for another person so fully that one is willing to sacrifice to give him or her all that is good.  “Called precisely to this supreme value, which is love.  Called as a person in the truth of his humanity, and thus also in the truth of his masculinity and femininity, in the truth of his body.  Called in that truth which has been his inheritance ‘of the beginning’ the inheritance of his heart, which is deeper than the sinfulness inherited, deeper than the three fold concupiscence.  Christ’s words, set in the whole reality of creation and redemption, re-activate that deepest inheritance and give it real power in human life” (p. 314).  

            The truth of Genesis and the Sermon on the Mount are finally confirmed by St Paul in Ephesians 5, which celebrates “the interpersonal covenant proper to marriage” (p. 473) through the reciprocal submission of both husband and wife.  This intimate, sacramental union, akin to that between Christ and His Church, “is a revelation and realization in time of the mystery of salvation, of the election of love ‘hidden’ from eternity in God” (p. 476).  There is no higher calling.  There is no better means to holiness, for thereby our body “is capable of making visible what is invisible:  the spiritual and the divine.  It has been created to transfer into the visible reality of the world the mystery hidden from eternity in God, and thus to be a sign of it” (p. 505).   

207 Back to Aristotle & Aquinas with Edward Feser

It’s always delightful to discover an erudite scholar engaging his world with wit and wisdom.  These qualities mark The Last Superstition:  A Refutation of the New Atheism (South Bend:  St. Augustine’s Press, c. 2008), wherein Edward J. Feser takes seriously St. Thomas Aquinas’ warning that “‘A small error in the beginning of something is a great one at the end’” (p. vii).  To illustrate, he begins his treatise by examining the California Supreme Court’s 2008 decision to permit homosexuals to “marry,” arguing that “the most important thing to know about ‘same sex marriage’ is not that it has been lawlessly imposed by certain courts” but that “it is a metaphysical absurdity and a moral abomination.” (p. ix).  In truth, neither the courts nor the people can “define” marriage any more than they can decide whether or not the axioms or Euclidean geometry are true.  Marriage, if it stands for anything at all, is “an objective metaphysical fact determined by its final cause, inherently procreative, and thus inherently heterosexual.  There is no such thing as ‘same-sex marriage’ any more than there are round squares.  Indeed, there is really no such thing as ‘sex’ outside the context of sexual intercourse between a man and a woman” (p. 149).  Furthermore, “if ‘same-sex marriage is not contrary to nature, then nothing is; and if nothing is contrary to nature, then . . . there can be no grounds whatsoever for moral judgment” (p. 150).  

The California court, however, revealed the progressive, secularist  mindset (considered “the last superstition” by Feser) which “is, necessarily and inherently, a deeply irrational and immoral view of the world, and the more thoroughly it is assimilated by its adherents, the more thoroughly do they cut themselves off from the very possibility of rational and moral understanding” (p. 3).  Committed, as they are, to a thoroughly (and irrationally narrow) materialistic conception of reality, considering only the findings of empirical science, progressive secularistgs have divorced themselves from the rich, classical philosophical heritage of the West.  Consequently:  “The irony is that to anyone who actually knows something about the history and theology of the Western religious tradition for which [Sam] Harris, [Daniel] Dennett, [Richard] Dawkins, and [Christopher] Hitchens show so much contempt, their books stand out for their manifest ignorance of that tradition and for the breathtaking shallowness of their philosophical analysis of religious matters” (p. 4).   

Feser knows this position well, for he was “for many years a convinced atheist and naturalist” (p. 6).  Slowly, by carefully reading philosophers such as Gottlob Frege, he became persuaded that “there exists, in addition to the material world and the ‘world’ within the human mind, a ‘third realm’ of abstract entities, in particular of meanings and of mathematical objects like numbers” (p. 6).  He then discovered, through Elizabeth Anscombe and Alasdair MacIntyre, the perennial relevance of Aristotle.  Finally, notable philosophers of religion, such as Alvin Plantinga and William Lane Craig whetted his appetite for St. Thomas Aquinas.   “All of this led me eventually to a serious reconsideration of the Aristotelian tradition in philosophy in general, and of Aquinas’s adaptation of it in particular, and the end result was that I became convinced that the basic metaphysical assumptions with modern secular philosophers rather unreflectively take for granted, and which alone can make atheism seem at all plausible, were radically mistaken” (p. 7).  Thus this book!

To build his case, Feser begins with a brief survey of philosophy’s Greek origins.  With Plato it becomes evident that “when we grasp the essence or nature of being a triangle [for example], [what] we grasp is not something material or physical, and not something we grasp or could grasp through the senses” (p. 33).  We “know the essence of triangularity is something universal rather than particular, something immaterial rather than material, and something we know through the intellect rather than the senses” (pp. 33-34).  Such universals cannot be reduced, however, to mental states—mere ideas within the mind.  “For what we know about triangles are objective facts, things we have discovered rather than invented” (p. 34).  What we know about things are their essences—their immaterial, objective, forms.  And knowing forms leads us to discern, with Plato, the ultimate “Form of the Good,” the “source of all being” (p. 37).   Knowing any form (whether triangularity or beauty, squareness or justice) “itself requires in turn knowing the Form of the Good by reference to which it counts as a perfect archetype.  The Form of the Good is thus the highest of the Forms, their source, and indeed the source of all being” (p. 37).  Plato does not specify this ultimate Form as God, though his followers (including Christians such as Augustine) certainly did.

However modified and reformulated, “something like Plato’s theory is notoriously very hard to avoid if we are to make sense of mathematics, language, science, and the very structure of the world of our experience” (p. 40).  Having granted Plato’s greatness, however, Feser finds Aristotle even greater, for his “is the most powerful and systematic realist metaphysics ever developed” (p. 51).  Lamentably, as Feser demonstrates later on:  “Abandoning Aristotelianism, as the founders of modern philosophy did, was the single greatest mistake ever made in the entire history of Western thought” (p. 51).  Aristotle’s insights, such as actuality and potentiality, causation (material; formal; efficient; final) and logic, cannot be abandoned without compromising the whole philosophical endeavor.  In short, “it was the logical development of Aristotelian ideas (primarily by his medieval Scholastic admirers) that provided the most powerful and systematic intellectual foundation for traditional Western religion and morality—and for that matter, for science, morality, politics, and theology in general—that has ever existed” (p. 52).  

Coupled with an appreciation for the ancient Greeks, Feser urges us to learn from Medieval thinkers such as Thomas Aquinas, the “greatest philosopher of the Middle Ages, and among the greatest philosophers, period” (p. 77).  His intellectual debt to St. Thomas Aquinas is further evident in his Aquinas:  A Beginner’s Guide (Oxford:  Oneworld Publications, c. 2009), devoted to his metaphysics, natural philosophy, psychology, and ethics.  When one turns to Aquinas after reading “new atheists” such as Christopher Hitchens and Richard Dawkins (who fail to grasp even elementary differences between science and metaphysics), the luminous superiority of the Angelic Doctor becomes instantly apparent.  Dawkins et al., for example, cannot think of creation except within a linear time frame, but when Aquinas endeavored to develop arguments demonstrating God’s existence, he aimed “to show that given that there are in fact some causes of various sorts, the nature of cause and effect entails that God is necessary as an uncaused cause of the universe even if we assume that the universe has always existed and thus had no beginning.  The argument is not that the world wouldn’t have got started if God hadn’t knocked down the first domino at some point in the distant past; it is that it wouldn’t exist here and now, or undergo change or exhibit final causes here and now unless God were here and now, and at every moment, sustaining it in being, change, and goal-directedness” (Last Superstition, p. 86).  

Aquinas held this because, along with other Medieval thinkers, he grasped the difference between “accidentally ordered” and “essentially ordered” events.  The former occur as a series within time, as illustrated in the biblical lists of sons begotten by their fathers; the latter must be traced not backward but “‘downward’ in the present moment [as when a batter swings the bat, moving his hands and the bat simultaneously, the arm and shoulder and hands causing the bat to move] since they are series in which each member depends simultaneously on other members which simultaneously depend on turn on yet others, and so on” (Last Superstition, p. 93).  When we grasp this distinction, and understand that God is “Pure Act,” we can begin to see why it is that He must, necessarily, exist.  Inasmuch as everything has an “essence,” what it is, it attains being by being actualized, deriving its existence from another, more basic Reality—Being Itself.  “As Peter Geach puts it, for Aquinas the claim that God made the world ‘is more like “the minstrel made music” than “the blacksmith made a shoe”’; that is to say, creation is an ongoing activity rather than a once-and-for-all event.  While the shoe might continue to exist even if the blacksmith dies, the music necessarily stops when the minstrel stops playing, and the world would necessarily go out of existence if God stopped creating it” (Aquinas, p. 88).  

The natural world, though composed of mindless matter, appears mysteriously ordered.  As Aquinas noted, “things which lack intelligence, such as natural bodies, act for an end, and this is evident from their acting always, or nearly always, in the same way, so as to obtain the best result” (Summa Theologiae, 1.2.3).  Trees grow upward, seeking the light; arrows follow a predictable trajectory; acorns drop to the earth, prepared if buried rightly to develop into a lofty oak.  In view of such manifest facts:  “It follows,” Feser says, “that the system of ends or final causes that make up the physical universe can only exist at all if there is a Supreme Intelligence or intellect outside the universe which directs things towards their ends.  Moreover, this intellect must exist here and now, and not merely at some beginning point in the past, because causes are here and now, and at any point at which they exist at all, directed towards certain ends” (Aquinas, p. 117).  Far from being a probable hypothesis, an intelligently ordered cosmos demonstrates, as Aquinas argued, the reality of an Intelligent Agent.  

Emphatically, Feser asserts:  “the classical theistic arguments, and certainly the arguments of such major philosophical theologians as Anselm, Aquinas, and Leibniz, are not properly interpreted as ‘God of the gaps’ arguments at all.  They are not ‘hypotheses’ or attempts to ‘postulate’ a quasi-scientific explanation for particular phenomena that science has not yet accounted for, but which it could in principle account for someday.  They are rather attempts conclusively to demonstrate the existence of a Necessary Being or First Cause of the world on the basis of premises (concerning the metaphysics of causation, say, or the contingency of the material world, or the concept of a greatest possible being) about which empirical science has nothing to tell us.  The question of whether they succeed or fail as proofs is thus independent of the current state of our scientific knowledge” (Philosophy Mind, pp. 236-237).  

Aristotle and Aquinas also developed the notion of “hylomorphism” (matter-formism) to explain the reality of and differences between nutritive (vegetative), sensitive (animal), and uniquely rational (human) souls.  Feser discusses this in more detail in Philosophy of Mind:  A Beginner’s Guide (Oxford:  Oneworld Publishers, c. 2005, reprinted, with a Postcript, 2006), as well as his introduction to Aquinas, wherein he defends what philosophers call hylomorphic or Thomistic dualism.  He probes such questions as:  “Is the mind nothing but the brain?  Do you have an immaterial and immortal soul, inaccessible to science and knowable only via metaphysical inquiry?  Can computers think?” (Philosophy of Mind, p. vi).  Debates concerning such things were launched by Rene Descartes, who “set the agenda for modern philosophy in general and philosophy of mind in particular” (p. 1).  His metaphysical dualism—severing the purely physical res extensa from the purely non-physical res cogitans—has been misrepresented and caricatured and largely rejected by modern thinkers.  But his fundamental insight, noting that there is a radical difference between things and thinking—as real and discrete as the difference between apples and oranges—retains its basic truth.  That we think with our brains does not mean that our minds are nothing but brains.

Philosophers who insist there is no difference between minds and brains—various sorts of materialists—now dominate academic discussions.  So Feser devotes much of Philosophy of Mind to an examination of their arguments.  Following the example of Aquinas, he explains their views judiciously and clearly.  He acknowledges their power and persuasiveness, especially when one limits his thinking to empirical science.  But in the final analysis, they all fail to adequately explain such eminent realities as qualia (an inner awareness of such things as pain and color), subjectivity, intentionality, consciousness, and rational thought.  To Aristotle and Aquinas:  “Rationality—the ability to grasp forms or essences and to reason on the basis of them—has as its natural end or final cause the attainment of truth, of understanding the world around us.  And free will has its natural end or final cause the choice of those actions that best accord with the truth as it is discovered by reason, and in particular in accord with the truth about a human being’s own nature or essence” (Last Superstition, p. 122).  In sum:  to Feser, “Descartes’s basic contention that the mind is irreducible to the brain or body has not been refuted” (Philosophy Mind, p. 211), though he famously failed to suggest how mind and body interact.  This resulted from his discarding Aristotle and Aquinas, losing their understanding of an immanent teleology in all that is.  He, and most modern philosophers, forfeited a better “conception of matter—in particular a conception in which matter isn’t utterly devoid of mental properties” (p. 219).  This we find in Aristotelian hylomorphism!  Form and matter are two distinctive aspects of all that is, explicable in terms of material, efficient, formal, and final causes.  

In us humans, the soul (from the moment of conception) forms the body, but it enjoys an existence apart from it.  Amazingly, recent scientific discoveries affirm this hylomorphic perspective, for “the nature and structure of DNA is exactly the sort of thing we should expect to exist given an Aristotelian metaphysical conception of the world, and not at all what we would expect if materialism were true” (Last Superstition, p. 129).   Importantly, to Aquinas, rational thought is “not strictly speaking a bodily operation at all, but an immaterial one.  But if the rational soul operates independently of the body, it cannot depend for its continued existence on the continued existence of the body.  In short, the human soul, unlike the souls of plants and animals and unlike any form of any other kind, is a subsistent form:  it is capable, in principle anyway, of continuing in existence as a particular thing after its separation from the body in death, and even after the destruction of that body” (Philosophy Mind, p. 225).  

While praising and appropriating insights from the Ancient and Medieval thinkers, Feser certainly casts a critical gaze at the “descent of the modernists.”  Following the trajectory of Medieval nominalists such as William of Ockham, who rejected the realism of Aristotle and Aquinas, modern philosophers have repudiated the very principles that justify the existence of their calling.  Rather than explore “the ultimate causes and meaning of things” they have sought “means of increasing ‘human utility and power’ through the ‘mechanical arts’ or technology (Bacon) and of making us ‘masters and possessors of nature’ (Descartes).  Usefulness would replace wisdom, and pampering the body in this life would push aside preparing the soul for the next” (p. 175).  In the soil poisoned by Bacon and Descartes, Hobbes and Hume, there sprouted a Mechanical Philosophy determined to reduce all that is to matter-in-motion, ratified (it was assumed) by Newtonian physics.  “The modern metaphysical picture entailed by mechanism, especially when conjoined with nominalism, thus opens up an unbridgeable ‘gap’ between mind and reality” (p. 201).  Yet each of them, as Elizabeth Anscombe remarked about Hume, should probably be ranked as a “‘mere— brilliant—sophist’” (p. 254).   Skepticism, determinism, nihilism all follow, working out the implications of the course chosen centuries ago.  

As the philosopher W.T. Stace noted, in an important article 60 years ago:  “‘The real turning point between the medieval age of faith and the modern age of unfaith came when the scientists of the seventeenth century turned their backs upon what used to be called “final causes” which were “‘basic to the whole of Western civilization.’”  Consequently, Stace continued:  “‘The conception of purpose in the world was ignored and frowned upon.  This, though silent and almost unnoticed, was the greatest revolution in human history, far outweighing in importance any of the political revolutions whose thunder has reverberated through the world. . . .  The world, according to this new picture, is purposeless, senseless, meaningless.  Nature is nothing but matter in motion.’”  Everything runs according to chance and necessity, without purpose or design.  Simultaneously “‘there went the ruin of moral principles and indeed of all values,’” which were reduced to little more than personal preferences and social conventions (p. 225-226).  

Evaluating such developments Feser celebrates “Aristotle’s revenge.”   Modern philosophy, so pervasively materialistic, in large part fails because it fails to recognize the perennial truth Aristotle discerned.  The world, in short, makes sense only when we think in an Aristotelian manner, for it is simply impossible to reason otherwise!  Thus Max Delbruck, who won a Nobel Prize for his work in biophysics, suggested we grant Aristotle a Nobel Prize “‘for the discovery of the principle implied in DNA,’ and ‘the reason for the lack of appreciation, among scientists, of Aristotle’s scheme lies in our having blinded for 300 years by the Newtonian view of the world’” (p. 257).  One cannot “eliminate final causality or teleology from the explanation of human action.  As Alfred North Whitehead once put it, ‘those who devote themselves to the purpose of proving that there is no purpose constitute an interesting subject for study’” (p. 247).  

So for Feser it’s back to Aristotle!  It seems to “be no doubt that a broadly Aristotelian philosophical worldview is still as rationally defensible today as it ever was.”  To think rationally cannot but involve following his footsteps.  Indeed, he has been rejected largely for ethical reasons, for if his metaphysics is true his ethics necessarily follow.  “Even in Aristotle’s own work, we find a very conservative ethics grounded in human nature, a doctrine of the immateriality of the human intellect, and an Unmoved Mover of the universe contemplation of whom is the highest end of human existence.  By the time Aquinas and the other Scholastics were done refining and drawing out the implications of the Aristotelian system, it was evident that it entailed nothing less than the entire conception of God enshrined in classical monotheism, the immortality of the soul, and the natural law system of morality.  To acknowledge the truth of the Aristotelian metaphysical picture of the world is thus unavoidable to open the door to everything the Scholastics built on it.  In short, Aristotle’s revenge is also Aquinas’s revenge; and for that reason alone, contemporary secular intellectuals cannot allow themselves to acknowledge it.  For the project of the early moderns is their project too” (p. 267).  

Feser is an Assistant Professor of Philosophy at Pasadena City College.  That a scholar of his caliber teaches at a community college reminds me of a remark Alexander Solzhenitsyn made when he addressed a gathering at Harvard University.  Cognizant of Harvard’s elite standing in the academic world, he noted that much of the best thinking in the nation took place in non-distinguished, out-of-the-way places where individuals were not subservient to the reigning dogmas of fashion and ideology.  I’d guess he’d be delighted to find a mind such as Feser’s working in an undergraduate college in Pasadena!   

Commending Feser’s endeavor in The Last Superstition, the noted literary critic and co-editor of The New Criterion, Roger Kimball describes it as:  “A thoughtful and theologically sophisticated sally into the ranks of the New Atheism.  Feser has written a lively and well informed polemic against the latest crop of Village Atheists . . . who have provided the public with so much entertainment and so little enlightenment these past few years.  This is a serious and passionately engaged challenge to the latest effort to impose a dehumanizing orthodoxy by religious illiterates.”  

206 A Signature in the Cell

In 2004, the Proceedings of the Biological Society of Washington, a scholarly journal housed at the Smithsonian Institution, published a peer-reviewed article advocating intelligent design by Stephen Meyer.  Angry scientists, alarmed at the article’s deviation from the party line, demanded the journal’s editor be censured for allowing such heresy to gain traction.  In due course, the editor was demoted and assigned another position within the institution.  Such is the opposition that meets any scholar who dares study biology with anything less than a philosophical commitment to monistic materialism—a stance nicely illustrated in Francis Crick’s admonition that biologists, while marveling at the mystery of DNA, must “‘constantly keep in mind that what they see was not designed, but rather evolved’” (p. 12).  .  

Stephen Meyer, however, has amplified his case for intelligent design in Signature in the Cell:  DNA and the Evidence for Intelligent Design (New York:  HarperCollins Publishers, c. 2009).  Meyer received his Ph.D. from the University of Cambridge in the philosophy of science and now directs the Center for Science and Culture at the Discovery Institute in Seattle.  This book, he says, “attempts to make a comprehensive, interdisciplinary argument for a new view of the origin of life.  “It makes ‘one long argument’ for the theory of intelligent design” (p. 8).  Consequently it is, writes Steve Fuller (a professor at the University of Warwick), “at once a philosophical history of how information has come to be central to cutting-edge research in biology today and one man’s intellectual journey to the conclusion that intelligent design and provides the explanation for that fact.”  

Meyer primarily focuses on the reality of non-material information within the biological world.  Just as you can load “information” into a computer without adding any weight to its material components, so too information (evident in the DNA) has been programmed into all that lives.  Consequently, evolutionary biologists, though deeply committed to reductionistic materialism, cannot avoid using teleological language that refers to intentionality and purpose when describing what they behold.  Thus we read them refer to such things as:  “‘genetic code,’ ‘genetic information,’ ‘transcription,’ ‘translation,’ ‘editing enzymes,’ ‘signal-transduction circuitry, ‘feedback loop,’ and ‘information processing system’” (p. 21).  Overtly denying design, they cannot find words that fail to imply it!  

As a young scientist working in the oil industry, Meyer himself became enamored with the mystery of life while attending a conference that addressed “three big scientific questions—the origin of the universe, the origin of life, and the nature of human consciousness” (p. 24).   One of the speakers was “Charles Thaxton, the chemist who with his coauthors had proposed the controversial idea about an intelligence playing a role in the origin of biological information” (p. 28).  Intellectually challenged, Meyer decided to pursue his interests in one of the world’s elite research universities (Cambridge) and discern the degree to which information, not mindless matter-in-motion, best explains life’s mysterious origins.  

Today’s scholars who embrace a “matter first” theory follow the lead of Soviet biochemist Aleksandr Oparin, who sought to harmonize Darwinian science with Marxist materialism.  His position was allegedly confirmed by a famous experiment at the University of Chicago, when Stanley Miller distilled some amino acids from a chemical mix charged with high-voltage electricity.  Despite the textbook popularity of the Miller-Urey experiment, however, recent research reveals its virtual irrelevance in origins-of-life inquiries.  We now know that pre-biotic atmospheric conditions were hostile to life and quite unlike those Miller assumed.  This is largely due to the mounting evidence of the complexity of even the simplest forms of life.  As Watson and Crick and their successors began to fathom the double helix structure of the DNA, biologists confronted the massive amount of “information” contained therein.  Renowned information scientists differentiate between “mere complexity” (routine patterns observed in crystals, for example) and “specified complexity” (evident in proteins and cells as well as meaningful sentences).  As Meyer describes the “molecular labyrinth,” carefully identifying the incredible process (through transcription and translation) whereby cells function, he also ponders the deeply philosophical questions regarding life’s beginning.  Are chance and necessity, as many materialists insist, sufficient causes for the realities we observe?  

Or is there something more?  Something rational supplying the rationality, the information so evident in all that lives?   To Meyer it was clear that Alfred North Whitehead rightly asserted:  “‘There can be no living science unless there is a widespread instinctive conviction in the existence of an Order of Things.  And, in particular, of an Order of Nature.’  Whitehead argued that confidence in this proposition was especially inspired by the ‘medieval insistence upon the rationality of God’” (p. 142).   Most great scientists in the history of science—giants such as Kepler and Newton—saw divine design everywhere in the world they studied.  They did so because they inferred the makings of history from its results—employing what Charles Sanders Peirce would label “abduction.”  None of us has seen Abraham Lincoln, but we cannot, as historians, explain the Civil War without affirming his existence and activities.  Historical scientists (as Peter Lipton explained in Inference to the Best Explanation) consider competing hypotheses as they endeavor to explain what happened in the past, what provides “causal adequacy.”  Cosmologists now embrace the Big Bang as the best explanation of the origin of the universe, remarkably unlike the “steady state” theories that long dominated the field.  

Reading Lipton’s treatise on inference to the best explanation, “a light came on for me,” Meyer says.  “I immediately asked myself:  What causes now in operation produce digital code or specified information?  Is there a known cause—a vera causa—of the origin of such information?” (p. 171).  Everywhere it seems evident that intelligent agents—and only intelligent agents—are responsible for specified information.  To explain (using abductive reason) all the information in living organisms, it simply makes sense to posit the possibility of an Intelligent Agent as its source.  Mathematical calculations render the possibility of chance (random material events) creating life nearly preposterous.  Indeed:  “The odds of getting even one functional protein of modest length (150 amino acids) by chance from a prebiotic soup is no better than 1 chance in 10164”  (p. 212).  To illustrate this astronomical number, Meyer notes that there are only half that number of protons, neutrons and electrons in the entire universe!  There have only been 1016 seconds since the Big Bang beginning of the universe!  We’re faced with the fact, as G.K. Chesterton said a century ago, that:  “Evolution as explanation, as an ultimate philosophy of the cause of living things, is still faced with the problem of producing rabbits out of an empty hat; a process commonly involving some hint of Design” (CW, XVII, p. 291).  After carefully explaining (and rejecting) various naturalistic hypotheses, Meyer declares that since “evidence for the causal adequacy of intelligence is all around us both inside and outside the lab” (p. 340) it makes sense to attribute the vera causa of life to intelligent design.   “Experience shows that large amounts of specified complexity or information (especially codes and languages) invariably originate from an intelligent source—from a mind or a personal agent” (p. 343).  

Meyer meticulously considers a multitude of issues in 500 pages, adding another 50 pages of endnotes and 30 pages of bibliographical references to scholarly literature.  Yet for an academic treatise it is quite readable, reflecting the author’s pedagogic skill, using illustrations, diagrams and drawings.  “In Signature in the Cell,” says Scott Turner, a SUNY professor of environmental and forest biology, “Stephen C. Meyer gives us a fascinating exploration of the case for intelligent design theory, woven skillfully around a compelling account of Meyer’s own journey.  Along the way, Meyer effectively dispels the most pernicious caricatures:  that intelligent design is simply warmed-over creationism, the province of deluded fools and morons, or a dangerous political conspiracy.  Whether you believe intelligent design is true or false, Signature in the Cell is a must-read book.”  

* * * * * * * * * * * * * * * * * *

A decade ago Michael J. Behe, Professor of Biological Science at Lehigh University, published Darwin’s Black Box:  the Biochemical Challenge to Evolution, one of the finest works proposing Intelligent Design as a rational rival to the theory of purely Naturalistic Evolution.  He has recently added to his case with The Edge of Evolution:  The Search for the Limits of Darwinism (New York:  Free Press, c. 2007).  He notes that current orthodoxy in the scientific community defends a Darwinism composed of “random mutation, natural selection, and common descent” (p. 1).  Of the three, random mutation is most crucial for understanding the emergence of novel life forms, but “except at life’s periphery, the evidence for a pivotal role for random mutations is terrible” (p. 4).

In truth, because the evidence regarding historical development is so sparse “by default most biologists work within a Darwinian framework and simply assume what cannot be demonstrated” (p. 9).  We need the kind of precise, empirical data evident in engineering and anatomy.  For this we must plumb the mysterious realms of tiny molecules, proteins, and DNA.  Rather than relying on anecdotal items, such as Darwin’s domesticated pigeons and Galapagos finches, Behe insists:  “The only way to get a realistic understanding of what random mutation and natural selection can actually do is to follow changes at the molecular level.  It is critical to appreciate this:  Properly evaluating Darwin’s theory absolutely requires evaluating random mutation and natural selection at the molecular level” (p. 10).  

To do so Behe focuses on malaria—“the single best test case of Darwin’s theory” (p. 12).  Because of its widespread devastation, malaria has been carefully studied for a century.  Since population numbers, not time, “is the chief factor in evolution” (p. 153), so we can see, in 100 years of malaria parasites’ development, what has taken 100 million years in other species.  Amazingly, “the number of malarial parasites produced in a single year is likely a hundred times greater than the number of all the mammals that have ever lived on earth in the past two hundred million years” (p. 194).  Reacting to the parasite, “Hundreds of different mutations that confer a measure of resistance to malaria cropped up in the human genome and spread through our population by natural selection.  These mutations have been touted by Darwinists as among the best clearest examples of the abilities of Darwinian evolution” (pp. 12-13).  And in one sense they are quite right.  But, in a profounder sense, “the molecular changes underlying malaria resistance . . . tell a much different tale than Darwinists expected—a tale that highlights the incoherent flailing involved in a blind search” (p. 13).  The many mutations failed to provide the immunity man needs to successfully resist the pathogen.  And, importantly:  “What greater numbers of malaria can’t do, lesser numbers of large animals can’t do either” (p. 200).  

“Over the centuries,” Behe says in a pivotal paragraph, the human genome has variously responded to malaria’s assault.  Massive scholarly attention to the disease have led to the following conclusions:  “1) Darwinian processes are incoherent and highly constrained; and 2) the battle of predator and prey (or parasite and host), which has often been portrayed by Darwinist writers as a productive arms-race cycle of improvements on each side, is in fact a destructive cycle, more like trench warfare, where conditions deteriorate.  The changes in the malaria genome are even more highly instructive, simply because of the sheer numbers of parasites involved.  From them we see:  3) Like a staggering, blindfolded drunk who falls after a step or two, when more than a single tiny step is needed for an evolutionary improvement, blind random mutation is very unlikely to find it.  And 4) extrapolating from the data on an enormous number of malaria parasites allows us to roughly but confidently estimate the limits of Darwinian evolution for all of life on earth over the past several billion years” (p. 19).  

Darwinism holds that evolution proceeds as random mutations produce beneficial changes that enable individuals within a species to survive.  Careful studies of DNA in various human populations reveal remarkably few such mutations in the past 10,000 years.  In the case of adaptations to malaria, all mutations have been harmful, all of them “acts of desperation to stave off an invader” (p. 38).  Unlike the progressive “arms races” portrayed by Richard Dawkins and other evolutionists, “the data show trench warfare, with acts of desperate destruction, not arms races, with mutual improvements” (p. 42).  Where we have the best imaginable evidence, “The thrust and parry of human-malaria evolution did not build anything—it only destroyed things” (p. 42).  And this is as true of the malaria parasites (mutating in response to chloroquine) as it is of the human populations they invade.

The study of malarial parasites enables us to evaluate human evolution.  Perhaps a trillion creatures “preceded us in the past ten million years.  Although that’s a lot, it’s still much, much less than the number of malaria parasites it takes to develop chloroquine resistance” (p. 60).  For our species to develop “any single mutation of the kind required for malaria to become resistant to chloroquine” (a shift of only two amino acids), is mathematically improbable (p. 60).  For such a mutation to occur, “we would need to wait a hundred million times ten million years.  Since that is many times the age of the universe, it’s reasonable to conclude the following:  No mutation that is of the same complexity as chloroquine resistance in malaria arose by Darwinian evolution in the line leading to humans in the past ten million years” (p. 61).   

The Darwinian doctrine of common descent (detecting duplicated genes or anatomical structures shared by all creatures) certainly explains some things.  But not the really important things!  The eminent French geneticist Francois Jacob famously wrote that Darwinian evolution is a ‘tinkerer,’ not an engineer” (p. 119).  It explains small variations but not intricate designs.  There is a realm within which Darwinism suffices, but the significant development of living creatures has taken place in other realms, for “it does not even begin to explain where those commonalities came from, or how humans subsequently acquired remarkable differences.  Something that is nonrandom must account for the common descent of life” (p. 72).  In short:  “although Darwin hoped otherwise, random variation doesn’t explain the most basic features of biology” (p. 83).  

In his earlier book, Darwin’s Black Box, Behe carefully delineated the “irreducible complexity” of the cilium—large cellular structures that “help cells move around in liquid, acting like propellers” (p. 86).  Despite voluminous (and at times vituperative) attacks from Darwinists, “in the more than ten years since I pointed it out the situation concerning missing Darwinian explanations for the evolution of the cilium is utterly unchanged” (p. 95).  Indeed, advanced microscopic technology reveals yet more complexity.  This is because such things as cilia and flagella “are far past the edge of evolution.  Such coherent, complex, cellular systems did not arise by random mutation and natural selection, any more than the Hoover Dam was built by the random accumulation of twigs, leaves, and mud” (p. 102).  

At the most fundamental level, proteins reveal the mystery of life.  They not only do requisite work within cells, but we now know that they must “fit specifically with other proteins” and most of them “actually work as teams of half dozen or more” (p. 124).  Still more:  “proteins must self-assemble” (p. 125).  Furthermore:  “to acquire some new useful property, not just one but two new protein-binding sites had to develop.”  In the case of malaria parasites developing resistance to chloroquine, it would take “a hundred billion billion organisms—more than the number of mammals that has ever existed on earth” (p. 135).  And, as one would suspect, “all known malarial evolutionary responses to human drugs includes no novel protein-protein interactions” (p. 136).  

What’s true for malaria is likewise true for the HIV virus, which mutates at least ten thousand times more rapidly than cells.  To exhaust all possible “mutations in HIV requires only 1020 viruses, which have in fact appeared on earth in recent decades.”  Scientists have documented HIV’s run through the “gamut of all possible substitution mutations, a gamut that would require billions of years for cells to experience.  Yet all those mutations have changed the virus very little” (p. 154).  HIV studies “also shed light on the topic of the origin of life on earth.  It has been speculated that life started out modestly, as viral-like strings of RNA, and then increased in complexity to yield cells.  The extremely modest changes in HIV throw cold water on that idea.  In 1020 copies, HIV developed nothing significantly new or complex.  Extrapolating from what we know, such ambitious Darwinian early-earth scenarios appear to be ruled out” (p. 155).  Consequently, Behe says:  “there is no evidence that Darwinian processes can take the multiple, coherent steps needed to build new molecular machinery, the kind of machinery that fills the cell” (p. 163).  

Inasmuch as random selection fails to explain the intricacies of the cell, Behe suggests that nonrandom selection may well fit the bill.  “That is, alterations to DNA over the course of the history of life on earth must have included many changes that we  have no statistical right to expect, ones that were beneficial beyond the wildest reach of probability.  Over and over again the past several billion years, the DNA of living creatures changed in salutary ways that defied chance” (p. 165).  He concludes that the “elegant, coherent, functional systems upon which life depends are the result of deliberate intelligent design” (p. 166).  To illustrate, he shows the “logic maps” evident in the gene regulatory networks that guide the inner workings of the animal bodies.  When visually portrayed they reveal an “obvious, impressive coherence,” much like “a complex electronic or computer-logic circuit” that “was very likely purposely designed” (p. 197).  

Behe insists his notion of “intelligent design” has no necessarily religious foundation, though it is of course compatible with many religious beliefs.  One can, irreligiously, simply argue that there is an intelligent aspect to the purely natural world.  Yet it’s increasingly clear that “the origin of life was deliberately, purposely arranged, just s the fundamental laws and constants and many other anthropic features of nature were deliberately, purposely arranged” (p. 216).  Fine-tuning just seems evident wherever we study the natural world, whether as astronomers or microbiologists.  “Like it or not, the more science has discovered about the universe, the more deeply fine-tuning is seen to extend—well beyond laws, past details, and into the very fabric of life, perhaps beyond the level of vertebrate classes.  If that level of design required continuing ‘interference,’ that’s what it required, and we should be happy to benefit from it” (p. 230).  But one need not go that far and insist on a divine power “interfering” with the natural world,  Behe refuses to rule out randomness and contingency or evolution by natural selection in some areas.  It just means we must move beyond “the edge of evolution” to fully fathom the reality of our world.

He regards “design as a completely scientific conclusion,” relying solely on “detailed physical evidence, plus standard logic” (p. 233).  He makes his case as a skilled scientists, anchoring it in the best research available.  He writes clearly, persuasively prodding us to open our minds to the unfolding probability of a cosmos that is truly a cosmos—as beautifully designed as the child’s ear that moved Whittiker Chambers (as recorded in Witness) from atheism to theism.