253 What Darwin Got Wrong

 Two distinguished professors of cognitive science—Jerry Foder (Rutgers) and Massimo Piattelli-Palmarini (University of Arizona)—argue, in What Darwin Got Wrong (New York:  Farrar, Straus and Giroux, c. 2010), “that there is something wrong—quite possibly fatally wrong—with the theory of natural selection” (p. xiii).  The theory makes two claims:  1) natural selection is an observable process wherein “creatures with adaptive traits are selected”—i.e. survivors procreate; 2) natural selection is a guiding mechanism whereby “creatures are selected for their adaptive traits” (p. xv).  The first premise is historical—certain things happened; the second is philosophical—why these things happened.  To infer the second premise from the first is clearly illogical—what philosophers call an “intensional fallacy.”  But this is precisely what Neo-Darwinians do and thereby render the theory suspect.  As self-identified atheists, the authors firmly pledge allegiance to the philosophical naturalism their guild demands, but they do insist that clear thinking demands doubt regarding the “just so” Darwinian story.  

To build their case, Foder and Piattelli-Palmarini make a rigorous evaluation of the evidence (especially the information in-forming genetic activity) now available and believe natural selection fails to explain it.  Rather than sheer randomness, there seem to be “laws of form” giving direction to (i.e. causing) biological formation.  But neither Darwin nor his modern epigones set forth a credible theory of causation, though they claim to do so under the rubric of “natural selection.”   Darwinians, as natural historians, trace what happened, often in the dim and distant past.  They tell us what apparently happened—but not what “had to happen,” which is “the domain of theory, not of history; and there isn’t any theory of evolution” (p. 152).  “Natural history isn’t a theory of evolution; it’s a bundle of evolutionary scenarios.  That’s why the explanations it offers are so often post hoc and unsystematic” (p. 159).  Along with Marx, Darwin imagined he could extract scientific theory from history; both men were grievously mistaken.  

“‘OK; so if Darwin got it wrong,’” the authors write by way of summary, “‘what do you guys think is the mechanism of evolution?’  Short answer:  we don’t know what the mechanism of evolution is.  As far as we can make out, nobody knows exactly how phenotypes evolve.  We think that, quite possibly, they evolve in lots of different ways; perhaps there are as many distinct kinds of causal routes to the fixation of phenotypes as there are different kinds of natural histories of the creatures whose phenotypes they are” (p. 153).  Dogmatically insisting there is no God or Intelligent Designer or Mother Nature or any kind of supervising Mind, the authors simply leave unanswered the really important question:  why did all this occur?    But they do, at least, have the intellectual fortitude to point out why Natural Selection cannot be the answer.

* * * * * * * * * * * * * * * * * * 

Not long ago I heard the leader of an atheist movement in England elucidate why he is establishing fellowship centers to substitute for churches for folks disbelieving in God.  Explaining his views, he said “we come from nothing and go to nothing,” so living as painlessly as possible here is life’s only goal.  To say we come from nothing, of course, violates one of the clearest logical principles, for obviously nothing could come from nothing.  That a reasonably articulate man could  so cheerfully espouse nonsense illustrates one of the distressing marks of modernity:  the lack of philosophical perspicacity evident wherever scientism reigns.  As a healthy antidote, one of the 20th century’s greatest philosophers, Etienne Gilson, provides a valuable perspective on modern science in From Aristotle to Darwin and Back Again:  A Journey in Final Causality, Species, and Evolution (San Francisco:  Ignatius Press, 2009, a new translation of Gilson’s 1971 treatise).  

He begins where anyone concerned with meticulous science, meaningful distinctions, and coherent logic must:  with Aristotle.  Importantly, Aristotle understood that different subjects demand  different ways of thinking—doing math differs significantly from composing music, though the two disciplines certainly share some commonalities.  When they considered questions concerning the origins of things, Pre-Socratic thinkers had, by-and-large, invoked chance and necessity, churning along in a mechanical fashion.  To Aristotle, however, it made more sense to see things as designed, with discernable purpose, much like an artistic work reflecting the mind of its maker.  Such reasoning led him to posit, when explaining things, four essential causes:  material; efficient; formal; final.  In the common sense tradition following Aristotle’s paradigm, it makes sense to understand a house as composed of material things, put together by workmen, following a blueprint, in order to provide suitable shelter.  To eliminate formal and final causes from the equation—as has been done for three centuries by scientists fixated solely on material and efficient causes—renders reality ultimately unintelligible.

For Aristotle there is an undeniable telos—an end-oriented ingredient—to all that is.  His voracious investigations of the natural world filled him with wonder:  “‘Every realm of nature is marvelous’” (p. 25).  Still more:  “‘Absence of haphazard and conduciveness of everything to an end are to be found in Nature’s works in the highest degree, and the resultant end of her generations and combinations is a form of the beautiful’” (p. 25).   Thus he “found teleology so evident in nature that he asked himself how his predecessors had been able to avoid seeing it there, or, still worse, had denied its presence” (p. 21).  Two millennia later Gilson asks the same question!  How can highly intelligent people not see the obvious design in things?  “In brief, if there is in nature at least an apparently colossal proportion of finality, by what right do we not take it into account in an objective description of reality?” (p. 31).  

Gilson takes us on a 400 year historical journey, showing how philosophical naturalism, with its mechanistic explanations, has become dominant in the West.  At the apex of the account stands Charles Darwin, who scrupulously expunged any hint of design (with its powerful suggestion of a Creator) from his version of evolution through Natural Selection.  Unfortunately, in his and his followers’ writings the word “Evolution has served the purpose of hiding the absence of an idea” (p. 103).  Claiming to explain everything it explains very little if anything.  Thus a distinguished French naturalist, Paul Lemoine lamented:  “The theories of evolution with which our studious youth are lulled to sleep actually compose a dogma which everyone continues to teach; but, each in his specialty, zoologist or botanist, takes cognizance of the fact that any of the explications furnished cannot stand’” (p. 104).  “‘The result of this expose,’” Lemoine said in closing his article in the Encyclopedie francaise, “‘is that the theory of evolution is impossible’” (p. 105).  And it is impossible because it cannot withstand the kind of rigorous analysis given it by philosophers such as Gilson.  

Living things cannot be explained mechanistically.  Aristotle saw this clearly centuries ago, and the “facts that Aristotle’s biology wished to explain are still there.  He is reproached, sometimes bitterly, with having explained them poorly, but up to the present no one has explained them any better’ (p. 141).  To truly understand our world we must, as did Lemaine, allow that “‘vital phenomena tend toward a precise end, from which tendency the name of “final causes” is derived’” (p. 142).  And indeed, most biologists, when seeking to explain anything, silently rely on (and develop euphemisms for) such final causes.  

* * * * * * * * * * * * * * * * 

In Darwin’ Doubt:  The Explosive Origin of Animal Life and the Case for Intelligent Design (New York:  HarperOne, c. 2013), Stephen C. Meyer takes seriously Charles Darwin’s personal doubt regarding his celebrated theory, popularly portrayed as the evolutionary tree of life.  Darwin was deeply troubled by the lack of fossil evidence for the universal common ancestry of all living things, gradually flowering into various species through natural selection, basic to his theory of evolution.  Indeed, his work was vigorously disputed by the most celebrated fossil expert of his day, the Swiss paleontologist teaching then at Harvard, Louis Agassiz.  During a markedly brief period of time—the Cambrian Era—a great variety of animal species just suddenly appeared, with no hint of common ancestry.  To Agassiz, this “posed an insuperable difficulty for Darwinian theory” (p. 8).  

Darwin recognized this difficulty, noting:  “‘The abrupt manner in which whole groups of species suddenly appear in certain formations has been urged by several paleontologists . . . as a fatal objection to the belief in the transmutation of species.  If numerous species, belonging to the same genera of families, have really started into life all at once, the fact would be fatal to the theory of descent with slow modification through natural selection’” (p. 17).  What Darwin did, to preserve his fossil-less theory, was to insist that in time further paleontological expeditions would uncover a fuller geological record that would confirm his belief in common ancestry and natural selection.  

But a series of meticulous 20th century paleontological expeditions have failed to find the evidence Darwin envisioned.  Instead there stands exposed in the fossils of the Cambrian Era a “geologically abrupt appearance of a menagerie of animals as various as any found in the gaudiest science fiction.  During this explosion of fauna, representatives of about twenty of the roughly twenty-six total phyla present in the known fossil record made their first appearance on earth” (p. 31).  Though much of the field work was done in the Burgess Shale in the Canadian Rockies, there is an even richer fossil depository in China—the Maotianshan Shale near Chengjiang —which affords us “an even greater variety of Cambrian body plans from an even older layer of Cambrian rock” (p. 50).  This site, inspected by Chinese scientists, shows “that the Cambrian animals appeared even more explosively than previously realized” (p. 51).  Thus the renowned Chinese paleontologist J. Y. Chen declares that the evidence turns “upside down” Darwin’s tree of life imagery.  Ironically, Chen noted:  “‘In China we can criticize Darwin, but not the government.  In America you can criticize the government but not Darwin” (p. 52).  The Chinese scientists have also, during the past 20 years, refined the methodology for dating the geological record, leading them to believe that the “Cambrian Explosion” took place within five to ten million years—a mere moment in earth’s five billion year history.  

Evidence regarding biological development in the Cambrian Explosion, Meyer argues, calls into question the hallowed “tree of life” depicted in standard textbooks.  Rather than a single tree, it looks like a score or more separate bushes, all beginning at the same time.  For over 3 billion years, only single-celled organisms (notably bacteria and algae) existed; then, some 560 million years ago some complex multicellular organisms, such as sponges, appeared; shortly thereafter came the Cambrian Explosion and “the oceans swarmed with animals” as a “carnival of novel biological forms arose”—all given structure by “an explosion of genetic information unparalleled in the previous history of life” (p. 163).  And the more we understand about genetics the more difficult it is to even imagine, much less demonstrate, how such living creatures emerged and evolved in accord with the Darwinian theory.  

Consequently, various naturalistic alternatives to the standard evolutionary model have been proposed.  Some biologists envision “self-organizing” patterns following certain natural laws, rather as crystals seem to spontaneously assemble.  Decades ago Stephen Jay Gould theorized that the slow process of gradual evolution advanced through inexplicable jumps, suddenly developing new life-forms.  More recently Jeffrey Schwartz, in Sudden Origins, admitted:  “We are still in the dark about the origin of most major groups of organisms.  They appear in the fossil record as Athena did from the head of Zeus—full blown and raring to go, in contradiction to Darwin’s depiction of evolution as resulting from the gradual accumulation of countless infinitesimally minute variations’” (p. 318).  Instead, he postulated some as yet indiscernible “Hox genes” that better explain it.  

“Clearly,” Meyer says, “standard evolutionary theory has reached an impasse” (p. 337).  So he proposes a better approach:  Intelligent Design.  If one is not committed to a purely materialistic metaphysic, if one is open to the possibility of a mental dimension to reality, then looking for an information-giving intelligent milieu or agent might make sense.  Meyer’s approach “affirms that there are certain features of living systems that are best explained by the design of an actual intelligence—a conscious and rational agent, a mind—as opposed to a mindless, materialistic process.  The theory of intelligent design does not reject ‘evolution’ defined as ‘change over time’ or even universal common ancestry, but it does dispute Darwin’s idea that the cause of major biological change and the appearance of design are wholly blind and undirected” (p. 339).  

Meyer’s fascination with intelligent design began when, as a young scholar, he encountered the work of a chemist, Charles Thaxton, whose book The Mystery of Life’s Origin demonstrated the improbability of nonliving chemicals evolving into living biological organisms.  Thaxton and his co-authors “suggested that the information-bearing properties of DNA might point to the activity of a designing intelligence—to the work of a mind, or an ‘intelligent cause’ as they put it.  Drawing on the analysis of the British-Hungarian physical chemist Michael Polanyi, they argued that chemistry and physics alone could not produce the information in DNA any more than ink and paper alone could produce the information in a book.  Instead, they argued that our uniform experience suggests a cause-and-effect relationship between intelligent activity and the product of information” (p. 341).  

This possibility drew Meyer to go to the University of Cambridge in England, where he pursued his interests in the history and philosophy of science.  There he discovered the important role historical scientists assign to “abductive inference,” a method that infers “past conditions or causes from present clues.”  Unlike deductive logic, abductive reasoning leads to plausibility rather than certainty.  It is an “inference to the best explanation.”  Historians who understand their discipline know it’s as much an art as a science since they deal with particular events rather than universal laws.  Thus there have ever been rival hypotheses regarding past events such as the fall of the Roman Empire.  So too scholars studying evolution necessarily engage in historical work, and “whether they always realize it or not . . .typically use the method of inference to the best explanation” (p. 351).  

Applying this method to the Cambrian Era, we encounter creatures possessing layers of highly sophisticated digital information akin to “systems known from experience to have arisen as a result of intelligent activity.  In other words, standard materialistic evolutionary theories have failed to identify an adequate mechanism or cause for precisely those attributes of living forms that we know from experience only intelligence—conscious rational activity—is capable of producing” (p. 358).  Reading Macbeth we reasonably infer it was written by a literary genius, a mind telling the story—Shakespeare.  Listening to The Messiah we reasonably infer it was composed by a musical genius, a mind orchestrating text and score—Handel.  Encountering pictographs on the rocks near Boise, Idaho, we reasonably infer they were drawn centuries ago by rational human beings—Indians residing in that area.  Inevitably we think materials containing and conveying information necessarily come from conscious minds.  

Thus, as we now know, living organisms—whether molecules or cells, plants or animals, as preserved in the fossils of the Cambrian Era—“require specified and highly improbable (information-rich) arrangements of lower-level constituents in order to maintain their form and function” (p. 365).  Still more:  “Conscious and rational agents have, as part of their powers of purposive intelligence, the capacity to design information-rich parts and to organize those parts into functional information-rich systems and hierarchies.  We know of no other causal entity or process that has this capacity” (p. 366).  And “both the Cambrian animal forms themselves and their pattern of appearance in the fossil record exhibit precisely those features that we should expect to see if an intelligent cause had acted to produce them” (p. 379).  These ancient animals suddenly appeared, “without any clear material antecedent; they came on the scene complete with digital code, dynamically expressed integrated circuitry, and multi-layered, hierarchically organized information storage and processing systems” (p. 381).  

In the light of all this, might it make sense to infer, as the best explanation, an intelligence of some sort as the cause of it all?  Yes, Meyer insists, it does.  As the popular novelist Dean Koontz says, “Meyer writes beautifully.  He marshals complex information as well as any writer I’ve read. . . .  This book—and his body of work—challenges scientism with real science and excites in me the hope that the origins-of-life debate will soon be largely free of the ideology that has long colored it . . . a wonderful, most compelling read.”  

* * * * * * * * * * * * * * 

Though Charles Darwin is generally presented to the public as a virtuous scientist, motivated by a dispassionate desire to understand the world, Benjamin Wiker argues, in The Darwin Myth:  The Life and Lies of Charles Darwin (Washington:  Regnery Publishing, Inc., c. 2009),  that he often made misleading statements regarding his life and his claims to intellectual originality, and that these assertions were naively embraced by most of his biographers, who have portrayed him as exemplary in every way—a “secular saint who single handedly brought enlightenment to a world shrouded in the darkness of superstition and ignorance” (p. ix).  To provide the context that shows he was notably less than honest about himself and fair to his rivals, Wiker sketches a succinct overview of his life and intellectual development, giving considerable care to his theological and philosophical orientation. 

By 1859, Darwin was ready to provide the public with a “long argument” in “two long books,” setting forth his notions of “evolution through natural selection” and the purely naturalistic “descent of man.”  Especially absent in his presentation—and the clandestine case he actually advocated, Wiker contends —was absence of God in creation.  While he at times made elusive references to some higher power, Darwin clearly envisioned an essentially godless world.  “Darwin’s principle of natural selection was chosen by him precisely because it excluded any creative action by God” (p. 139).  

Darwin also envisioned an essentially amoral world, for “morality does not govern evolution.  If it did, then we might expect a divine overseer” (p. 92).  Darwin knew that “if there were a moral standard outside the process of natural selection, if the evolution of morality progressed toward that standard, if the actions of men and societies were judged by that standard, then we would be admitting a theistic account of evolution” (p.145).  Unwilling to grant this, he considered morality a survival technique—constantly changing, relative to various environments, without real substance.  Social Darwinism, celebrated by eugenicists and might-makes-right thinkers such as Nietzsche and dictators such as Stalin, naturally and necessarily followed Darwin’s philosophical views.  Consequently, Wiker suggests, he, along with Marx and Freud, may rightly be acknowledged for his importance while decried for his influence.  

252 Admirable Autobiographies

For forty years Michael Novak has been a very visible and influential public intellectual, writing generally at the crossroads of politics and religion.  In his recent autobiography, Writing from Left to Right:  My Journey from Liberal to Conservative (New York:  Image, c. 203), he details his remarkable career, filled with insights into many of the most powerful men of our time as well as his own intellectual development.  Gracefully written, irenic and generous in presentation, his memoirs open for us vistas of understanding.  Though autobiographies are perhaps the most particularistic of all historical materials, Novak sees in his life broader patterns; thus:  “This book,” he tells us, “is about political and economic upheavals between the years 1960 and 2005, and the navigation through heavy waves that many of us chose.  This is not just my story, but the story of thousands, even millions.   Many more are likely to join us over the next decade.  Reality does not flinch from teaching human beings hard lessons” (Kindle, #100).  

All four of Novak’s grandparents came from the same part of rural Slovakia and settled in western Pennsylvania.  He was born and reared in Johnstown, a place in the Allegheny Mountains best known as the site of a disastrous 1889 flood.  Reared in a solidly Catholic, working class family, he early sensed a call to the priesthood and for 12 years immersed himself in studies for the religious life.  But in 1960 he “judged that my true vocation was in lay, not priestly, life” (p. 13) and resolved to become a writer.  Settling in New York, he finished and saw published his first book (a novel) and began reading widely in politics and economics.  Offered a position as a writer for an aspiring Democratic politician, he wrote a speech about “The New Frontier” that was never given, but the phrase made its way into a memorable address by John F. Kennedy, whose ultimate election proved deeply satisfying.  Politically, in accord with JFK, Novak was happily situated as a middle-of-the-road Democrat.  

Sensing the need for further academic preparation, he entered the philosophy department at Harvard in pursuit of a Ph.D.  But his professors were immersed in such things as symbolic logic whereas Novak was more interested in metaphysics and ethics.   While there he managed to publish several papers and two books, but he was hardly attuned to the main concerns of his professors.  He did, however, meet a visiting scholar, Gabriel Marcel, whose lectures “fanned the sparks in me of a lifelong interest in the ‘person’” (p. 27).  He also encountered, while at Harvard, the works of Reinhold Niebuhr and determined to write his doctoral thesis on Niebuhr’s Christian Realism.  His graduate studies were interrupted, however, when he was given the opportunity to spend four months in Rome, observing and writing about the second session of the Second Vatican Council.  In time he published The Open Church, momentarily aligning himself with the “progressives” who saw the Council as a vehicle whereby the Church could be significantly updated and improved.  “Within a few years after the Council, I found myself reacting more and more negatively to the large faction of the ‘progressives’ who failed to grasp the truly conservative force of Vatican II—its revival of ancient traditions, its sharper disciplines, its challenges to mere worldliness and mere politics” (p. 52).  His religious convictions an intellectual shift, as noted in the books’ title, from left to right.  

His position on the left, shared by many younger folks in the ‘60s, began when he began teaching philosophy at Stanford University, where he spent some of his happiest days.  Identifying with his students, he gradually embraced their anti-war radicalism and even left Stanford to help start an experimental college of the State University of New York at Old Westbury.  Designed to align with the ethos of the counter-culture—making students colleagues rather than apprentices—the experiment degenerated into “crazy, rebellious, and anarchic” chaos.  As Novak “faced the full implications of the deep leftist principles” (p. 93) he was, to put it mildly, appalled.  To imagine them applied to the nation—as envisioned by Tom Hayden and The Students for a Democratic Society—left him aghast.  

Soon departing Old Westbury, in 1970 Novak was hired by Sarge Shriver (whom both LBJ and George McGovern almost picked as vice presidential running mates) to work (as a writer) for the election of Democrats to Congress.  Shriver, of course, was married to Eunice Kennedy, so Novak entered and intellectually helped shape a significant faction of the Democrat Party.  “Shriver loved the vein of Catholic thought that wanted to ‘reconstruct the social order,’ ‘put the yeast of the gospel in the world,’ ‘feed the hungry, comfort the afflicted’—generally, that is to make a difference in the world” (p. 112).  In addition to working for Shriver, the prolific Novak wrote articles and finished a book, The Rise of the Unmeltable Ethnics, anticipating themes enunciated within a decade by Ronald Reagan.  

This publication demonstrated his “declaration of independence from the cultural left” (p. 124), a rather painful move inasmuch as it presided imperiously over much of the nation’s intellectual life.   Novak also had to acknowledge that Left-wing Democrats were taking control of the party and locking it into a pro-abortion, anti-Jewish/Christian, multicultural agenda.  And he also began re-thinking his loosely socialistic economic positions, prodded by Richard John Neuhaus and Peter Berger to consider the empirical evidence of the actual good wrought by capitalism around the world.  “Socialism, I was beginning to infer, is not creative, not wealth producing.  It is wealth consuming (it consumes the wealth of others); it is parasitical” (p. 149).  LBJ’s “Great Society,” along with most all welfare states, have clearly failed to fulfill their promises.  

Fortuitously invited to join (as a theologian) the American Enterprise Institute in 1977, Novak secured “a front row seat at the great economic debates of the next three years” (p. 174).  He found the “supply side” economics, as espoused by Jack Kemp and Ronald Reagan, surprisingly persuasive.  He was appointed by President Reagan as the U.S. as ambassador for the UN Commission on Human Rights and published (in 1982)  what his probably his most influential book, The Spirit of Democratic Capitalism, to provide an ethical rationale for free enterprise.  When she met him, Margaret Thatcher exclaimed:  “‘I have so wanted to meet you.  I have been reading your book.  You are doing the most important work in the world.’”  “‘Exposing the moral foundations of capitalism,’ she went on, ‘is so important.  Then fate of the poor all around the world depends on it’” (p. 218).  That truth was confirmed by Vaclev Havel, who with  a handful of friends met secretly to discuss the book, chapter by chapter, and determined to implement its principles in Czechoslovakia.  

“In my lifetime,” Novak says, “I have been favored to meet and often work with many great political leaders, presidents, artists, and builders of brand-new industries in new technologies of the Electronic Age.  I especially prized working with Reagan, Thatcher, and Vaclav Havel.  But of all the great human beings I have met—and even been invited into friendship with—none is closer to my heart than John Paul II” (p. p. 298).  This leads to a warmly-written chapter on “the Pope who called me friend.”  John Paul II had apparently read Novak’s The Spirit of Democratic Capitalism as he prepared his social encyclical, Centesimus Annus, in 1991.  Invited, on several occasions, to dine and discuss ideas, as well as celebrate mass with the Pope, culminated, in a profound way, Novak’s life as a Christian writer.  

* * * * * * * * * * * * * * 

For many years Melanie Phillips was an acclaimed reporter and columnist—“a darling of the left”—for the Guardian, probably the leading leftist newspaper in England, “the paper of choice for intellectuals, the voice of progressive conscience, and the dream destination for many, if not most, aspiring journalists” (#347).  Thus she titles her judicious autobiography Guardian Angel:  My Story, My Britain (New York:  emBooks, c. 2013).  As a serious writer she has ever sought to tell the truth and follow the evidence wherever it led.  This, however, challenged her ideological assumptions, and in time she found a journalistic home in more conservative quarters.  Consequently, looking back on a career of “charting social and political trends for more than thirty years,” she declares “there were now two Britains:  the first adhering to decency, rationality and duty to others, and the second characterized by hatred, rampant selfishness, and a terrifying repudiation of reason” (Kindle, #23).    There were obviously battles waging, so her memoir “is the story of my culture war:  the account of my battles with the hate-mongering left” (#28).  

Reared in London as an only child in a working-class, somewhat dysfunctional Jewish family, she found solace in “the magical world of books,” excelled in school and entered Oxford University, where she studied English Literature and “dabbled in student politics and mildly left-wing circles” (#309).  Inwardly insecure, she often compensated by thrusting herself to “the center of the stage in order to validate my existence by the approval of an audience” (#314).  Needing employment, she fell into a training post with a suburban newspaper and began learning to do journalism and was named Young Journalist of the Year.  This led, in 1977, to a staff position on the Guardian.  She was 26 years old and shared, for many years, the staff’s conviction that they, speaking for the liberal left, “were the embodiment of virtue itself” (#406).  They espoused “a set of dogmatic mantras.  Poverty was bad, cuts in public spending were bad, prison was bad, the Tory government was bad, the state was good, poor people were good, minorities were good, sexual freedom was good” (#449).  Still more:  by the ‘90s  post-modernism had infiltrated journalism under the label of “attachment” journalism.  “Truth was now said to be an illusion; objectivity was a sham; journalists who tried to be dispassionate were therefore perpetrating a fraud upon the public.  The only honest approach was for journalists to wear their hearts on their sleeves; this was not called bias, but honesty” (#999).  

And they were, Phillips laments, very much the vanguard of the cultural revolution that has totally transformed Britain!  Her disillusionment with the Left began when she honestly followed the evidence while researching and writing articles on a wide variety of subjects—immigration, education, environmentalism, marriage and family, feminism, multiculturalism, health care, Israel and foreign affairs.  Though only nominally Jewish, she found to her surprise that her colleagues on the Guardian branded her as a Jew who could not deal dispassionately with Israel.  Indeed, anything but a pro-Palestinian stance was anathematized by Britain’s leftists, who routinely equate Israelis with Nazis!  “The more I read, the more horrified I became by the scale of the intellectual and moral corruption that was becoming embedded in public discourse about the Middle East—the systematic rewriting of history, denial of law and justice and the corresponding demonization and delegitimisation of Israel” (#1669).  

As a mother of two, she became increasingly distressed with the nation’s schools, which had demonstrably been “hijacked by left-wing ideology.  Instead of being taught to read and write, children were being left to play in various states of anarchy on the grounds that any exercise of adult authority was oppressive and would destroy the innate creativity of the child” (#738).  When she dared suggest that teachers should actually teach something, the left’s reaction was vitriolic—she was instantly branded “right-wing” and ostracized by many colleagues.  By 1990 she “realized something very bad indeed was happening to Britain.  What was being described was more akin to life in a totalitarian state.  Dissent was being silenced, and those who ran against the orthodoxy were being forced to operate in secret; still more, the very meaning of concepts such as education, teaching, and of knowledge was being unilaterally altered, and thousands of children, particularly those at the bottom of the social heap, were being abandoned to ignorance and institutionalized disadvantage” (#812).  Her alienation from the Left “was not so much political as moral.  The left was rejecting all external authority and embracing instead moral and cultural relativism—the idea that ‘what is right’ is ‘what is right for me’, and declaring any hierarchy of values illegitimate” (#878).  

Aligned with her critique of education, Phillips’ sharpest point of disagreement with the Left devolved from her defense of the traditional family.  All the evidence proved that children flourished best in intact homes, where mothers and fathers shared responsibility for rearing them.  Divorce, single-parenting and step-parenting all harm children.  But despite the data, the cultural left triumphantly validated such behaviors.   The facts were never cogently debated, only denied.  She was by no means refuted, only reviled.  Ultimately she concluded “that the destruction of the traditional family had as its real target the destruction of Biblical morality.  I thought I was merely standing up for evidence, duty and the protection of the vulnerable.”  But her foes saw clearly “that the banner behind which I was actually marching was the Biblical moral law which put chains on people’s appetites” (#980).  

When she began investigating the claims of environmentalists, including the furor over global warming, she immediately “smelled charlatanry.”  Determined to follow the evidence, she found none!  Instead what posed as environmentalism simply “brought together deeply obnoxious strands of thinking” hitherto evident in the anti-Western, anti-capitalist frothing of leftist ideologues.  “To me,” she says, “the clear message of environmentalism was that the planet would be fine if it wasn’t for the human race.  So it was a deeply regressive, reactionary, proto-fascist movement for putting modernity into reverse, destroying the integrity of science, and threatening humanity itself” (#892).  Her position was reinforced by discussions with dozens of first-rate scientists who demonstrated that “the science” was not at all “settled” and the alarmists’ views were often akin to a “scam.”  

* * * * * * * * * * * * * 

During the past decade Joseph Pearce has emerged as one of the better writers dealing with literary figures—Literary Converts; Wisdom and Innocence:  A Life of G.K. Chesterton; C.S. Lewis and the Catholic Church; Tolkien, Man and Myth; Solzhenitsyn:  A Soul in Exile.  Bradley Birzer, the author of J.R.R. Tolkien’s Sanctifying Myth, says:  “Pearce writes with historical insight on one hand and poetic imagination on the other.  Perhaps our greatest living biographer, Pearce has the uncanny ability to get into the minds, hopes, fears, and motivations of his subjects.”  Having written at length and with discernment about others, in Race with the Devil:  My Journey from Racial Hatred to Rational Love (Charlotte, NC:  Saint Benedict Press, c. 2013) he now tells his own life story.  

Reared in a rather conventional middle-class home near London, Pearce had little exposure to or interest in religion as a child, though there was a residual Christianity here and there.  His father, amazingly well self-educated (and to whom the book is dedicated), was the strongest influence in his life, and he has “nothing but gratitude to him for all the good things he taught me and all the love he bestowed upon me” (Kindle Loc #276).  He did, however, have a strong anti-Catholic bias rooted in a general misunderstanding of the English Reformation and its aftermath” (# 295).  And he was also was quite hostile to the immigration policies of Great Britain which were transforming the land he loved into a multicultural jungle.  Yet unlike “the philanthropist who proclaims his love for Man but despises men,” his father “loved men but despised those who spoke in the abstract about the brotherhood of man” (#310).  Consequently, “the extent to which I love my fellow man is attributable, under grace, to the example my father gave me.  My love of poetry and history has its roots in his love for these things.  My omnivorous hunger for knowledge is a gift that he gave me.  The path of the autodidact, which he took through life, is the path that I have followed also.  I am happy to have followed in my father’s footsteps, though equally happy that I ceased to do so when I came to realize he we was not always walking in the right direction” (#357).  

Following the elder Pearce’s father’s example of learning on his own, young Joseph read widely—far beyond his classroom assignments—and developed a consuming passion for politics.  Indeed, at the age of 17 he published an article entitled “‘Red Indoctrination in the Classroom,’ which critiqued the Marxist orientation of my high school education” (#458).  Still more, he became active in the National Front, a political party noted for its highly racial agenda.  This involved him in numerous street demonstrations and occasional fights with militant Marxists.  Determined to advance the cause he founded Bulldog, a magazine designed to incite racial hatred and targeting young people with the National Front message.  This made him a highly visible and controversial figure and he began to work fulltime for the Party in 1978.  In time he was twice arrested and imprisoned for “hateful” articles he had written.  

Looking back on this period of his life, Pearce says:  “The animus of my political creed to which I subscribed was not animosity towards aliens but a love of my own people, albeit a love that became an idol, a false god that I worshipped at the expense of my own spiritual wellbeing” (#799).  Providentially, he began to discover, though a variety of experiences, a better way that began with an awakening to beauty.  For instance, visiting the family of a friend in rural England sparked within him “a fuller and better vision of the beauty of life, particularly country life, to which I had been largely unaware until then” (#774).  Away from city lights he clearly saw, for the first time, the stars in all their majesty.  “Having my eyes awakened to such beauty was a baptism of the imagination—a baptism of desire—which I now see as foundational to my path to religious conversion” (#789).  

He was also nourished by the works of Alexander Solzhenitsyn, who sowed “seeds of faith and hope in my understanding of reality and exorcise the demons of nihilism and pessimism that lurked in the darkest recesses of my soul” (#1049).  (Much later, while writing a biography of Solzhenitsyn, he was privileged to meet him—a rare encounter granted when Solzhenitsyn learned Pearce sought to emphasize his Christian convictions.)  His love of pop music exposed him to Elvis Presley, whose gospel recordings played a vital role in opening Pearce to the Gospel!  And then he was (much like C.S. Lewis, decades earlier) “surprised by Chesterton”!  First fascinated by some of his “distributist” economic ideas, he the stumbled into the works of G.K.C. and would never be the same again!  In Chesterton he “found a new friend who would become the most powerful influence (under grace) on my personal and intellectual development over the following years” (#1729).  

Such influences—in concert with an inner “baptism of desire” that sharpened his hunger for truth—prepared him for some unexpected changes while serving his second prison term.  It was, for him, the dark night of the soul described by St John of the Cross.  Longing for freedom, disillusioned with his political work, and strangely drawn to theological inquiries and halting efforts to pray, he emerged from prison determined to change his life.  He began visiting a small Catholic chapel and sensed therein the answer to his heart’s disquiet.  In time he converted to the Church began to live in accord with her doctrines and ethics, finding self-sacrifice the key to the good life.  Rather than writing racist screeds he began to consider Christian materials and wrote a biography of his beloved G.K. Chesterton.  “Whereas my previous writing had led people astray, I hoped that my gifts as a writer could now help lead people to the truth” 

251 Vietnam Revisited

When I began my college teaching career in January, 1966, America’s involvement in the Vietnam War was beginning to evoke controversy.  But since I didn’t teach recent American history and had other compelling concerns I naively accepted the news as reported in Time Magazine and by Walter Cronkite on CBS.  Thus I first supported what I understood to be a just war defense of the people of South Vietnam; subsequently, accepting what was said about the Pentagon Papers and promoted by Cronkite et al., I came to believe it impossible to win the war and that we had been misled regarding the reasons we had entered it.  (Especially important in shaping my own views in those days was Sojourners Magazine, whose editor, Jim Wallis, claimed to provide an Evangelical appraisal on world affairs—something I now realize was deeply, if not deviously, flawed).  And when Saigon fell and South Vietnam was melded into the Communist orbit I effectively closed my mind to what seemed then to have been a sad and misguided American endeavor.  Older and wiser now, I have recently revisited the conflict with the assistance of sources suitably critical of the popular opinions established by the media 45 years ago.  

Uwe Siemon-Netto is a German journalist who spent five years covering the war in Vietnam.  Unlike most American correspondents, who stayed in the safety of Saigon, he spent much time in the countryside, where the war was waged, and developed meaningful relationships with a variety of individuals.  He talked with and knew not only the political and military elites but ordinary people (such as the boy Duc, for whom the book is named).  In his deeply moving memoir—Duc:  A Reporter’s Love for the Wounded People of Vietnam (c. 2013)—he laments the many casualties of a conflict sadly won by the Communists in 1975.  He personally observed the “heinous atrocities the Communist committed as a matter of policy,” serving as “a witness to mass murder and carnage beside which transgressions against the rules of war perpetrated on the American and South Vietnamese side—clearly not as a matter of policy or strategy—appear pale in comparison” (p. xii).  As “a collection of personal sketches of what I saw, observed, lived through and reported in my Vietnam years,” Duc gives us an enlightening slice of a story we need to better understand a critical phase in our nation’s history.

One message Siemon-Netto makes clear:  ordinary Americans were misinformed by intellectually dishonest “apologists for the Hanoi regime, such as philosopher Noam Chomsky” and New Left leaders, personified by Jane Fonda, who denied demonstrable truths regarding such events as the Hue Massacre (where Communists ruthlessly slaughtered thousands of  innocent civilians) and the highly successful Tet Offensive, a decisive victory for both the American military and South Vietnamese government.  “More than half of the 80,000 Communist soldiers who participated in the Tet Offensive were killed; the Vietcong infrastructure was smashed.  This was a big military victory.  It was a hard-won victory for the allies, but a victory it was.  All things being equal, this should have been the Allied triumph bringing this war to a successful end.  We combat correspondents could testify to this, irrespective of the pacifist and defeatist spin opinion makers, ideologues and self-styled progressives in the United States and Europe put on this pivotal event” (p. 209).   

Indeed, as Peter Braestrup said:  “‘Rarely has contemporary crisis-journalism turned out, in retrospect, to have veered so widely from reality.  Essentially, the dominant themes of the words and film from Vietnam . . . added up to a portrait of defeat for the allies.  Historians, on the contrary, have concluded that the Tet Offensive resulted in a severe military-political setback for Hanoi in the South’” (p. 140).  But on February 27, 1968, less than two months after the military triumph, “CBS Evening News anchorman Walter Cronkite, home from a flying visit to Vietnam after the Tet Offensive, pronounced sonorously before an audience of some 20 million viewers the Vietnam War unwinnable.  This flew in the face of everything many combat correspondents . . . had lived through at Tet” (p. 221).  “But Walter Cronkite’s opinion trumped reality, turning a military victory into a political defeat” (p. 221) and the antiwar movement would surge significantly with Senator Eugene McCarthy surfing the swell.  Soon President Lyndon Baines Johnson would acknowledge he had lost middle America along with Cronkite and abandon his re-election campaign.  

Communist leaders (such as Gen. Vo Nguyen Giap, who recently died in his 105th year), shrewdly manipulated mouthpieces such as Fonda, knowing quite well that the war would be won in America’s living rooms as well as the battlefields of Vietnam.  “‘During the latter half of the 15-year American involvement,’ wrote Robert Elegant, ‘the media became the primary battlefield.  Illusory events reported by the press as well as real events within the press corps were more decisive than the clash of arms or the contention of ideologies.  For the first time in modern history, the outcome of the war was determined not on the battlefield but on the printed page and, above all, on the television screen.  Looking back coolly, I believe it can be said . . . that American and south Vietnamese forces actually won the limited military struggle’” (p.113).  Wielding their pens to oppose the war, the “new journalists” employed their skills, less concerned with accurate writing than improving the world.  Rather than researching and reporting, they “followed a drift in journalism that became fashionable when the profession changed its character from a down-to-earth craft to another pseudo-academic ivory tower” (p. 74).   American journalists in Vietnam tended to “preach, pontificate and browbeat like the scribes of Joseph Goebbels propaganda ministry in Nazi Germany or of the Soviet agitprop service” (p. 74).  “The traditional journalists, the craftsmen, were still on the job in my time in Vietnam.  They had their stories published, albeit in many cases further and further in the back pages.  The limelight was reserved for the new journalists, the pundits, stars who opined rather than reported.  They flew in and out of Saigon on ‘special assignments’ and, clad in freshly pressed fatigues, pontificated before millions of television viewers, not on the basis of what they had experienced in the jungles and villages . . . but on the basis of the stereotypical antiwar ideology, they themselves were imposing on the American public square” (p. 75).   

Reading Duc, however, can help rectify the record regarding Vietnam.  Siemon-Netto’s  compelling concern for the people of Vietnam—as well as the American servicemen who sacrificed so much in that conflict—finds its voice in this memorable account.  

* * * * * * * * * * * * * * * * * * * * *

Phillip E. Jennings served as a Marine Corps helicopter pilot in Vietnam and has written The Politically Incorrect Guide to the Vietnam War (Washington:  Regnery Publishing, Inc., c. 2010) to challenge some of the widely-embraced assumptions regarding the history of that conflict.  The author of comic novels as well as the CEO of Molecular Resonance Corporation, he writes not as an academic (though he’s consulted 300 books) or journalist but as a war veteran determined to expose falsehoods and defend the propriety of American’s involvement in Southeast Asia.  

After a quick overview of the labyrinthine developments leading to the division of Vietnam by the Geneva accord in 1954, he notes that the United States came to the aid of South Vietnam (as it had done in Korea in 1950) following North Vietnam’s 1959 decision to dispatch fighters to enlarge Ho Chi Minh’s communist dictatorship.  President John F. Kennedy, who strongly supported America’s commitment to the Ngo Dinh Diem regime and the South’s military (the ARVN), approved increased military aid and involvement in the war.  Unfortunately, JFK’s ambassador (Henry Cabot Lodge) and two young journalists (Neil Sheehan and David Halberstam, who falsely claimed that 30 Buddhist monks had been killed by the government) orchestrated a process culminating in a military coup and the execution of Diem, the only man who conceivably could have effectively ruled and defended his country.  

Following Kennedy’s assassination in 1963, President Johnson decided to massively escalate America’s role in Vietnam.  His ill-focused and inept strategies, generally attuned to domestic politics rather than military developments, resulted in what often appeared an insoluble stalemate.  Nevertheless, “the situation in South Vietnam in 1967 was far from dire for the Americans and their allies” (p. 88).  Much had gone wrong, but not all was lost!  That was demonstrated in the 1968 Tet Offensive, wherein the Communists were decisively defeated.  Indeed, “Had the United States followed up on the destruction of the Viet Cong in the Tet Offensive by mining Haiphong Harbor and bombing Hanoi (as Nixon did in 1972), the war might have ended in 1968” (p. 103).  Even then, though the American media refused to report it, during the four most important years of the war (’68-’73) there was an “unheralded victory.”  

With the ’68 election of Richard Nixon as President, America’s strategies shifted—and so did the course of the war, during which South Vietnamese soldiers took “over the war on the ground, and pacified 90 percent of the countryside” (p. 105).  “Nixon was decisive where Kennedy waffled; and he was a tough-minded statesman while Johnson was an over-promoted congressional enforcer.  Nixon succeeded where his Democratic predecessors (and political opponents) failed” (p. 115).  He launched the Christmas bombing of military targets in Hanoi in 1972—next to the Tet Offensive the most successful American campaign.  Reeling under the assault, Ho Chi Minh began cooperating with the peace talks in Paris that promised to secure a future for South Vietnam.  

Tragically, Nixon’s successes unraveled amidst the Watergate scandal.  Emboldened Democrats, enjoying majorities in both houses of Congress, moved to curb the president’s powers and (following his resignation in 1974) curtail aid to South Vietnam.  In short order Laos and Cambodia, as well as South Vietnam, “were sacrificed to Communism” (p. 145).  

* * * * * * * * * * * * * * * * *

One of the best analyses of the Vietnam War was published 30 years ago by Norman Podhoretz, entitled Why We Were in Viet Nam (New York:  Simon and Schuster, c. 1982, 1983).  It is simply structured to answer four questions:  Why we went in?  Why we stayed in?  Why we withdrew?  and Whose immorality?   

We entered the war under the guidance of President John F. Kennedy, who had declared, in a speech given in 1956 while still a senator from Massachusetts, “‘the cornerstone of the Free World in Southeast Asia,’” an outpost of democracy which “‘would be threatened if the red tide of Communism overflowed into Vietnam’” (p. 19).  Still more:  it was in America’s national interest to protect her representatives and investments in that region.  JFK was clearly committed to the Truman Doctrine of “containment” and thought South Vietnam worth defending.  Following this policy, Truman had involved the U.S. in the Korean War, and “as Guenter Lewy puts it in his authoritative history of the Vietnam War, ‘no serious discussion or questioning appears to have taken place of the importance of Southeast Asia to American security interests, of the correctness of the dire predictions regarding the consequences of the loss of the area’” (p. 34).  

Having gone into Vietnam under JFK, we stayed in because President Lyndon B. Johnson supported the Truman Doctrine and “‘made a solemn private vow’” to “devote himself to ‘seeing things through in Vietnam’” (p. 64).  He orchestrated the passage of the Golf of Tonkin resolution in 1964 and deftly enlisted the support of Arkansas’s  J. William Fulbright and Idaho’s Frank Church—senators who later became some of his harshest critics.  Though promising during the ’64 electoral campaign to limit our involvement in Vietnam, LBJ dramatically expanded America’s involvement in the war, costing the nation considerable blood and treasure.  Professors, protesters, and politicians soon surfaced to denounce the effort, siding “with the enemy with complete impunity” (p. 85).  Skillfully infiltrating the “Movement,” hard core “communist groups worked on increasingly close terms with the non-Communist radicals who made up the ever-selling constituency of what had only recently become known as the New Left” (p. 89).  

Pro-Communist intellectuals such as Susan Sontag, Mary McCarthy and Noam Chomsky praised and supported North Vietnam, many of them making pilgrimages to Hanoi to bask in the limelight Ho Chi Minh provided.  To McCarthy Hanoi was a wonderful place, full of well-fed cheerful children and free of prostitutes and refuse.  So too in the countryside, she “‘saw no children with sores and scalp diseases  . . . .  no rotten teeth or wasted consumptive-looking frames’” (p. 93).  All was well in the workers’ paradise, whereas south of the border she found nothing but chaos and corruption.  Sadly enough, General Edward Landsdale lamented, Ho and his minions malevolently devastated Vietnam, yet our public intellectuals never sought to portray them as earlier writers had done with the Kaiser in WWI or Hitler in WWII.  “‘For some baffling reason, we accepted the self-portrait of Ho Chi Minh as a benevolent old “uncle” who was fond of children—and of other Politburo leaders as speakers for a people they did not permit to have opinions.  So we let their claims to leadership go unchallenged while their people suffered and died’” (p. 108).  

Determined to defend American’s effort in Vietnam, Podhoretz condemned the media for enlisting in the anti-war movement and deliberately misleading the public.  Given his illustrations and citations, no sympathetic reader could deny the fact that influential journalists and academics effectively supported North Vietnam and shaped public opinion to that end.  “Thus did the North Vietnamese go on fighting in the reasonably secure belief that even if they lost on the battlefield, American public opinion—like French public opinion before it—would force the United States to withdraw on terms that would eventually ensure the Communist conquest of the south” (p. 130).    

Consequently we withdrew from the war.  LBJ decided not to run for another presidential term in 1968 and Richard Nixon was elected promising to end the war, though he certainly wanted to save South Vietnam from Communism and following the Christmas bombing in 1972 stood poised to actually prevail in the struggle.  In the opinion of Sir Robert Thompson, one of the most knowledgeable authorities on the war:  “‘In my view, on December 30, 1972, after eleven days of those B-52 attacks on the Hanoi area, you had won the war.  It was over!  . . . They would have taken any terms’” (p. 156).  And, indeed, Ho Chi Minh’s representatives to the Paris Peace accords quickly signed on to the Nixon-Kissinger proposals.  But the antiwar movement in America despised any hint of victory!  Prominent Democrats, such as George McGovern and Howard Hughes, railed against the “immorality” of this nation’s support of South Vietnam and the collapse of the Nixon presidency brought to power anti-war ideologues who rapidly orchestrated the process of exiting Southeast Asia.    

Clearly distressed by this nation’s failure in Vietnam, Podhoretz addresses an important ethical question:  who was clearly wrong?  In retrospect, as we consider the millions who died in Vietnam and Cambodia as a result of Ho Chi Minh’s aggressions, as we calculate the atrocities wrought by the Communists, as we reflect on the radical shifts in American foreign policy under Jimmy Carter, it becomes clear to Podhoretz that we should have persevered in the war and spared the world from a series of catastrophes.  Despite the many failures of the U.S. military, including isolated atrocities such as My Lai, American soldiers they could hardly be accused of “war crimes,” whereas nothing short of “genocide” took place under Ho Chi Minh’s and Pol Pot’s direction.  

* * * * * * * * * * * * * * * * 

For many years Bruch Herschensohn was an influential member of California’s Republican establishment, working for both Richard Nixon and Ronald Reagan.  In 1992 he would have probably been elected the United States Senate had not his opponent, Barbara Boxer, issued a last-minute and utterly false smear asserting he habitually visited strip clubs.  Narrowly defeated, he turned to writing and lecturing in universities such as Claremont, Pepperdine, and Harvard.  Determined to rectify a slice of this nation’s historical record, so badly distorted by left-wing ideologues, he recently published An American Amnesia:  How the U.S. Congress Forced the Surrender of South Vietnam and Cambodia (New York:  Beaufort Books, c. 2010).  “Voluntary amnesia,” he asserts, “is a crime against history” (Kindle #2127), and we must at all costs avoid it.  The story he tells he fully understands as a participant, consulting the notes and clippings he made while serving as a speech writer to Richard Nixon.

The 1973 Paris Peace Accords, precipitated by the Christmas bombing a month earlier and negotiated by Secretary of State Henry Kissinger, established two independent Vietnams.  But when President Nixon was forced from office, says North Vietnam’s Colonel Bui Tin, “‘we knew we would win’” (#828), and, defying the accords, within three years Ho Chi Minh’s Viet Minh forces had successfully invaded and conquered the South while Pol Pot’s Khmer Rouge took control of Cambodia.  Sustained by enormous assistance from China and the Soviet Union, Communist forces prevailed while the United States Congress did everything possible (overriding President Gerald Ford’s repeated objections) to abandon and ignore Indochina.  Evaluating all this, Senator J. William Fulbright “announced that he was no more depressed than I would be about Arkansas losing a football game to Texas’” (#778).  

In this endeavor the Congress was aided and abetted by the media, typified by Sidney Schanberg (egregiously celebrated in the 1984 film The Killing Fields) who was awarded the Pulitzer Prize in 1976 for his reporting on Cambodia.  According to him, “‘I have seen the Khmer Rouge and they are not killing anyone’” (#543).   “NBC’s Jack Perkins watched Saigon’s War Memorial being toppled into the street by North Vietnamese soldiers, and he said to his American television audience that the statue had been ‘an excess of what money and bad taste accomplish.  I don’t know if you call it the fall of Saigon or the liberation of Saigon’” (#759).  That “liberation” led quite quickly to renaming Saigon Ho Chi Minh City and the expulsion of many residents to the countryside, where “reeducation camps” imposed Ho’s ideology.  Ultimately millions of innocents were slain.  Amazingly, due to the agitation of anti-war protesters and the stratagems of the 94th Congress, more died in the year following Saigon’s fall “than during the preceding decade of war” (#857).  But the American media studiously ignored the genocide!  

America’s reaction to the war in Vietnam, says Herschensohn, had unintended consequences, revealed in incidents such as 9/11.    The 94th Congress not only refused to grant President Ford funds to defend Cambodia and South Vietnam but enacted policies to hamstring the CIA, leading to a series of intelligence failures around the world.  Under Jimmy Carter, America retreated everywhere—turning away from El Salvador, Nicaragua and Iran.  When the Ayatollah Khomeini seized control of Iran, Carter’s “Ambassador to the United Nations, Andrew Young, stated, ‘Khomeini will be somewhat of a Saint when we get over the panic’” (#2067).  A saint indeed!  And a murderous saint to boot!  

Thus Herschensohn warns:  “Because of congressional actions taken in the mid-1970s, the nation today faces risks to our survival, and risks to the very survival of civilization as we know it” (#2127).  

250 The Roots of Radical Islam

 To understand radical Islam’s emergence during the last half of the 20th century, Lawrence Wright’s The Looming Tower:  Al-Qaeda and the Road to 9/11 (NY:  Alfred A. Knopf, 2006) remains one of best researched, most readable surveys.  He tells how a small cadre of religious zealots—most notably Sayyid Qutb, Ayman  al-Zawahiri, and Osama bin Laden—deliberately upended our world.  And they did so, in part, because the United States routinely failed to understand, withstand, and respond to their assaults.  Amazingly, despite recurrent warnings and violent episodes, almost no one in America took them seriously.  “It was too bizarre, too primitive and exotic.  Up against the confidence that Americans placed in modernity and technology and their own ideals to protect them from the savage pageant of history, the defiant gestures of bin Laden and his followers seemed absurd and even pathetic” (p. 6).  

Wright begins his account portraying Sayyid Qutb, an Egyptian school teacher who came to the United States in 1949.  Filled with hatred for the new nation of Israel and shocked by its triumph over Arab armies, he found in America added fuel for the Islamic zeal consuming his soul.  The shame and shock at the establishment of Israel “would shape the Arab intellectual universe more profoundly than any other event in modern history.  ‘I hate those Westerners and despise them!’ Qutb wrote after President Harry Truman endorsed the transfer of a hundred thousand Jewish refugees into Palestine.  ‘All of them, without any exception:  the English, the French, the Dutch, and finally the Americans, who have been trusted by many’” (p. 9).  His hatred, interestingly enough, didn’t deter him from coming to study in America! 

Though generally well-treated by the ordinary folks in Washington D.C. and Greeley, Colorado, where Qutb briefly studied and continued his writing projects, he looked for and found proof of America’s degeneracy in such events such as a church dance, the freedom enjoyed by women, and publications such as the spurious Kinsey Report.  Hostility to America meshed easily with hostility to Israel to form the core of his world view, and when he returned to Egypt he believed :  “Modern values—secularism, rationality, democracy, subjectivity, individualism, mixing of the sexes, tolerance, materialism—had infected Islam through the agency of Western colonialism.  America now stood for all that” and he was persuaded “that Islam and modernity were completely incompatible” (p. 24).  Ultimately  imprisoned by General Abdul Nasser—the first truly native-born Egyptian to rule Egypt in 2500 years—he wrote Milestones, an enormously influential treatise, to recall Muslims to the pristine purity of their 7th century origins.  For radical Muslims, Qutb’s Milestones resembles Hitler’s Mein Kampf or Lenin’s What Is to Be Done.  

The second significant Islamist was Ayman al-Zawahiri, a medical doctor from a prominent family who was reared in an upscale Cairo suburb.  During his student years he absorbed and quickly promoted Qutb’s version of Islam.  He too was distressed by the mere existence of Israel and felt humiliated by Egypt’s collapse in the 1967 war—a decisive “psychological turning point in the history of the modern Middle East.  The speed and decisiveness of the Israeli victory in the Six Day War humiliated many Muslims who had believed until then that God favored their cause.  . . . The profound appeal of Islamic fundamentalism in Egypt and elsewhere was born in this shocking debacle” (p. 38).  Fiercely nationalistic, Zawahiri envisioned reestablishing the Muslim Caliphate centered in Egypt and enabling Islam to authentically flourish, dominating planet earth.  He launched an underground movement (al-Jihad) designed to overthrow the secular regime in his country.  Accused of involvement in the assassination of President Anwar Sadat, Zawahiri was imprisoned for three years and early emerged as the spokesman for the defendants.  

The third and most infamous protagonist in Wright’s story is “The Founder,” Osama bin Laden, one of the many sons of Mohammed bin Laden, one of Saudi Arabia’s most prosperous businessmen.  He was especially close to King Abdul Aziz and did much of the construction work on the renovation of the Grand Mosque in Mecca, which can hold a million worshippers.  Though expected to take his place in his father’s extensive business empire, young Osama bin Laden joined the Muslim Brothers while in high school and began to show less interest in making money than establishing Islamic states.  While studying in King Abdul Aziz University in Jeddah he turned increasingly religious, taking the Salafist position that declares versions of Islam other than that espoused by Saudi Arabian Wahhabis heretical.  Like Zawahiri he was deeply moved by the writing of Sayyid Qutb and embraced his anti-American agenda.  “Bin Laden would later say that the United States had always been his enemy.  He dated his hatred for America to 1982, ‘when America permitted the Israelis to invade Lebanon and the American Sixth Fleet helped them’” (p. 151).  

   When Soviet troops invaded Afghanistan in 1979, radical Muslims rallied to defend Islam, so both Zawahiri and bin Laden made their way to the fields of conflict.  Much much of their activity, however,  took place in nearby Pakistan, where bin Laden proved especially useful in fundraising.  Here they and their followers engaged in endless discussions regarding jihadist strategies and sought to train young warriors to give their lives to the cause.  The Afghans fought and won the war against Russia, whereas the Arabs recruited by Zawahiri and bin Laden mainly looked for opportunities to die as martyrs for Islam.   Thus forged amidst the Afghan War, the ideology and methodology of Al Qaeda were basically in place by 1988.  In particular, Islamic rationalizations for suicide missions and terrorist attacks on innocent civilians coalesced within the principle of takfi—a license for true believers “to kill practically anyone and everyone who stood in their way; indeed, they saw it as a divine duty” (p. 125).    

As the war in Afghanistan wound down, bin Laden returned to Jeddah, Saudi Arabia, where he enjoyed a celebrity status for his “divine mission” in Afghanistan.  In his native land the Wahhabi version of Islam had gained strength:  theaters were closed, music (“the flute of the devil” bin Laden said) virtually disappeared, and women’s activities were seriously circumscribed.  But for radicals like bin Laden even this was not sufficient and his activities increasingly irritated King Fahd and the princes ruling the Kingdom.  When, for example, Iraq conquered Kuwait and threatened Saudi Arabia, bin Laden objected to allowing American troops to defend the kingdom.  He and his jihadists, he declared, could (with Allah’s aid) repel any invasion of Arabian peninsula’s sacred soil.  But King Fahd,  trusting in tanks rather than jihadists, invited the Americans to establish bases and successfully overturn Saddam Hussein’s conquests.  

At odds with Saudi rulers (who ultimately revoked his citizenship), bin Laden then moved to Sudan, where he bought land near Khartoum and tried to both farm and launch various business enterprises.  His very presence added considerably to Sudan’s financial status and he seemed momentarily content.  But he soon fell in with a radical Imam (Abu Hajer) who encouraged him to attack the United States, “the last remaining superpower” threatening Islam.  He and al-Qaeda would henceforth target American troops and murder innocents—concentrating “not on fighting armies but on killing civilians” (p. 175).  By this time he had come to despise the United States as “weak and cowardly,” urging his followers to remember Vietnam and Lebanon.  When a few of their soldiers die, he said,  Americans retreat!  “For all its wealth and resources, America lacks convictions.  It cannot stand against warriors of faith who do not fear death” (p. 187).   President Bill Clinton’s cowardly withdrawal from Somalia in 1993 had further confirmed bin Laden’s growing contempt for the USA.    

Amidst deteriorating conditions, bin Laden left Sudan in 1996 financially ruined, his family scattered, and his organization broken.  “He held America responsible for the crushing reversal that had led him to this state” (p. 223).  On August 23, 1996, in his “Declaration of War Against the Americans Occupying the land of the Two Holy Places,” he said:  “You are not unaware of the injustice, repression, and aggression that have befallen Muslims through the alliance of Jews, Christians, and their agents, so much so that Muslims’ blood has become the cheapest blood and their money and wealth are plundered by the enemies” (p. 234).  Barred from returning to Saudi Arabia, he settled in Afghanistan, now controlled by Mullah Mohammed Omar and the Taliban.  Joined by a group of Egyptians following Zawahiri, he began training terrorists such as Mohammed Atta to take down America.  In  1998, Zawahiri drafted a document calling on “all of the different mujahideen groups that had gathered in Afghanistan” to launch  “a global Islamic jihad against America” (p. 259).  

This fatwa, signed by bin Laden as well as Zawahiri, declared that the killing of “Americans and their allies—civilian and military—is an individual duty for every Muslim who can do it in any country in which it is possible to do it’” (p. 260).  Soon thereafter the jihadists orchestrated the nearly simultaneous bombings of American embassies in Kenya (killing 213 and injuring thousands of people) and Tanzania (killing 11 and wounding 85).  Two years later the USS Cole was nearly sunk by a suicide attack in Aden, Yemen’s deep water port, killing 17 sailors.  To bin Laden:  “The destroyer represented the capital of the West, and the small boat represented Mohammed.”  

But other than haphazardly launching a few missiles and issuing threats, Bill Clinton and his administration did nothing.  In the waning days of his presidency he tried “to burnish his legacy by securing a peace agreement between Israel and Palestine. (p. 331).   Within a year, however, culminating the jihadist offensive, came September 11, 2001 and with the collapsing New York towers the world woke up to al-Qaeda, bin Laden, and the threat posed by radical Islam!  

* * * * * * * * * * * * * * * * * * * 

In Nazi Propaganda for the Arab World (New Haven:  Yale University Press, c. 2009), Professor Jeffrey Herf “documents and interprets Nazi Germany’s propaganda efforts aimed at Arabs and Muslims in the middle East and North Africa” (p. 1).  From 1939 to 1945 a steady stream of anti-Semitic, anti-Allies propaganda reached millions of Muslims via shortwave radio.  These broadcasts both “attributed enormous power and enormous evil to the Jews” (p. 2) and promised that an Axis victory would free “the countries of the Middle East from the English yoke and thus realize their right to self-determination” (p. 3).  In time, of course, the Allies won WWII and little came of the Nazi endeavor to establish a foothold in the Islamic world.  But the broadcasts’ rhetoric, it can be argued, helped shape the mindset of today’s radical Muslims, for the same anti-Semitic, anti-Western message routinely circulates throughout their world.

Central to the story is Haj Amin el-Husseini, the Grand Mufti of Jerusalem, who resided in Berlin during WWII and was “the most important public face and voice of Nazi Germany’s Arabic-language propaganda” (p. 8).  He and his family were influential and he “led opposition to the Balfour Declaration and to Jewish immigration to Palestine” (p. 8).  In Berlin, he met and associated with Adolf Hitler, Heinrich Himmler, and other important Nazis.  He assured Hitler that “the Fuhrer was ‘admired by the entire Arab world.’  He thanked him for the sympathy he had shown to the Arab and especially the Palestinian cause” (p. 76).  Hitler responded by assuring Husseini that Arabs would be liberated from English domination, Jews in North Africa and the Middle East would be destroyed, and “the Mufti would be the most authoritative spokesman of the Arab world” (p. 78).  “Husseini was a key figure in finding common ideological ground between National Socialism, on the one hand, and the doctrines of Arab nationalism and militant Islam, on the other” (p. 8).  Following the war he mysteriously “escaped” and found shelter in Cairo, where he was protected and lauded for the remainder of his life, ever promoting an anti-Jewish, anti-American agenda.  

From one perspective, this book is a chronological record of what was said by the Nazi propaganda machine.  Chapter by chapter, Herf describes the shifting nature of the broadcasts, reflecting the course of WWII.  As the war began, the Nazis sought to enlist Arab support in the Middle East, where England especially controlled considerable territory.  Thus similarities between Islam and National Socialism were stressed.  As General Rommel seemed on the verge of victory in North Africa, the broadcasts promised both the extermination of the Jews (primarily in Palestine) and freedom from British rule.  When the Allies began to turn back the German advance, the broadcasts shifted to emphasize the potential harm Muslims would suffer should the British and American and Soviet armies succeed.   As the Third Reich collapsed, the broadcasts shifted to emphasize conspiracies afoot in Islamic lands, blaming Jews and their supporters (especially America) for various evils.

From another perspective, however, there was a constancy to the broadcasts:  hostility to Jews and their allies.  “Radical anti-Semitism was a central component throughout the broadcasts” (p. 11).  No labels were too vicious, no rumors too unfounded, no accusations too malicious for assertion on the radio!  Arabs in North Africa and the Middle East were urged to kill Jews, following the Nazi example, aiming at the “final solution.”  They were reminded that the prophet Mohammed expelled the Jews from Arab lands and then urged to follow his example.  A broadcast in 1942 was titled “Kill the Jews before They Kill You.”  Egyptians were urged to do their duty “to annihilate the Jews and to destroy their property’” (p. 125).  Husseini always made it clear that “his hatred of the Jews was ineradicably bound to his Muslim faith and to his reading of the Koran” (p. 154).  He charged that “‘they lived like a sponge among peoples, sucked their blood, seized their property, undermined their morals yet still demand the rights of local inhabitants’” (p. 185).  Still more, he cried out:  “‘Arabs!  Rise as one and fight for your sacred rights.  Kill the Jews wherever you find them.  This pleases God, history and religion.  This serves your honor.  God is with you’” (p. 213).  

As was evident in the Grand Mufti’s messages, the Koran was the great authority invoked to appeal to Arab listeners.  Neither Hitler’s Mein Kampf nor The Protocols of the Elders of Zion were much discussed, nor were the speeches of Hitler or Himmler invoked.  Rather, texts from the Koran were continually cited to justify Nazi propaganda.  “Nazism thus stood with the ‘faithful’ and ‘noble’ Muslims against traitors who deviated from the path laid down in the Koran” (p. 197).  Himmler even urged his German scholarss to link the Shi’ite hope for the coming of the Twelfth Imam to Hitler, suggesting that “‘the Koran predicts and assigns to the Fuhrer the mission of completing the Prophet’s work’” (p. 199).  Hitler could be portrayed, Himmler said, “‘as Jesus (Isa) who the Koran predicts will return and, as a knight . . . defeats giants and the king of the Jews who appear at the end of the world’” (p. 199).  

Hostility to the Jews was conjoined with hostility to the Allies (preeminently England and America) in the broadcasts.  Despite the fact that the British restricted Jewish Immigration to Palestine and the Americans equivocated regarding the establishment of a Jewish state, both nations were accused of actively promoting such activities.  Egyptians particularly were portrayed as victims of British oppression and urged to drive out the foreigners.  As American troops increasingly played a role in the war, the broadcasts besmirched the USA and President Franklin D. Roosevelt in particular.  He was declared to be not only a tool in the hands of conniving Jews (such as Hans Morgenthau and Bernard Baruch) but to be a Jew himself!  In one of his broadcasts Husseini “stated that the ‘wicked American intentions toward the Arabs are now clearer, and there remain no doubts that they are endeavoring to establish a Jewish empire in the Arab world.  More than 400,000,000 Arabs oppose this criminal American movement’” (p. 213).  

Though Herf focuses almost exclusively on the historical details, it takes little imagination to apply his insights to current affairs.  Virtually the same rhetoric employed by the Nazis is evident throughout Islamic lands.  The link is quite clear in the Muslim Brotherhood, which was founded by Hassan al-Banna (a graduate of the most prestigious Islamic university, Al-Azhar in Cairo), who had “‘made a careful study of the Nazi and Fascist organizations’” (p. 225).  “The Brotherhood wanted to establish a government based on pure Koranic principles and sought to counter reliance on Western culture, which it regarded as having brought about an ‘abasement of morals, conduct and character, for having increased the complexity of society and for having exposed the people to poverty and misery’” (p. 225).  

Picking up on Muslim Brotherhood themes following WWII, Sayyid Qutb furthered the fanatical message of radical Islam, writing Our Struggle with the Jews.  “The title itself,” notes Professor Herf, “evokes disconcerting comparisons to Hitler’s Mein Kampf (My Struggle).  Most important, in its views of the Jews and in its conspiratorial mode of analysis the book displayed a striking continuity with the themes of Nazism’s wartime broadcasts, with the important difference that it was far more embedded in the Koran and Islamic commentaries” (p. 255).  Qutb asserted that the Koran “‘spoke much about the Jews and elucidated their evil psychology’” (p. 257).  This alone authorized “war against the Jews in Israel” (p. 258).  Qutb probably “listened to Nazi broadcasts and traveled in the pro-Axis intellectual milieu of the radical Islamists in and around Al Azhar University.” Thus, Herf reasons:  “Just as the Nazis had threatened the Jews with ‘punishment’ for alleged past misdeeds, so Qutb offered a religious justification for yet another attempt to ‘mete out the worst kind of punishment’ to the Jews then in Israel.  In terms that his audience understood, Our Struggle with the Jews was a call to massacre the Jews living in Israel” (p. 259).  Executed in Egypt in 1966, Qutb “became both a martyr and an ideological inspiration for such radical Islamist groups as Al Qaeda, Hezbollah, and Hamas” (p. 255).  His influence clearly permeates the thought and action of radical Muslim terrorists, including Osama bin Laden.  The vitriol regarding Jews, the anger at America, the dishonest renditions of history, the constant complaints of victimization—nothing much has changed in more than half-a-century!  

Professor Herf draws upon previously untapped documentary sources, especially a cache of materials—transcriptions of the broadcasts made in Cairo by an American ambassador and sent to Washington—that provide extensive evidence for his case.  A professor of history at the University of Maryland, he has written extensively about the Third Reich’s animosity towards the Jews.  His books include:  Reactionary Modernism:  Technology, Culture, and Politics in Weimar and the Third Reich; The Jewish Enemy:  Nazi Propaganda During World War II and the Holocaust; and Divided Memory:  The Nazi Past in the Two Germanys.  He writes as a scholar for scholars, an historian for historians, meticulously footnoting every assertion.  Above all he wants to fully, conclusively document his argument.  Consequently, as he demonstrates the recurrent message, year after year, there is an unavoidable redundancy to the presentation that taxes the reader’s patience.  But his treatise provides important evidence that enables us to understand important aspects of Islam, then and now—from Mohammed onwards, Muslims have distrusted and detested Jews and anyone else disinclined to submit to Allah.  This, rightly understood, is part and parcel of Islam, which means “surrender to God’s will” as manifest in the Prophet’s followers.  

249 “Mind and Cosmos”

   Academic philosophers rarely grace the covers of newsmagazines, but the March 25 issue of The Weekly Standard portrayed Professor Thomas Nagel, bound with ropes, surrounded by demonic monks, roasting in a fire, featured in an article titled “The Heretic—professor, philosopher apostate.”  The reason for such attention was the recent publication of Nagel’s Mind and Cosmos:  Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False (New York:  Oxford University Press, c. 2012).  A professor at New York University, he enjoys an eminent position within the elite galaxy of revered intellectuals.  Before publishing this treatise he had refrained from openly questioning the entrenched naturalistic Weltanschauung of his peers so starkly set forth by Francis Crick:  “You, your joys and your sorrows, you memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules.  Who you are is nothing but a pack of neurons.”  Taking issue with such reductive materialism and publishing a slender treatise questioning its guiding assumptions has elicited outrage and abuse from his erstwhile colleagues, but in doing so Nagel did the real work of a philosopher—following the evidence and seeking the truth rather than tacking to the winds of opinion.  

He admits that “for a long time I have found the materialist account [given “canonical exposition” in Richard Dawkins’ The Blind Watchmaker] of how we and our fellow organisms came to exist hard to believe, including the standard version of how the evolutionary process works.  The more details we learn about the chemical basis of life and the intricacy of the genetic code, the more unbelievable the standard historical account becomes” (p. 5).  The more we understand about life and the cosmos the less adequately Neo-Darwinianism explains things.  Though personally a humanist atheist, he finds common ground with the advocates of Intelligent Design such as Michael Behe and Stephen Meyer, persuasive critics of the dominant paradigm.  He’s come to seriously consider the possibility that mind, rather than matter, shapes Reality.  Consequently:  “My guiding conviction is that mind is not just an afterthought or an accident or an add-on, but a basic aspect of nature” (p. 16).  This is particularly evident when we turn our attention to what we know best—ourselves!  “Something more is needed to explain how there can be conscious, thinking creatures whose bodies and brains are composed of those elements.  If we want to try to understand the world as a whole, we must start with an adequate range of data, and those data must include the evident facts about ourselves” (p. 20).  Unfortunately:  “Evolutionary naturalism implies that we shouldn’t take any of our convictions seriously, including the scientific world picture on which evolutionary naturalism itself depends” (p. 18).  

The mysterious and absolutely indubitable reality of human consciousness highlights the inadequacy of evolutionary naturalism.  “Organisms such as ourselves do not just happen to be conscious; therefore no explanation even of the physical character of those organisms can be adequate which is not also an explanation of their mental character.  In other words, materialism is incomplete even is a theory of the physical world, since the physical world included conscious organisms among its most striking occupants” (p. 45).  Scholars like Dawkins and Crick, who reduce consciousness to material entities, fail to properly distinguish between description and explanation; observing neurons firing in the brain does not begin to adequately explain the phenomenon of consciousness.  Far better, Nagel says, is the ancient Aristotelian conception of “teleological laws” guiding natural processes.  In addition to matter-in-motion, there may well be “something else, namely a cosmic predisposition to the formation of life, consciousness, and the value that is inseparable from them” (p. 123).  Old Aristotle may well have erred, but he now appears wiser than his modern antagonists!  As for theists, a creative God certainly provides a satisfactory explanation.  No final explanation for consciousness fully persuades Nagel, but he knows that the Neo-Darwinian answer lacks cogency.  What we must seek, he argues, is “a form of understanding that enables us to see ourselves and other conscious organisms as specific expressions simultaneously of the physical and mental character of the universe” (p. 69).  

What’s true for consciousness is even truer for cognition—our incredible ability to reason.  We are not only aware of ourselves as thinking beings but we can transcend our personal perspectives and objectively discover momentous realities such as the law of gravity.  Evolutionary naturalism fails, abysmally, to explain the existence and unique mental powers of our species, so properly labeled homo sapiens.  “Rationality, even more than consciousness, seems necessarily a feature of the functioning of the whole conscious subject, and cannot be conceived of, even speculatively, as composed of countless atoms of miniature rationality” (p. 87).  

Then add to cognition conscience!  Add to speculative reason practical reason.  We do, countless times a day, evaluate things, judging them good and evil, right and wrong.  And such judgments range far beyond our individual feelings or interests.  I may very well be more outraged by the former San Diego Mayor Bob Filner’s abusive behavior than by an undeserved personal insult.  I may very well be more concerned with the national debt’s impact on future generations than by the sharp increase of my electric bill, though both result from irresponsible politicians’ decisions.  To Nagel, only the “moral realism” expounded by traditional thinkers such as Aristotle and C.S. Lewis enables us to craft ethical principles and render moral judgments; and “since moral realism is true, a Darwinian account of the motives underlying moral judgment must be false, in spite of the scientific consensus in its favor” (p. 105).  

Inasmuch as consciousness, rationality and morality define us as human beings—and inasmuch as evolutionary naturalism cannot explain these fundamental realities—we must, Nagel says, open our minds to better ways of thinking and understanding the universe, taking “the appearance and evolution of life as something more than a history of the development of self-reproducing organisms, as it is in the Darwinian version” (p. 122).  A better version is wanted!  For, Nagel concludes:  “I would be willing to bet that the present right-thinking consensus will come to seem laughable in a generation or two” (p. 128).  No wonder “the present right-thinking” guardians of secular orthodoxy turned venomous when confronted with Nagel’s intellectual rigor and incisive logic!  

* * * * * * * * * * * * * * * * * *

In many of his writings C.S. Lewis trenchantly critiqued the philosophical naturalism masquerading as “science” in the modern world.  This he labeled “scientism,” carefully differentiating it from authentic “science,” with its rigorous methodology and tentative hypotheses.  The intrinsic nihilism and potential brutality of “scientism” was philosophically exposed in Lewis’s The Abolition of Man and memorably portrayed in his That Hideous Strength, one of the great dystopias of the 20th century.  The same message is manifest (though without Lewis’s theistic foundation) in Raymond Tallis’ recent Aping Mankind:  Neuromania, Darwinitis and the Misrepresentation of Humanity (Durham, U.K.:  Acumen Publishing Limited, c. 2011).   As a medical doctor (and “atheistic humanist”) who taught for many years at the University of Manchester, devoting himself to brain science, he is thoroughly aware of neuroscience and its implications for understanding human nature.  But he has become increasingly distressed by the unwarranted supposition (what he dubs “neuromania”) that we are no more than our brains, ignoring the importance of common sense, consciousness and culture, art and religion.  As widely propounded in both scholarly and popular circles:  “The neurophysiological self is at best the locus of ‘one damn thing after another’, which hardly comes near to the self of a human being who leads her life, who is a person reaching into a structured future with anticipations, aims and ambitions, that are themselves rooted in an almost infinitely complex accessible past that makes sense of them” (p. 135).   

Even on a purely material level man’s brain eludes easy analysis.  Though specific neurological sections clearly do specific things (e.g. seeing; hearing), they are capable of alternative and adaptive roles.  Rather than being “hard-wired” like a computer, the brain has a beguiling “plasticity” enabling it to reorganize under certain conditions.  The brain is clearly necessary for us to think—but it is not necessarily a sufficient explanation of our thinking.  Neurologists may chart correlations between neurons firing and mental activity, but as elementary logic reminds us a correlation must never be equated with causation.  “The errors of muddling correlation with causation, necessary condition with sufficient causation, and sufficient causation with identity lie at the heart of the neuromaniac’s basic assumption that consciousness and nerve impulses are one and the same, and that . . . ‘the mind is a creation of the brain’” (p. 95).  Quite the contrary, Tallis argues:  “mental events are not physical events in the brain” (p. 133).  

Undergirding the notion that the mind is the creation of the brain is the evolutionary assumption that we are nothing but the clever animals Daniel Dennett declares as part and parcel of  “Darwin’s Dangerous Idea,” the “universal acid” that cuts away all confidence in what philosophers call qualia—intentionality and meaning,  morality and justice, freedom and responsibility, beauty and love.  To Tallis, any theory that discounts such qualia (intensely felt personal realities basic to human experience) demands disbelief!  Obviously “nerve impulses are not at all like qualia” (p. 95) and any attempt to explain away the latter by describing the former cannot but miscarry.  Indeed, “we shall find, again and again, that we cannot make sense of what the brain is supposed to do—in particular postulating an intelligible world in which it is located—without appealing to talk about people who are not identical with their brains or with material processes in those brains” (p. 111).  

The “Darwinitis” infecting “neuromaniacs” is similarly suspect to Tallis.  “If they only looked at what was in front of their noses they would not have to be told that there are differences between organisms and people:  that a great gulf separates us from even our nearest animal kin” (p. 147).  In almost every significant way we differ from other animals!  “Many of our strongest appetites—for example, for abstract knowledge and understanding—are unique to us” (p. 151).  Our finest endeavors—writing and reading books, composing and listening to symphonies—have no parallel in the animal kingdom.  Importantly, to Tallis, embracing Darwinism as an explanation for human origins does not necessarily entail accepting “Darwinitis (which purports to explain everything about people in terms of biological evolution)” (p. 153).  Especially problematic is any Darwinian explanation of human consciousness, the fundamental reality known to us.  “In short, if it is difficult (although not in principle impossible) to see how living creatures emerged out of the operation of the laws of physics on lifeless matter, it is even less clear how consciousness emerged or why it should be of benefit to those creatures that have it.  Or, more precisely, why evolution should have thrown up species with a disabling requirement to do things deliberately and make judgments” (p. 179).  

Whether humanizing animals or animalizing man, Darwinitis demands its disciples deny non-material realities of any sort.  Consequently they remain “bewitched” by figures of speech comparing us with computers or machines, dolphins or chimps.  Unlike computers, however, we think and skillfully program computers, which are not conscious and cannot reason.  Even the most sophisticated supercomputers “are as zombie-like as pocket calculators” (p. 195).  We, conversely, uniquely use languages that are not at all computational!  In our languaging we reveal our freedom and dignity (realities necessarily denied by neuromaniacs) as human beings, and in our literature we revel in our uniquely human creativity.  

Sadly enough, Tollis says, even our current humanities (history, philosophy, art and music) have fallen captivity to Darwinitis and neuromania, reducing literally all our activities to “animalities,” i.e. matter-in-motion.  Thus we find Shakespearean scholars studying Macbeth’s grasping for an imaginary dagger and declaring:  “‘when moving his right hand, an actor playing Macbeth would activate the right cerebellar hemisphere and the left primary cortex” (p. 294)!   Such “scholarship,” relentlessly marching through academia, should give us pause, Tollis says, because it ruthlessly destroys all that grants grandeur to our literary treasures.  Similarly, we must be alerted to the flourishing academic discipline of “neuro-evolutionary ethics” espoused by thinkers such as Patricia Churchland, who insists that “‘it is increasingly evident that moral standards, practices and policies reside in our neurobiology’” (p. 317). Thus, as Albert Einstein asserted in 1932, in our “thinking, feeling, and acting” we do nothing freely “‘but are just as causally bound as the stars in their motion’” (p. 312).  

Such views, coming to the foreground in our world, lead Tollis to warn:  “Be afraid, be very afraid.”  Indeed, Tollis is sufficiently afraid to look favorably on “at least the idea” of God (p. 325).  The consequences of the atheism he embraces embarrass him!  Though irreligious himself, he finds the traditional notion of God preferable to the simplistic “biologism” espoused by prominent atheists such as Richard Dawkins, whose “devastating reductionism . . . disgusts even an atheist like me.  In defending the humanities, the arts, the law, ethics, economics, politics and even religious belief against neuro-evolutionary reductionism, atheist humanists and theists have a common cause and, in reductive naturalism, a common adversary:  scientism” (p. 336).  

* * * * * * * * * * * * * * * * * 

For many years Alvin Plantinga has effectively represented the Christian perspective among academic philosophers.  Illustrating his prestige among his peers, he was invited to deliver the Gifford Lectures in 2005.  In print the lectures are titled:   Where the Conflict Really Lies:  Science, Religion, and Naturalism (New York:  Oxford University Press, c. 2011).  Unlike some Gifford lecturers (e.g. William James, in The Variety of Religious Experience), Plantinga writes almost exclusively for his peers, and this treatise will be accessible only to folks with ample background in the denser realms of science, philosophy and theology.  His thesis, in short, claims:  “there is superficial conflict but deep concord between science and theistic religion, but superficial concord and deep conflict between science and naturalism” (#89 in Kindle ed.).  Theists such as himself need not deny evolutionary evidence, but they cannot abide a naturalistic “add-on to the scientific doctrine of evolution:  the claim that evolution is undirected, unguided, unorchestrated by God (or anyone else)” (#142).  Though vociferously denied by its secular proponents, scientific “naturalism” assumes a religious role in their thinking and may be understood as a “quasi-religion” whose presumptions clearly conflict with the data of consciousness and cognition.  Unfortunately, as Richard Feyerabend wisely noted years ago:  “Scientists are not content with running their own playpens in accordance with what they regard as the rules of the scientific method; they want to universalize those rules, they want them to become part of society at large” (Against Method, p. 220).  

We Christians especially should take seriously the scientific discoveries and insights of our time.  Created in God’s image, we are uniquely equipped “to know and understand something of ourselves, our world, and God himself” (p. 4).  All truth is God’s truth and we can (in part, looking through a dark glass) know it.  Unfortunately, all too many modern “scientists” abandon their circumscribed vocation and become amateur philosophers when promoting their naturalistic (and generally atheistic) convictions.  By carefully reading Richard Dawkins’ The Blind Watchmaker and The God Delusion—and demanding such necessities as demonstrable evidence and cogent explanation, logical rigor and unambiguous definitions— Plantinga effectively illustrates Dawkins’ sophomoric shortcomings.  Dispatching Dawkins, he then dissects Daniel Dennett’s Darwin’s Dangerous Idea—“a paradigm example of naturalism” (p. 36).  Though by profession a philosopher (whereas Dawkins is a biologist dispensing philosophy), Dennett apparently has failed to do the honest intellectual toil necessary to actually engage the great theists of the past!  Consequently, many of his arguments (like Dawkins’) prove jejune to a first-rate thinker such as Plantinga—“about as bad as philosophy (well, apart from the blogosphere) gets” (p. 45).  

Much the same can be said of “evolutionary psychologists” who “explain distinctive human traits—our art, humor, play, love, sexual behavior, poetry, sense of adventure, love of stories, our music, our morality, and our religion itself—in terms of adaptive advantages accruing to our hunter-gather ancestors” (p. 131).  Thus Harvard’s Steven Pinker devoted “only eleven of his 660-page How the Mind Works” to music; he explained that music “‘was useless’ in terms of human evolution and development’” and should be regarded “as ‘auditory cheesecake,’ a trivial amusement that ‘just happens to tickle several important parts of the brain in a highly pleasurable way, as cheesecake tickles the palate’” (p. 132).  Such Pinkerian statements simply illustrate the intellectual vacuity of celebrated academics!  

On a more constructive level, after meticulously answering objections to the possibility of God’s intervention in the world, he suggests that God could easily work through both the “macroscopic” and “microscopic” realms, exercising “providential guidance over both “cosmic” and “evolutionary” history and doing so “without in any way ‘violating’ the created natures of the things he has created” (p. 116).  Still more, Plantinga confesses:  “perhaps God is more like a romantic artist; perhaps he revels in glorious variety, riotous creativity, overflowing fecundity, uproarious activity.  . . . .  Perhaps he is also very much a hands-on God, constantly active in history, leading, guiding, persuading and redeeming his people, blessing them with “the Internal Witness of the Holy Spirit” (Calvin) or “the Internal Instigation of the Holy Spirit” (Aquinas) and conferring upon them the gift of faith.  No doubt he is active in still other ways.  None of this so much as begins to compromise his greatness and majesty, his august and unsurpassable character” (p. 107).  Equally possible, God may very well have created “a theater for setting for free actions on the part of human beings and other persons”—“a world of regularity and predictability” wherein we function in accord with our imago dei status (p. 119).  

Thus good science poses no “defeaters” for Christian faith.  There is in fact deep concord between them.  As Sir Isaac Newton said, in Principia Mathematica:  “This most beautiful system of the sun, planets and comets, could only proceed from the counsel and dominion of an intelligent and powerful being. . . .  This Being governs all things, not as the soul of the world, but as Lord over all” (p.).  Today’s cosmologists often reflect on the apparent “fine tuning” of the universe that suggests a profound teleological process culminating in a world “just right” for us human beings.   Current advocates of “Intelligent Design” such as Michael Behe have persuasively detailed evidence and argued that “irreducibly complex” structures cannot be adequately explained by Neo-Darwinians who insist the evolutionary process is absolutely unguided.  Plantinga effectively demonstrates the thoroughly rational and philosophically defensible position that there is every reason to believe in a Mind behind the Cosmos.   

248 “The Rising Tyranny of Ecology”

In 1973, Alston Chase abandoned his academic career—as a tenured philosophy professor with degrees from Harvard, Oxford and Princeton—and “returned to nature” in Montana’s mountains.  At the time he considered himself an “environmentalist” and sought to live accordingly.  Successfully relocated, he undertook a writing project to document the development of Yellowstone National Park under the reigning “ecosystems management” principle adopted by park managers.  What he discovered—and detailed in Playing God in Yellowstone (Boston:  The Atlantic Monthly Press, c. 1986)—was the destructiveness of misguided good intentions, leaving the park significantly degraded and endangered.  Particularly informative is his philosophically-nuanced treatment of “the environmentalists” who naively assumed and promoted “the subverted science” articulated by Rachel Carson, whose deeply flawed Silent Spring (labeled “the Uncle Tom’s Cabin of modern environmentalism”) recruited so many of them to the movement.  Such environmentalists include “the new pantheists” who believe, with Thoreau, that:  “In wilderness is the preservation of the world.”  They’re often enamored with “California Cosmologists” such as Theodore Rosak, Alan Watts, Fritjof Capra, and assorted Native American shamans, whose thoroughly Gnostic notions (e.g. panpsychism) promise a mystical union with Nature conducive to the inner bliss of self-realization.  And they follow an assortment of “hubris commandos”—well-heeled urban elites with political connections who want to reserve the wilderness for backpackers.  

His work on Yellowstone piqued Chase’s concern for broader environmental policies impacting the nation, so he researched and wrote In A Dark Wood:  The Fight over Forests and the Rising Tyranny of Ecology (New York:  Houghton Mifflin Company, c. 1995), one of the most probing and important ecological studies in print.  Combining a detailed narrative of events with a penetrating analysis of deeper issues, In A Dark Wood effectively exposes the heart of America’s confusion regarding how to live rightly with the natural world.  Anyone one concerned with the health of the natural world—and of the cultural and political world—should read and reflect on Chase’s work.  Evaluating his conclusions at the beginning of his treatise, Chase confesses they “were far more disturbing than I had anticipated.  An ancient political and philosophical notion, ecosystems ecology, masquerades as a modern scientific theory.  Embraced by a generation of college students during the campus revolutions of the 1960s, it had become a cultural icon by the 1980s.  Today, not only does it infuse all environmental law and policy, but its influence is also quietly changing the very character of government.  Yet, as I shall show, it is false, and its implementation has been a calamity for nature and society” (p. xiii).  Those collegians—drinking deeply from Rachel Carson’s Silent Spring and adopting the pantheism of Emerson  and John Muir—began an effective long march through the nation’s institutions and transformed environmentalism into a religious faith with a radical political agenda.  

That process stands revealed in their determination to preserve the forests of Washington, Oregon, and California.  Rejecting the traditional notion that loggers with their sawmills could wisely manage the forests and provide for their sustained rejuvenation, the activists demanded they be quarantined in what was imagined to be a primitive paradise and allowed to flourish free from human contamination.  Chase shows how timber “harvests soared during the postwar building boom” and “trees grew faster than they were being cut” (p. 73).  This was especially true of California’s redwood forests.  The trees were healthy and provided a healthy income for thousands of folks throughout the Pacific Northwest.  Admittedly, “old growth” forests were declining, but they had been effectively replaced by faster growing, younger, healthier trees.  Still more:  those “old growth” forests were largely a figment of the activists’ imagination!  The best historical evidence shows that pre-Columbian Indians had carefully set and controlled fires, and the Pacific Northwest forests two centuries ago were “‘more open than they are now,’” containing “‘islands of even-aged conifers, bounded by prairies, savannas, groves of oak, meadows, ponds, thickets and berry patches.”  Largely due to broadcast Indian burning, they “‘were virtually free of underbrush and course woody debris that has been commonplace in forests for most of this century’” (p. 404).  

But the defenders of the mythical “old growth” forests, draped in the mantle of “ecology” and taking “the balance of nature” as axiomatic, believe “nature knows best” and requires us to promote her sustainability.  This position is less a scientific schema than an ancient ideological stance shaped by evolutionists such as Ernest Haeckel (who coined the word ecology in 1866) and Aldo Leopold (whose Sand County Almanac became an instructional manual for environmental activists).  It is less an agenda fueled by evidence than a faith founded in improbable historical and metaphysical assumptions.  The Ecosystem became God!  Enamored of “deep ecology,” environmentalists “unwittingly embraced ideas that synthesized an American religion of nature with German  metaphysics:  a holism that supposed individuals were only parts of a larger system and had no independent standing; antipathy to Judaic and Christian values of humanism, capitalism, materialism, private property, technology, consumerism, and urban living; reverence for nature; believe in the spiritual superiority of primitive culture; a desire to go ‘back to the land’; support for animal rights; faith in organic farming; and a program to create nature reserves” (p. 129).  Fortuitously for them, the Endangered Species Act both enacted their aspirations and opened legal portals whereby they could effectively attain their goal of transforming America.  

The ecological activists, looking for an opportunity to kill the logging industry with its “clear-cutting” and access roads, chanced on an “endangered species” in the Pacific Northwest—the spotted owl.  Only a few birds (14 in the first important study) were found, and they appeared to prefer “old growth” forests.  To preserve these owls’ “ecosystem” a massive effort was almost immediately launched to halt all activities in the forests that might endanger it, though in fact “spotted owl policy would be built on the thin air of uneducated guesswork” (p. 251).    Well-funded by environmental organizations such as the Nature Conservancy, the Sierra Club, and the wealthy eastern aristocrats such as Teresa Heinz Kerry who finance foundations (e.g. Pew, Rockefeller, Heinz and the Tides), the activists successfully manipulated the media, academia, and the judiciary to preserve the extensive lands allegedly needed for the spotted owl to flourish.  They deliberately ignored accumulating scientific studies finding ever-more spotted owls thriving throughout the region, especially in recently harvested private timber properties.  “By 1993 six thousand to nine thousand would be estimated to live in northern California alone, and perhaps an equal number in Oregon and Washington.  Yet federal demographic studies continued to claim that the species remained in deep decline” (p. 365).  True believers cannot be deterred by the facts!  

So they successfully pursued their agenda, primarily through the courts, and managed to earmark large sections of the Pacific Northwest as “old growth” forests, forever inviolable and off-limits to cutting of any sort.  Timber harvests in California dropped by 40% within two decades.  Loggers lost their jobs, sawmills closed, and small towns shriveled.  Unlike the urban environmentalists (burnishing their self-esteem by supporting the Sierra Club, which paid skilled lawyers to pursue their agenda through the courts) the working folks in the forests lacked both the money and organizational skills with which to resist the lock-down of their region.  “Saving the owl had effectively shut down an area larger than Massachusetts, Vermont, New Hampshire, and Connecticut combined, costing the economy tens of billions of dollars and casting tens of thousands out of work” (p. 398).  When grass-roots groups (identified as the Wise Use Movement) tried to rally and defend themselves and their livelihood, environmentalists invested millions of dollars to discredit them, calling them “a mob” bankrolled by the evil timber industry.  Environmentalists orchestrated meetings with President Bill Clinton and his Vice President Al Gore, who then packed the President’s Council on Sustainable Development with leaders of various well-heeled environmental organizations.  Remarkably, within three decades the ecological “movement became a war launched by the haves against the have-nots.  It is a situation analogous to what the late Christopher Lasch has called ‘the revolt of the elites’ whereby ‘upper-middle-class liberals have mounted a crusade to sanitize American society.’  Indeed, Lasch could have been thinking of environmentalist when he added that ‘when confronted with resistance, members of today’s elite betray the venomous hatred that lies not far beneath the smiling face of upper-middle-class benevolence’” (p. 415).  

Tragically, the natural world would suffer harm along with the loggers and the small town economies they sustained.  “The great effort to save old growth would eventually destroy the very landscapes it was intended to preserve.  For it demonstrated an important principle:  that seeking to halt change merely accelerates it.  Nothing more clearly revealed this truth than the rising threat of wildfire” (p. 400).  Allegedly “saving” forests and wildlife, the preservationists paved the way for the fires we now witness throughout the West.  Trees that could have been logged and provided a living for thousands of people now burn, for wherever old trees die and underbrush thrives the potential for massive fires increases.  For instance:  “Officials in southern California, following the 1993 firestorm, attributed the lack of prescribed burning that could have reduced or eliminated much of the destruction to public opposition, some of which was based on concern for the habitat of the Stephens kangaroo rat and the gnatcatcher” (p. 401).  The raging fires should awaken us to the truth of Dante’s words that provide Chase the title for his book.  In The Divine Comedy the great poet said:  “I went astray / from the straight road and woke to find myself / alone in a dark wood.  How shall I say / what wood that was!  I never saw so drear, / so rank, so arduous a wilderness.”  Today we’re in a “dark wood” that results from our captivity to an ancient philosophical error:  identifying the good with what is “natural,” imagining the “state of nature” as ideal for man, formulating “new values based on systems ecology, which from the beginning was less a preservation science than a program for social control.  Supposing that protecting ecosystems was the highest imperative for government, it increasingly viewed the exercise of individual liberty as a threat” (p. 413).  

* * * * * * * * * * * * * * * * * *

Nothing better illustrates the “rising tyranny of ecology” than Elizabeth Nickson’s Eco-Fascists:  How Radical Conservationists Are Destroying Our Natural Heritage (New York:  HarperCollins Publishers/Broadside Books, c. 2012).  She claims to “walk the green walk more than anyone I’ve met” and lives on 16 acres immersed in older-growth forest in a geothermal-heated house on a Canadian island (Salt Spring) in Puget Sound.  There she witnessed and recorded, in fascinating detail, how we have been misled by “a corrupt idea—that an ecosystem has to be in balance, with all its members present in the correct proportion, to be considered healthy” (p. 6).  According to the litany:  “Nature knows best.  Man is a virus and a despoiler and must be controlled” (p. 18).  Though touted as “conservation biology” it is a demonstrably “bad science” that is doing great harm.  “Evil may be too strong a word for us modernists to use comfortably, but what else do you call an idea that ruins everything it touches?” (p. 173).  “In just thirty-five years, conservation biology has created one disaster after another, in something that observers are now calling an error cascade.  Tens of millions have been removed from their beloved lands.  Immensely valuable natural resources have been declared off-limits to the most desperate in the developing world” (p. 200).  Consequently:  “Range, forest, and farm are dying; water systems have been destroyed.  Conservation biology has created desert and triggered the dying of entire cultures” (p. 200).  

A seasoned journalist, working in various parts of the world as a reporter for Time magazine, Hickson went home to care for her dying father and remained on the land because she learned to love it.  She “built a cottage at the top of my hill” and “resolved to stay” (p. 12).  In the process, however, whenever she tried to do literally anything on the land she owned and sought to improve she encountered the front line of a totalitarian movement—“an uncompromising and rigid bureaucratic command-and-control structure, which is creating yet another hybrid of the totalitarian state” (Kindle #82)—that saddled her with a series of irrational and onerous restrictions and burdened her not only with anger at the fanatical environmentalists on her island and senseless bureaucratic restrictions but with a concern for the future of our world.  In the process she effectively “lost all but 4 acres of the original 28.  I still pay taxes on the 16.5 acres I supposedly have left, but I’m lucky I am allowed to walk on it” (p. 177).  She had to deal with “grim zealots [many of them angry, wealthy, divorced women] seeking to remake the world” in accord with their mantra of “sustainability” and who find allies in affluent NGOs and “fervent true believers in federal and state agencies” such as the EPA (#98).  She discovered a “labyrinthine public planning process” aptly described by historian David Clary as “the eternal generation of turgid documents to be reviewed and revised forever.”  

True to their Leftist principles, environmental zealots follow the utopian visions formulated by Rousseau and Marx, validating the oft-uttered generalization that “when the Iron Curtain fell, fellow travelers migrated to the environmental movement.  And when they arrived to transform the rural world—a world few of us visit except on vacation, when no one is paying attention—they brought their planning with them” (p. 45).  Consequently:  “There is no starker way to describe what is taking place right now in the country than as the full flourishing of the bureaucratic state.  Private property rights have been largely removed, the culture is dying, but the state, consisting of federal, state, and local ministries and departments, has bulked out so that a giant superstructure of bureaucrats with rulebooks piled high around their desks flourishes, grows, and feeds on ever-diminishing wealth” (p. 47).  

Facilitating this process (and feathering their own nests while granting rare privileges to their wealthy political friends such as Harry Reid) are powerful organizations such as the Nature Conservancy (TNC), the world’s 10th largest NGO, “the biggest of the big dogs, the mythic wolf-king of the forest primeval” (p. 61).  In a complicated, convoluted and surreptitious process, The Nature Conservancy works with “the nation’s richest individuals, like Ted Turner, David Letterman, the Rockefellers, and the DuPonts.  Basically, TNC is acting as agent for the wealthiest among us, acquiring enormous tracts of land, using $25 donations from its 1.3-million-strong membership and $100 million in annual government money, and then selling that land at a discount to the very rich, who in effect receive a substantial tax discount as well as an extremely beautiful place in which to establish a country estate” (p. 68).  The good folks living on the land distrust and fear TNC, so it generally “operates through a proliferation of ‘partner’ land trusts, conservancies, and operatives.  TNC’s sending polite, fresh-faced kids into the middle of nowhere to start local actions for waterbirds or watersheds or ancient forests was the trigger that started the landslide collapse in rural America” (p. 72).  Environmentalists have created legions of smaller foundations, now run “by a subset of grim zealots seeking to remake the world” (#96).  The feared “robber barons” of yore have been replaced by equally pitiless celebrities such as Tom Cruise, Teresa Heinz, and Robert Redford!  Readers such as I (for many years a member and admirer of The Nature Conservancy) will never forget Nickson’s meticulous deconstruction of TNC—and by inference the Sierra Club, the Wilderness Society, etc.  

Her personal frustration led to an investigation, including an extensive journey throughout the rural West (marked by an in-depth interview with Alston Chase in Montana) as well as plowing through the  written materials that resulted in the publication of Eco-Fascism.  She explored the forests where logging and sawmills once sustained a vibrant culture and the open range backcountry where cattle once ranged.  There she found:  “Deserted lands, mile after mile after mile.  No one on the highways, not even trucks.  One broken little hamlet after another.  . . . .  What I was looking at was death.  Death not just of the little towns but death of millions of acres of rangeland.  . . . it was like driving through Ghost World, with wraiths drifting across the fields whispering of what was once all fecundity and life” (p. 235).  She came to believe that there has been a well-orchestrated war on rural America, where folks earn their living from the land rather than preserving it as sanctuary for either veneration or vacation retreats.  Enamored with their own purity, environmentalists have effectively sequestered nearly 700 million acres of land, 30 percent of the nation’s land base.  “The amount of land classified as wilderness has grown tenfold since ecosystem theory took flight, growing from 9 million acres in 1964 to 110 million acres today” (p. 96).  Amazingly, as a result of crusades to create parks and “wilderness areas,” nearly half (40%) of New York state, “almost 14 million acres—is in the process of being rewilded, turned back, in all essentials, to Iroquoia” (p. 40).  Worldwide the same process proceeds apace—as a result of the creation of parks and refuges “more than 10 percent of the developing world’s landmass has been placed under strict conservation—11.75 million square, miles, more than the entire continent of Africa” (p. 38).  In the process, “more than 14 million indigenous people have been cleared from their ancestral lands by conservationists” (p.36).  

With the support of the Clinton Administration in the 1990s and the Obama Administration today, environmental activists have banned logging in millions of acres in the national forests.  However well-intended, the meticulous study of Holly Fretwell, Who Is Minding the Federal Estate—“the most important analysis of the effects of environmental activism on rural America to date” in Nickson’s judgment (p. 129)—shows “that everything, everything, we have been doing was wrong” (p. 130).  Wildfires vividly illustrate this, for nothing—neither timber harvesting nor road building—can compare with the damage that wildfires inflict on” the forests (p. 130).  The fires resulting from environmental policies consume vastly more timber than “evil corporations” could possibly have done, and the devastation inflicted on spruce and pine trees by the pine beetle and budworm could have been controlled by rapid cutting had not environmentalists insisted the bugs be allowed to pursue their destructive ways.    

  Nickson admits:  “The title of this book is harsh, particularly when used with regard to environmentalists, whom most people view as virtuous at best, foolish at worst.  But I do not use this term lightly, nor as a banner to grab attention.  My father landed on D-day and, at the end of the war, was put in charge of a Nazi camp and told to ‘sort those people out.’”  He was thus highly sensitive to the fact “that man defaults to tyranny over and over again, and while the tyranny of the environmental movement in rural America has not reached what its own policy documents say is its ultimate goal—radical population reduction—we cannot any longer ignore that goal and its implications” (#132).  And she believes there is in fact an answer:  “The Gordian knot of the countryside mess can be solved with one swift blow of the sword.  Property rights must be restored to the individuals who are willing to work their lives away tending that land.  The people, the individuals and families, in other words, who want it.  Confiscation by government, multinationals, and agents of the megarich—the foundations and land trusts—must be reversed.  Otherwise devastation beckons” (p. 314).  

247 How Liberalism Became Our State Religion

Barack Obama’s 2008 presidential campaign and election clearly appealed to and elicited a strongly religious fervor.  Devotees fainted at his rallies, messianic claims were attached to his agenda, and Obama promised a fundamental “transformation” of America.  Celebrating his election, he grandiosely declared that peoples henceforth would see that “this was the moment when the rise of the oceans began to slow and the planet began to heal.”  Consequently, actor Jamie Foxx urged fans to “give an honor to God and our lord and savor Barack Obama.”  MSNBC commentator Chris Matthews enthused:  “This is the New Testament” and “I feel this thrill going up my leg.”  Louis Farrakhan, closely aligned with Jeremiah Wright, Obama’s Chicago pastor, told his Nation of Islam disciples:  “When the Messiah speaks, the youth will hear, and the Messiah is absolutely speaking.”  And there’s even The Gospel According to Apostle Barack by Barbara Thompson.  Though previous presidents—notably John F. Kennedy—elicited something of the same enthusiasm, Obama is somewhat unique in America.  But he is not at all unusual when placed against the backdrop of human history, when again and again “charismatic” leaders have claimed and been endowed with supernatural powers.  

Thus there is good reason to seriously ponder Benjamin Wiker’s Worshipping the State:  How Liberalism Became Our State Religion (Washington:  Regnery Publishing, Inc., c. 2013).  He prefaces his treatise with a typically prescient statement by G. K. Chesterton:  “‘It is only by believing in God that we can ever criticize the Government.  Once abolish . . . God, and the Government becomes the God.  That fact is written all across human history . . . .  The truth is that Irreligion is the opium of the people.  Wherever the people do not believe in something beyond the world, they will worship the world.  But, above all, they will worship the strongest thing in the world’” (p. 1).   And inasmuch as the State has (during the past five centuries) gradually expanded its powers, there is a natural tendency to worship it.    

Though secular liberals have frequently touted their “tolerance” and commitment to “diversity,” there is a totalitarian shadow—an irreligious dogmatism—evident in their many current anti-Christian endeavors:  the “war on Christmas” with efforts to enshrine alternatives such as “Winter Solstice;” the cleansing of any Christian content from public school curricula (while simultaneously promoting Islam); the dogmatic support of naturalistic evolution rather than any form of intelligent design in the universities; the removal of crosses or nativity scenes on public lands; the desecration of Christian symbols by “artists” of various sorts; the assault on Christian ethics through court decisions (e.g. Roe v. Wade) and programs such as the abortificient provisions in Obamacare.  Systematically imposed by the federal courts (following the crucial 1947 Everson v. Board of Education Supreme Court decision), “the federal government has acted as an instrument of secularization, that is, of disestablishing Christianity from American culture, and establishing in its place a different worldview” (p. 11).  

Lest we restrict this process to America, however, we must chart some powerful historical developments in Western Civilization that have been unfolding for half-a-millennium.  To Wiker, the triumph of Liberalism in these centuries enabled growing numbers of folks to liberate themselves from the “curse” of Christianity and replace the Church with an enlightened and nurturing State.  “The founders of liberalism believed that Christianity was a huge historical mistake, and therefore they reached back again to the pagans for help in loosening the Christian hold on the world, and quite often adopted precisely those things in paganism that Christianity had rejected” (p. 22).  Consequently, “Christians today find themselves in a largely secularized society” quite akin to the ancient world with an easy-going sexual ethos; “it is as if Christianity is being erased from history, and things were being turned back to the cultural status quo of two thousand years ago” (p. 37).  

Christianity, of course, emerged within a pagan world wherein the state (Egyptian pharaohs, the Athenian polis, Imperial Rome) had been routinely idolized.  Following Christ’s wonderful prescription—“render unto Caesar the things that are Caesar’s and to God the things that are God’s”—his followers established the “two cities” approach definitively set forth by St Augustine.  Priests and kings are to preside over different, though not totally isolated realms.  Clearly delineated in the Bible, “The distinction between church and state, religious and political power, is peculiar to Christianity, and the church invented it” (p. 44).  Of ultimate importance to early Christians was doctrinal Truth, an uncompromising insistence on the singular claims of Christ Jesus, the LORD of His heavenly kingdom.  Christians should not deify the state, and no king should defy God’s Law!  Though routinely blurred in practice and often resembling a wrestling match requiring energetic corrections (such as the Cluniac reforms in the 10th and 11th centuries), the separation of church and state provided the key to much that’s distinctive in Western Civilization by preventing the “Church from becoming a department of the state.”  Prescriptively, in 494 A.D. Pope Gelasius wrote a famously important letter to the Eastern Emperor Anastasius, insisting on a clear separation between “the sacred authority of the priesthood and the royal power.”  (In the East, by contrast, a “Caesaropapism” developed reducing the Church to an arm of the Byzantine Empire).  Thenceforth, uniquely in the West, two realms were established with neither dominating the other.  

During the past 500 years, however, this balance slowly shifted and secular powers have imposed their way on the churches.  Wiker describes it as “the rise of liberalism and the re-paganization of the state.”  Fundamental to this progression was Niccolo Machiavelli, who published The Prince in 1512 A.D. and “invented the absolute separation of church and state that is the hallmark of liberalism.”  The Church had drawn lines between religious and political powers, “but Machiavelli built the first wall between them.  In fact, his primary purpose in inventing the state was to exclude the church from any cultural, moral, or political power or influence—to render it an entirely harmless subordinate instrument of the political sovereign” (p. 104).  An effective ruler—the strong-armed prince—must ignore religious and moral prescriptions, following a “might-makes-right” formula.  Machiavellian secularism now appears in both the “soft “liberalism” designed to satisfy our physical needs and the “hard liberalism” of fascist states.  To Machiavelli, the prince should appease the ignorant masses and “‘appear all mercy, all faith, all honesty, all humanity, all religion’” (p. 110).  Working surreptitiously, however, he should promote a “re-paganized” religion and state-controlled educational system.  “The current belief that the church must be separated from the state and walled off in private impotence—leaving, by its subtraction from the public square, the liberal secular state—all that is Machiavelli’s invention.  The playing out of this principle in our courts today is in keeping with his goal of creating a state liberated from the Christian worldview” (p. 119).  Machiavelli’s moral nihilism also fit nicely with newly-empowered nation-states which followed the cuius regio, eius religio (“whose realm, his religion”) compromise negotiated in 1555 at the Peace of Augsberg and quickly moved to control the churches.  

England’s King Henry VIII—guided by Thomas Cromwell, who had studied Machiavelli’s teachings at the University of Padua—brutally illustrated this trend by establishing the Church of England.  He and his successors placed themselves directly under God, controlling both church and state.  Henry supervised the publication of the 1539 Great Bible, featuring an engraving of himself handing copies of it to both the Archbishop of Canterbury (Thomas Cranmer) and his Lord Chancellor (Cromwell).  A century later, “England gave the world the immensely influential political philosopher Thomas Hobbes, author of the Leviathan, who constructed an entirely secular foundation for Henry’s church, and therefore gave us the first truly secular established church in a major modern state—more exactly, an absolutist, autocratic version” (p. 122).  To accomplish this, he first reduced all reality to the material realm, subtly denying the non-materiality of both God and the soul and eliminating any objective, absolute moral standard.  In Hobbes’ world, lacking both Natural and Divine Law, good and evil are mere labels attached to feelings that either please or displease us.  Thus, Hobbes famously said, in our natural state we are at war with everyone and, consequently our lives are “‘solitary, poor, nasty, brutish, and short’” (p. 130).  To corral our nastiness, a Leviathan—an all-powerful Government—must rule.  We need an absolute Sovereign to protect grant and protect our “rights.”  As with Machiavelli, Hobbes knew the masses needed religion, and he simply insisted the Sovereign had the right to prescribe and enforce it through the Church of England.  His “church is entirely a state church, completely under the king’s power” (p. 134).  

Liberalism, similarly, insists the Church must accommodate the state, and “Liberal Christianity is the form that the established religion of the state takes—perhaps not its final form, but its most visible, obvious form” (p. 120).  To accomplish this, liberal thinkers during the Age of Reason determined to destroy the authority of Scripture, and the “demotion of the Bible from revealed truth to mere myth is the result” (p. 58).  To attain this end Benedict Spinoza marshaled his formidable intellect, garnering credit for being both the “father” of both “modern liberal democracy” and “modern Scripture scholarship.”  More blatantly materialistic than either Machiavelli or Hobbes (declaring God is Nature), he adumbrated a might-makes-right political philosophy that flowered in Hegel, “who declared that the Prussian state was the fullest manifestation of the immanentized ‘spirit’ of God” (p. 145).  In a state thus deified, of course, Scripture must be displaced, so Spinoza simply denied any supernatural dimension to the written Word.  By definition, miracles—especially miracles such as the Incarnation and Resurrection—cannot occur, so “higher critics” cavalierly dismissed all such accounts.  To the extent the Bible has merit, its message was reduced “to one simple platitude:  ‘obedience toward God consists only in love of neighbor’” (p. 154).  

Within the next two centuries this same secularizing process wormed its way into the churches.  As a result of the “higher criticism” launched by Spinoza, a “secularizing approach to Scripture was deeply entrenched among the intelligentsia [such as David Friedrich Strauss, a disciple of Hegel, who wrote The Life of Jesus Critically Examined] and had made great headway in European universities.  The aim was ‘de-mythologizing,’ removing from the Biblical text (just as Spinoza had dictated) all of the miracles, and hence undermining all the doctrinal claims related to Christ’s divinity, so that readers were left with, at best, Jesus the very admirable moral man who was misunderstood to be divine by his wishful disciples.  Christianity—so the Scripture scholarship seemed to establish—was built upon a case of mistaken identity.  But the moral core could be salvaged” (p. 240).  Still more:  through the evolutionary processes (both natural and societal) we humans can deify ourselves!  We should worship Man rather than God, the creature rather than the Creator!  

Sharing Spinoza’s intolerance for intolerance, John Locke proposed a softer (“classic”) form of liberalism, though he fully supported its secular essence and proposed a “reasonable” rather than traditionally orthodox form of Christianity.  Eschewing doctrine to emphasize morality, Locke promoted a “mild Deism” that proved quite influential in 18th century England and America.  Personally pious—and the author of many biblical commentaries especially popular in America—Locke was (many thought) sufficiently “Christian” to embrace philosophically.  Concerned to preserve permanent moral standards, he espoused a version of the Natural Law, but he displaced the Church as a mediating institution and left the individual “facing the state alone” (p. 228).  A privatized religious faith is fine, he thought, so long as it makes no claims to shape public policies.   And his “classical” liberalism was (notably in post-WWII America) progressively folded into the more radical forms attuned to Hobbes and Spinoza. 

In today’s churches, Wiker laments, Spinoza’s “materialistic mindset has increasingly taken hold, and the church has become correspondingly anemic.  The church thus weakened by unbelief in the supernatural is what we call the mainline or liberal Christian church.  That church has total faith in materialist science, fully embraces the ‘scientific’ study of Scripture fathered by Spinoza, and professes a completely de-supernaturalized form of Christianity that is entirely at home in this world and only vaguely and non-threateningly associated with the next” (p. 153).  Thus we are confronted, as H. Richard Niebuhr famously said, with theologians teaching that:  “‘A God without wrath brought men without sin into a kingdom without judgment through the ministrations of a Christ without a cross’” (p. 153).  With this Spinoza would be pleased!  “To sum up Spinoza’s kind of Christianity:  You don’t need the Nicene Creed if you’re nice.  People who fight over inconsequential dogmas are not nice.  They’re intolerant” (p.155).  

Furthering the “liberal” agenda of the European Enlightenment, Jean-Jacques Rousseau envisioned a secular “civil religion” (outlined in his Social Contract) replacing Christianity.  His agenda was implemented by men such as Robespierre (“radical liberals”) in the French Revolution and still exerts enormous influence in our world.  Rousseau propounded his own purely naturalistic version of Genesis, imagining how things were in a pure “state of nature.”  Noble savages, free from the constraints of Judeo-Christian morality, followed their passions and enjoyed the good life.  All were equal and none permanently possessed anything.  To regain that lost estate, a “liberal state” is needed—one that “does not define law in terms of the promotion of virtue and the prohibition of vice, but in terms of the protection and promotion of individuals’ private pleasures—since all such pleasures are natural—are declared to be rights.  Any limitation of these ‘rights’ is considered unjust; that is, justice is redefined to mean everyone getting as much of whatever he or she wants as long as he or she doesn’t infringe on anyone else’s pursuit of pleasure” (p. 172).  

Having carefully explained the views of secular liberalism’s architects, Wiker shows how Leftists of various sorts implemented it in the centuries following the French Revolution, for “as the first attempt to incarnate the new liberal political order in a great state, the French Revolution is iconic for liberalism” (p. 200).  Importantly, a purely naturalistic worldview must be crafted and imposed.  We must be persuaded that “we live in a purposeless universe, so that each person has just as much right as anyone else to pursue his or her arbitrarily defined goals or ends” (p. 187).  Liberals triumphantly cite the Darwinian doctrine of biological evolution to prove “that the development of life is driven by entirely random, material processes,” that man “is an accident,” and that we are not made in God’s image but the product of a “meandering and mindless” natural process (p. 194).  Each person freely fabricates and follows whatever moral standards he desires, relaxing into a hedonistic utilitarianism calculated to enjoy the greatest good for the greatest number.  In effect, this has led to a resurgence of a pagan ethos at ease with abortion, euthanasia, promiscuity, sodomy and pedophilia.  

To accomplish this, liberals determined to deprive the Christian religion of any real power.  In late 19th century France this became clear as officials swept away crucifixes and saints’ statues in public places, outlawed religious processions, closed religious schools, and renamed city streets after Enlightenment heroes rather than saints.  More importantly, they seized control of the educational system, making it an agency of the state.  Secularists in America sought the same ends.  To Wiker:  “One cannot overestimate how significant it was in France (and is in America) for liberals to have gained complete state control of education, and for that education to be mandatory.”  This precipitated “a top-down revolution wherein a relatively small minority may impose its worldview upon the entire population using state power.  And the education establishment in our own country, as was the case in France, is dominated by radicals and socialists from the Left, from the universities right down to the elementary schools” (p. 216).  

Thus Liberalism came to America’s shores, first in the form of Locke and later under the auspices of “higher critics” and socialists of various hues.  In a sense, Wiker argues:  “America had a kind of Jacobin class bubbling away underneath tits Protestant surface, plotting its own version of a radical cultural revolution” (p. 263).  Thomas Paine, one of the more influential publicists during the American Revolution, represents this phenomenon.  The author of Common Sense, promoting independence from England, he also wrote The Age of Reason, promoting Deism and anti-Christian prejudices.  Thomas Jefferson avidly embraced both Locke (e.g. The Declaration of Independence) and Paine (e.g. The Life and Morals of Jesus of Nazareth), laying the groundwork for the famous “wall of separation between church and state” in a letter he sent to the Danbury Baptist Association in 1802.  Not until after WWII, however, did the Supreme Court enshrine this Jeffersonian comment as a reason to exclude religion from the public square.  Though Jefferson represented only a small minority of America’s Founders, his anti-Christian secularism slowly spread through the body politic as the decades passed.    

To a degree this Jeffersonian secularism prevailed in America, Alexis de Tocqueville said, because on a practical level 19th century Americans were notably materialistic—seeking comfort and prosperity without compunction.  They were thus quite “inclined to follow Locke, both in theory and in practice, and hence already well on our way to allowing the soul and heaven to fade away.  Christianity was often quite fervent America, but it was subtly reconstructed to be compatible with passionately this-worldly material pursuits.  It was not a Christianity that could produce martyrs or even severe judges of the fallen secular order” (p. 268).  By the end of the century, then, the nation was unfortunately vulnerable to the radicals at the universities who determined to transform the nation.  Scores of young scholars, following the Civil War, sailed to Europe (especially German universities) and returned with Spinoza and Rousseau, Darwin and Spencer, Strauss and Marx, entrenched in their minds.  They then either established or  controlled the nation’s preeminent universities which (given the largess of state and federal governments)  began to shape the cultural life of America.  New academic disciplines—including sociology and psychology—insisted that trained “experts” do for the people what they could not do on their own.  And the newly-minted law schools, personified by Oliver Wendell Holmes, systematically sought to impose a secular agenda on the land.  Progressive politicians, including Theodore Roosevelt, Woodrow Wilson, and Franklin Roosevelt, heeded their admonitions and implemented their goals.  

The time has come, Wiker argues, to disestablish the secular humanism now ruling America under the guise of “progressivism.”  Like scores of other political ideologies, it’s as clearly a religion (ironically, an unbelief established as a belief) with its own dogmas regarding creation, man’s nature and purpose in life, sin and salvation, good and evil, right and wrong, church and state, death, immortality and life everlasting.  “Liberalism once appeared to be about freeing everyone, believers and non-believers alike, from government-imposed religion and morality, but it has shown that that was just a ruse for establishing its own particular worldview, one that is fundamentally antagonistic to Christianity” (p. 312).  To mount the counterrevolution, believers must first target the nation’s universities—primarily by teaching truthful history—first stemming and then reversing the currents of liberal orthodoxy.  

# # #

246 Christ the King

 For most of his life N.T. Wright—one of the world’s most distinguished biblical scholars as well as an active churchman and bishop in the Church of England—has pondered various of questions regarding Jesus and His people.  In Simply Jesus:  A New Vision of Who He Was, What He Did, and Why He Matters (New York:  HarperOne, c. 2011), he sets forth some definitive answers to his quest.  “This book is written,” he declares, “in the belief that the question of Jesus—who he really was, what he really did, what it means, and why it matters—remains hugely important in every area, not only in personal life, but also in political life, not only in ‘religion’ or ‘spirituality,’ but also in such spheres of human endeavor as worldview, culture, justice, beauty, ecology, friendship, scholarship, and sex” (p. 5).  He also endeavors to move beyond the “conservative vs. liberal,” or “personal salvation vs. social gospel,” divisions by subsuming them all beneath his thesis regarding the neglected Truth declaring Christ’s Kingship.    

To find who Jesus really was requires serious historical research, seeking to understand His milieu rather than re-shaping Him to fit ours.  First, that means properly using proper sources—primarily the four canonical gospels.  It also means understanding the ancient milieu within which they were written, when a powerful religious movement (labeled a “philosophy” by the Jewish historian Josephus) reflected an expectation of the coming Messiah and insisted “that it was time for God alone to be king” (p. 41).  Over the centuries Israel’s prophets, reflecting on crucial events such as the Exodus and Exile (cf. Ezekiel 34) had envisioned a time when God fully manifested his royal authority on earth as well as heaven, working through purified hearts rather than foods and rituals.  Israel’s poets (cf. Psalm 2) expected YHWH, working through His anointed Son, would “establish his own rule over the rest of the world from his throne in Zion” (p. 50).  Jesus, drawing on such passages from the Psalms and Isaiah, portrayed Himself as the “suffering servant” expected by some first century Jews, but “Nobody, so far as we know, had dreamed of combining these ideas in this way before.  Nor had anyone suggested that when the prophet spoke of ‘the arm of YHWH’ (53:1)—YHWH himself rolling up his sleeves, as it were to come to the rescue—this personification might actually refer to the same person, to the wounded and bleeding servant” (p. 173).    

Precisely that’s what happened in Jesus, his disciples insisted!  “Within a few years of his death, the first followers of Jesus of Nazareth were speaking and writing about him, and indeed singing about him, not just as a great teacher and healer, not just as a great spiritual leader and holy man, but as a strange combination:  both the Davidic king and the returning God.  He was, they said, the anointed one, the one who had been empowered and equipped by God’s Spirit to do all kinds of things, which somehow meant that he was God’s anointed, the Messiah, the coming king.  He was the one who had been exalted after his suffering and now occupied the throne beside the throne of God himself” (p. 54).  God’s plan was fulfilled, Luke declared, when Jesus ascended the Cross rather than a throne—“or, rather, as all four gospel writers insist, a cross that is to be seen as a throne.  This, they all say, is how Jesus is enthroned as ‘King of the Jews.’  Jesus’  vocation to be Israel’s Messiah and his vocation to suffer and die belong intimately together” (p. 173).   Jesus and His disciples saw the Cross as “the shocking answer to the prayer that God’s kingdom would come on earth as in heaven” (p. 185).  

Consequently, God Himself is in charge of His Kingdom, ruling through his Son Christ Jesus.  “It was this new world in which God was in charge at last, on earth as in heaven.  God was fixing things, mending things, mending people, making new life happen.  This was the new world in which the promises were coming true, in which new creation was happening, in which a real ‘return from exile’ was taking place in the hearts and minds and lives both of notorious sinners and of people long crippled by disease” (p. 91).  Inevitably this provoked animosity from the principalities and powers determined to replace YHWH!  As is revealed in Jesus’ wilderness temptations, the LORD battles Satan and his earthly satraps—a battle finished on the Cross, where Jesus forever defeated the powers of darkness.  

That Christ is King explains the frequent NT references to Jesus forgiving sins and replacing the Temple (where sins were normally forgiven).  The Temple was YHWH’s dwelling, the sacred site where His Shekinah glory declared His presence.  It was literally the center of the world “where heaven and earth met” (p. 132).  When Jesus dramatically cleansed the Temple He “was staking an implicitly royal claim:  it was kings, real or aspiring, who had authority over the Temple” (p. 127).  By this action Jesus declared “that the Temple was under God’s judgment and would, before too long, be destroyed forever” (p. 129).  Indeed, He Himself would become the Temple!  Still more:  He became the new Sabbath and Jubilee!  Time and space are transformed in the new creation wherein He now rules.  

Emblematic of the new creation is the Passover meal Jesus celebrated with His disciples.   It was a traditional Jewish ceremony, but it was radically new.  “Like everything else Jesus did,” Wright says, “he filled the old vessels so full that they overflowed.  He transformed the old mosaics into a new, three-dimensional design.  Instead of Passover pointing backward to the great sacrifice by which God had rescued his people from slavery in Egypt, this meal pointed forward to the great sacrifice by which God was to rescue his people from their ultimate slavery, from death itself and all that contributed to it (evil, corruption, and sin).  This would be the real Exodus, the real ‘return from exile.’  This would be the establishment of the ‘new covenant’ spoken of by Jeremiah (31:31).  This would be the means by which ‘sins would be forgiven’—in other words, the means by which God would deal with the sin that had caused Israel’s exile and shame and, beyond that, the sin because of which the whole world was under the power of death.  This would be the great jubilee moment, completing the achievement outlined in Nazareth” (p. 180).  

“The gifts of bread and wine,” Wright continues, “already heavy with symbolic meaning, acquire a new density:  this is how the presence of Jesus is to be known among his followers.  Sacrifice and presence.  This is the new Temple, this strange gathering around a quasi-Passover table.  Think through the Exodus themes once more.  The tyrant is to be defeated:  not Rome, now, but the dark power that stands behind that great, cruel empire.  God’s people are to be liberated:  not Israel as it stands, with its corrupt, money-hungry leaders and it is people bent on violence, but the reconstituted Israel for whom the Twelve are the founding symbol” (p. 180).  The Last Supper, of course, set the stage for Jesus’s crucifixion and Resurrection; thereafter his followers—His reconstituted Israel—quickly spread around the world declaring “Jesus is Lord, and He is risen.”  The Risen Lord unveiled “the beginning of the new world that Israel’s God had always intended to make” (p. 191), and in His post-Resurrection appearances He materialized as “a person who is equally at home ‘on earth’ and ‘in heaven’” (p. 192).  After 40 days, He ascended into heaven.  But His heaven permeates the earth—Jesus is in “heaven” but He is everywhere present on earth as well.  “If Easter is about Jesus as the prototype of the new creation, his ascension is about his enthronement as the one who is now in charge.  Easter tells us that Jesus is himself the first part of new creation; his ascension tells us that he is now running it” (p. 195).  

And in time He will fully assert His rule.  He’s coming again!  “Heaven is God’s space, God’s dimension of present reality, so that to think of Jesus ‘returning’ is actually, as both Paul and John say in the passages just quoted, to think of him presently invisible, but one day reappearing” (p. 202).  The new world envisioned in Romans 8 and Revelation 21-22 will be a place under Christ’s control, “administering God’s just, wise, and healing rule” (p. 202).  “The second coming is all about Jesus as the coming Lord and judge who will transform the entire creation.  And, in between resurrection and ascension, on the one hand, and the second coming, on the other, Jesus is the one who sends the holy Spirit, his own Spirit, into the lives of his followers, so that he himself is powerfully present with them and in them, guiding them, directing them, and above all enabling them to bear witness to him as the world’s true Lord and work to make that sovereign rule a reality” (p. 203).  

We Christians (his Christ-bearers, His followers) are assigned a vital role in the Kingdom, for God ever intended to rule earth through human beings.  Jesus redeemed us on the Cross in order for us to join him, ruling the world in accord with His design.  “In God’s kingdom, humans get to reflect God at last into the world, in the way they were meant to.  They become more fully what humans were meant to be.  That is how God becomes king.  That is how Jesus goes to work in the present time.  Exactly as he always did” (p. 213).  And He established His Church (His Body), wherein we work to accomplish His ends.  Our work (as  concisely outlined in the Beatitudes) is to bear witness to His way in His world.  

* * * * * * * * * * * * * * * * * * * *

In How God Became King:  Getting to the Heart of the Gospels (New York:  HarperOne, c. 2012) Tom (a.k.a. N.T.) Wright continues to develop the thesis earlier enunciated in Simply Jesus.  He thinks we have lost touch with the canonical gospels, using them as props or tools to further our own agendas rather than as sources demanding our prayerful attention and implementation.  He acknowledges that for 20 centuries numerous scholars have devoted much time and intellectual firepower to the task of understanding them, but he thinks they have, by-and-large, failed to rightly discern and declare their real message.  (There is, of course, more than a little hubris in any declaration such as Wright’s that he alone has at last found The Truth—but that is something of a scholarly virus, an occupational hazard, frequently found in brilliant folks such as he.)    

In Wright’s reading of Church history, orthodox theologians and preachers have (rather narrowly  following St Paul or Luther or Calvin) reduced “the gospel” to the historic creeds—i.e. Apostles’ and Nicene—and neglected if not totally bypassed the Gospels.  “The great creeds, when they refer to Jesus, pass directly from his virgin birth to his suffering and death,” whereas the four Gospel writers “tell us a great deal about what Jesus did between the time of his birth and the time of his death.  In particular, they tell us about what we might call his kingdom-inaugurating work:  the deeds and words that declared that God’s kingdom was coming then and there, in some sense or other, on earth as in heaven.  They tell us a great deal about that; but the great creeds don’t” (p. 11).  “The gospels were all about God becoming king, but the creeds are focused on Jesus being God” (p. 20).  The creeds are not wrong, Wright insists, in what they affirm!  But when the Faith is reduced to creedal verities the Jesus Message gets lost.  

The Message got lost early on as misinterpretations came to dominate the Church!  For 1500 years or so Christians have seemed to ignore the fact “that the Jewish context of Jesus’ public career was playing any role in theological or pastoral reflection,” and He became “founder” of the faith, “with the type of Christianity varying according to the predilections of the preacher or teacher” (p. 110).   Three centuries ago Enlightenment thinkers, reviving the hedonistic materialism of Epicurus and Lucretius and heeding biblical critics such as H.S. Reimarus and Baruch Spinoza, embarked on a quest for the “historical Jesus” that refused to see Him as the Gospels reveal.  As variously portrayed by multitudes of liberal professors and preachers, poets and songsters, Jesus appears as “a revolutionary, hoping to overthrow the Romans by military violence and establish a new Jewish state.  Or he’s a wild-eyed apocalyptic visionary, expecting the end of the world.  Or he’s a mild-mannered teacher of sweet reasonableness, of the fatherhood of God and the brotherhood of ‘man.’  Or perhaps he’s some combination of the above” (p. 26).  Indeed, according to Rudolph Bultmann and his 20th century epigones, details regarding His life have no bearing on much of anything, for the Gospels (in their view) are not bona fide biographies conveying truthful details.  They were fanciful projections, written long after the events described, of an evolving community looking for illustrations to justify their “faith.”  

As a result of skeptical scholarship, “there seems a ready market right across the Western world for books that say that Jesus was just a good Jewish boy who would have been horrified to see a ‘church’ set up in his name, who didn’t think of himself as ‘God’ or even the ‘Son of God’, and who had no intention of dying for anyone’s sins—the church has got it all wrong” (p. 27).  To Wright, such a reading of the Gospels clearly ignores their obvious content.  Markedly deistic, Enlightenment thinkers wanted nothing to do with a God who intervenes on earth, much less actually rules anything.  They rejected both earthly kings and the heavenly King come to earth in Jesus.  “But the whole point of the gospels is to tell the story of how God became king, on earth as in heaven.  They were written to stake the very specific claim towards which the eighteenth-century movements of philosophy and culture, and particularly politics, were reacting with such hostility” (p. 34).  The Deism promoted by Voltaire and Thomas Paine removed God from His world.  “The divine right of kings went out with the guillotine, and the new slogan vox populi vox Dei (‘The voice of the people is the voice of God’) was truncated; God was away with the fairies doing his own thing, and vox pop, by itself, was all that was now needed” (p. 164).  

It’s now time to escape the intellectual shackles of the eighteenth-century!  It’s time to read the Gospels with 20-20 vision, taking them as trustworthy sources regarding who Jesus was and what He declared.  For, Wright incessantly repeats, they give us a largely forgotten narrative, “the story of how Israel’s God became king” (p. 38).  As the Messiah—a Jewish Messiah fulfilling the Hebrew Scriptures—Jesus came not so much as to provide a pathway to an ethereal heaven removed from the earth as to establish an outpost of heaven on earth.  “Jesus was announcing that a whole new world was being born and he was ‘teaching’ people how to live within that whole new world” (p. 47).  To rightly hear the Gospel we must turn down the volume of skeptical scholars and moralistic reformers and hear the annunciation of Jesus “as the climax of the story of Israel” (p. 65).  As Matthew insists, Jesus consummates the history of Israel initiated by father Abraham and “will save his people from their sins” (Mt 1:21).  But Jesus came to save more than the children of Israel, and “the reason Israel’s story matters is that the creator of the world has chosen and called Israel to be the people through whom he will redeem the world.  The call of Abraham is the answer to the sin of Adam.  Israel’s story is thus the microcosm and beating heart of the world’s story, but also its ultimate saving energy.  What God does for Israel is what God is doing in relation to the whole world.  That is what it meant to be Israel, to be the people who, for better and worse, carried the destiny of the world on their shoulders.  Grasp that, and you have a pathway into the heart of the New Testament” (p. 73).  

Mark’s gospel begins with Jesus’ baptism, where He is anointed with the Spirit and declared God’s Son by the Father.  Thenceforth He selected His 12 disciples, symbolizing the 12 tribes of Israel and doing dramatic things, such as calming the storm on the Sea of Galilee, illustrating what He was doing as God rescuing His people.  Toward the end of Mark’s Gospel, we encounter a Roman centurion who declared the crucified Christ as truly the Son of God.  Given his background, we assume the centurion didn’t fully understand what transpired on Golgotha.  “For him, the phrase ‘God’s son” would normally have meant one person and one person only:  Tiberius Caesar, son of the ‘divine’ Augustus” (p. 94).  The centurion tacitly recognized a larger truth:  in Jesus God regained His rightful throne as earth’s real Ruler.  

This too John makes clear in the Prologue to his Gospel, where he “takes us back to the first books of the Bible, to Genesis and Exodus.  He frames his simple, profound opening statement with echoes of the creation story (‘In the beginning . . .’, leading up to the creation of humans in God’s image) and echoes of the climax of the Exodus (‘The Word became flesh, and lived among us,’ 1.14, where the word ‘lived’ is literally ‘tabernacled’, ‘pitched his tent’, as in the construction of the tabernacle for God’s glory in the wilderness).  This, in other words, is where Israel’s history and with it world history reached their moment of destiny” (p. 77).  John’s Jesus “is a combination of the living Word of the Old Testament, the Shekinah of Jewish hope (God’s tabernacling presence in the Temple), and ‘wisdom’, which in some key Jewish writings was the personal self-expression of the creator God, coming to dwell with humans and particularly with Israel (see Wisdom 7; Sirach 24)” (p. 103).  

Climaxing his story with Jesus on the Cross, John portrays Him as “enthroned,” truly the King of Kings.   It was a new kind of Kingdom, one of Love and Truth rather than Power, as he explained to Pilate, and the Roman Procurator acted more presciently than he imagined when he had a sign (a typical public notice called a titulus) in Hebrew, Greek, and Latin —“JESUS OF NAZARETH, THE KING OF THE JEWS”—affixed to the Cross.  “The cross in John, which we already know to be the fullest unveiling of God’s and Jesus’, love (13:1), is also the moment when God takes his power and reigns over Caesar” (p. 146).  Cross and Kingdom, like hand and glove, go together.  “Jesus, John is saying, is the true king whose kingdom comes in a totally unexpected fashion, folly to the Roman governor and a scandal to the Jewish leaders” (p. 220).  “Part of John’s meaning of the cross, then, is that it is not only what happens, purely pragmatically, when God’s kingdom challenges Caesar’s kingdom.  It is also what has to happen if God’s kingdom, which makes its way (as Jesus insists) by non-violence rather than by violence, is to win the day.  This is the ‘truth’ to which Jesus has come to bear witness, the ‘truth’ for which Pilate’s world-view has no possible space (18:38)” (p. 230).  

Following the Crucifixion, of course, we read of the Resurrection and Ascension, fully affirming Jesus’ mission.  “It is the resurrection that declares that the cross was a victory, not a defeat.  It therefore announces that God has indeed become king on earth as in heaven” (p. 246).  Then comes Pentecost, when  the Spirit fully enters Jesus’ disciples, enabling them to “be for the world what Jesus was for Israel” (p. 119).  And just as Jesus battled satanic powers and tackled worldly tyrants, so too His followers continue that work.  The clash of kingdoms foreseen by Daniel and Isaiah and dramatically evident in the life of Jesus continues today.   As with Pilate, the paramount issue is Truth, to which Jesus and His followers bear witness.  This “truth is what happens when humans use words to reflect God’s wise ordering of the world and so shine light into its dark corners, bringing judgment and mercy where it is badly needed” (p. 145).  

Jewish prophets predicted the Messiah would inaugurate a theocracy—the righteous reign of God, ruling through human beings, stewards of His creation.  “Those who are put right with God through the cross are to be putting-right people for the world” (p. 244).  In the Temple—“the fulcrum of ancient Jewish theocracy”—priests and kings had joined to do God’s work in His world, with priests leading worship and kings establishing justice.  He Himself is the new temple, which, “like the wilderness tabernacle, is a temple on the move, as Jesus’ people go out, in the energy of the spirit, to be the dwelling of God in each place, to anticipate that eventual promise by their common and cross-shaped life and work” (p. 239).  

245 Refuting Relativism

While the recently installed Pope Francis urges empathy with the poor he also laments the spiritual poverty of those in bondage to what Benedict XVI called “the tyranny of relativism.”  He certainly follows St.  Francis of Assisi, urging us to be peacemakers—“But there is no true peace without truth!  There cannot be true peace if everyone is his own criterion, if everyone can always claim exclusively his own rights, without at the same time caring for the good of others, of everyone, on the basis of the nature that unites every human being on this earth.”  His papal predecessor, Benedict XVI, had warned:   “We are building a dictatorship of relativism that does not recognize anything as definitive and whose ultimate goal consists solely of one’s own ego and desires.”  While acknowledging that fanatics too easily assert their confidence in various “truths,” we should not cease discerning and proclaiming with certainty self-evident and trustworthy insights and convictions.  “That is why,” he said, “we must have the courage to dare to say:  ‘Yes, man must seek the truth; he is capable of truth.”   

Benedict’s admonitions would not have surprised Allan Bloom, who in 1987 wrote The Closing of the American Mind as “a meditation on the state of our souls, particularly those of the young, and their education” (p. 19).  Youngsters need teachers to serve as midwives—above all helping them deal with “the question, ‘What is man?’ in relation to his highest aspirations as opposed to his low and common needs” (p. 21).   University students, Bloom said, were “pleasant, friendly and, if not great-souled, at least not particularly mean-spirited.  Their primary preoccupation is themselves, understood in the narrowest sense” (p. 83), preoccupied with personal feelings and frustrations.  Not “what is man” but “who am I” is the question!  They illustrate “the truth of Tocqueville’s dictum that ‘in democratic societies, each citizen is habitually busy with the contemplation of a very petty object, which is himself’” (p. 86).  

     This preoccupation with self-discovery and self-esteem, Bloom believed, flowers easily in today’s relativism, a philosophical dogma espoused by virtually everyone coming to or prowling about the university.  Under the flag of “openness” and “tolerance,” no “truths” are acknowledged and everyone freely follows his own feelings.  So even the brightest of our young people know little about history, literature, or theology, for such knowledge resides in books, which remain largely unread, even in the universities.  Minds shaped by films, rock music and television have little depth, and “the failure to read good books both enfeebles the vision and strengthens our most fatal tendency—the belief that the here and now is all there is” (p. 64).  Deepening his analysis in a section titled “Nihilism, American Style,” Bloom diagnosed the philosophical roots of today’s educational malaise as preeminently rooted in Nietzsche, Freud, and Heidegger.  An enormous intellectual earthquake has shaken our culture to its foundations.  It is “the most important and most astonishing phenomenon of our time,” the “attempt to get ‘beyond good and evil’” by substituting “value relativism” for Judeo-Christian absolutism (p. 141).  

* * * * * * * * * * * * * * * * * * * * * * * * *

What concerned Bloom and the popes at the turning of the millennium was perceptively examined half-a-century earlier by C.S. Lewis in one of his finest books, The Abolition of Man (New York:  Macmillan, 1947).  First presented during WWII as a series of lectures, Lewis began by carefully examining an elementary English textbook he dubbed The Green Book.   While allegedly designed to help students read literature, the text was inadvertently a philosophical justification for relativism, promoting the notion that all values, whether aesthetic or ethical, are subjective and ultimately indefensible.  However, Lewis said:  “Until quite modern times all teachers and even all men believed the universe to be such that certain emotional reactions on our part could be either congruous or incongruous to it—believed, in fact, that objects did not merely receive, but could merit, our approval or disapproval, our reverence or our contempt.  The reason why Coleridge agreed with the tourist who called the cataract sublime and disagreed with the one who called it pretty was of course that he believed inanimate nature to be such that certain responses could be more ‘just’ or ‘ordinate’ or ‘appropriate’ to it than others.  And he believed (correctly) that the tourists thought the same.  The man who called the cataract sublime was not intending simply to describe his own emotions about it:   he was also claiming that the object was one which merited those emotions” (#148 in Kindle).  

Coleridge and others who believed in objective Truth (and truths) generally appealed to what Chinese thinkers revered to as “the Tao.  It is the reality beyond all predicates, the abyss that was before the Creator Himself.  It is Nature, it is the Way, the Road.  It is the Way in which the universe goes on, the Way in which things everlastingly emerge, stilly and tranquilly, into space and time.  It is also the Way which every man should tread in imitation of that cosmic and supercosmic progression, conforming all activities to that great exemplar” (#107).  We instantly recognize—through theoretical reason—certain laws of thought (e.g. the law of non-contradiction) or geometry (e.g. a line is the shortest distance between two points); we also know—through practical reason—certain permanent moral maxims (e.g. murder is wrong).  Any effort to reduce universal values to personal feelings inevitably founder in nihilistic confusion.    

In truth:  “All the practical principles behind the Innovator’s case for posterity, or society, or the species, are there from time immemorial in the Tao.  But they are nowhere else.  Unless you accept these without question as being to the world of action what axioms are to the world of theory, you can have no practical principles whatever” (#358).  “The human mind has no more power of inventing a new value than of imagining a new primary colour, or, indeed, of creating a new sun and a new sky for it to move in” (#398).  By disregarding the Tao, advocates of any new morality sink into a “void” without a  pattern to follow, a nihilistic abyss promoting “the abolition of Man” (#556).  

* * * * * * * * * * * * * * * * * * * * * * * * 

In The Book of Absolutes:  A Critique of Relativism and a Defense of Universals (Montreal:  McGill-Queen’s University Press, c. 2008), William D. Gairdner updates and amplifies an ancient and perennial proposition.  A distinguished Canadian Olympic athlete with degrees from Stanford University, Gairdner has effectively influenced the resurgence of conservatism in his native land.  Though he acknowledges the present power and pervasiveness of relativism, he finds it “a confused and false conception of reality that produces a great deal of unnecessary anxiety and uncertainty, both for individuals and for society as a whole” (#71).  To rectify this problem he wrote “a book to restore human confidence by presenting the truth about the permanent things of this world and of human existence” (#74).  

The current notion—that truth varies from person to person (or group to group), that all perspectives must be tolerated, that moral judgments must be suspended—has an ancient history which Gairdner explores, noting that earlier generations generally judged it “preposterous.  The ancient Greeks actually coined the word idiotes (one we now apply to crazy people) to describe anyone who insisted on seeing the world in a purely personal and private way” (#88).  There were, of course, Greek Sophists such as Protagoras who declared:  “As each thing appears to me, so it is for me, and as it appears to you, so it is for you.”  But their views withered under the relentless refutations of Socrates, Plato, and Aristotle—all defending objective truth and perennial principles—followed by Medieval philosophers such as Thomas Aquinas and Enlightenment scientists such as Isaac Newton.  

Dissenting from the traditional, absolutist position were thinkers such as Thomas Hobbes, who rejected any rooting of moral principles in a higher law, declaring in The Leviathan  that we label “good” whatever pleases us.  Indeed, the words good and evil “are ever used with relation to the person that usesth them:  there being nothing simply and absolutely so.”  A century later Hobbes’ subjectivism would be enshrined by a thinker markedly different from him in many respects, Immanuel Kant, “the most coolly influential modern philosopher to have pushed us toward all sorts of relativist conclusion” (p. 14).   Building on Kant’s position, Friedrich Nietzsche formulated the relativist slogan:  “there are no facts, only interpretations.”  American pragmatists and cultural anthropologists, European existentialists and deconstructionists took up the catchphrase, and today we live in a postmodern culture deeply shaped by epistemological skepticism and moral relativism.  

Sadly, this intellectual shift was bolstered by a profound popular misunderstanding and misrepresentation of one of the monumental scientific discoveries of all time, Einstein’s “Special Theory of Relativity.”  As historian Paul Johnson explains, “‘the belief began to circulate, for the first time at a popular level, that there were no longer any absolutes:   of time and space, of good and evil, of knowledge, and above all of value.  Mistakenly but perhaps inevitably, relativity became confused with relativism.’  And, he adds, ‘no one was more distressed than Einstein by this public misapprehension’” (p. 17).  Nevertheless, it served as a scalpel that helped “‘cut society adrift from its traditional moorings in the faith and morals of Judeo-Christian culture’” (p. 18).   

After explaining various forms of relativism—noting that its moral and cultural forms are most prevalent and pernicious—Gairdner registers some objections to it.  It is, importantly, “self-refuting,” basing its entire case upon the absolute assertion that there are no absolutes.  Thus it cannot withstand Aristotle’s powerful argument, set forth in his Metaphysics, showing how it violates the law of non-contradiction.  That various persons or cultures claim different “truths” hardly affects the fact that a great many beliefs are manifestly wrong, whereas others (e.g. the earth is spherical) are demonstrably right.  The fact that some groups of people (“sick societies”) have condoned human sacrifice or infanticide hardly justifies these practices.  Admittedly, some truths—both scientific (a heliocentric solar system) and moral (slavery is wrong)—become clear only after considerable time or laborious investigation, but that only strengthens their certainty.  Thus:  “Neither believing nor doing makes a thing right or wrong” (p. 39).  

Challenging relativism, Gairdner invokes scholars such as Donald E. Brown, a professor of anthropology, whose 1991 publication, Human Universals, effectively refutes many sophomoric relativistic mantras by listing more than 300 human universals.  For example:  “All humans use language as their principal medium of communication, and all human languages have the same underlying architecture, built of the same basic units and arranged according to implicit rules of grammar and other common features.  All people classify themselves in terms of status, class, roles, and kinship, and all practice division of labour by age and gender.  . . . .  All have poetry, figurative speech, symbolic speech, ceremonial speech, metaphor, and the like.  All use logic, reckon time, distinguish past, present, and future think in causal terms, recognize the concept and possibility of cheating and lying, and strive to protect themselves from the same’” (p. 64).  Everywhere and at all times we humans have acknowledged such universal realities.  

There are indubitable, demonstrable constants (laws) throughout the natural world—notably the law of gravity and Einstein’s famous theorem, E=mc2.  Material things, moving through space, continually change; but laws remain the same.  As David “Berlinski puts it, ‘the laws of nature by which nature is explained are not themselves a part of nature.  No physical theory predicts their existence nor explains their power.  They exist beyond space and time, they gain purchase by an act of the imagination and not observation, they are the tantalizing traces in matter of an intelligence that has so far hidden itself in symbols’” (p. 76).  Still more, says Berlinski:  “‘We are acquainted with gravity through its effects; we understand gravity by means of its mathematical form.  Beyond this, we understand nothing” (p. 78).  

Analogously, as human beings we are, by nature, “hardwired” with important “universals.”  The “blank-slate” notion, promulgated by empiricists such as John Locke and B.F. Skinner, cannot support the accumulating genetic and cognitive evidence concerning our species.  Beyond using the same nucleic acid and DNA basic to all living organisms, we do a great number of remarkable things:  negotiating contracts; acting altruistically and even sacrificially; establishing elaborate kinship ties; walking erectly; engaging in year-round sexual encounters; recognizing ineradicable male-female differences; manifesting an inexplicably miraculous intelligence, reason, and free will.  Gairdner patiently compiles and evaluates the massive evidence available to show that a multitude of “universals” do in fact define us as human beings.  “Contrary to the claims of relativists everywhere that nothing of an essential or innate human nature exists, we find that there is indeed a basic universal, biologically rooted human nature, in which we all share.  This is so from DNA to the smiling instinct.  It is a human nature that runs broad and deep, and nothing about it is socially constructed or invented” (p. 162).  This is markedly evident in the “number of universals of human language” (p. 217) discovered by scholarly linguists such as Noam Chomsky, who insists “‘there is only one human language,’ which he and his followers later labeled ‘UB,’ or ‘Universal Grammar’”(p. 229).  As an English professor, Gairdner devotes many pages, in several chapters, to an explanation and analysis of recent developments in the study of language, truly one of the defining human characteristics.  Importantly, it can “be seen as a mirror of the internal workings of the mind rather than as a mirror of the external workings of culture or society” (p. 291).  

Embedded within this human nature we find a natural law prescribing moral norms.  “The traditional natural law is therefore based on four assumptions:  ‘1.  There are universal and eternally valid criteria and principles on the basis of which ordinary human law can be justified (or criticized).  2.  These principles are grounded both in nature (all beings share certain qualities and circumstances) and in human nature.  3.   Human beings can discover these criteria and principles by the use of right reason.  4.  Human law is morally binding only if it satisfies these criteria and principles’” (p. 164).  It is a hallmark of the philosophia perennis articulated by classical (Plato; Aristotle; Cicero) and Christian (Aquinas; Leibniz; C.S. Lewis) thinkers and noted for its common sense notions regarding God, man, and virtuous behavior.  

Espoused by some of the great architects of Western Civilization—including Aristotle and Cicero,  Aquinas and Calvin, Sir William Blackstone and the American Founders, the Nuremberg judges and Martin Luther King—the natural law tradition has provided the foundation for “the rule of law” so basic to all our rights and liberties.   As defined by Cicero:  “‘true law is right reason in agreement with nature, universal, consistent, everlasting, whose nature is to advocate duty by prescription and to deter wrongdoing by prohibition.’  He further stated that ‘we do not need to look outside ourselves for an expounder or interpreter of this law.’  God, he said, is the author and promulgator and enforcer of this law, and whoever tries to escape it ‘is trying to escape himself and his nature as a human being’” (p. 183).  

So, Gairdner explains:  “The precepts of natural law for rational human creatures are, then, rational directives of logic and morality aimed at the common good for humanity and at avoidance of everything destructive of the good.  This means that human rational fulfillment may be found in such things as preserving the existence of ourselves and others by begetting and protecting children, by avoiding dangers to life, by defending ourselves and our loved ones, by hewing to family and friends, and of course, by hewing to reason itself.  We know many such standards in religion as commandments.  In daily life we know them as natural commands and prohibitions:  love others, do unto them as you would have them do unto you, be fair, do not steal, do not lie, uphold justice, respect property, and so on” (p. 189).  Such precepts, as Aquinas insisted, are intuitively known, per se nota; they are as self-evident as the geometric axioms of Euclid or the North Star’s fixed location in the night sky.  Thus murder and lying and theft and rape are rightly recognized as intrinsically evil.  Honoring one’s parents, respecting the dead, valuing knowledge, and acting courageously are rightly deemed good.  

During the past century, as relativism has flourished, the natural law tradition was widely attacked and abandoned.  Strangely enough, most relativists grant the existence of “an extremely mysterious law of gravity that controls all matter but that is not itself a part of matter, but they will not consent to other types of laws that govern or guide human behaviour, such as a natural moral law” (p. 210).    Apparently, Gairdner says, one of the reasons “for the decline of natural law during the rise of the modern state is that just about every law school, every granting institution, every legal journal, and every public court and tribunal is largely funded by the state.  And no modern state wants to be told by ordinary citizens that any of its laws are not morally binding.  That is why Lord Acton referred to natural law as ‘a revolution in permanence.’  He meant a revolution by those who cherish a traditional society and a morality rooted in human nature against all those who attempt to uproot, reorder, and deny or replace these realities” (p. 182).  

The modern repudiation of absolutes followed developments in 19th and 20th century German philosophy, evident in Hegel, Nietzsche and Heidegger, reaching their apex in Nazi Germany.  Classical and Christian advocates of transcendent metaphysical principles, such as Plato and Aquinas, were discarded by a corps of “existentialists” determined to move “beyond good and evil” and devise a purely secular, humanistic ethos.  French intellectuals, following the lead of Jacques Derrida and Michel Foucault, imported Nietzsche and Heidegger, setting forth the currents of “deconstruction” and “postmodernism” so triumphant in contemporary universities and media centers.  “It was all an echo of Nietzsche’s ultra-relativist claim (later elaborated by Heidegger) that ‘there are no facts, only interpretations’” (p. 252).  

Ironically, Derrida himself, toward the end of his life, announced an important “Turn” in his thought.  He acknowledged “the logical as well as spiritual need for a foundation of some kind.  And out it came, as quite a shock to his adamantly relativist followers:  “‘I believe in justice.  Justice itself, if there is any, is not deconstructible’” (p. 266).  Derrida simply illustrates the perennial power of the natural law—there are some things we just can’t not know!    Derrida’s “Turn” underscores what Gairdner endeavors to do in this book:  “to expose the intellectual weakness of the relativism that pervades modern—especially postmodern—thought and also to offer a basic overview of the absolutes, constants, and universals that constitute the substance of the many fields explored here.  We have see them at work in culture through human universals, in physics via the constants of nature, in moral and legal thought via the natural law, and in the human body via our hardwired biology.  And not least, of course, in view of its close approximation to human thought itself, we have looked at the constants and universals of human language” (p. 308).  He persuasively demonstrates that “we do not live in a foundationless or relativistic world in which reality and meaning, or what is true and false, are simply made up as we go along and according to personal perceptions.  On the contrary, we live in a world in which every serious field of human thought and activity is permeated by fundamentals of one kind or another, by absolutes, constants, and universals, as the case may be, of nature and of human nature” p. 308).  

# # # 

244 Fewer . . . and Fewer of Us

  Among the handful of must-read 20th century dystopian novels—Aldus Huxley’s Brave New World; George Orwell’s 1984; C.S. Lewis’s That Hideous Strength—is P.D. James’s The Children of Men (New York:  Penguin Books, c. 1992), which prods both our imagination and reason by envisioning the potential consequences of demographic trends.  James, a distinguished novelist, known mainly for her riveting (and philosophically nuanced) mystery stories, portrays the world in 2021, twenty-six years after the last baby was born, dooming the race to extinction.  She challenged, in a powerful artistic way, one of the prevailing orthodoxies of our day—the threat of overpopulation.  The Children of Men boldly countered the message of Paul Ehrlich’s 1968 best-selling Population Bomb (one of the most egregiously misguided books published during that pivotal decade), which fueled to the mounting fears of ecological catastrophe then gripping the environmental community.  Because earth’s resources are finite, he predicted:  “In the 1970s the world will undergo famines—hundreds of millions of people are going to starve to death.”  Ehrlich was duly lauded by the academic community (after all he was a certified member of the elite, a professor at Stanford University with an enviable reputation as an entomologist) and courted by the complacent media (Johnny Carson gushing over him for his prescience).  

One of the few journalists willing to risk censure by differing with Ehrlich was Ben J. Wattenberg, who warned of actual population implosion rather than explosion.  Two decades later he wrote The Birth Dearth, examining the “Total Fertility Rate” which was falling around the globe.  He vainly hoped to stimulate a national conversation on the subject, but few (alas) recognized the reality of birth dearth.  Returning to his concern in Fewer:  How the New Demography of Depopulation Will Shape Our Future (Chicago:  Ivan R. Dee, c. 2004), he argued that “never have birth and fertility rates fallen so far, so fast, so low, for so long, in so many places, so surprisingly” (p. 5).  “For at least 650 years,” he says, “the total number of people on earth has headed in only one direction:  up.  But soon—probably within a few decades—global population will level off and then likely fall for a protracted period of time” (p. 5).  

European, Russian and Japanese populations are virtually in free fall, with a Total Fertility Rate (TFR) significantly less than requisite replacement levels (2.1 per woman).  Europe’s population will likely shrink from 728 million in 2000 to 632 million in 2050.  To replace lost babies, Europe would need to take in nearly two million (rather than the current 376,000) immigrants each year.  Russia has a TFR of 1.14 and will lose 30 percent of its population by mid-century.  Not only are folks having fewer children—they want less!  And lest we think this is true only of prosperous, highly industrialized nations, it also applies to Less Developed Countries, where a dramatic reduction in population growth has occurred within the past few decades.  China, for example, had a TFR of 6.06 forty years ago; after instituting a “one child only” policy, by  the beginning of the millennium the TFR fell to 1.8!  Similarly, South Korea’s 2005 rate fell to 1.17.  Brazil and Mexico reveal the same depopulating trajectory.  In fact, few nations are repopulating themselves.  America, however, has proved somewhat exceptional, sustaining a replacement level fertility rate—in part through welcoming immigrants who frequently have large families.  

To explain this phenomenon, Wattenberg points first to increased urbanization, where children are something of a liability rather than an asset.  Secondly, as women pursue higher education and careers—and as couples earn more money—they have proportionally fewer children.  “More education, more work, lower fertility” (p. 96).  Thirdly, abortion disposes of 45 million developing children every year.  Fourthly, divorce lowers fertility as single women welcome fewer children than their married counterparts.  Fifthly, contraception (exploding since the ‘60s) empowers couples to enjoy sexual pleasure without undesired consequences.  And sixthly, since couples marry later in life they inevitably have fewer offspring.  The ominous consequences of this depopulation cannot be ignored, because the welfare states established in virtually all modern countries simply cannot support growing numbers of elderly retirees funded by dwindling numbers of younger workers.  Successful businesses thrive by employing creative young workers and selling goods to expanding numbers of consumers—essential factors inevitably absent as populations decline.  Nations—and civilizations—will lose power and influence as their numbers decline.  Less free, less enlightened dictatorial successors may very well replace them.  The world, quite simply, will be a radically different place within a century.  

* * * * * * * * * * * * * * * * * * * * * * 

In What to Expect When No One’s Expecting:  America’s Coming Demographic Disaster (New York:  Encounter Books, c. 2013) Jonathan V. Last details the latest data regarding population prospects.  The book’s title reveals its thesis:  no one’s expecting these days—and paradoxically, as P.J. O’Rourke quipped, “the only thing worse than having children is not having them.”  Failing to heed the Bible’s first injunction—“be fruitful, and multiply, and replenish the earth”—modern man faces an uncertain prospect bereft of children, coddling pets as their “fuzzy, low-maintenance replacements” (p. 3).  The earth, it seems, will grow emptier.  Indeed, “only 3 percent of the world’s population lives in a country whose fertility rate is not declining” (p. 92).  We are moving from the “First Demographic Transition,” wherein children took center-stage and politicians built careers on looking out for them, to the “Second Demographic Transition,” wherein individual adults shun both families and children to pursue their own careers and pleasures.  “Middle-class Americans don’t have very many babies these days.  In case you’re wondering, the American fertility rate currently sits at 1.93,” significantly below the requisite replacement level (p. 4).  At the moment, the deficit is rectified by Hispanic women, who average 2.35 babies, but they too are rapidly choosing to have less and less.  For example, the once-plenteous supply of Puerto Rican immigrants to New York has collapsed as the island’s birthrate shrunk in 50 years from 4.97 to 1.64.  “Some day,” Last says, “all of Latin America will probably have a fertility rate like Puerto Rico’s.  And that day is coming sooner than you think” (p. 113).  Labor shortages in Latin countries will eliminate the need to emigrate and the U.S. population picture will quickly resemble Europe’s.    

Glancing abroad, by 2050 Greece may lose 20 percent of its people; Latvia has, since “1989 lost 13 percent” and “Germany is shedding 100,000 people a year” (p. 25).  Spain registers barely one baby per woman.  Japan’s population is shrinking and abandoned rural villages bear witness to the trend.  It’s the same in Russia:  “In 1995, Russia had 149.6 million people.  Today, Russia is home to 138 million.  By 2050, its population will be nearly a third smaller than it is today” (p. 25).  Consequently, Vladimir Putin has zealously promoted a variety of failing schemes designed to encourage women to have more children.  But they choose not to!  Other things seem more important.  “Divorce has skyrocketed—Russia has the world’s highest divorce rate.  Abortion is rampant, with 13 abortions performed for every 10 live births.  Consider that for a moment:  Russians are so despondent about the future that they have 30 percent more abortions than births.  This might be the most grisly statistic the world has ever seen.  It suggests a society that no longer has a will to live” (p. 137).  

Portents of things to come stand revealed in Hoyerswerda, a German city near the Polish border.  In 1980 the town had 75,000 residents and “the highest birth rate in East Germany” (p. 98).  With the collapse of the Soviet Union, the folks there (and, more broadly, throughout the former East Germany) simply stopped procreating.  The fertility rate abruptly plunged to 0.8 and within three decades the town lost half of its residents.  Hoyerswerda “began to close up shop” (p. 98).  Buildings, businesses, and homes stood vacant.  Similar developments across the country dictated a significant shift from “urban planning” to expand and develop infrastructures and suburbs to devising ways “to shrink cities” (p. 98).  Parks now proliferate, replacing factories and schools.  The wolf population is actually resurgent, with wolf-packs prowling around dwindling settlements.  

Mirroring these European trends is Old Town Alexandria (the D.C. suburb where Last and his wife lived for a while)—a “glorious preserve of eco-conscious yoga and free range coffee.  My neighbors had wonderfully comfortable lives in part because they didn’t take on the expense of raising children” (p. 25).  As a portent of things to come, white, college-educated women, shopping in Alexandria’s stylish boutiques and devotedly determined to pursue a career, have a fertility rate of 1.6—barely more than Chinese women after decades of that nation’s recently-suspended “one-child” policy.  In 1970 the average Chinese woman bore six children and the Communist regime envisioned multiple problems with the ticking population bomb.  Energetic policies were implemented until quite recently, when the rulers realized the implications of population implosion.  But a trajectory has been set and within forty years “the age structure in China will be such that there are only two workers to support each retiree” (p. 13).  

Looking to explain this world-wide pattern, the author lists a variety of “factors, operating independently, with both foreseeable and unintended consequences.  From big things (like the decline in church attendance and the increase of women in the workforce) to little things (like a law mandating car seats in Tennessee or the reform of divorce statutes in California), our modern world has evolved in such a way as to subtly discourage childbearing” (p. 16).  Certainly there are good reasons not to procreate.  Heading the list is money.  Rearing a child may very well cost parents a million dollars!  Financially, it’s the worst investment possible!  “It is commonly said that buying a house is the biggest purchase most Americans will ever make.  Well, having a baby is like buying six houses, all at once.  Except you can’t sell your children, they never appreciate in value, and there’s a good chance that, somewhere around age 16, they’ll announce:  ‘I hate you’” (p. 43).  

Complicating this are welfare state structures such as Medicare and Social Security that “actually shift economic incentives away from having children” (p. 46).  Though Texas Governor Rick Perry was ridiculed for suggesting it, Social Security really is a “Ponzi scheme” that will only work “so long as the intake of new participants continues to increase” (p. 107).  In 1950 three million folks were getting Social Security checks; thirty years later there were 35 million retirees expecting monthly payments; by 2005 nearly 50 million were on the role, taking $546 billion a year from taxpayers still working.  In its initial (New Deal) phase, Social Security only exacted one percent of a worker’s paycheck; 30 years later (under LBJ’s Great Society) the rate inched up to three percent; by 1971 it jumped to 4.6 percent; and today (shielded from any adjustments by Barack Obama) it amounts to 6.2 percent.  The sky, you might say, is the limit as an endless line of elders look to their shrinking numbers of children to pay the bills.  The same goes for Medicare—except the prognosis is worse by far!  It simply cannot survive in it present form, given the realities of a shrinking population.  Both programs “were conceived in an era of high fertility.  It was only after our fertility rate collapsed that the economics of the programs became dysfunctional” (p. 109).  

Yet looming above all else is “the exodus of religion from the public square” (p. 84).  Devout Catholics and Protestants have more kids.  They shun the behaviors facilitating population decline—contraception, abortion, cohabitation, delayed marriage, divorce—and church-going couples fully enjoy marriage in ways unavailable to their secular counterparts.  Practicing Protestants increasingly resemble practicing Catholics, procreating more than enough youngsters to support population growth.  But non-religious women, according to a 2002 survey, had a fertility rate of only 1.8, whereas women who rated religion as “not very important” clocked in at 1.3.  Ultimately “there’s only one good reason to go through the trouble” of rearing children:  “Because you believe, in some sense, that God wants you to” (p. 170).  For this reason our government especially should craft family-friendly, child-friendly policies—repudiating the feminist and homosexual ideologies shaping the laws and judicial decrees of the past half-century.   

* * * * * * * * * * * * * * * * * * * * * * 

Columnist Mark Steyn, whether writing or speaking, is justly renowned for his infectious humor and verbal dexterity, bringing to his discussions of serious subjects a sustained note of good cheer.  There is little to cheer about, however, in Steyn’s America Alone:  The End of the World as We Know It (Washington:  Regnery Publishing, Inc., c. 2006) wherein he casts a gloomy look at demographic realities and predicted “the Western world will not survive the twenty-first century, and much of it will effectively disappear within our lifetimes, including many if not most European countries” (p. xiii).  Within 40 years “60 percent of Italians [once lionized for their large and boisterous families] will have no brothers, no sisters, no cousins, no aunts, no uncles” (p. xvii).  Declining populations will leave welfare states unsustainable and civilization unfeasible.  “Civilizations,” said Arnold J. Toynbee, in A Study of History, “die from suicide, not murder,” and Western Civilization is in the midst of self-inflicted mortal wounds.  “We are,” Steyn says, “living through a rare moment:  the self-extinction of the civilization which, for good or ill, shaped the age we live in” (p. 3).  

Though we’re tempted to think such things have never happened before, Steyn jolts us with a quotation from Polybius (c. 150 B.C.), one of the greatest ancient historians, who said:  “In our own time the whole of Greece has been subject to a low birth rate and a general decrease of the population, owing to which cites have become deserted and the land has ceased to yield fruit, although there have neither been continuous wars nor epidemics. . . .  For as men had fallen into such a state of pretentiousness, avarice, and indolence that they did not wish to marry, or if they married to rear the children born to them, or at the most as a rule but one or two of them, so as to leave these in affluence and bring them up to waste their substance, the evils rapidly and insensibly grew” (The Histories, XXXVI).    

Basic to demographic decay, as both Polybius and Steyn argue, is an apparently irreversible moral and spiritual decay leaving listless increasing numbers of people.  Irreligious folks inevitably lose faith not only in an invisible God but in equally invisible ethical principles and reasons to live hopfully for the future.  Thus Europe’s population has plunged like a raft going over a waterfall in the wake of the de-Christianization of the continent.  Childless places like Japan and Singapore and Albania have little religious vitality.  Standing alone in the midst of all this is the United States, which still enjoys a modest population growth.  True to form, the U.S. is the extraordinary Western nation still featuring robust religious activity.  Unfortunately this may not long persist since “most mainline Protestant churches are as wedded to the platitudes du jour as the laziest politician” (p. 98).  They “are to one degree or another, post-Christian.  If they no loner seem disposed to converting the unbelieving to Christ, they can at least convert them to the boggiest of soft-left political clichés, on the grounds that if Jesus were alive today he’d most likely be a gay Anglican bishop in a committed relationship driving around in an environmentally friendly car with an ‘Arms Are for Hugging’ sticker on the way to an interfaith dialogue with a Wiccan and a couple of Wahhabi imans” (p. 100).  Without a resurgence of orthodox, muscular Christianity, Steyn thinks, America too will soon choose the childless path to historical oblivion.  

In addition to population implosion, Steyn devotes much attention in America Alone to the threat of Islamic imperialism, facilitated by the growing passivity—the unwillingness to resist terrorism—throughout much of what was once labeled “Western Civilization.”  Indicative of the trend was “the decision of the Catholic high school in San Juan Capistrano to change the name of its football team from the Crusaders to the less culturally insensitive Lions” (p. 158).  (Simultaneously, 75 miles to the south, lock-stepping with the culture, Point Loma Nazarene University—while I was on the faculty—similarly  changed its mascot from Crusaders to Sea Lions.)  This loss of a masculine will-to-fight, as will as the will-to-procreate, signifies a collapsing culture.  Indeed, the “chief characteristic of our age is “‘deferred adulthood’” (p. 191).  And it takes strong adults to create and sustain a culture.  

* * * * * * * * * * * * * * * * * * * 

However realistically we appraise the threat of Islamic Jihadism, demographic realities foretell coming calamities in Muslim lands during the next half-century.  This prospect becomes clear in David P. Goldman’s How Civilizations Die (And Why Islam Is Dying Too) (Washington:  Regnery Publishing, Inc., c. 2011).  Growing numbers of us are aware of the “birth dearth” haunting much of the world, but because it’s underreported, few know that within four decades “the belt of Muslim countries from Morocco to Iran will become as gray as depopulating Europe” (p. x).  For example females in Iran, though now surrounded by half-a-dozen of their siblings, will themselves “bear only one or two children during their lifetimes” (p. x).  “The fastest demographic decline ever registered in recorded history is taking place today in Muslim countries” (p. xv).  

Along with his description of demographic patterns in Muslim nations, Goldman’s discussion of “four great extinctions” makes his book worth pondering.  The first extinction took place a millennium before Christ, with the passing of the Bronze Age and the disappearance of cities such as Troy, Mycenae, and Jericho.  The second extinction, two hundred years before Christ, enervated the Hellenic civilization once centered in cities such as Athens and Sparta.  Aristotle says Sparta shrank within a century from 10,000 to 1,000 citizens.  Increasingly large landed estates, run for the benefit of an ever diminishing aristocracy less and less concerned with rearing children, indulging themselves with sexual perversions such as pederasty, left Sparta bereft of people and militarily listless.  The city was, he said  “ruined for want of men.” (Politics, II, ix).   

The third extinction marked the end of Rome’s power and grandeur in the fourth and fifth centuries A.D.  Even in the glory days of the Empire, when Augustus Caesar reigned, “there was probably a decline” in the empire’s population due to “the deliberate control of family numbers through contraception, infanticide and child exposure” (p. 131).  Augustus himself decreed punishments for “childlessness, divorce, and adultery among the Roman nobility” (p. 131), but nothing worked, and the empire’s needed laborers and soldiers were necessarily drawn from defeated (or volunteering) barbarians from the North.  We are now in the midst of the fourth extinction, when civilizations (East and West) are beginning to show symptoms of rigor mortis.   This extinction began in many ways with the French Revolution in 1789, the “world’s first attempt to found a society upon reason rather than reason” (p. 134), followed by the subsequent waves of revolutionary activity, that transformed Europe into a bastion of atheistic and anti-natal secularism.  

Though Islam seems to be a vibrant religion, currently regaining its virility through movements such as the Muslim Brotherhood, Goldman thinks it is in fact violently (and vainly) reacting to the global secularism fully evident in the dramatic decline of population throughout the Islamic world.  Joining “Western Civilization,” Islam is another dying culture!  So fewer and fewer of us will inherit the earth.