250 The Roots of Radical Islam

 To understand radical Islam’s emergence during the last half of the 20th century, Lawrence Wright’s The Looming Tower:  Al-Qaeda and the Road to 9/11 (NY:  Alfred A. Knopf, 2006) remains one of best researched, most readable surveys.  He tells how a small cadre of religious zealots—most notably Sayyid Qutb, Ayman  al-Zawahiri, and Osama bin Laden—deliberately upended our world.  And they did so, in part, because the United States routinely failed to understand, withstand, and respond to their assaults.  Amazingly, despite recurrent warnings and violent episodes, almost no one in America took them seriously.  “It was too bizarre, too primitive and exotic.  Up against the confidence that Americans placed in modernity and technology and their own ideals to protect them from the savage pageant of history, the defiant gestures of bin Laden and his followers seemed absurd and even pathetic” (p. 6).  

Wright begins his account portraying Sayyid Qutb, an Egyptian school teacher who came to the United States in 1949.  Filled with hatred for the new nation of Israel and shocked by its triumph over Arab armies, he found in America added fuel for the Islamic zeal consuming his soul.  The shame and shock at the establishment of Israel “would shape the Arab intellectual universe more profoundly than any other event in modern history.  ‘I hate those Westerners and despise them!’ Qutb wrote after President Harry Truman endorsed the transfer of a hundred thousand Jewish refugees into Palestine.  ‘All of them, without any exception:  the English, the French, the Dutch, and finally the Americans, who have been trusted by many’” (p. 9).  His hatred, interestingly enough, didn’t deter him from coming to study in America! 

Though generally well-treated by the ordinary folks in Washington D.C. and Greeley, Colorado, where Qutb briefly studied and continued his writing projects, he looked for and found proof of America’s degeneracy in such events such as a church dance, the freedom enjoyed by women, and publications such as the spurious Kinsey Report.  Hostility to America meshed easily with hostility to Israel to form the core of his world view, and when he returned to Egypt he believed :  “Modern values—secularism, rationality, democracy, subjectivity, individualism, mixing of the sexes, tolerance, materialism—had infected Islam through the agency of Western colonialism.  America now stood for all that” and he was persuaded “that Islam and modernity were completely incompatible” (p. 24).  Ultimately  imprisoned by General Abdul Nasser—the first truly native-born Egyptian to rule Egypt in 2500 years—he wrote Milestones, an enormously influential treatise, to recall Muslims to the pristine purity of their 7th century origins.  For radical Muslims, Qutb’s Milestones resembles Hitler’s Mein Kampf or Lenin’s What Is to Be Done.  

The second significant Islamist was Ayman al-Zawahiri, a medical doctor from a prominent family who was reared in an upscale Cairo suburb.  During his student years he absorbed and quickly promoted Qutb’s version of Islam.  He too was distressed by the mere existence of Israel and felt humiliated by Egypt’s collapse in the 1967 war—a decisive “psychological turning point in the history of the modern Middle East.  The speed and decisiveness of the Israeli victory in the Six Day War humiliated many Muslims who had believed until then that God favored their cause.  . . . The profound appeal of Islamic fundamentalism in Egypt and elsewhere was born in this shocking debacle” (p. 38).  Fiercely nationalistic, Zawahiri envisioned reestablishing the Muslim Caliphate centered in Egypt and enabling Islam to authentically flourish, dominating planet earth.  He launched an underground movement (al-Jihad) designed to overthrow the secular regime in his country.  Accused of involvement in the assassination of President Anwar Sadat, Zawahiri was imprisoned for three years and early emerged as the spokesman for the defendants.  

The third and most infamous protagonist in Wright’s story is “The Founder,” Osama bin Laden, one of the many sons of Mohammed bin Laden, one of Saudi Arabia’s most prosperous businessmen.  He was especially close to King Abdul Aziz and did much of the construction work on the renovation of the Grand Mosque in Mecca, which can hold a million worshippers.  Though expected to take his place in his father’s extensive business empire, young Osama bin Laden joined the Muslim Brothers while in high school and began to show less interest in making money than establishing Islamic states.  While studying in King Abdul Aziz University in Jeddah he turned increasingly religious, taking the Salafist position that declares versions of Islam other than that espoused by Saudi Arabian Wahhabis heretical.  Like Zawahiri he was deeply moved by the writing of Sayyid Qutb and embraced his anti-American agenda.  “Bin Laden would later say that the United States had always been his enemy.  He dated his hatred for America to 1982, ‘when America permitted the Israelis to invade Lebanon and the American Sixth Fleet helped them’” (p. 151).  

   When Soviet troops invaded Afghanistan in 1979, radical Muslims rallied to defend Islam, so both Zawahiri and bin Laden made their way to the fields of conflict.  Much much of their activity, however,  took place in nearby Pakistan, where bin Laden proved especially useful in fundraising.  Here they and their followers engaged in endless discussions regarding jihadist strategies and sought to train young warriors to give their lives to the cause.  The Afghans fought and won the war against Russia, whereas the Arabs recruited by Zawahiri and bin Laden mainly looked for opportunities to die as martyrs for Islam.   Thus forged amidst the Afghan War, the ideology and methodology of Al Qaeda were basically in place by 1988.  In particular, Islamic rationalizations for suicide missions and terrorist attacks on innocent civilians coalesced within the principle of takfi—a license for true believers “to kill practically anyone and everyone who stood in their way; indeed, they saw it as a divine duty” (p. 125).    

As the war in Afghanistan wound down, bin Laden returned to Jeddah, Saudi Arabia, where he enjoyed a celebrity status for his “divine mission” in Afghanistan.  In his native land the Wahhabi version of Islam had gained strength:  theaters were closed, music (“the flute of the devil” bin Laden said) virtually disappeared, and women’s activities were seriously circumscribed.  But for radicals like bin Laden even this was not sufficient and his activities increasingly irritated King Fahd and the princes ruling the Kingdom.  When, for example, Iraq conquered Kuwait and threatened Saudi Arabia, bin Laden objected to allowing American troops to defend the kingdom.  He and his jihadists, he declared, could (with Allah’s aid) repel any invasion of Arabian peninsula’s sacred soil.  But King Fahd,  trusting in tanks rather than jihadists, invited the Americans to establish bases and successfully overturn Saddam Hussein’s conquests.  

At odds with Saudi rulers (who ultimately revoked his citizenship), bin Laden then moved to Sudan, where he bought land near Khartoum and tried to both farm and launch various business enterprises.  His very presence added considerably to Sudan’s financial status and he seemed momentarily content.  But he soon fell in with a radical Imam (Abu Hajer) who encouraged him to attack the United States, “the last remaining superpower” threatening Islam.  He and al-Qaeda would henceforth target American troops and murder innocents—concentrating “not on fighting armies but on killing civilians” (p. 175).  By this time he had come to despise the United States as “weak and cowardly,” urging his followers to remember Vietnam and Lebanon.  When a few of their soldiers die, he said,  Americans retreat!  “For all its wealth and resources, America lacks convictions.  It cannot stand against warriors of faith who do not fear death” (p. 187).   President Bill Clinton’s cowardly withdrawal from Somalia in 1993 had further confirmed bin Laden’s growing contempt for the USA.    

Amidst deteriorating conditions, bin Laden left Sudan in 1996 financially ruined, his family scattered, and his organization broken.  “He held America responsible for the crushing reversal that had led him to this state” (p. 223).  On August 23, 1996, in his “Declaration of War Against the Americans Occupying the land of the Two Holy Places,” he said:  “You are not unaware of the injustice, repression, and aggression that have befallen Muslims through the alliance of Jews, Christians, and their agents, so much so that Muslims’ blood has become the cheapest blood and their money and wealth are plundered by the enemies” (p. 234).  Barred from returning to Saudi Arabia, he settled in Afghanistan, now controlled by Mullah Mohammed Omar and the Taliban.  Joined by a group of Egyptians following Zawahiri, he began training terrorists such as Mohammed Atta to take down America.  In  1998, Zawahiri drafted a document calling on “all of the different mujahideen groups that had gathered in Afghanistan” to launch  “a global Islamic jihad against America” (p. 259).  

This fatwa, signed by bin Laden as well as Zawahiri, declared that the killing of “Americans and their allies—civilian and military—is an individual duty for every Muslim who can do it in any country in which it is possible to do it’” (p. 260).  Soon thereafter the jihadists orchestrated the nearly simultaneous bombings of American embassies in Kenya (killing 213 and injuring thousands of people) and Tanzania (killing 11 and wounding 85).  Two years later the USS Cole was nearly sunk by a suicide attack in Aden, Yemen’s deep water port, killing 17 sailors.  To bin Laden:  “The destroyer represented the capital of the West, and the small boat represented Mohammed.”  

But other than haphazardly launching a few missiles and issuing threats, Bill Clinton and his administration did nothing.  In the waning days of his presidency he tried “to burnish his legacy by securing a peace agreement between Israel and Palestine. (p. 331).   Within a year, however, culminating the jihadist offensive, came September 11, 2001 and with the collapsing New York towers the world woke up to al-Qaeda, bin Laden, and the threat posed by radical Islam!  

* * * * * * * * * * * * * * * * * * * 

In Nazi Propaganda for the Arab World (New Haven:  Yale University Press, c. 2009), Professor Jeffrey Herf “documents and interprets Nazi Germany’s propaganda efforts aimed at Arabs and Muslims in the middle East and North Africa” (p. 1).  From 1939 to 1945 a steady stream of anti-Semitic, anti-Allies propaganda reached millions of Muslims via shortwave radio.  These broadcasts both “attributed enormous power and enormous evil to the Jews” (p. 2) and promised that an Axis victory would free “the countries of the Middle East from the English yoke and thus realize their right to self-determination” (p. 3).  In time, of course, the Allies won WWII and little came of the Nazi endeavor to establish a foothold in the Islamic world.  But the broadcasts’ rhetoric, it can be argued, helped shape the mindset of today’s radical Muslims, for the same anti-Semitic, anti-Western message routinely circulates throughout their world.

Central to the story is Haj Amin el-Husseini, the Grand Mufti of Jerusalem, who resided in Berlin during WWII and was “the most important public face and voice of Nazi Germany’s Arabic-language propaganda” (p. 8).  He and his family were influential and he “led opposition to the Balfour Declaration and to Jewish immigration to Palestine” (p. 8).  In Berlin, he met and associated with Adolf Hitler, Heinrich Himmler, and other important Nazis.  He assured Hitler that “the Fuhrer was ‘admired by the entire Arab world.’  He thanked him for the sympathy he had shown to the Arab and especially the Palestinian cause” (p. 76).  Hitler responded by assuring Husseini that Arabs would be liberated from English domination, Jews in North Africa and the Middle East would be destroyed, and “the Mufti would be the most authoritative spokesman of the Arab world” (p. 78).  “Husseini was a key figure in finding common ideological ground between National Socialism, on the one hand, and the doctrines of Arab nationalism and militant Islam, on the other” (p. 8).  Following the war he mysteriously “escaped” and found shelter in Cairo, where he was protected and lauded for the remainder of his life, ever promoting an anti-Jewish, anti-American agenda.  

From one perspective, this book is a chronological record of what was said by the Nazi propaganda machine.  Chapter by chapter, Herf describes the shifting nature of the broadcasts, reflecting the course of WWII.  As the war began, the Nazis sought to enlist Arab support in the Middle East, where England especially controlled considerable territory.  Thus similarities between Islam and National Socialism were stressed.  As General Rommel seemed on the verge of victory in North Africa, the broadcasts promised both the extermination of the Jews (primarily in Palestine) and freedom from British rule.  When the Allies began to turn back the German advance, the broadcasts shifted to emphasize the potential harm Muslims would suffer should the British and American and Soviet armies succeed.   As the Third Reich collapsed, the broadcasts shifted to emphasize conspiracies afoot in Islamic lands, blaming Jews and their supporters (especially America) for various evils.

From another perspective, however, there was a constancy to the broadcasts:  hostility to Jews and their allies.  “Radical anti-Semitism was a central component throughout the broadcasts” (p. 11).  No labels were too vicious, no rumors too unfounded, no accusations too malicious for assertion on the radio!  Arabs in North Africa and the Middle East were urged to kill Jews, following the Nazi example, aiming at the “final solution.”  They were reminded that the prophet Mohammed expelled the Jews from Arab lands and then urged to follow his example.  A broadcast in 1942 was titled “Kill the Jews before They Kill You.”  Egyptians were urged to do their duty “to annihilate the Jews and to destroy their property’” (p. 125).  Husseini always made it clear that “his hatred of the Jews was ineradicably bound to his Muslim faith and to his reading of the Koran” (p. 154).  He charged that “‘they lived like a sponge among peoples, sucked their blood, seized their property, undermined their morals yet still demand the rights of local inhabitants’” (p. 185).  Still more, he cried out:  “‘Arabs!  Rise as one and fight for your sacred rights.  Kill the Jews wherever you find them.  This pleases God, history and religion.  This serves your honor.  God is with you’” (p. 213).  

As was evident in the Grand Mufti’s messages, the Koran was the great authority invoked to appeal to Arab listeners.  Neither Hitler’s Mein Kampf nor The Protocols of the Elders of Zion were much discussed, nor were the speeches of Hitler or Himmler invoked.  Rather, texts from the Koran were continually cited to justify Nazi propaganda.  “Nazism thus stood with the ‘faithful’ and ‘noble’ Muslims against traitors who deviated from the path laid down in the Koran” (p. 197).  Himmler even urged his German scholarss to link the Shi’ite hope for the coming of the Twelfth Imam to Hitler, suggesting that “‘the Koran predicts and assigns to the Fuhrer the mission of completing the Prophet’s work’” (p. 199).  Hitler could be portrayed, Himmler said, “‘as Jesus (Isa) who the Koran predicts will return and, as a knight . . . defeats giants and the king of the Jews who appear at the end of the world’” (p. 199).  

Hostility to the Jews was conjoined with hostility to the Allies (preeminently England and America) in the broadcasts.  Despite the fact that the British restricted Jewish Immigration to Palestine and the Americans equivocated regarding the establishment of a Jewish state, both nations were accused of actively promoting such activities.  Egyptians particularly were portrayed as victims of British oppression and urged to drive out the foreigners.  As American troops increasingly played a role in the war, the broadcasts besmirched the USA and President Franklin D. Roosevelt in particular.  He was declared to be not only a tool in the hands of conniving Jews (such as Hans Morgenthau and Bernard Baruch) but to be a Jew himself!  In one of his broadcasts Husseini “stated that the ‘wicked American intentions toward the Arabs are now clearer, and there remain no doubts that they are endeavoring to establish a Jewish empire in the Arab world.  More than 400,000,000 Arabs oppose this criminal American movement’” (p. 213).  

Though Herf focuses almost exclusively on the historical details, it takes little imagination to apply his insights to current affairs.  Virtually the same rhetoric employed by the Nazis is evident throughout Islamic lands.  The link is quite clear in the Muslim Brotherhood, which was founded by Hassan al-Banna (a graduate of the most prestigious Islamic university, Al-Azhar in Cairo), who had “‘made a careful study of the Nazi and Fascist organizations’” (p. 225).  “The Brotherhood wanted to establish a government based on pure Koranic principles and sought to counter reliance on Western culture, which it regarded as having brought about an ‘abasement of morals, conduct and character, for having increased the complexity of society and for having exposed the people to poverty and misery’” (p. 225).  

Picking up on Muslim Brotherhood themes following WWII, Sayyid Qutb furthered the fanatical message of radical Islam, writing Our Struggle with the Jews.  “The title itself,” notes Professor Herf, “evokes disconcerting comparisons to Hitler’s Mein Kampf (My Struggle).  Most important, in its views of the Jews and in its conspiratorial mode of analysis the book displayed a striking continuity with the themes of Nazism’s wartime broadcasts, with the important difference that it was far more embedded in the Koran and Islamic commentaries” (p. 255).  Qutb asserted that the Koran “‘spoke much about the Jews and elucidated their evil psychology’” (p. 257).  This alone authorized “war against the Jews in Israel” (p. 258).  Qutb probably “listened to Nazi broadcasts and traveled in the pro-Axis intellectual milieu of the radical Islamists in and around Al Azhar University.” Thus, Herf reasons:  “Just as the Nazis had threatened the Jews with ‘punishment’ for alleged past misdeeds, so Qutb offered a religious justification for yet another attempt to ‘mete out the worst kind of punishment’ to the Jews then in Israel.  In terms that his audience understood, Our Struggle with the Jews was a call to massacre the Jews living in Israel” (p. 259).  Executed in Egypt in 1966, Qutb “became both a martyr and an ideological inspiration for such radical Islamist groups as Al Qaeda, Hezbollah, and Hamas” (p. 255).  His influence clearly permeates the thought and action of radical Muslim terrorists, including Osama bin Laden.  The vitriol regarding Jews, the anger at America, the dishonest renditions of history, the constant complaints of victimization—nothing much has changed in more than half-a-century!  

Professor Herf draws upon previously untapped documentary sources, especially a cache of materials—transcriptions of the broadcasts made in Cairo by an American ambassador and sent to Washington—that provide extensive evidence for his case.  A professor of history at the University of Maryland, he has written extensively about the Third Reich’s animosity towards the Jews.  His books include:  Reactionary Modernism:  Technology, Culture, and Politics in Weimar and the Third Reich; The Jewish Enemy:  Nazi Propaganda During World War II and the Holocaust; and Divided Memory:  The Nazi Past in the Two Germanys.  He writes as a scholar for scholars, an historian for historians, meticulously footnoting every assertion.  Above all he wants to fully, conclusively document his argument.  Consequently, as he demonstrates the recurrent message, year after year, there is an unavoidable redundancy to the presentation that taxes the reader’s patience.  But his treatise provides important evidence that enables us to understand important aspects of Islam, then and now—from Mohammed onwards, Muslims have distrusted and detested Jews and anyone else disinclined to submit to Allah.  This, rightly understood, is part and parcel of Islam, which means “surrender to God’s will” as manifest in the Prophet’s followers.  

249 “Mind and Cosmos”

   Academic philosophers rarely grace the covers of newsmagazines, but the March 25 issue of The Weekly Standard portrayed Professor Thomas Nagel, bound with ropes, surrounded by demonic monks, roasting in a fire, featured in an article titled “The Heretic—professor, philosopher apostate.”  The reason for such attention was the recent publication of Nagel’s Mind and Cosmos:  Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False (New York:  Oxford University Press, c. 2012).  A professor at New York University, he enjoys an eminent position within the elite galaxy of revered intellectuals.  Before publishing this treatise he had refrained from openly questioning the entrenched naturalistic Weltanschauung of his peers so starkly set forth by Francis Crick:  “You, your joys and your sorrows, you memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules.  Who you are is nothing but a pack of neurons.”  Taking issue with such reductive materialism and publishing a slender treatise questioning its guiding assumptions has elicited outrage and abuse from his erstwhile colleagues, but in doing so Nagel did the real work of a philosopher—following the evidence and seeking the truth rather than tacking to the winds of opinion.  

He admits that “for a long time I have found the materialist account [given “canonical exposition” in Richard Dawkins’ The Blind Watchmaker] of how we and our fellow organisms came to exist hard to believe, including the standard version of how the evolutionary process works.  The more details we learn about the chemical basis of life and the intricacy of the genetic code, the more unbelievable the standard historical account becomes” (p. 5).  The more we understand about life and the cosmos the less adequately Neo-Darwinianism explains things.  Though personally a humanist atheist, he finds common ground with the advocates of Intelligent Design such as Michael Behe and Stephen Meyer, persuasive critics of the dominant paradigm.  He’s come to seriously consider the possibility that mind, rather than matter, shapes Reality.  Consequently:  “My guiding conviction is that mind is not just an afterthought or an accident or an add-on, but a basic aspect of nature” (p. 16).  This is particularly evident when we turn our attention to what we know best—ourselves!  “Something more is needed to explain how there can be conscious, thinking creatures whose bodies and brains are composed of those elements.  If we want to try to understand the world as a whole, we must start with an adequate range of data, and those data must include the evident facts about ourselves” (p. 20).  Unfortunately:  “Evolutionary naturalism implies that we shouldn’t take any of our convictions seriously, including the scientific world picture on which evolutionary naturalism itself depends” (p. 18).  

The mysterious and absolutely indubitable reality of human consciousness highlights the inadequacy of evolutionary naturalism.  “Organisms such as ourselves do not just happen to be conscious; therefore no explanation even of the physical character of those organisms can be adequate which is not also an explanation of their mental character.  In other words, materialism is incomplete even is a theory of the physical world, since the physical world included conscious organisms among its most striking occupants” (p. 45).  Scholars like Dawkins and Crick, who reduce consciousness to material entities, fail to properly distinguish between description and explanation; observing neurons firing in the brain does not begin to adequately explain the phenomenon of consciousness.  Far better, Nagel says, is the ancient Aristotelian conception of “teleological laws” guiding natural processes.  In addition to matter-in-motion, there may well be “something else, namely a cosmic predisposition to the formation of life, consciousness, and the value that is inseparable from them” (p. 123).  Old Aristotle may well have erred, but he now appears wiser than his modern antagonists!  As for theists, a creative God certainly provides a satisfactory explanation.  No final explanation for consciousness fully persuades Nagel, but he knows that the Neo-Darwinian answer lacks cogency.  What we must seek, he argues, is “a form of understanding that enables us to see ourselves and other conscious organisms as specific expressions simultaneously of the physical and mental character of the universe” (p. 69).  

What’s true for consciousness is even truer for cognition—our incredible ability to reason.  We are not only aware of ourselves as thinking beings but we can transcend our personal perspectives and objectively discover momentous realities such as the law of gravity.  Evolutionary naturalism fails, abysmally, to explain the existence and unique mental powers of our species, so properly labeled homo sapiens.  “Rationality, even more than consciousness, seems necessarily a feature of the functioning of the whole conscious subject, and cannot be conceived of, even speculatively, as composed of countless atoms of miniature rationality” (p. 87).  

Then add to cognition conscience!  Add to speculative reason practical reason.  We do, countless times a day, evaluate things, judging them good and evil, right and wrong.  And such judgments range far beyond our individual feelings or interests.  I may very well be more outraged by the former San Diego Mayor Bob Filner’s abusive behavior than by an undeserved personal insult.  I may very well be more concerned with the national debt’s impact on future generations than by the sharp increase of my electric bill, though both result from irresponsible politicians’ decisions.  To Nagel, only the “moral realism” expounded by traditional thinkers such as Aristotle and C.S. Lewis enables us to craft ethical principles and render moral judgments; and “since moral realism is true, a Darwinian account of the motives underlying moral judgment must be false, in spite of the scientific consensus in its favor” (p. 105).  

Inasmuch as consciousness, rationality and morality define us as human beings—and inasmuch as evolutionary naturalism cannot explain these fundamental realities—we must, Nagel says, open our minds to better ways of thinking and understanding the universe, taking “the appearance and evolution of life as something more than a history of the development of self-reproducing organisms, as it is in the Darwinian version” (p. 122).  A better version is wanted!  For, Nagel concludes:  “I would be willing to bet that the present right-thinking consensus will come to seem laughable in a generation or two” (p. 128).  No wonder “the present right-thinking” guardians of secular orthodoxy turned venomous when confronted with Nagel’s intellectual rigor and incisive logic!  

* * * * * * * * * * * * * * * * * *

In many of his writings C.S. Lewis trenchantly critiqued the philosophical naturalism masquerading as “science” in the modern world.  This he labeled “scientism,” carefully differentiating it from authentic “science,” with its rigorous methodology and tentative hypotheses.  The intrinsic nihilism and potential brutality of “scientism” was philosophically exposed in Lewis’s The Abolition of Man and memorably portrayed in his That Hideous Strength, one of the great dystopias of the 20th century.  The same message is manifest (though without Lewis’s theistic foundation) in Raymond Tallis’ recent Aping Mankind:  Neuromania, Darwinitis and the Misrepresentation of Humanity (Durham, U.K.:  Acumen Publishing Limited, c. 2011).   As a medical doctor (and “atheistic humanist”) who taught for many years at the University of Manchester, devoting himself to brain science, he is thoroughly aware of neuroscience and its implications for understanding human nature.  But he has become increasingly distressed by the unwarranted supposition (what he dubs “neuromania”) that we are no more than our brains, ignoring the importance of common sense, consciousness and culture, art and religion.  As widely propounded in both scholarly and popular circles:  “The neurophysiological self is at best the locus of ‘one damn thing after another’, which hardly comes near to the self of a human being who leads her life, who is a person reaching into a structured future with anticipations, aims and ambitions, that are themselves rooted in an almost infinitely complex accessible past that makes sense of them” (p. 135).   

Even on a purely material level man’s brain eludes easy analysis.  Though specific neurological sections clearly do specific things (e.g. seeing; hearing), they are capable of alternative and adaptive roles.  Rather than being “hard-wired” like a computer, the brain has a beguiling “plasticity” enabling it to reorganize under certain conditions.  The brain is clearly necessary for us to think—but it is not necessarily a sufficient explanation of our thinking.  Neurologists may chart correlations between neurons firing and mental activity, but as elementary logic reminds us a correlation must never be equated with causation.  “The errors of muddling correlation with causation, necessary condition with sufficient causation, and sufficient causation with identity lie at the heart of the neuromaniac’s basic assumption that consciousness and nerve impulses are one and the same, and that . . . ‘the mind is a creation of the brain’” (p. 95).  Quite the contrary, Tallis argues:  “mental events are not physical events in the brain” (p. 133).  

Undergirding the notion that the mind is the creation of the brain is the evolutionary assumption that we are nothing but the clever animals Daniel Dennett declares as part and parcel of  “Darwin’s Dangerous Idea,” the “universal acid” that cuts away all confidence in what philosophers call qualia—intentionality and meaning,  morality and justice, freedom and responsibility, beauty and love.  To Tallis, any theory that discounts such qualia (intensely felt personal realities basic to human experience) demands disbelief!  Obviously “nerve impulses are not at all like qualia” (p. 95) and any attempt to explain away the latter by describing the former cannot but miscarry.  Indeed, “we shall find, again and again, that we cannot make sense of what the brain is supposed to do—in particular postulating an intelligible world in which it is located—without appealing to talk about people who are not identical with their brains or with material processes in those brains” (p. 111).  

The “Darwinitis” infecting “neuromaniacs” is similarly suspect to Tallis.  “If they only looked at what was in front of their noses they would not have to be told that there are differences between organisms and people:  that a great gulf separates us from even our nearest animal kin” (p. 147).  In almost every significant way we differ from other animals!  “Many of our strongest appetites—for example, for abstract knowledge and understanding—are unique to us” (p. 151).  Our finest endeavors—writing and reading books, composing and listening to symphonies—have no parallel in the animal kingdom.  Importantly, to Tallis, embracing Darwinism as an explanation for human origins does not necessarily entail accepting “Darwinitis (which purports to explain everything about people in terms of biological evolution)” (p. 153).  Especially problematic is any Darwinian explanation of human consciousness, the fundamental reality known to us.  “In short, if it is difficult (although not in principle impossible) to see how living creatures emerged out of the operation of the laws of physics on lifeless matter, it is even less clear how consciousness emerged or why it should be of benefit to those creatures that have it.  Or, more precisely, why evolution should have thrown up species with a disabling requirement to do things deliberately and make judgments” (p. 179).  

Whether humanizing animals or animalizing man, Darwinitis demands its disciples deny non-material realities of any sort.  Consequently they remain “bewitched” by figures of speech comparing us with computers or machines, dolphins or chimps.  Unlike computers, however, we think and skillfully program computers, which are not conscious and cannot reason.  Even the most sophisticated supercomputers “are as zombie-like as pocket calculators” (p. 195).  We, conversely, uniquely use languages that are not at all computational!  In our languaging we reveal our freedom and dignity (realities necessarily denied by neuromaniacs) as human beings, and in our literature we revel in our uniquely human creativity.  

Sadly enough, Tollis says, even our current humanities (history, philosophy, art and music) have fallen captivity to Darwinitis and neuromania, reducing literally all our activities to “animalities,” i.e. matter-in-motion.  Thus we find Shakespearean scholars studying Macbeth’s grasping for an imaginary dagger and declaring:  “‘when moving his right hand, an actor playing Macbeth would activate the right cerebellar hemisphere and the left primary cortex” (p. 294)!   Such “scholarship,” relentlessly marching through academia, should give us pause, Tollis says, because it ruthlessly destroys all that grants grandeur to our literary treasures.  Similarly, we must be alerted to the flourishing academic discipline of “neuro-evolutionary ethics” espoused by thinkers such as Patricia Churchland, who insists that “‘it is increasingly evident that moral standards, practices and policies reside in our neurobiology’” (p. 317). Thus, as Albert Einstein asserted in 1932, in our “thinking, feeling, and acting” we do nothing freely “‘but are just as causally bound as the stars in their motion’” (p. 312).  

Such views, coming to the foreground in our world, lead Tollis to warn:  “Be afraid, be very afraid.”  Indeed, Tollis is sufficiently afraid to look favorably on “at least the idea” of God (p. 325).  The consequences of the atheism he embraces embarrass him!  Though irreligious himself, he finds the traditional notion of God preferable to the simplistic “biologism” espoused by prominent atheists such as Richard Dawkins, whose “devastating reductionism . . . disgusts even an atheist like me.  In defending the humanities, the arts, the law, ethics, economics, politics and even religious belief against neuro-evolutionary reductionism, atheist humanists and theists have a common cause and, in reductive naturalism, a common adversary:  scientism” (p. 336).  

* * * * * * * * * * * * * * * * * 

For many years Alvin Plantinga has effectively represented the Christian perspective among academic philosophers.  Illustrating his prestige among his peers, he was invited to deliver the Gifford Lectures in 2005.  In print the lectures are titled:   Where the Conflict Really Lies:  Science, Religion, and Naturalism (New York:  Oxford University Press, c. 2011).  Unlike some Gifford lecturers (e.g. William James, in The Variety of Religious Experience), Plantinga writes almost exclusively for his peers, and this treatise will be accessible only to folks with ample background in the denser realms of science, philosophy and theology.  His thesis, in short, claims:  “there is superficial conflict but deep concord between science and theistic religion, but superficial concord and deep conflict between science and naturalism” (#89 in Kindle ed.).  Theists such as himself need not deny evolutionary evidence, but they cannot abide a naturalistic “add-on to the scientific doctrine of evolution:  the claim that evolution is undirected, unguided, unorchestrated by God (or anyone else)” (#142).  Though vociferously denied by its secular proponents, scientific “naturalism” assumes a religious role in their thinking and may be understood as a “quasi-religion” whose presumptions clearly conflict with the data of consciousness and cognition.  Unfortunately, as Richard Feyerabend wisely noted years ago:  “Scientists are not content with running their own playpens in accordance with what they regard as the rules of the scientific method; they want to universalize those rules, they want them to become part of society at large” (Against Method, p. 220).  

We Christians especially should take seriously the scientific discoveries and insights of our time.  Created in God’s image, we are uniquely equipped “to know and understand something of ourselves, our world, and God himself” (p. 4).  All truth is God’s truth and we can (in part, looking through a dark glass) know it.  Unfortunately, all too many modern “scientists” abandon their circumscribed vocation and become amateur philosophers when promoting their naturalistic (and generally atheistic) convictions.  By carefully reading Richard Dawkins’ The Blind Watchmaker and The God Delusion—and demanding such necessities as demonstrable evidence and cogent explanation, logical rigor and unambiguous definitions— Plantinga effectively illustrates Dawkins’ sophomoric shortcomings.  Dispatching Dawkins, he then dissects Daniel Dennett’s Darwin’s Dangerous Idea—“a paradigm example of naturalism” (p. 36).  Though by profession a philosopher (whereas Dawkins is a biologist dispensing philosophy), Dennett apparently has failed to do the honest intellectual toil necessary to actually engage the great theists of the past!  Consequently, many of his arguments (like Dawkins’) prove jejune to a first-rate thinker such as Plantinga—“about as bad as philosophy (well, apart from the blogosphere) gets” (p. 45).  

Much the same can be said of “evolutionary psychologists” who “explain distinctive human traits—our art, humor, play, love, sexual behavior, poetry, sense of adventure, love of stories, our music, our morality, and our religion itself—in terms of adaptive advantages accruing to our hunter-gather ancestors” (p. 131).  Thus Harvard’s Steven Pinker devoted “only eleven of his 660-page How the Mind Works” to music; he explained that music “‘was useless’ in terms of human evolution and development’” and should be regarded “as ‘auditory cheesecake,’ a trivial amusement that ‘just happens to tickle several important parts of the brain in a highly pleasurable way, as cheesecake tickles the palate’” (p. 132).  Such Pinkerian statements simply illustrate the intellectual vacuity of celebrated academics!  

On a more constructive level, after meticulously answering objections to the possibility of God’s intervention in the world, he suggests that God could easily work through both the “macroscopic” and “microscopic” realms, exercising “providential guidance over both “cosmic” and “evolutionary” history and doing so “without in any way ‘violating’ the created natures of the things he has created” (p. 116).  Still more, Plantinga confesses:  “perhaps God is more like a romantic artist; perhaps he revels in glorious variety, riotous creativity, overflowing fecundity, uproarious activity.  . . . .  Perhaps he is also very much a hands-on God, constantly active in history, leading, guiding, persuading and redeeming his people, blessing them with “the Internal Witness of the Holy Spirit” (Calvin) or “the Internal Instigation of the Holy Spirit” (Aquinas) and conferring upon them the gift of faith.  No doubt he is active in still other ways.  None of this so much as begins to compromise his greatness and majesty, his august and unsurpassable character” (p. 107).  Equally possible, God may very well have created “a theater for setting for free actions on the part of human beings and other persons”—“a world of regularity and predictability” wherein we function in accord with our imago dei status (p. 119).  

Thus good science poses no “defeaters” for Christian faith.  There is in fact deep concord between them.  As Sir Isaac Newton said, in Principia Mathematica:  “This most beautiful system of the sun, planets and comets, could only proceed from the counsel and dominion of an intelligent and powerful being. . . .  This Being governs all things, not as the soul of the world, but as Lord over all” (p.).  Today’s cosmologists often reflect on the apparent “fine tuning” of the universe that suggests a profound teleological process culminating in a world “just right” for us human beings.   Current advocates of “Intelligent Design” such as Michael Behe have persuasively detailed evidence and argued that “irreducibly complex” structures cannot be adequately explained by Neo-Darwinians who insist the evolutionary process is absolutely unguided.  Plantinga effectively demonstrates the thoroughly rational and philosophically defensible position that there is every reason to believe in a Mind behind the Cosmos.   

248 “The Rising Tyranny of Ecology”

In 1973, Alston Chase abandoned his academic career—as a tenured philosophy professor with degrees from Harvard, Oxford and Princeton—and “returned to nature” in Montana’s mountains.  At the time he considered himself an “environmentalist” and sought to live accordingly.  Successfully relocated, he undertook a writing project to document the development of Yellowstone National Park under the reigning “ecosystems management” principle adopted by park managers.  What he discovered—and detailed in Playing God in Yellowstone (Boston:  The Atlantic Monthly Press, c. 1986)—was the destructiveness of misguided good intentions, leaving the park significantly degraded and endangered.  Particularly informative is his philosophically-nuanced treatment of “the environmentalists” who naively assumed and promoted “the subverted science” articulated by Rachel Carson, whose deeply flawed Silent Spring (labeled “the Uncle Tom’s Cabin of modern environmentalism”) recruited so many of them to the movement.  Such environmentalists include “the new pantheists” who believe, with Thoreau, that:  “In wilderness is the preservation of the world.”  They’re often enamored with “California Cosmologists” such as Theodore Rosak, Alan Watts, Fritjof Capra, and assorted Native American shamans, whose thoroughly Gnostic notions (e.g. panpsychism) promise a mystical union with Nature conducive to the inner bliss of self-realization.  And they follow an assortment of “hubris commandos”—well-heeled urban elites with political connections who want to reserve the wilderness for backpackers.  

His work on Yellowstone piqued Chase’s concern for broader environmental policies impacting the nation, so he researched and wrote In A Dark Wood:  The Fight over Forests and the Rising Tyranny of Ecology (New York:  Houghton Mifflin Company, c. 1995), one of the most probing and important ecological studies in print.  Combining a detailed narrative of events with a penetrating analysis of deeper issues, In A Dark Wood effectively exposes the heart of America’s confusion regarding how to live rightly with the natural world.  Anyone one concerned with the health of the natural world—and of the cultural and political world—should read and reflect on Chase’s work.  Evaluating his conclusions at the beginning of his treatise, Chase confesses they “were far more disturbing than I had anticipated.  An ancient political and philosophical notion, ecosystems ecology, masquerades as a modern scientific theory.  Embraced by a generation of college students during the campus revolutions of the 1960s, it had become a cultural icon by the 1980s.  Today, not only does it infuse all environmental law and policy, but its influence is also quietly changing the very character of government.  Yet, as I shall show, it is false, and its implementation has been a calamity for nature and society” (p. xiii).  Those collegians—drinking deeply from Rachel Carson’s Silent Spring and adopting the pantheism of Emerson  and John Muir—began an effective long march through the nation’s institutions and transformed environmentalism into a religious faith with a radical political agenda.  

That process stands revealed in their determination to preserve the forests of Washington, Oregon, and California.  Rejecting the traditional notion that loggers with their sawmills could wisely manage the forests and provide for their sustained rejuvenation, the activists demanded they be quarantined in what was imagined to be a primitive paradise and allowed to flourish free from human contamination.  Chase shows how timber “harvests soared during the postwar building boom” and “trees grew faster than they were being cut” (p. 73).  This was especially true of California’s redwood forests.  The trees were healthy and provided a healthy income for thousands of folks throughout the Pacific Northwest.  Admittedly, “old growth” forests were declining, but they had been effectively replaced by faster growing, younger, healthier trees.  Still more:  those “old growth” forests were largely a figment of the activists’ imagination!  The best historical evidence shows that pre-Columbian Indians had carefully set and controlled fires, and the Pacific Northwest forests two centuries ago were “‘more open than they are now,’” containing “‘islands of even-aged conifers, bounded by prairies, savannas, groves of oak, meadows, ponds, thickets and berry patches.”  Largely due to broadcast Indian burning, they “‘were virtually free of underbrush and course woody debris that has been commonplace in forests for most of this century’” (p. 404).  

But the defenders of the mythical “old growth” forests, draped in the mantle of “ecology” and taking “the balance of nature” as axiomatic, believe “nature knows best” and requires us to promote her sustainability.  This position is less a scientific schema than an ancient ideological stance shaped by evolutionists such as Ernest Haeckel (who coined the word ecology in 1866) and Aldo Leopold (whose Sand County Almanac became an instructional manual for environmental activists).  It is less an agenda fueled by evidence than a faith founded in improbable historical and metaphysical assumptions.  The Ecosystem became God!  Enamored of “deep ecology,” environmentalists “unwittingly embraced ideas that synthesized an American religion of nature with German  metaphysics:  a holism that supposed individuals were only parts of a larger system and had no independent standing; antipathy to Judaic and Christian values of humanism, capitalism, materialism, private property, technology, consumerism, and urban living; reverence for nature; believe in the spiritual superiority of primitive culture; a desire to go ‘back to the land’; support for animal rights; faith in organic farming; and a program to create nature reserves” (p. 129).  Fortuitously for them, the Endangered Species Act both enacted their aspirations and opened legal portals whereby they could effectively attain their goal of transforming America.  

The ecological activists, looking for an opportunity to kill the logging industry with its “clear-cutting” and access roads, chanced on an “endangered species” in the Pacific Northwest—the spotted owl.  Only a few birds (14 in the first important study) were found, and they appeared to prefer “old growth” forests.  To preserve these owls’ “ecosystem” a massive effort was almost immediately launched to halt all activities in the forests that might endanger it, though in fact “spotted owl policy would be built on the thin air of uneducated guesswork” (p. 251).    Well-funded by environmental organizations such as the Nature Conservancy, the Sierra Club, and the wealthy eastern aristocrats such as Teresa Heinz Kerry who finance foundations (e.g. Pew, Rockefeller, Heinz and the Tides), the activists successfully manipulated the media, academia, and the judiciary to preserve the extensive lands allegedly needed for the spotted owl to flourish.  They deliberately ignored accumulating scientific studies finding ever-more spotted owls thriving throughout the region, especially in recently harvested private timber properties.  “By 1993 six thousand to nine thousand would be estimated to live in northern California alone, and perhaps an equal number in Oregon and Washington.  Yet federal demographic studies continued to claim that the species remained in deep decline” (p. 365).  True believers cannot be deterred by the facts!  

So they successfully pursued their agenda, primarily through the courts, and managed to earmark large sections of the Pacific Northwest as “old growth” forests, forever inviolable and off-limits to cutting of any sort.  Timber harvests in California dropped by 40% within two decades.  Loggers lost their jobs, sawmills closed, and small towns shriveled.  Unlike the urban environmentalists (burnishing their self-esteem by supporting the Sierra Club, which paid skilled lawyers to pursue their agenda through the courts) the working folks in the forests lacked both the money and organizational skills with which to resist the lock-down of their region.  “Saving the owl had effectively shut down an area larger than Massachusetts, Vermont, New Hampshire, and Connecticut combined, costing the economy tens of billions of dollars and casting tens of thousands out of work” (p. 398).  When grass-roots groups (identified as the Wise Use Movement) tried to rally and defend themselves and their livelihood, environmentalists invested millions of dollars to discredit them, calling them “a mob” bankrolled by the evil timber industry.  Environmentalists orchestrated meetings with President Bill Clinton and his Vice President Al Gore, who then packed the President’s Council on Sustainable Development with leaders of various well-heeled environmental organizations.  Remarkably, within three decades the ecological “movement became a war launched by the haves against the have-nots.  It is a situation analogous to what the late Christopher Lasch has called ‘the revolt of the elites’ whereby ‘upper-middle-class liberals have mounted a crusade to sanitize American society.’  Indeed, Lasch could have been thinking of environmentalist when he added that ‘when confronted with resistance, members of today’s elite betray the venomous hatred that lies not far beneath the smiling face of upper-middle-class benevolence’” (p. 415).  

Tragically, the natural world would suffer harm along with the loggers and the small town economies they sustained.  “The great effort to save old growth would eventually destroy the very landscapes it was intended to preserve.  For it demonstrated an important principle:  that seeking to halt change merely accelerates it.  Nothing more clearly revealed this truth than the rising threat of wildfire” (p. 400).  Allegedly “saving” forests and wildlife, the preservationists paved the way for the fires we now witness throughout the West.  Trees that could have been logged and provided a living for thousands of people now burn, for wherever old trees die and underbrush thrives the potential for massive fires increases.  For instance:  “Officials in southern California, following the 1993 firestorm, attributed the lack of prescribed burning that could have reduced or eliminated much of the destruction to public opposition, some of which was based on concern for the habitat of the Stephens kangaroo rat and the gnatcatcher” (p. 401).  The raging fires should awaken us to the truth of Dante’s words that provide Chase the title for his book.  In The Divine Comedy the great poet said:  “I went astray / from the straight road and woke to find myself / alone in a dark wood.  How shall I say / what wood that was!  I never saw so drear, / so rank, so arduous a wilderness.”  Today we’re in a “dark wood” that results from our captivity to an ancient philosophical error:  identifying the good with what is “natural,” imagining the “state of nature” as ideal for man, formulating “new values based on systems ecology, which from the beginning was less a preservation science than a program for social control.  Supposing that protecting ecosystems was the highest imperative for government, it increasingly viewed the exercise of individual liberty as a threat” (p. 413).  

* * * * * * * * * * * * * * * * * *

Nothing better illustrates the “rising tyranny of ecology” than Elizabeth Nickson’s Eco-Fascists:  How Radical Conservationists Are Destroying Our Natural Heritage (New York:  HarperCollins Publishers/Broadside Books, c. 2012).  She claims to “walk the green walk more than anyone I’ve met” and lives on 16 acres immersed in older-growth forest in a geothermal-heated house on a Canadian island (Salt Spring) in Puget Sound.  There she witnessed and recorded, in fascinating detail, how we have been misled by “a corrupt idea—that an ecosystem has to be in balance, with all its members present in the correct proportion, to be considered healthy” (p. 6).  According to the litany:  “Nature knows best.  Man is a virus and a despoiler and must be controlled” (p. 18).  Though touted as “conservation biology” it is a demonstrably “bad science” that is doing great harm.  “Evil may be too strong a word for us modernists to use comfortably, but what else do you call an idea that ruins everything it touches?” (p. 173).  “In just thirty-five years, conservation biology has created one disaster after another, in something that observers are now calling an error cascade.  Tens of millions have been removed from their beloved lands.  Immensely valuable natural resources have been declared off-limits to the most desperate in the developing world” (p. 200).  Consequently:  “Range, forest, and farm are dying; water systems have been destroyed.  Conservation biology has created desert and triggered the dying of entire cultures” (p. 200).  

A seasoned journalist, working in various parts of the world as a reporter for Time magazine, Hickson went home to care for her dying father and remained on the land because she learned to love it.  She “built a cottage at the top of my hill” and “resolved to stay” (p. 12).  In the process, however, whenever she tried to do literally anything on the land she owned and sought to improve she encountered the front line of a totalitarian movement—“an uncompromising and rigid bureaucratic command-and-control structure, which is creating yet another hybrid of the totalitarian state” (Kindle #82)—that saddled her with a series of irrational and onerous restrictions and burdened her not only with anger at the fanatical environmentalists on her island and senseless bureaucratic restrictions but with a concern for the future of our world.  In the process she effectively “lost all but 4 acres of the original 28.  I still pay taxes on the 16.5 acres I supposedly have left, but I’m lucky I am allowed to walk on it” (p. 177).  She had to deal with “grim zealots [many of them angry, wealthy, divorced women] seeking to remake the world” in accord with their mantra of “sustainability” and who find allies in affluent NGOs and “fervent true believers in federal and state agencies” such as the EPA (#98).  She discovered a “labyrinthine public planning process” aptly described by historian David Clary as “the eternal generation of turgid documents to be reviewed and revised forever.”  

True to their Leftist principles, environmental zealots follow the utopian visions formulated by Rousseau and Marx, validating the oft-uttered generalization that “when the Iron Curtain fell, fellow travelers migrated to the environmental movement.  And when they arrived to transform the rural world—a world few of us visit except on vacation, when no one is paying attention—they brought their planning with them” (p. 45).  Consequently:  “There is no starker way to describe what is taking place right now in the country than as the full flourishing of the bureaucratic state.  Private property rights have been largely removed, the culture is dying, but the state, consisting of federal, state, and local ministries and departments, has bulked out so that a giant superstructure of bureaucrats with rulebooks piled high around their desks flourishes, grows, and feeds on ever-diminishing wealth” (p. 47).  

Facilitating this process (and feathering their own nests while granting rare privileges to their wealthy political friends such as Harry Reid) are powerful organizations such as the Nature Conservancy (TNC), the world’s 10th largest NGO, “the biggest of the big dogs, the mythic wolf-king of the forest primeval” (p. 61).  In a complicated, convoluted and surreptitious process, The Nature Conservancy works with “the nation’s richest individuals, like Ted Turner, David Letterman, the Rockefellers, and the DuPonts.  Basically, TNC is acting as agent for the wealthiest among us, acquiring enormous tracts of land, using $25 donations from its 1.3-million-strong membership and $100 million in annual government money, and then selling that land at a discount to the very rich, who in effect receive a substantial tax discount as well as an extremely beautiful place in which to establish a country estate” (p. 68).  The good folks living on the land distrust and fear TNC, so it generally “operates through a proliferation of ‘partner’ land trusts, conservancies, and operatives.  TNC’s sending polite, fresh-faced kids into the middle of nowhere to start local actions for waterbirds or watersheds or ancient forests was the trigger that started the landslide collapse in rural America” (p. 72).  Environmentalists have created legions of smaller foundations, now run “by a subset of grim zealots seeking to remake the world” (#96).  The feared “robber barons” of yore have been replaced by equally pitiless celebrities such as Tom Cruise, Teresa Heinz, and Robert Redford!  Readers such as I (for many years a member and admirer of The Nature Conservancy) will never forget Nickson’s meticulous deconstruction of TNC—and by inference the Sierra Club, the Wilderness Society, etc.  

Her personal frustration led to an investigation, including an extensive journey throughout the rural West (marked by an in-depth interview with Alston Chase in Montana) as well as plowing through the  written materials that resulted in the publication of Eco-Fascism.  She explored the forests where logging and sawmills once sustained a vibrant culture and the open range backcountry where cattle once ranged.  There she found:  “Deserted lands, mile after mile after mile.  No one on the highways, not even trucks.  One broken little hamlet after another.  . . . .  What I was looking at was death.  Death not just of the little towns but death of millions of acres of rangeland.  . . . it was like driving through Ghost World, with wraiths drifting across the fields whispering of what was once all fecundity and life” (p. 235).  She came to believe that there has been a well-orchestrated war on rural America, where folks earn their living from the land rather than preserving it as sanctuary for either veneration or vacation retreats.  Enamored with their own purity, environmentalists have effectively sequestered nearly 700 million acres of land, 30 percent of the nation’s land base.  “The amount of land classified as wilderness has grown tenfold since ecosystem theory took flight, growing from 9 million acres in 1964 to 110 million acres today” (p. 96).  Amazingly, as a result of crusades to create parks and “wilderness areas,” nearly half (40%) of New York state, “almost 14 million acres—is in the process of being rewilded, turned back, in all essentials, to Iroquoia” (p. 40).  Worldwide the same process proceeds apace—as a result of the creation of parks and refuges “more than 10 percent of the developing world’s landmass has been placed under strict conservation—11.75 million square, miles, more than the entire continent of Africa” (p. 38).  In the process, “more than 14 million indigenous people have been cleared from their ancestral lands by conservationists” (p.36).  

With the support of the Clinton Administration in the 1990s and the Obama Administration today, environmental activists have banned logging in millions of acres in the national forests.  However well-intended, the meticulous study of Holly Fretwell, Who Is Minding the Federal Estate—“the most important analysis of the effects of environmental activism on rural America to date” in Nickson’s judgment (p. 129)—shows “that everything, everything, we have been doing was wrong” (p. 130).  Wildfires vividly illustrate this, for nothing—neither timber harvesting nor road building—can compare with the damage that wildfires inflict on” the forests (p. 130).  The fires resulting from environmental policies consume vastly more timber than “evil corporations” could possibly have done, and the devastation inflicted on spruce and pine trees by the pine beetle and budworm could have been controlled by rapid cutting had not environmentalists insisted the bugs be allowed to pursue their destructive ways.    

  Nickson admits:  “The title of this book is harsh, particularly when used with regard to environmentalists, whom most people view as virtuous at best, foolish at worst.  But I do not use this term lightly, nor as a banner to grab attention.  My father landed on D-day and, at the end of the war, was put in charge of a Nazi camp and told to ‘sort those people out.’”  He was thus highly sensitive to the fact “that man defaults to tyranny over and over again, and while the tyranny of the environmental movement in rural America has not reached what its own policy documents say is its ultimate goal—radical population reduction—we cannot any longer ignore that goal and its implications” (#132).  And she believes there is in fact an answer:  “The Gordian knot of the countryside mess can be solved with one swift blow of the sword.  Property rights must be restored to the individuals who are willing to work their lives away tending that land.  The people, the individuals and families, in other words, who want it.  Confiscation by government, multinationals, and agents of the megarich—the foundations and land trusts—must be reversed.  Otherwise devastation beckons” (p. 314).  

247 How Liberalism Became Our State Religion

Barack Obama’s 2008 presidential campaign and election clearly appealed to and elicited a strongly religious fervor.  Devotees fainted at his rallies, messianic claims were attached to his agenda, and Obama promised a fundamental “transformation” of America.  Celebrating his election, he grandiosely declared that peoples henceforth would see that “this was the moment when the rise of the oceans began to slow and the planet began to heal.”  Consequently, actor Jamie Foxx urged fans to “give an honor to God and our lord and savor Barack Obama.”  MSNBC commentator Chris Matthews enthused:  “This is the New Testament” and “I feel this thrill going up my leg.”  Louis Farrakhan, closely aligned with Jeremiah Wright, Obama’s Chicago pastor, told his Nation of Islam disciples:  “When the Messiah speaks, the youth will hear, and the Messiah is absolutely speaking.”  And there’s even The Gospel According to Apostle Barack by Barbara Thompson.  Though previous presidents—notably John F. Kennedy—elicited something of the same enthusiasm, Obama is somewhat unique in America.  But he is not at all unusual when placed against the backdrop of human history, when again and again “charismatic” leaders have claimed and been endowed with supernatural powers.  

Thus there is good reason to seriously ponder Benjamin Wiker’s Worshipping the State:  How Liberalism Became Our State Religion (Washington:  Regnery Publishing, Inc., c. 2013).  He prefaces his treatise with a typically prescient statement by G. K. Chesterton:  “‘It is only by believing in God that we can ever criticize the Government.  Once abolish . . . God, and the Government becomes the God.  That fact is written all across human history . . . .  The truth is that Irreligion is the opium of the people.  Wherever the people do not believe in something beyond the world, they will worship the world.  But, above all, they will worship the strongest thing in the world’” (p. 1).   And inasmuch as the State has (during the past five centuries) gradually expanded its powers, there is a natural tendency to worship it.    

Though secular liberals have frequently touted their “tolerance” and commitment to “diversity,” there is a totalitarian shadow—an irreligious dogmatism—evident in their many current anti-Christian endeavors:  the “war on Christmas” with efforts to enshrine alternatives such as “Winter Solstice;” the cleansing of any Christian content from public school curricula (while simultaneously promoting Islam); the dogmatic support of naturalistic evolution rather than any form of intelligent design in the universities; the removal of crosses or nativity scenes on public lands; the desecration of Christian symbols by “artists” of various sorts; the assault on Christian ethics through court decisions (e.g. Roe v. Wade) and programs such as the abortificient provisions in Obamacare.  Systematically imposed by the federal courts (following the crucial 1947 Everson v. Board of Education Supreme Court decision), “the federal government has acted as an instrument of secularization, that is, of disestablishing Christianity from American culture, and establishing in its place a different worldview” (p. 11).  

Lest we restrict this process to America, however, we must chart some powerful historical developments in Western Civilization that have been unfolding for half-a-millennium.  To Wiker, the triumph of Liberalism in these centuries enabled growing numbers of folks to liberate themselves from the “curse” of Christianity and replace the Church with an enlightened and nurturing State.  “The founders of liberalism believed that Christianity was a huge historical mistake, and therefore they reached back again to the pagans for help in loosening the Christian hold on the world, and quite often adopted precisely those things in paganism that Christianity had rejected” (p. 22).  Consequently, “Christians today find themselves in a largely secularized society” quite akin to the ancient world with an easy-going sexual ethos; “it is as if Christianity is being erased from history, and things were being turned back to the cultural status quo of two thousand years ago” (p. 37).  

Christianity, of course, emerged within a pagan world wherein the state (Egyptian pharaohs, the Athenian polis, Imperial Rome) had been routinely idolized.  Following Christ’s wonderful prescription—“render unto Caesar the things that are Caesar’s and to God the things that are God’s”—his followers established the “two cities” approach definitively set forth by St Augustine.  Priests and kings are to preside over different, though not totally isolated realms.  Clearly delineated in the Bible, “The distinction between church and state, religious and political power, is peculiar to Christianity, and the church invented it” (p. 44).  Of ultimate importance to early Christians was doctrinal Truth, an uncompromising insistence on the singular claims of Christ Jesus, the LORD of His heavenly kingdom.  Christians should not deify the state, and no king should defy God’s Law!  Though routinely blurred in practice and often resembling a wrestling match requiring energetic corrections (such as the Cluniac reforms in the 10th and 11th centuries), the separation of church and state provided the key to much that’s distinctive in Western Civilization by preventing the “Church from becoming a department of the state.”  Prescriptively, in 494 A.D. Pope Gelasius wrote a famously important letter to the Eastern Emperor Anastasius, insisting on a clear separation between “the sacred authority of the priesthood and the royal power.”  (In the East, by contrast, a “Caesaropapism” developed reducing the Church to an arm of the Byzantine Empire).  Thenceforth, uniquely in the West, two realms were established with neither dominating the other.  

During the past 500 years, however, this balance slowly shifted and secular powers have imposed their way on the churches.  Wiker describes it as “the rise of liberalism and the re-paganization of the state.”  Fundamental to this progression was Niccolo Machiavelli, who published The Prince in 1512 A.D. and “invented the absolute separation of church and state that is the hallmark of liberalism.”  The Church had drawn lines between religious and political powers, “but Machiavelli built the first wall between them.  In fact, his primary purpose in inventing the state was to exclude the church from any cultural, moral, or political power or influence—to render it an entirely harmless subordinate instrument of the political sovereign” (p. 104).  An effective ruler—the strong-armed prince—must ignore religious and moral prescriptions, following a “might-makes-right” formula.  Machiavellian secularism now appears in both the “soft “liberalism” designed to satisfy our physical needs and the “hard liberalism” of fascist states.  To Machiavelli, the prince should appease the ignorant masses and “‘appear all mercy, all faith, all honesty, all humanity, all religion’” (p. 110).  Working surreptitiously, however, he should promote a “re-paganized” religion and state-controlled educational system.  “The current belief that the church must be separated from the state and walled off in private impotence—leaving, by its subtraction from the public square, the liberal secular state—all that is Machiavelli’s invention.  The playing out of this principle in our courts today is in keeping with his goal of creating a state liberated from the Christian worldview” (p. 119).  Machiavelli’s moral nihilism also fit nicely with newly-empowered nation-states which followed the cuius regio, eius religio (“whose realm, his religion”) compromise negotiated in 1555 at the Peace of Augsberg and quickly moved to control the churches.  

England’s King Henry VIII—guided by Thomas Cromwell, who had studied Machiavelli’s teachings at the University of Padua—brutally illustrated this trend by establishing the Church of England.  He and his successors placed themselves directly under God, controlling both church and state.  Henry supervised the publication of the 1539 Great Bible, featuring an engraving of himself handing copies of it to both the Archbishop of Canterbury (Thomas Cranmer) and his Lord Chancellor (Cromwell).  A century later, “England gave the world the immensely influential political philosopher Thomas Hobbes, author of the Leviathan, who constructed an entirely secular foundation for Henry’s church, and therefore gave us the first truly secular established church in a major modern state—more exactly, an absolutist, autocratic version” (p. 122).  To accomplish this, he first reduced all reality to the material realm, subtly denying the non-materiality of both God and the soul and eliminating any objective, absolute moral standard.  In Hobbes’ world, lacking both Natural and Divine Law, good and evil are mere labels attached to feelings that either please or displease us.  Thus, Hobbes famously said, in our natural state we are at war with everyone and, consequently our lives are “‘solitary, poor, nasty, brutish, and short’” (p. 130).  To corral our nastiness, a Leviathan—an all-powerful Government—must rule.  We need an absolute Sovereign to protect grant and protect our “rights.”  As with Machiavelli, Hobbes knew the masses needed religion, and he simply insisted the Sovereign had the right to prescribe and enforce it through the Church of England.  His “church is entirely a state church, completely under the king’s power” (p. 134).  

Liberalism, similarly, insists the Church must accommodate the state, and “Liberal Christianity is the form that the established religion of the state takes—perhaps not its final form, but its most visible, obvious form” (p. 120).  To accomplish this, liberal thinkers during the Age of Reason determined to destroy the authority of Scripture, and the “demotion of the Bible from revealed truth to mere myth is the result” (p. 58).  To attain this end Benedict Spinoza marshaled his formidable intellect, garnering credit for being both the “father” of both “modern liberal democracy” and “modern Scripture scholarship.”  More blatantly materialistic than either Machiavelli or Hobbes (declaring God is Nature), he adumbrated a might-makes-right political philosophy that flowered in Hegel, “who declared that the Prussian state was the fullest manifestation of the immanentized ‘spirit’ of God” (p. 145).  In a state thus deified, of course, Scripture must be displaced, so Spinoza simply denied any supernatural dimension to the written Word.  By definition, miracles—especially miracles such as the Incarnation and Resurrection—cannot occur, so “higher critics” cavalierly dismissed all such accounts.  To the extent the Bible has merit, its message was reduced “to one simple platitude:  ‘obedience toward God consists only in love of neighbor’” (p. 154).  

Within the next two centuries this same secularizing process wormed its way into the churches.  As a result of the “higher criticism” launched by Spinoza, a “secularizing approach to Scripture was deeply entrenched among the intelligentsia [such as David Friedrich Strauss, a disciple of Hegel, who wrote The Life of Jesus Critically Examined] and had made great headway in European universities.  The aim was ‘de-mythologizing,’ removing from the Biblical text (just as Spinoza had dictated) all of the miracles, and hence undermining all the doctrinal claims related to Christ’s divinity, so that readers were left with, at best, Jesus the very admirable moral man who was misunderstood to be divine by his wishful disciples.  Christianity—so the Scripture scholarship seemed to establish—was built upon a case of mistaken identity.  But the moral core could be salvaged” (p. 240).  Still more:  through the evolutionary processes (both natural and societal) we humans can deify ourselves!  We should worship Man rather than God, the creature rather than the Creator!  

Sharing Spinoza’s intolerance for intolerance, John Locke proposed a softer (“classic”) form of liberalism, though he fully supported its secular essence and proposed a “reasonable” rather than traditionally orthodox form of Christianity.  Eschewing doctrine to emphasize morality, Locke promoted a “mild Deism” that proved quite influential in 18th century England and America.  Personally pious—and the author of many biblical commentaries especially popular in America—Locke was (many thought) sufficiently “Christian” to embrace philosophically.  Concerned to preserve permanent moral standards, he espoused a version of the Natural Law, but he displaced the Church as a mediating institution and left the individual “facing the state alone” (p. 228).  A privatized religious faith is fine, he thought, so long as it makes no claims to shape public policies.   And his “classical” liberalism was (notably in post-WWII America) progressively folded into the more radical forms attuned to Hobbes and Spinoza. 

In today’s churches, Wiker laments, Spinoza’s “materialistic mindset has increasingly taken hold, and the church has become correspondingly anemic.  The church thus weakened by unbelief in the supernatural is what we call the mainline or liberal Christian church.  That church has total faith in materialist science, fully embraces the ‘scientific’ study of Scripture fathered by Spinoza, and professes a completely de-supernaturalized form of Christianity that is entirely at home in this world and only vaguely and non-threateningly associated with the next” (p. 153).  Thus we are confronted, as H. Richard Niebuhr famously said, with theologians teaching that:  “‘A God without wrath brought men without sin into a kingdom without judgment through the ministrations of a Christ without a cross’” (p. 153).  With this Spinoza would be pleased!  “To sum up Spinoza’s kind of Christianity:  You don’t need the Nicene Creed if you’re nice.  People who fight over inconsequential dogmas are not nice.  They’re intolerant” (p.155).  

Furthering the “liberal” agenda of the European Enlightenment, Jean-Jacques Rousseau envisioned a secular “civil religion” (outlined in his Social Contract) replacing Christianity.  His agenda was implemented by men such as Robespierre (“radical liberals”) in the French Revolution and still exerts enormous influence in our world.  Rousseau propounded his own purely naturalistic version of Genesis, imagining how things were in a pure “state of nature.”  Noble savages, free from the constraints of Judeo-Christian morality, followed their passions and enjoyed the good life.  All were equal and none permanently possessed anything.  To regain that lost estate, a “liberal state” is needed—one that “does not define law in terms of the promotion of virtue and the prohibition of vice, but in terms of the protection and promotion of individuals’ private pleasures—since all such pleasures are natural—are declared to be rights.  Any limitation of these ‘rights’ is considered unjust; that is, justice is redefined to mean everyone getting as much of whatever he or she wants as long as he or she doesn’t infringe on anyone else’s pursuit of pleasure” (p. 172).  

Having carefully explained the views of secular liberalism’s architects, Wiker shows how Leftists of various sorts implemented it in the centuries following the French Revolution, for “as the first attempt to incarnate the new liberal political order in a great state, the French Revolution is iconic for liberalism” (p. 200).  Importantly, a purely naturalistic worldview must be crafted and imposed.  We must be persuaded that “we live in a purposeless universe, so that each person has just as much right as anyone else to pursue his or her arbitrarily defined goals or ends” (p. 187).  Liberals triumphantly cite the Darwinian doctrine of biological evolution to prove “that the development of life is driven by entirely random, material processes,” that man “is an accident,” and that we are not made in God’s image but the product of a “meandering and mindless” natural process (p. 194).  Each person freely fabricates and follows whatever moral standards he desires, relaxing into a hedonistic utilitarianism calculated to enjoy the greatest good for the greatest number.  In effect, this has led to a resurgence of a pagan ethos at ease with abortion, euthanasia, promiscuity, sodomy and pedophilia.  

To accomplish this, liberals determined to deprive the Christian religion of any real power.  In late 19th century France this became clear as officials swept away crucifixes and saints’ statues in public places, outlawed religious processions, closed religious schools, and renamed city streets after Enlightenment heroes rather than saints.  More importantly, they seized control of the educational system, making it an agency of the state.  Secularists in America sought the same ends.  To Wiker:  “One cannot overestimate how significant it was in France (and is in America) for liberals to have gained complete state control of education, and for that education to be mandatory.”  This precipitated “a top-down revolution wherein a relatively small minority may impose its worldview upon the entire population using state power.  And the education establishment in our own country, as was the case in France, is dominated by radicals and socialists from the Left, from the universities right down to the elementary schools” (p. 216).  

Thus Liberalism came to America’s shores, first in the form of Locke and later under the auspices of “higher critics” and socialists of various hues.  In a sense, Wiker argues:  “America had a kind of Jacobin class bubbling away underneath tits Protestant surface, plotting its own version of a radical cultural revolution” (p. 263).  Thomas Paine, one of the more influential publicists during the American Revolution, represents this phenomenon.  The author of Common Sense, promoting independence from England, he also wrote The Age of Reason, promoting Deism and anti-Christian prejudices.  Thomas Jefferson avidly embraced both Locke (e.g. The Declaration of Independence) and Paine (e.g. The Life and Morals of Jesus of Nazareth), laying the groundwork for the famous “wall of separation between church and state” in a letter he sent to the Danbury Baptist Association in 1802.  Not until after WWII, however, did the Supreme Court enshrine this Jeffersonian comment as a reason to exclude religion from the public square.  Though Jefferson represented only a small minority of America’s Founders, his anti-Christian secularism slowly spread through the body politic as the decades passed.    

To a degree this Jeffersonian secularism prevailed in America, Alexis de Tocqueville said, because on a practical level 19th century Americans were notably materialistic—seeking comfort and prosperity without compunction.  They were thus quite “inclined to follow Locke, both in theory and in practice, and hence already well on our way to allowing the soul and heaven to fade away.  Christianity was often quite fervent America, but it was subtly reconstructed to be compatible with passionately this-worldly material pursuits.  It was not a Christianity that could produce martyrs or even severe judges of the fallen secular order” (p. 268).  By the end of the century, then, the nation was unfortunately vulnerable to the radicals at the universities who determined to transform the nation.  Scores of young scholars, following the Civil War, sailed to Europe (especially German universities) and returned with Spinoza and Rousseau, Darwin and Spencer, Strauss and Marx, entrenched in their minds.  They then either established or  controlled the nation’s preeminent universities which (given the largess of state and federal governments)  began to shape the cultural life of America.  New academic disciplines—including sociology and psychology—insisted that trained “experts” do for the people what they could not do on their own.  And the newly-minted law schools, personified by Oliver Wendell Holmes, systematically sought to impose a secular agenda on the land.  Progressive politicians, including Theodore Roosevelt, Woodrow Wilson, and Franklin Roosevelt, heeded their admonitions and implemented their goals.  

The time has come, Wiker argues, to disestablish the secular humanism now ruling America under the guise of “progressivism.”  Like scores of other political ideologies, it’s as clearly a religion (ironically, an unbelief established as a belief) with its own dogmas regarding creation, man’s nature and purpose in life, sin and salvation, good and evil, right and wrong, church and state, death, immortality and life everlasting.  “Liberalism once appeared to be about freeing everyone, believers and non-believers alike, from government-imposed religion and morality, but it has shown that that was just a ruse for establishing its own particular worldview, one that is fundamentally antagonistic to Christianity” (p. 312).  To mount the counterrevolution, believers must first target the nation’s universities—primarily by teaching truthful history—first stemming and then reversing the currents of liberal orthodoxy.  

# # #

246 Christ the King

 For most of his life N.T. Wright—one of the world’s most distinguished biblical scholars as well as an active churchman and bishop in the Church of England—has pondered various of questions regarding Jesus and His people.  In Simply Jesus:  A New Vision of Who He Was, What He Did, and Why He Matters (New York:  HarperOne, c. 2011), he sets forth some definitive answers to his quest.  “This book is written,” he declares, “in the belief that the question of Jesus—who he really was, what he really did, what it means, and why it matters—remains hugely important in every area, not only in personal life, but also in political life, not only in ‘religion’ or ‘spirituality,’ but also in such spheres of human endeavor as worldview, culture, justice, beauty, ecology, friendship, scholarship, and sex” (p. 5).  He also endeavors to move beyond the “conservative vs. liberal,” or “personal salvation vs. social gospel,” divisions by subsuming them all beneath his thesis regarding the neglected Truth declaring Christ’s Kingship.    

To find who Jesus really was requires serious historical research, seeking to understand His milieu rather than re-shaping Him to fit ours.  First, that means properly using proper sources—primarily the four canonical gospels.  It also means understanding the ancient milieu within which they were written, when a powerful religious movement (labeled a “philosophy” by the Jewish historian Josephus) reflected an expectation of the coming Messiah and insisted “that it was time for God alone to be king” (p. 41).  Over the centuries Israel’s prophets, reflecting on crucial events such as the Exodus and Exile (cf. Ezekiel 34) had envisioned a time when God fully manifested his royal authority on earth as well as heaven, working through purified hearts rather than foods and rituals.  Israel’s poets (cf. Psalm 2) expected YHWH, working through His anointed Son, would “establish his own rule over the rest of the world from his throne in Zion” (p. 50).  Jesus, drawing on such passages from the Psalms and Isaiah, portrayed Himself as the “suffering servant” expected by some first century Jews, but “Nobody, so far as we know, had dreamed of combining these ideas in this way before.  Nor had anyone suggested that when the prophet spoke of ‘the arm of YHWH’ (53:1)—YHWH himself rolling up his sleeves, as it were to come to the rescue—this personification might actually refer to the same person, to the wounded and bleeding servant” (p. 173).    

Precisely that’s what happened in Jesus, his disciples insisted!  “Within a few years of his death, the first followers of Jesus of Nazareth were speaking and writing about him, and indeed singing about him, not just as a great teacher and healer, not just as a great spiritual leader and holy man, but as a strange combination:  both the Davidic king and the returning God.  He was, they said, the anointed one, the one who had been empowered and equipped by God’s Spirit to do all kinds of things, which somehow meant that he was God’s anointed, the Messiah, the coming king.  He was the one who had been exalted after his suffering and now occupied the throne beside the throne of God himself” (p. 54).  God’s plan was fulfilled, Luke declared, when Jesus ascended the Cross rather than a throne—“or, rather, as all four gospel writers insist, a cross that is to be seen as a throne.  This, they all say, is how Jesus is enthroned as ‘King of the Jews.’  Jesus’  vocation to be Israel’s Messiah and his vocation to suffer and die belong intimately together” (p. 173).   Jesus and His disciples saw the Cross as “the shocking answer to the prayer that God’s kingdom would come on earth as in heaven” (p. 185).  

Consequently, God Himself is in charge of His Kingdom, ruling through his Son Christ Jesus.  “It was this new world in which God was in charge at last, on earth as in heaven.  God was fixing things, mending things, mending people, making new life happen.  This was the new world in which the promises were coming true, in which new creation was happening, in which a real ‘return from exile’ was taking place in the hearts and minds and lives both of notorious sinners and of people long crippled by disease” (p. 91).  Inevitably this provoked animosity from the principalities and powers determined to replace YHWH!  As is revealed in Jesus’ wilderness temptations, the LORD battles Satan and his earthly satraps—a battle finished on the Cross, where Jesus forever defeated the powers of darkness.  

That Christ is King explains the frequent NT references to Jesus forgiving sins and replacing the Temple (where sins were normally forgiven).  The Temple was YHWH’s dwelling, the sacred site where His Shekinah glory declared His presence.  It was literally the center of the world “where heaven and earth met” (p. 132).  When Jesus dramatically cleansed the Temple He “was staking an implicitly royal claim:  it was kings, real or aspiring, who had authority over the Temple” (p. 127).  By this action Jesus declared “that the Temple was under God’s judgment and would, before too long, be destroyed forever” (p. 129).  Indeed, He Himself would become the Temple!  Still more:  He became the new Sabbath and Jubilee!  Time and space are transformed in the new creation wherein He now rules.  

Emblematic of the new creation is the Passover meal Jesus celebrated with His disciples.   It was a traditional Jewish ceremony, but it was radically new.  “Like everything else Jesus did,” Wright says, “he filled the old vessels so full that they overflowed.  He transformed the old mosaics into a new, three-dimensional design.  Instead of Passover pointing backward to the great sacrifice by which God had rescued his people from slavery in Egypt, this meal pointed forward to the great sacrifice by which God was to rescue his people from their ultimate slavery, from death itself and all that contributed to it (evil, corruption, and sin).  This would be the real Exodus, the real ‘return from exile.’  This would be the establishment of the ‘new covenant’ spoken of by Jeremiah (31:31).  This would be the means by which ‘sins would be forgiven’—in other words, the means by which God would deal with the sin that had caused Israel’s exile and shame and, beyond that, the sin because of which the whole world was under the power of death.  This would be the great jubilee moment, completing the achievement outlined in Nazareth” (p. 180).  

“The gifts of bread and wine,” Wright continues, “already heavy with symbolic meaning, acquire a new density:  this is how the presence of Jesus is to be known among his followers.  Sacrifice and presence.  This is the new Temple, this strange gathering around a quasi-Passover table.  Think through the Exodus themes once more.  The tyrant is to be defeated:  not Rome, now, but the dark power that stands behind that great, cruel empire.  God’s people are to be liberated:  not Israel as it stands, with its corrupt, money-hungry leaders and it is people bent on violence, but the reconstituted Israel for whom the Twelve are the founding symbol” (p. 180).  The Last Supper, of course, set the stage for Jesus’s crucifixion and Resurrection; thereafter his followers—His reconstituted Israel—quickly spread around the world declaring “Jesus is Lord, and He is risen.”  The Risen Lord unveiled “the beginning of the new world that Israel’s God had always intended to make” (p. 191), and in His post-Resurrection appearances He materialized as “a person who is equally at home ‘on earth’ and ‘in heaven’” (p. 192).  After 40 days, He ascended into heaven.  But His heaven permeates the earth—Jesus is in “heaven” but He is everywhere present on earth as well.  “If Easter is about Jesus as the prototype of the new creation, his ascension is about his enthronement as the one who is now in charge.  Easter tells us that Jesus is himself the first part of new creation; his ascension tells us that he is now running it” (p. 195).  

And in time He will fully assert His rule.  He’s coming again!  “Heaven is God’s space, God’s dimension of present reality, so that to think of Jesus ‘returning’ is actually, as both Paul and John say in the passages just quoted, to think of him presently invisible, but one day reappearing” (p. 202).  The new world envisioned in Romans 8 and Revelation 21-22 will be a place under Christ’s control, “administering God’s just, wise, and healing rule” (p. 202).  “The second coming is all about Jesus as the coming Lord and judge who will transform the entire creation.  And, in between resurrection and ascension, on the one hand, and the second coming, on the other, Jesus is the one who sends the holy Spirit, his own Spirit, into the lives of his followers, so that he himself is powerfully present with them and in them, guiding them, directing them, and above all enabling them to bear witness to him as the world’s true Lord and work to make that sovereign rule a reality” (p. 203).  

We Christians (his Christ-bearers, His followers) are assigned a vital role in the Kingdom, for God ever intended to rule earth through human beings.  Jesus redeemed us on the Cross in order for us to join him, ruling the world in accord with His design.  “In God’s kingdom, humans get to reflect God at last into the world, in the way they were meant to.  They become more fully what humans were meant to be.  That is how God becomes king.  That is how Jesus goes to work in the present time.  Exactly as he always did” (p. 213).  And He established His Church (His Body), wherein we work to accomplish His ends.  Our work (as  concisely outlined in the Beatitudes) is to bear witness to His way in His world.  

* * * * * * * * * * * * * * * * * * * *

In How God Became King:  Getting to the Heart of the Gospels (New York:  HarperOne, c. 2012) Tom (a.k.a. N.T.) Wright continues to develop the thesis earlier enunciated in Simply Jesus.  He thinks we have lost touch with the canonical gospels, using them as props or tools to further our own agendas rather than as sources demanding our prayerful attention and implementation.  He acknowledges that for 20 centuries numerous scholars have devoted much time and intellectual firepower to the task of understanding them, but he thinks they have, by-and-large, failed to rightly discern and declare their real message.  (There is, of course, more than a little hubris in any declaration such as Wright’s that he alone has at last found The Truth—but that is something of a scholarly virus, an occupational hazard, frequently found in brilliant folks such as he.)    

In Wright’s reading of Church history, orthodox theologians and preachers have (rather narrowly  following St Paul or Luther or Calvin) reduced “the gospel” to the historic creeds—i.e. Apostles’ and Nicene—and neglected if not totally bypassed the Gospels.  “The great creeds, when they refer to Jesus, pass directly from his virgin birth to his suffering and death,” whereas the four Gospel writers “tell us a great deal about what Jesus did between the time of his birth and the time of his death.  In particular, they tell us about what we might call his kingdom-inaugurating work:  the deeds and words that declared that God’s kingdom was coming then and there, in some sense or other, on earth as in heaven.  They tell us a great deal about that; but the great creeds don’t” (p. 11).  “The gospels were all about God becoming king, but the creeds are focused on Jesus being God” (p. 20).  The creeds are not wrong, Wright insists, in what they affirm!  But when the Faith is reduced to creedal verities the Jesus Message gets lost.  

The Message got lost early on as misinterpretations came to dominate the Church!  For 1500 years or so Christians have seemed to ignore the fact “that the Jewish context of Jesus’ public career was playing any role in theological or pastoral reflection,” and He became “founder” of the faith, “with the type of Christianity varying according to the predilections of the preacher or teacher” (p. 110).   Three centuries ago Enlightenment thinkers, reviving the hedonistic materialism of Epicurus and Lucretius and heeding biblical critics such as H.S. Reimarus and Baruch Spinoza, embarked on a quest for the “historical Jesus” that refused to see Him as the Gospels reveal.  As variously portrayed by multitudes of liberal professors and preachers, poets and songsters, Jesus appears as “a revolutionary, hoping to overthrow the Romans by military violence and establish a new Jewish state.  Or he’s a wild-eyed apocalyptic visionary, expecting the end of the world.  Or he’s a mild-mannered teacher of sweet reasonableness, of the fatherhood of God and the brotherhood of ‘man.’  Or perhaps he’s some combination of the above” (p. 26).  Indeed, according to Rudolph Bultmann and his 20th century epigones, details regarding His life have no bearing on much of anything, for the Gospels (in their view) are not bona fide biographies conveying truthful details.  They were fanciful projections, written long after the events described, of an evolving community looking for illustrations to justify their “faith.”  

As a result of skeptical scholarship, “there seems a ready market right across the Western world for books that say that Jesus was just a good Jewish boy who would have been horrified to see a ‘church’ set up in his name, who didn’t think of himself as ‘God’ or even the ‘Son of God’, and who had no intention of dying for anyone’s sins—the church has got it all wrong” (p. 27).  To Wright, such a reading of the Gospels clearly ignores their obvious content.  Markedly deistic, Enlightenment thinkers wanted nothing to do with a God who intervenes on earth, much less actually rules anything.  They rejected both earthly kings and the heavenly King come to earth in Jesus.  “But the whole point of the gospels is to tell the story of how God became king, on earth as in heaven.  They were written to stake the very specific claim towards which the eighteenth-century movements of philosophy and culture, and particularly politics, were reacting with such hostility” (p. 34).  The Deism promoted by Voltaire and Thomas Paine removed God from His world.  “The divine right of kings went out with the guillotine, and the new slogan vox populi vox Dei (‘The voice of the people is the voice of God’) was truncated; God was away with the fairies doing his own thing, and vox pop, by itself, was all that was now needed” (p. 164).  

It’s now time to escape the intellectual shackles of the eighteenth-century!  It’s time to read the Gospels with 20-20 vision, taking them as trustworthy sources regarding who Jesus was and what He declared.  For, Wright incessantly repeats, they give us a largely forgotten narrative, “the story of how Israel’s God became king” (p. 38).  As the Messiah—a Jewish Messiah fulfilling the Hebrew Scriptures—Jesus came not so much as to provide a pathway to an ethereal heaven removed from the earth as to establish an outpost of heaven on earth.  “Jesus was announcing that a whole new world was being born and he was ‘teaching’ people how to live within that whole new world” (p. 47).  To rightly hear the Gospel we must turn down the volume of skeptical scholars and moralistic reformers and hear the annunciation of Jesus “as the climax of the story of Israel” (p. 65).  As Matthew insists, Jesus consummates the history of Israel initiated by father Abraham and “will save his people from their sins” (Mt 1:21).  But Jesus came to save more than the children of Israel, and “the reason Israel’s story matters is that the creator of the world has chosen and called Israel to be the people through whom he will redeem the world.  The call of Abraham is the answer to the sin of Adam.  Israel’s story is thus the microcosm and beating heart of the world’s story, but also its ultimate saving energy.  What God does for Israel is what God is doing in relation to the whole world.  That is what it meant to be Israel, to be the people who, for better and worse, carried the destiny of the world on their shoulders.  Grasp that, and you have a pathway into the heart of the New Testament” (p. 73).  

Mark’s gospel begins with Jesus’ baptism, where He is anointed with the Spirit and declared God’s Son by the Father.  Thenceforth He selected His 12 disciples, symbolizing the 12 tribes of Israel and doing dramatic things, such as calming the storm on the Sea of Galilee, illustrating what He was doing as God rescuing His people.  Toward the end of Mark’s Gospel, we encounter a Roman centurion who declared the crucified Christ as truly the Son of God.  Given his background, we assume the centurion didn’t fully understand what transpired on Golgotha.  “For him, the phrase ‘God’s son” would normally have meant one person and one person only:  Tiberius Caesar, son of the ‘divine’ Augustus” (p. 94).  The centurion tacitly recognized a larger truth:  in Jesus God regained His rightful throne as earth’s real Ruler.  

This too John makes clear in the Prologue to his Gospel, where he “takes us back to the first books of the Bible, to Genesis and Exodus.  He frames his simple, profound opening statement with echoes of the creation story (‘In the beginning . . .’, leading up to the creation of humans in God’s image) and echoes of the climax of the Exodus (‘The Word became flesh, and lived among us,’ 1.14, where the word ‘lived’ is literally ‘tabernacled’, ‘pitched his tent’, as in the construction of the tabernacle for God’s glory in the wilderness).  This, in other words, is where Israel’s history and with it world history reached their moment of destiny” (p. 77).  John’s Jesus “is a combination of the living Word of the Old Testament, the Shekinah of Jewish hope (God’s tabernacling presence in the Temple), and ‘wisdom’, which in some key Jewish writings was the personal self-expression of the creator God, coming to dwell with humans and particularly with Israel (see Wisdom 7; Sirach 24)” (p. 103).  

Climaxing his story with Jesus on the Cross, John portrays Him as “enthroned,” truly the King of Kings.   It was a new kind of Kingdom, one of Love and Truth rather than Power, as he explained to Pilate, and the Roman Procurator acted more presciently than he imagined when he had a sign (a typical public notice called a titulus) in Hebrew, Greek, and Latin —“JESUS OF NAZARETH, THE KING OF THE JEWS”—affixed to the Cross.  “The cross in John, which we already know to be the fullest unveiling of God’s and Jesus’, love (13:1), is also the moment when God takes his power and reigns over Caesar” (p. 146).  Cross and Kingdom, like hand and glove, go together.  “Jesus, John is saying, is the true king whose kingdom comes in a totally unexpected fashion, folly to the Roman governor and a scandal to the Jewish leaders” (p. 220).  “Part of John’s meaning of the cross, then, is that it is not only what happens, purely pragmatically, when God’s kingdom challenges Caesar’s kingdom.  It is also what has to happen if God’s kingdom, which makes its way (as Jesus insists) by non-violence rather than by violence, is to win the day.  This is the ‘truth’ to which Jesus has come to bear witness, the ‘truth’ for which Pilate’s world-view has no possible space (18:38)” (p. 230).  

Following the Crucifixion, of course, we read of the Resurrection and Ascension, fully affirming Jesus’ mission.  “It is the resurrection that declares that the cross was a victory, not a defeat.  It therefore announces that God has indeed become king on earth as in heaven” (p. 246).  Then comes Pentecost, when  the Spirit fully enters Jesus’ disciples, enabling them to “be for the world what Jesus was for Israel” (p. 119).  And just as Jesus battled satanic powers and tackled worldly tyrants, so too His followers continue that work.  The clash of kingdoms foreseen by Daniel and Isaiah and dramatically evident in the life of Jesus continues today.   As with Pilate, the paramount issue is Truth, to which Jesus and His followers bear witness.  This “truth is what happens when humans use words to reflect God’s wise ordering of the world and so shine light into its dark corners, bringing judgment and mercy where it is badly needed” (p. 145).  

Jewish prophets predicted the Messiah would inaugurate a theocracy—the righteous reign of God, ruling through human beings, stewards of His creation.  “Those who are put right with God through the cross are to be putting-right people for the world” (p. 244).  In the Temple—“the fulcrum of ancient Jewish theocracy”—priests and kings had joined to do God’s work in His world, with priests leading worship and kings establishing justice.  He Himself is the new temple, which, “like the wilderness tabernacle, is a temple on the move, as Jesus’ people go out, in the energy of the spirit, to be the dwelling of God in each place, to anticipate that eventual promise by their common and cross-shaped life and work” (p. 239).  

245 Refuting Relativism

While the recently installed Pope Francis urges empathy with the poor he also laments the spiritual poverty of those in bondage to what Benedict XVI called “the tyranny of relativism.”  He certainly follows St.  Francis of Assisi, urging us to be peacemakers—“But there is no true peace without truth!  There cannot be true peace if everyone is his own criterion, if everyone can always claim exclusively his own rights, without at the same time caring for the good of others, of everyone, on the basis of the nature that unites every human being on this earth.”  His papal predecessor, Benedict XVI, had warned:   “We are building a dictatorship of relativism that does not recognize anything as definitive and whose ultimate goal consists solely of one’s own ego and desires.”  While acknowledging that fanatics too easily assert their confidence in various “truths,” we should not cease discerning and proclaiming with certainty self-evident and trustworthy insights and convictions.  “That is why,” he said, “we must have the courage to dare to say:  ‘Yes, man must seek the truth; he is capable of truth.”   

Benedict’s admonitions would not have surprised Allan Bloom, who in 1987 wrote The Closing of the American Mind as “a meditation on the state of our souls, particularly those of the young, and their education” (p. 19).  Youngsters need teachers to serve as midwives—above all helping them deal with “the question, ‘What is man?’ in relation to his highest aspirations as opposed to his low and common needs” (p. 21).   University students, Bloom said, were “pleasant, friendly and, if not great-souled, at least not particularly mean-spirited.  Their primary preoccupation is themselves, understood in the narrowest sense” (p. 83), preoccupied with personal feelings and frustrations.  Not “what is man” but “who am I” is the question!  They illustrate “the truth of Tocqueville’s dictum that ‘in democratic societies, each citizen is habitually busy with the contemplation of a very petty object, which is himself’” (p. 86).  

     This preoccupation with self-discovery and self-esteem, Bloom believed, flowers easily in today’s relativism, a philosophical dogma espoused by virtually everyone coming to or prowling about the university.  Under the flag of “openness” and “tolerance,” no “truths” are acknowledged and everyone freely follows his own feelings.  So even the brightest of our young people know little about history, literature, or theology, for such knowledge resides in books, which remain largely unread, even in the universities.  Minds shaped by films, rock music and television have little depth, and “the failure to read good books both enfeebles the vision and strengthens our most fatal tendency—the belief that the here and now is all there is” (p. 64).  Deepening his analysis in a section titled “Nihilism, American Style,” Bloom diagnosed the philosophical roots of today’s educational malaise as preeminently rooted in Nietzsche, Freud, and Heidegger.  An enormous intellectual earthquake has shaken our culture to its foundations.  It is “the most important and most astonishing phenomenon of our time,” the “attempt to get ‘beyond good and evil’” by substituting “value relativism” for Judeo-Christian absolutism (p. 141).  

* * * * * * * * * * * * * * * * * * * * * * * * *

What concerned Bloom and the popes at the turning of the millennium was perceptively examined half-a-century earlier by C.S. Lewis in one of his finest books, The Abolition of Man (New York:  Macmillan, 1947).  First presented during WWII as a series of lectures, Lewis began by carefully examining an elementary English textbook he dubbed The Green Book.   While allegedly designed to help students read literature, the text was inadvertently a philosophical justification for relativism, promoting the notion that all values, whether aesthetic or ethical, are subjective and ultimately indefensible.  However, Lewis said:  “Until quite modern times all teachers and even all men believed the universe to be such that certain emotional reactions on our part could be either congruous or incongruous to it—believed, in fact, that objects did not merely receive, but could merit, our approval or disapproval, our reverence or our contempt.  The reason why Coleridge agreed with the tourist who called the cataract sublime and disagreed with the one who called it pretty was of course that he believed inanimate nature to be such that certain responses could be more ‘just’ or ‘ordinate’ or ‘appropriate’ to it than others.  And he believed (correctly) that the tourists thought the same.  The man who called the cataract sublime was not intending simply to describe his own emotions about it:   he was also claiming that the object was one which merited those emotions” (#148 in Kindle).  

Coleridge and others who believed in objective Truth (and truths) generally appealed to what Chinese thinkers revered to as “the Tao.  It is the reality beyond all predicates, the abyss that was before the Creator Himself.  It is Nature, it is the Way, the Road.  It is the Way in which the universe goes on, the Way in which things everlastingly emerge, stilly and tranquilly, into space and time.  It is also the Way which every man should tread in imitation of that cosmic and supercosmic progression, conforming all activities to that great exemplar” (#107).  We instantly recognize—through theoretical reason—certain laws of thought (e.g. the law of non-contradiction) or geometry (e.g. a line is the shortest distance between two points); we also know—through practical reason—certain permanent moral maxims (e.g. murder is wrong).  Any effort to reduce universal values to personal feelings inevitably founder in nihilistic confusion.    

In truth:  “All the practical principles behind the Innovator’s case for posterity, or society, or the species, are there from time immemorial in the Tao.  But they are nowhere else.  Unless you accept these without question as being to the world of action what axioms are to the world of theory, you can have no practical principles whatever” (#358).  “The human mind has no more power of inventing a new value than of imagining a new primary colour, or, indeed, of creating a new sun and a new sky for it to move in” (#398).  By disregarding the Tao, advocates of any new morality sink into a “void” without a  pattern to follow, a nihilistic abyss promoting “the abolition of Man” (#556).  

* * * * * * * * * * * * * * * * * * * * * * * * 

In The Book of Absolutes:  A Critique of Relativism and a Defense of Universals (Montreal:  McGill-Queen’s University Press, c. 2008), William D. Gairdner updates and amplifies an ancient and perennial proposition.  A distinguished Canadian Olympic athlete with degrees from Stanford University, Gairdner has effectively influenced the resurgence of conservatism in his native land.  Though he acknowledges the present power and pervasiveness of relativism, he finds it “a confused and false conception of reality that produces a great deal of unnecessary anxiety and uncertainty, both for individuals and for society as a whole” (#71).  To rectify this problem he wrote “a book to restore human confidence by presenting the truth about the permanent things of this world and of human existence” (#74).  

The current notion—that truth varies from person to person (or group to group), that all perspectives must be tolerated, that moral judgments must be suspended—has an ancient history which Gairdner explores, noting that earlier generations generally judged it “preposterous.  The ancient Greeks actually coined the word idiotes (one we now apply to crazy people) to describe anyone who insisted on seeing the world in a purely personal and private way” (#88).  There were, of course, Greek Sophists such as Protagoras who declared:  “As each thing appears to me, so it is for me, and as it appears to you, so it is for you.”  But their views withered under the relentless refutations of Socrates, Plato, and Aristotle—all defending objective truth and perennial principles—followed by Medieval philosophers such as Thomas Aquinas and Enlightenment scientists such as Isaac Newton.  

Dissenting from the traditional, absolutist position were thinkers such as Thomas Hobbes, who rejected any rooting of moral principles in a higher law, declaring in The Leviathan  that we label “good” whatever pleases us.  Indeed, the words good and evil “are ever used with relation to the person that usesth them:  there being nothing simply and absolutely so.”  A century later Hobbes’ subjectivism would be enshrined by a thinker markedly different from him in many respects, Immanuel Kant, “the most coolly influential modern philosopher to have pushed us toward all sorts of relativist conclusion” (p. 14).   Building on Kant’s position, Friedrich Nietzsche formulated the relativist slogan:  “there are no facts, only interpretations.”  American pragmatists and cultural anthropologists, European existentialists and deconstructionists took up the catchphrase, and today we live in a postmodern culture deeply shaped by epistemological skepticism and moral relativism.  

Sadly, this intellectual shift was bolstered by a profound popular misunderstanding and misrepresentation of one of the monumental scientific discoveries of all time, Einstein’s “Special Theory of Relativity.”  As historian Paul Johnson explains, “‘the belief began to circulate, for the first time at a popular level, that there were no longer any absolutes:   of time and space, of good and evil, of knowledge, and above all of value.  Mistakenly but perhaps inevitably, relativity became confused with relativism.’  And, he adds, ‘no one was more distressed than Einstein by this public misapprehension’” (p. 17).  Nevertheless, it served as a scalpel that helped “‘cut society adrift from its traditional moorings in the faith and morals of Judeo-Christian culture’” (p. 18).   

After explaining various forms of relativism—noting that its moral and cultural forms are most prevalent and pernicious—Gairdner registers some objections to it.  It is, importantly, “self-refuting,” basing its entire case upon the absolute assertion that there are no absolutes.  Thus it cannot withstand Aristotle’s powerful argument, set forth in his Metaphysics, showing how it violates the law of non-contradiction.  That various persons or cultures claim different “truths” hardly affects the fact that a great many beliefs are manifestly wrong, whereas others (e.g. the earth is spherical) are demonstrably right.  The fact that some groups of people (“sick societies”) have condoned human sacrifice or infanticide hardly justifies these practices.  Admittedly, some truths—both scientific (a heliocentric solar system) and moral (slavery is wrong)—become clear only after considerable time or laborious investigation, but that only strengthens their certainty.  Thus:  “Neither believing nor doing makes a thing right or wrong” (p. 39).  

Challenging relativism, Gairdner invokes scholars such as Donald E. Brown, a professor of anthropology, whose 1991 publication, Human Universals, effectively refutes many sophomoric relativistic mantras by listing more than 300 human universals.  For example:  “All humans use language as their principal medium of communication, and all human languages have the same underlying architecture, built of the same basic units and arranged according to implicit rules of grammar and other common features.  All people classify themselves in terms of status, class, roles, and kinship, and all practice division of labour by age and gender.  . . . .  All have poetry, figurative speech, symbolic speech, ceremonial speech, metaphor, and the like.  All use logic, reckon time, distinguish past, present, and future think in causal terms, recognize the concept and possibility of cheating and lying, and strive to protect themselves from the same’” (p. 64).  Everywhere and at all times we humans have acknowledged such universal realities.  

There are indubitable, demonstrable constants (laws) throughout the natural world—notably the law of gravity and Einstein’s famous theorem, E=mc2.  Material things, moving through space, continually change; but laws remain the same.  As David “Berlinski puts it, ‘the laws of nature by which nature is explained are not themselves a part of nature.  No physical theory predicts their existence nor explains their power.  They exist beyond space and time, they gain purchase by an act of the imagination and not observation, they are the tantalizing traces in matter of an intelligence that has so far hidden itself in symbols’” (p. 76).  Still more, says Berlinski:  “‘We are acquainted with gravity through its effects; we understand gravity by means of its mathematical form.  Beyond this, we understand nothing” (p. 78).  

Analogously, as human beings we are, by nature, “hardwired” with important “universals.”  The “blank-slate” notion, promulgated by empiricists such as John Locke and B.F. Skinner, cannot support the accumulating genetic and cognitive evidence concerning our species.  Beyond using the same nucleic acid and DNA basic to all living organisms, we do a great number of remarkable things:  negotiating contracts; acting altruistically and even sacrificially; establishing elaborate kinship ties; walking erectly; engaging in year-round sexual encounters; recognizing ineradicable male-female differences; manifesting an inexplicably miraculous intelligence, reason, and free will.  Gairdner patiently compiles and evaluates the massive evidence available to show that a multitude of “universals” do in fact define us as human beings.  “Contrary to the claims of relativists everywhere that nothing of an essential or innate human nature exists, we find that there is indeed a basic universal, biologically rooted human nature, in which we all share.  This is so from DNA to the smiling instinct.  It is a human nature that runs broad and deep, and nothing about it is socially constructed or invented” (p. 162).  This is markedly evident in the “number of universals of human language” (p. 217) discovered by scholarly linguists such as Noam Chomsky, who insists “‘there is only one human language,’ which he and his followers later labeled ‘UB,’ or ‘Universal Grammar’”(p. 229).  As an English professor, Gairdner devotes many pages, in several chapters, to an explanation and analysis of recent developments in the study of language, truly one of the defining human characteristics.  Importantly, it can “be seen as a mirror of the internal workings of the mind rather than as a mirror of the external workings of culture or society” (p. 291).  

Embedded within this human nature we find a natural law prescribing moral norms.  “The traditional natural law is therefore based on four assumptions:  ‘1.  There are universal and eternally valid criteria and principles on the basis of which ordinary human law can be justified (or criticized).  2.  These principles are grounded both in nature (all beings share certain qualities and circumstances) and in human nature.  3.   Human beings can discover these criteria and principles by the use of right reason.  4.  Human law is morally binding only if it satisfies these criteria and principles’” (p. 164).  It is a hallmark of the philosophia perennis articulated by classical (Plato; Aristotle; Cicero) and Christian (Aquinas; Leibniz; C.S. Lewis) thinkers and noted for its common sense notions regarding God, man, and virtuous behavior.  

Espoused by some of the great architects of Western Civilization—including Aristotle and Cicero,  Aquinas and Calvin, Sir William Blackstone and the American Founders, the Nuremberg judges and Martin Luther King—the natural law tradition has provided the foundation for “the rule of law” so basic to all our rights and liberties.   As defined by Cicero:  “‘true law is right reason in agreement with nature, universal, consistent, everlasting, whose nature is to advocate duty by prescription and to deter wrongdoing by prohibition.’  He further stated that ‘we do not need to look outside ourselves for an expounder or interpreter of this law.’  God, he said, is the author and promulgator and enforcer of this law, and whoever tries to escape it ‘is trying to escape himself and his nature as a human being’” (p. 183).  

So, Gairdner explains:  “The precepts of natural law for rational human creatures are, then, rational directives of logic and morality aimed at the common good for humanity and at avoidance of everything destructive of the good.  This means that human rational fulfillment may be found in such things as preserving the existence of ourselves and others by begetting and protecting children, by avoiding dangers to life, by defending ourselves and our loved ones, by hewing to family and friends, and of course, by hewing to reason itself.  We know many such standards in religion as commandments.  In daily life we know them as natural commands and prohibitions:  love others, do unto them as you would have them do unto you, be fair, do not steal, do not lie, uphold justice, respect property, and so on” (p. 189).  Such precepts, as Aquinas insisted, are intuitively known, per se nota; they are as self-evident as the geometric axioms of Euclid or the North Star’s fixed location in the night sky.  Thus murder and lying and theft and rape are rightly recognized as intrinsically evil.  Honoring one’s parents, respecting the dead, valuing knowledge, and acting courageously are rightly deemed good.  

During the past century, as relativism has flourished, the natural law tradition was widely attacked and abandoned.  Strangely enough, most relativists grant the existence of “an extremely mysterious law of gravity that controls all matter but that is not itself a part of matter, but they will not consent to other types of laws that govern or guide human behaviour, such as a natural moral law” (p. 210).    Apparently, Gairdner says, one of the reasons “for the decline of natural law during the rise of the modern state is that just about every law school, every granting institution, every legal journal, and every public court and tribunal is largely funded by the state.  And no modern state wants to be told by ordinary citizens that any of its laws are not morally binding.  That is why Lord Acton referred to natural law as ‘a revolution in permanence.’  He meant a revolution by those who cherish a traditional society and a morality rooted in human nature against all those who attempt to uproot, reorder, and deny or replace these realities” (p. 182).  

The modern repudiation of absolutes followed developments in 19th and 20th century German philosophy, evident in Hegel, Nietzsche and Heidegger, reaching their apex in Nazi Germany.  Classical and Christian advocates of transcendent metaphysical principles, such as Plato and Aquinas, were discarded by a corps of “existentialists” determined to move “beyond good and evil” and devise a purely secular, humanistic ethos.  French intellectuals, following the lead of Jacques Derrida and Michel Foucault, imported Nietzsche and Heidegger, setting forth the currents of “deconstruction” and “postmodernism” so triumphant in contemporary universities and media centers.  “It was all an echo of Nietzsche’s ultra-relativist claim (later elaborated by Heidegger) that ‘there are no facts, only interpretations’” (p. 252).  

Ironically, Derrida himself, toward the end of his life, announced an important “Turn” in his thought.  He acknowledged “the logical as well as spiritual need for a foundation of some kind.  And out it came, as quite a shock to his adamantly relativist followers:  “‘I believe in justice.  Justice itself, if there is any, is not deconstructible’” (p. 266).  Derrida simply illustrates the perennial power of the natural law—there are some things we just can’t not know!    Derrida’s “Turn” underscores what Gairdner endeavors to do in this book:  “to expose the intellectual weakness of the relativism that pervades modern—especially postmodern—thought and also to offer a basic overview of the absolutes, constants, and universals that constitute the substance of the many fields explored here.  We have see them at work in culture through human universals, in physics via the constants of nature, in moral and legal thought via the natural law, and in the human body via our hardwired biology.  And not least, of course, in view of its close approximation to human thought itself, we have looked at the constants and universals of human language” (p. 308).  He persuasively demonstrates that “we do not live in a foundationless or relativistic world in which reality and meaning, or what is true and false, are simply made up as we go along and according to personal perceptions.  On the contrary, we live in a world in which every serious field of human thought and activity is permeated by fundamentals of one kind or another, by absolutes, constants, and universals, as the case may be, of nature and of human nature” p. 308).  

# # # 

244 Fewer . . . and Fewer of Us

  Among the handful of must-read 20th century dystopian novels—Aldus Huxley’s Brave New World; George Orwell’s 1984; C.S. Lewis’s That Hideous Strength—is P.D. James’s The Children of Men (New York:  Penguin Books, c. 1992), which prods both our imagination and reason by envisioning the potential consequences of demographic trends.  James, a distinguished novelist, known mainly for her riveting (and philosophically nuanced) mystery stories, portrays the world in 2021, twenty-six years after the last baby was born, dooming the race to extinction.  She challenged, in a powerful artistic way, one of the prevailing orthodoxies of our day—the threat of overpopulation.  The Children of Men boldly countered the message of Paul Ehrlich’s 1968 best-selling Population Bomb (one of the most egregiously misguided books published during that pivotal decade), which fueled to the mounting fears of ecological catastrophe then gripping the environmental community.  Because earth’s resources are finite, he predicted:  “In the 1970s the world will undergo famines—hundreds of millions of people are going to starve to death.”  Ehrlich was duly lauded by the academic community (after all he was a certified member of the elite, a professor at Stanford University with an enviable reputation as an entomologist) and courted by the complacent media (Johnny Carson gushing over him for his prescience).  

One of the few journalists willing to risk censure by differing with Ehrlich was Ben J. Wattenberg, who warned of actual population implosion rather than explosion.  Two decades later he wrote The Birth Dearth, examining the “Total Fertility Rate” which was falling around the globe.  He vainly hoped to stimulate a national conversation on the subject, but few (alas) recognized the reality of birth dearth.  Returning to his concern in Fewer:  How the New Demography of Depopulation Will Shape Our Future (Chicago:  Ivan R. Dee, c. 2004), he argued that “never have birth and fertility rates fallen so far, so fast, so low, for so long, in so many places, so surprisingly” (p. 5).  “For at least 650 years,” he says, “the total number of people on earth has headed in only one direction:  up.  But soon—probably within a few decades—global population will level off and then likely fall for a protracted period of time” (p. 5).  

European, Russian and Japanese populations are virtually in free fall, with a Total Fertility Rate (TFR) significantly less than requisite replacement levels (2.1 per woman).  Europe’s population will likely shrink from 728 million in 2000 to 632 million in 2050.  To replace lost babies, Europe would need to take in nearly two million (rather than the current 376,000) immigrants each year.  Russia has a TFR of 1.14 and will lose 30 percent of its population by mid-century.  Not only are folks having fewer children—they want less!  And lest we think this is true only of prosperous, highly industrialized nations, it also applies to Less Developed Countries, where a dramatic reduction in population growth has occurred within the past few decades.  China, for example, had a TFR of 6.06 forty years ago; after instituting a “one child only” policy, by  the beginning of the millennium the TFR fell to 1.8!  Similarly, South Korea’s 2005 rate fell to 1.17.  Brazil and Mexico reveal the same depopulating trajectory.  In fact, few nations are repopulating themselves.  America, however, has proved somewhat exceptional, sustaining a replacement level fertility rate—in part through welcoming immigrants who frequently have large families.  

To explain this phenomenon, Wattenberg points first to increased urbanization, where children are something of a liability rather than an asset.  Secondly, as women pursue higher education and careers—and as couples earn more money—they have proportionally fewer children.  “More education, more work, lower fertility” (p. 96).  Thirdly, abortion disposes of 45 million developing children every year.  Fourthly, divorce lowers fertility as single women welcome fewer children than their married counterparts.  Fifthly, contraception (exploding since the ‘60s) empowers couples to enjoy sexual pleasure without undesired consequences.  And sixthly, since couples marry later in life they inevitably have fewer offspring.  The ominous consequences of this depopulation cannot be ignored, because the welfare states established in virtually all modern countries simply cannot support growing numbers of elderly retirees funded by dwindling numbers of younger workers.  Successful businesses thrive by employing creative young workers and selling goods to expanding numbers of consumers—essential factors inevitably absent as populations decline.  Nations—and civilizations—will lose power and influence as their numbers decline.  Less free, less enlightened dictatorial successors may very well replace them.  The world, quite simply, will be a radically different place within a century.  

* * * * * * * * * * * * * * * * * * * * * * 

In What to Expect When No One’s Expecting:  America’s Coming Demographic Disaster (New York:  Encounter Books, c. 2013) Jonathan V. Last details the latest data regarding population prospects.  The book’s title reveals its thesis:  no one’s expecting these days—and paradoxically, as P.J. O’Rourke quipped, “the only thing worse than having children is not having them.”  Failing to heed the Bible’s first injunction—“be fruitful, and multiply, and replenish the earth”—modern man faces an uncertain prospect bereft of children, coddling pets as their “fuzzy, low-maintenance replacements” (p. 3).  The earth, it seems, will grow emptier.  Indeed, “only 3 percent of the world’s population lives in a country whose fertility rate is not declining” (p. 92).  We are moving from the “First Demographic Transition,” wherein children took center-stage and politicians built careers on looking out for them, to the “Second Demographic Transition,” wherein individual adults shun both families and children to pursue their own careers and pleasures.  “Middle-class Americans don’t have very many babies these days.  In case you’re wondering, the American fertility rate currently sits at 1.93,” significantly below the requisite replacement level (p. 4).  At the moment, the deficit is rectified by Hispanic women, who average 2.35 babies, but they too are rapidly choosing to have less and less.  For example, the once-plenteous supply of Puerto Rican immigrants to New York has collapsed as the island’s birthrate shrunk in 50 years from 4.97 to 1.64.  “Some day,” Last says, “all of Latin America will probably have a fertility rate like Puerto Rico’s.  And that day is coming sooner than you think” (p. 113).  Labor shortages in Latin countries will eliminate the need to emigrate and the U.S. population picture will quickly resemble Europe’s.    

Glancing abroad, by 2050 Greece may lose 20 percent of its people; Latvia has, since “1989 lost 13 percent” and “Germany is shedding 100,000 people a year” (p. 25).  Spain registers barely one baby per woman.  Japan’s population is shrinking and abandoned rural villages bear witness to the trend.  It’s the same in Russia:  “In 1995, Russia had 149.6 million people.  Today, Russia is home to 138 million.  By 2050, its population will be nearly a third smaller than it is today” (p. 25).  Consequently, Vladimir Putin has zealously promoted a variety of failing schemes designed to encourage women to have more children.  But they choose not to!  Other things seem more important.  “Divorce has skyrocketed—Russia has the world’s highest divorce rate.  Abortion is rampant, with 13 abortions performed for every 10 live births.  Consider that for a moment:  Russians are so despondent about the future that they have 30 percent more abortions than births.  This might be the most grisly statistic the world has ever seen.  It suggests a society that no longer has a will to live” (p. 137).  

Portents of things to come stand revealed in Hoyerswerda, a German city near the Polish border.  In 1980 the town had 75,000 residents and “the highest birth rate in East Germany” (p. 98).  With the collapse of the Soviet Union, the folks there (and, more broadly, throughout the former East Germany) simply stopped procreating.  The fertility rate abruptly plunged to 0.8 and within three decades the town lost half of its residents.  Hoyerswerda “began to close up shop” (p. 98).  Buildings, businesses, and homes stood vacant.  Similar developments across the country dictated a significant shift from “urban planning” to expand and develop infrastructures and suburbs to devising ways “to shrink cities” (p. 98).  Parks now proliferate, replacing factories and schools.  The wolf population is actually resurgent, with wolf-packs prowling around dwindling settlements.  

Mirroring these European trends is Old Town Alexandria (the D.C. suburb where Last and his wife lived for a while)—a “glorious preserve of eco-conscious yoga and free range coffee.  My neighbors had wonderfully comfortable lives in part because they didn’t take on the expense of raising children” (p. 25).  As a portent of things to come, white, college-educated women, shopping in Alexandria’s stylish boutiques and devotedly determined to pursue a career, have a fertility rate of 1.6—barely more than Chinese women after decades of that nation’s recently-suspended “one-child” policy.  In 1970 the average Chinese woman bore six children and the Communist regime envisioned multiple problems with the ticking population bomb.  Energetic policies were implemented until quite recently, when the rulers realized the implications of population implosion.  But a trajectory has been set and within forty years “the age structure in China will be such that there are only two workers to support each retiree” (p. 13).  

Looking to explain this world-wide pattern, the author lists a variety of “factors, operating independently, with both foreseeable and unintended consequences.  From big things (like the decline in church attendance and the increase of women in the workforce) to little things (like a law mandating car seats in Tennessee or the reform of divorce statutes in California), our modern world has evolved in such a way as to subtly discourage childbearing” (p. 16).  Certainly there are good reasons not to procreate.  Heading the list is money.  Rearing a child may very well cost parents a million dollars!  Financially, it’s the worst investment possible!  “It is commonly said that buying a house is the biggest purchase most Americans will ever make.  Well, having a baby is like buying six houses, all at once.  Except you can’t sell your children, they never appreciate in value, and there’s a good chance that, somewhere around age 16, they’ll announce:  ‘I hate you’” (p. 43).  

Complicating this are welfare state structures such as Medicare and Social Security that “actually shift economic incentives away from having children” (p. 46).  Though Texas Governor Rick Perry was ridiculed for suggesting it, Social Security really is a “Ponzi scheme” that will only work “so long as the intake of new participants continues to increase” (p. 107).  In 1950 three million folks were getting Social Security checks; thirty years later there were 35 million retirees expecting monthly payments; by 2005 nearly 50 million were on the role, taking $546 billion a year from taxpayers still working.  In its initial (New Deal) phase, Social Security only exacted one percent of a worker’s paycheck; 30 years later (under LBJ’s Great Society) the rate inched up to three percent; by 1971 it jumped to 4.6 percent; and today (shielded from any adjustments by Barack Obama) it amounts to 6.2 percent.  The sky, you might say, is the limit as an endless line of elders look to their shrinking numbers of children to pay the bills.  The same goes for Medicare—except the prognosis is worse by far!  It simply cannot survive in it present form, given the realities of a shrinking population.  Both programs “were conceived in an era of high fertility.  It was only after our fertility rate collapsed that the economics of the programs became dysfunctional” (p. 109).  

Yet looming above all else is “the exodus of religion from the public square” (p. 84).  Devout Catholics and Protestants have more kids.  They shun the behaviors facilitating population decline—contraception, abortion, cohabitation, delayed marriage, divorce—and church-going couples fully enjoy marriage in ways unavailable to their secular counterparts.  Practicing Protestants increasingly resemble practicing Catholics, procreating more than enough youngsters to support population growth.  But non-religious women, according to a 2002 survey, had a fertility rate of only 1.8, whereas women who rated religion as “not very important” clocked in at 1.3.  Ultimately “there’s only one good reason to go through the trouble” of rearing children:  “Because you believe, in some sense, that God wants you to” (p. 170).  For this reason our government especially should craft family-friendly, child-friendly policies—repudiating the feminist and homosexual ideologies shaping the laws and judicial decrees of the past half-century.   

* * * * * * * * * * * * * * * * * * * * * * 

Columnist Mark Steyn, whether writing or speaking, is justly renowned for his infectious humor and verbal dexterity, bringing to his discussions of serious subjects a sustained note of good cheer.  There is little to cheer about, however, in Steyn’s America Alone:  The End of the World as We Know It (Washington:  Regnery Publishing, Inc., c. 2006) wherein he casts a gloomy look at demographic realities and predicted “the Western world will not survive the twenty-first century, and much of it will effectively disappear within our lifetimes, including many if not most European countries” (p. xiii).  Within 40 years “60 percent of Italians [once lionized for their large and boisterous families] will have no brothers, no sisters, no cousins, no aunts, no uncles” (p. xvii).  Declining populations will leave welfare states unsustainable and civilization unfeasible.  “Civilizations,” said Arnold J. Toynbee, in A Study of History, “die from suicide, not murder,” and Western Civilization is in the midst of self-inflicted mortal wounds.  “We are,” Steyn says, “living through a rare moment:  the self-extinction of the civilization which, for good or ill, shaped the age we live in” (p. 3).  

Though we’re tempted to think such things have never happened before, Steyn jolts us with a quotation from Polybius (c. 150 B.C.), one of the greatest ancient historians, who said:  “In our own time the whole of Greece has been subject to a low birth rate and a general decrease of the population, owing to which cites have become deserted and the land has ceased to yield fruit, although there have neither been continuous wars nor epidemics. . . .  For as men had fallen into such a state of pretentiousness, avarice, and indolence that they did not wish to marry, or if they married to rear the children born to them, or at the most as a rule but one or two of them, so as to leave these in affluence and bring them up to waste their substance, the evils rapidly and insensibly grew” (The Histories, XXXVI).    

Basic to demographic decay, as both Polybius and Steyn argue, is an apparently irreversible moral and spiritual decay leaving listless increasing numbers of people.  Irreligious folks inevitably lose faith not only in an invisible God but in equally invisible ethical principles and reasons to live hopfully for the future.  Thus Europe’s population has plunged like a raft going over a waterfall in the wake of the de-Christianization of the continent.  Childless places like Japan and Singapore and Albania have little religious vitality.  Standing alone in the midst of all this is the United States, which still enjoys a modest population growth.  True to form, the U.S. is the extraordinary Western nation still featuring robust religious activity.  Unfortunately this may not long persist since “most mainline Protestant churches are as wedded to the platitudes du jour as the laziest politician” (p. 98).  They “are to one degree or another, post-Christian.  If they no loner seem disposed to converting the unbelieving to Christ, they can at least convert them to the boggiest of soft-left political clichés, on the grounds that if Jesus were alive today he’d most likely be a gay Anglican bishop in a committed relationship driving around in an environmentally friendly car with an ‘Arms Are for Hugging’ sticker on the way to an interfaith dialogue with a Wiccan and a couple of Wahhabi imans” (p. 100).  Without a resurgence of orthodox, muscular Christianity, Steyn thinks, America too will soon choose the childless path to historical oblivion.  

In addition to population implosion, Steyn devotes much attention in America Alone to the threat of Islamic imperialism, facilitated by the growing passivity—the unwillingness to resist terrorism—throughout much of what was once labeled “Western Civilization.”  Indicative of the trend was “the decision of the Catholic high school in San Juan Capistrano to change the name of its football team from the Crusaders to the less culturally insensitive Lions” (p. 158).  (Simultaneously, 75 miles to the south, lock-stepping with the culture, Point Loma Nazarene University—while I was on the faculty—similarly  changed its mascot from Crusaders to Sea Lions.)  This loss of a masculine will-to-fight, as will as the will-to-procreate, signifies a collapsing culture.  Indeed, the “chief characteristic of our age is “‘deferred adulthood’” (p. 191).  And it takes strong adults to create and sustain a culture.  

* * * * * * * * * * * * * * * * * * * 

However realistically we appraise the threat of Islamic Jihadism, demographic realities foretell coming calamities in Muslim lands during the next half-century.  This prospect becomes clear in David P. Goldman’s How Civilizations Die (And Why Islam Is Dying Too) (Washington:  Regnery Publishing, Inc., c. 2011).  Growing numbers of us are aware of the “birth dearth” haunting much of the world, but because it’s underreported, few know that within four decades “the belt of Muslim countries from Morocco to Iran will become as gray as depopulating Europe” (p. x).  For example females in Iran, though now surrounded by half-a-dozen of their siblings, will themselves “bear only one or two children during their lifetimes” (p. x).  “The fastest demographic decline ever registered in recorded history is taking place today in Muslim countries” (p. xv).  

Along with his description of demographic patterns in Muslim nations, Goldman’s discussion of “four great extinctions” makes his book worth pondering.  The first extinction took place a millennium before Christ, with the passing of the Bronze Age and the disappearance of cities such as Troy, Mycenae, and Jericho.  The second extinction, two hundred years before Christ, enervated the Hellenic civilization once centered in cities such as Athens and Sparta.  Aristotle says Sparta shrank within a century from 10,000 to 1,000 citizens.  Increasingly large landed estates, run for the benefit of an ever diminishing aristocracy less and less concerned with rearing children, indulging themselves with sexual perversions such as pederasty, left Sparta bereft of people and militarily listless.  The city was, he said  “ruined for want of men.” (Politics, II, ix).   

The third extinction marked the end of Rome’s power and grandeur in the fourth and fifth centuries A.D.  Even in the glory days of the Empire, when Augustus Caesar reigned, “there was probably a decline” in the empire’s population due to “the deliberate control of family numbers through contraception, infanticide and child exposure” (p. 131).  Augustus himself decreed punishments for “childlessness, divorce, and adultery among the Roman nobility” (p. 131), but nothing worked, and the empire’s needed laborers and soldiers were necessarily drawn from defeated (or volunteering) barbarians from the North.  We are now in the midst of the fourth extinction, when civilizations (East and West) are beginning to show symptoms of rigor mortis.   This extinction began in many ways with the French Revolution in 1789, the “world’s first attempt to found a society upon reason rather than reason” (p. 134), followed by the subsequent waves of revolutionary activity, that transformed Europe into a bastion of atheistic and anti-natal secularism.  

Though Islam seems to be a vibrant religion, currently regaining its virility through movements such as the Muslim Brotherhood, Goldman thinks it is in fact violently (and vainly) reacting to the global secularism fully evident in the dramatic decline of population throughout the Islamic world.  Joining “Western Civilization,” Islam is another dying culture!  So fewer and fewer of us will inherit the earth.  

243 Scared to Death

 Though I routinely recommend various books wishing them widely read, I occasionally finish one wishing everyone fully knew its contents, for, as the prophet Hosea said, “My people are destroyed for lack of knowledge” (4:6).   Scared to Death:  From BSE to Global Warming:  Why Scares Are Costing Us the Earth (New York:  Continuum, c. 2007; 2009 reprint), by two British journalists, Christopher Booker and Richard North, is one of those books.  In brief, they document Shakespeare’s insight in A Midsummer Night’s Dream (“In the night, imagining some fear, how easy is a bush supposed a bear”) showing how a succession of unfounded fears have panicked and harmed millions of people.  Each panic followed a “common pattern,” beginning with alleged scientific data portending a catastrophe in the making.  “Each has inspired obsessive coverage in the media.  Each has then provoked a massive response from politicians and officials, imposing new laws that inflicted enormous economic and social damage.  But eventually the scientific reasoning on which the panic was base has been found to be fundamentally flawed” (p. ix).  Though differing in details, they all resemble the “millennium bug” that so exercised millions of folks as January 2000 approached.  Eminent authorities warned of “potentially disastrous global consequences to both business and government” as computers were predicted to malfunction.  Scores of institutions invested millions of dollars preparing for the crisis.  But absolutely nothing happened!  

The first part of the book delves into a litany of “food scares” that profoundly affected Great Britain.  Beginning in 1985, a few cattle died as a result of brain infection—known as “cattle scrapie” and ultimately dubbed “Mad Cow Disease.”  At the same time scattered salmonella and listeria outbreaks led anxious experts to blame eggs and cheese as the culprits.  Government scientists and bureaucrats leapt into action, persuaded they needed to protect the public, decreeing the slaughter of herds and flocks.  Flooded with sensational statements in the newspapers and on TV, people around the world suddenly shunned British beef and eggs, bankrupting scores of small farmers.  Hygiene became a pressing and paramount issue, though food poisoning incidents “remained curiously stable” (p. 76).  Absolutely no evidence existed linking brain encephalopathies in livestock to human beings, yet nothing deterred government spokesmen and journalists from hyping the threat.  When the “Mad Cow disease” was finally  laid to rest, more than 8,000,000 cattle and sheep had been destroyed with a total cost of more than three billion pounds.  Comprehensively calculated, the panic cost twice that.  “Without question it was the most expensive food scare the world has ever seen” (p. 126).

Having examined, in detail, health-related scares in Britain, Booker and North devote the second part of Scared to Death to “general scares” that duplicate the same pattern.  “In many ways the first truly modern ‘scare’ was one that began in America” following the publication of Rachel Carson’s Silent Spring in 1962 (p. 167).  She blamed DDT, a powerful insecticide widely used following WWII, for poisoning the environment and causing cancer.  Though it had been remarkably successful—reducing malaria mortality rates by 95 per cent—fervent  environmentalists quickly crusaded to ban DDT.  Greenpeace and the World Wildlife Fund effectively pushed for a world-wide ban on the substance, despite the fact that, as Michael Crichton said:  “‘Since the ban two million people a year have died unnecessarily from malaria, mostly children.  All together, the ban has caused more than fifty million needless deaths.  Banning DDT killed more people than Hitler’” (p. 170).  No solid studies have found DDT remotely responsible for cancer in human beings.  Indeed its worst consequence seems to be the thinning of eggshells for birds of prey.  An examination of  “The Modern Witch Craze” documents the incredible claims of Satanic ritualistic abuse of children enkindled in the 1980s.  Beginning with allegations brought by a California  mother who believed her son had been abused in the McMartin Infant School and sustained by a corps of social workers and counselors who insisted children’s stories could not be doubted, dozens of innocent people were brought to trial (in Britain as well as America) and imprisoned before mounting evidence demonstrated the folly of it all.  Some of the accused committed suicide.  We now know that social workers (armed with state authority) separated the children from their parents for weeks or even months at a time to interview them.  The children “were repeatedly plied with leading questions of a type which would never have been allowed in a courtroom” (p. 191).  Their outlandish stories, venturing into the fantastical, were taken literally by the psychological “experts” (often claiming to help children recover repressed memories) and all too frequently trusted by prosecutors.  In time most of the adult “culprits” were vindicated and we now know how untrustworthy both children’s stories and social workers’ constructions can actually be.  But the actual pain and suffering resulting from the witch craze can hardly be calculated.  

Few of us filling our gas tanks with unleaded fuel realize the high price we pay results, in part, from the billions of dollars wasted through the “lead scare” that mandated it.  Concentrated doses of lead (e.g. in ancient Roman water pipes) can certainly be toxic and its presence in gasoline helped pollute the air.  But lead is a “miraculous” additive to gasoline, significantly improving engine efficiency, and there was absolutely no evidence that leaded gas residue was any threat to public health.  However, a single, scientifically dubious study (by Robert Needleman, a child psychologist from the University of Pittsburg) alleging harmful effects on children’s intellectual development, was manipulated by politicians and the Environmental Protection Agency to mandate unleaded gasoline and justify a massive social change.  Yet Needleman was acclaimed and awarded for his work and given the Rachel Carson Award for Integrity in Science in 2004.  According to EPA administrator Carol Browner:  “‘The elimination of lead from gas is one of the great environmental achievements of all time’” (p. 234).   If so, one must wonder precisely what was actually achieved apart from fuzzy feelings about helping the children!  

While no one today doubts the lethal effects of smoking cigarettes, the threat of “passive smoking” can hardly be proven.  Smokers harm themselves but not “innocent” bystanders.  Yet during the past several decades activists have successfully campaigned to require, at considerable cost, a “smoke-free” environment virtually everywhere.  Thus for 20 years it has been illegal to smoke in California “workplaces, bars and restraints, but also within a yard and a half of any public building and on it famous beaches” (p. 254).  Allegations that thousands of people die each year due to “passive smoking” quite simply lack any statistical or factual basis.  Non-smokers may be  offended by the smell of tobacco smoke, but they suffer no real harm.  A massive research project, commissioned by the World Health Organization and conducted by 27 esteemed epidemiologists and cancer specialists, demonstrated this.   “Across the board and in all seven countries, their conclusions were consistent.  They found no evidence that there was any ‘statistically significant’ additional risk from passive exposure to smoke, either at home or in the workplace” (p. 256).  

Another study, “the longest and most comprehensive scientific study ever carried out into the effects of passive smoking anywhere in the world,” commissioned by the American Cancer Society, similarly concluded that “there was no ‘causal relation between environmental tobacco smoke and tobacco-related mortality’” (p. 261).  One would think such evidence would lead to retractions and shifts in public pronounements and policy.  Wrong!  Anti-tobacco fanatics facilely disregarded the evidence, sought to suppress scholarly papers and silence dissenters, linking arms with politicians such as New York’s Mayor Bloomberg on their mission to purify the air of all taints of the hated weed!  Booker and North conclude:  “The triumph of the campaign against passive smoking had provided one of the most dramatic examples in history of how science can be bent and distorted for ideological reasons, to come up with findings that the evidence did not support, and which were in many ways the reverse of the truth.  In this respect, it provided one of the most vivid examples in modern times of the psychological power of the scare” (p. 270).  

Add to the fear of tobacco smoke the fear of asbestos, one of the world’s most wonderful fire-resistant minerals, widely used in water pipes, brake linings, and building materials.  It can be woven into cloth-like products or mixed with plaster and cement as a reinforcement stronger than steel.  As with tobacco, however, some forms of asbestos can prove deadly when inhaled and absorbed by the lungs.  This is true, however, of only one kind of it!  The more common “white asbestos” is “by far the most widely used” and “poses no measurable risk to human health” (p. 276).  Fully 90 percent of the asbestos found in America’s buildings was benign.  But the limited numbers of workers dying of cancer due to intensive exposures to the deadly form of  asbestos enabled scaremongering lawyers and contractors, buoyed by EPA edicts, to pounce on people’s ill-informed fear of any exposure to any kind of it.   Buildings of all sorts (churches, schools, factories, homes) must be cleansed!  Companies must be punished through lawsuits—and, in time, great corporations such as Johns-Manville were destroyed and even Lloyds of London nearly collapsed.  Cunning lawyers extracted billions of dollars from beleaguered asbestos suppliers.  Legislative efforts to curtail the proliferating lawsuits were blocked “by a caucus of Democrat senators [e.g. Joe Biden; Edward Kennedy; John Kerry; Hillary Clinton; John Edwards] who had each received huge sums in campaign funding from law firms” (p. 322).  Ultimately, says Professor Lester Brickman:  “Asbestos litigation has come to consist, mainly, of non-sick people . . . claiming compensation for non-existent injuries, often testifying according to prepared scripts with perjurious contents, and often supported by specious medical evidence . . . it is . . . a massively fraudulent enterprise that can rightly take its place among the pantheon of . . . great American swindles’” (p. 273).  

Even more devastating is the irrational fear of global warming, “the new secular religion,” which is now fraudulently branded “climate change” since the evidence for actual warming has faded.  Objective historians have long noted significant climate changes—a “pre-Roman Cold” period (700-400 B.C), a “Roman Warming” time (200 B.C.-500 A.D), a cold era during the “Dark Ages” (500-900 A.D), a “Medieval Warming” time (900-1300 A.D.), a “Little Ice Age” (1300-1800), and the “Modern Warming” era we’re now in.  It has been much warmer, and much colder, in the past two millennia.  But hugely influential and well-funded scientists such as Michael Mann have distorted the record with sensational “evidence” including his spurious “hockey stick” graph that flattened out both the Medieval Warming and Little Ice Age.  Alarmists such as Mann seek to demonstrate temperature change with data from a single “1993 study of one group of trees in one untypical corner of the USA” (p. 359) and an “unqualified acceptance of the recent temperature readings given by hundreds of weather stations across the earth’s surface” (p. 359).  Ignored is evidence from weather satellites or the probable contamination of weather stations near urban centers.  

Bolstered by suspicious scientific pronouncements, activists such as Al Gore and Barack Obama have ignited widespread fears and orchestrated policies designed to fundamentally alter human behavior on earth through such things as the 1997 Kyoto Protocol.  Though Gore’s documentary—“An Inconvenient Truth”—won awards in various quarters, it was perceptively labeled, by an Australian paleoclimatologist, as “‘a propaganda crusade’” largely “‘based on junk science.  ‘His arguments are so weak that they are pathetic.  It is incredible that they and his film are commanding public attention’” (p. 378).   Calling for a curtailment on burning fossil fuels or developing nuclear energy (by far the best solution to the problem of greenhouse gases), Gore and his green corps demand the development of various forms of “green energy.”  Interestingly enough, environmentalist rhetoric subtly shifted from warning regarding “global warming” to admonitions for “green energy”!  Yet to this point highly-touted “green alternatives” such as wind turbines make little dent on the production of carbon dioxide—e.g. the 1200 turbines  built in Britain that produce only one-eighth of the electricity supplied by one coal-fired plant in Yorkshire!  

In fact, “climate change” is most likely driven by solar activity and clouds, with only minimal impact attributable to human activities.  “In many respects, however, the alarm over global warming was only the most extreme example of all the scares described in this book.  Yet again it had followed the same familiar pattern:  the conjuring up of some great threat to human welfare, which had then been exaggerated far beyond the scientific evidence; the collaboration of the media in egging on the scare; the tipping point when the politicians marshaled all the machinery of government in support of the scare; and finally the wholly disproportionate regulatory response inflicting immense economic and social damage for a highly questionable benefit” (p. 403).  

* * * * * * * * * * * * * * * * * * * * * * * * * * *

Oklahoma Senator James Inhofe’s The Greatest Hoax:  How the Global Warming Conspiracy Threatens Your Future (Los Angeles:  WND Books, c. 2012) seeks to counter the positions promoted by Al Gore and environmental alarmists.  Throughout the book Senator Inhofe pillories Gore, oft-times portrayed by the media as a “climate prophet” or “Goricle.”  Indeed, “Katie Couric famously said that Gore was a ‘Secular Saint,’ and Ophrah Winfrey said that he was the ‘Noah’ of our time” (Kindle #1182)   Obviously Gore and environmentalists have embraced and promote a religion rather than a scientific position.  Thus dissenters from the environmentalist dogma like Inhofe are treated as heretics akin to “Holocaust deniers”!  To Robert F. Kennedy Jr., those who dare differ with Gore are traitors!  To deal with them, one journalist “called for Nuremberg-style trials for climate skeptics” (#1372)!   Their research must be proscribed, their publications censored!  

Folks such as Couric and Kennedy are, manifestly, full-fledged true believers who revel in hysterical rhetoric.  Folks like Senator Inhofe, in opposition, join a distinguished minority of highly-informed people who question the devotees of “climate change.”  Thus they find credible scientists such as Dr. Claude Allegre, a noted French geophysicist, “a former French Socialist Party leader, a member of both the French and U.S. Academies of Science, and one of the first scientists to sound the global warming alarm—who changed around 2006 from being a believer to a skeptic” (#1903).  Joining Allegre, Dr. David Bellamy, highly acclaimed, figure in the UK, was “also converted into a skeptic after reviewing the science.  Bellamy said that “‘global warming is largely a natural phenomenon’ and said that catastrophic fears were ‘poppycock.’  ‘The world is wasting stupendous amounts of money on trying to fix something that can’t be fixed,’ and ‘climate-change people have no proof for their claims.  They have computer models which do not prove anything’” (#1919).  

Sitting on the Senate Environment and Public Works Committee, Inhofe has political acumen and access to substantive scientific studies.  Consequently he played a critical role in defeating President Obama’s “cap and trade” proposals.  (He was, importantly, working at the same time to pass the Clear Skies Act, designed to improve air quality, so he can hardly be dismissed as an enemy of environmental health).  He proudly labels himself “a one man truth squad” on the global warming issue and includes a great deal of personal details regarding his background and concerns regarding the state of the American Union.  Consequently:  “This book, constitutes the wake-up call for America—the first and only complete history of the Greatest Hoax, who is behind it, the true motives, how we can defeat it—and what will happen if we don’t” (#88).  He knows, for example, according to the testimony of EPA Administrator Lisa Jackson, that even if the U.S. enacted the most stringent policies designed to reduce carbon levels in the atmosphere “it would only reduce global temperatures by 0.06 degrees Celsius by 2050.  Such a small amount is hardly even measurable” (#140).  Still more:  “‘No study to date  has positively attributed all or part [of the climate change observed] to anthropogenic causes’” (#706).  

So what’s actually taking place within the global warming scaremongering?  “Looking back, it is clear that the global warming debate was never really about saving the world; it was about controlling the lives of every American.  MIT climate scientist Richard Lindzen summed it up perfectly in March 2007 when he said ‘Controlling carbon is a bureaucrat’s dream.  If you control carbon, you control life’” (#440).  There’s no question that “progressives” from Woodrow Wilson to Barack Obama have striven to take control of our lives, purportedly to maximize pleasure and minimize pain for the public.  More broadly, to Vaclav Klaus, President of the Czech Republic:  “‘The global warming religion is an aggressive attempt to use the climate for suppressing our freedom and diminishing our prosperity.”  It is a “totally erroneous doctrine which has no solid relation to the science of climatology but is about power in the hands of unelected global warming activists ” (#19).  Klaus writes with an understanding of what European leaders such as French President Jacques Chirac envision when they tout the Kyoto treaty as “‘the first component of an authentic global governance’” (#553).  Equally perceptive, Canada’s Prime Minister Stephen Harper “called Kyoto a ‘socialist scheme’” (#561).  Consequently, Inhofe concludes:  “it is crystal clear that this debate was never about saving the world from man-made global warming; it was always about how we live our lives. It was about whether we wanted the United Nations to ‘level the playing field worldwide’ and ‘redistribute the wealth.’  It was about government deciding what forms of energy we could use” (#3280).  

Senator Inhoff’s book takes its title from his “Greatest Hoax” Senate speech, and he is deeply convinced that “global warming” or “climate change” is indeed a bogus scenario manufactured by liberal elites who “truly believe that they know how to run things better than any individual country ever could.  In this way they are like ‘super-liberals’ on an international scale.  On one of its websites, the UN even claims that its ‘moral authority’ is one of its ‘best properties’” (#653).  This moral authority apparently resides in the UN’s self-righteous commitment “to the utopian ideals of global socialism” (#653), frequently promoted as necessary for “sustainable development.   The spurious nature of this Hoax became clear when “Climategate, the greatest scientific scandal of our time broke” (#2319).  A careful reading of the emails between scientists in the UK and US (reprinted in considerable detail as an appendix to this treatise) reveals, in the words of Clive Crook:  “‘The closed-mindedness of these supposed men of science, their willingness to go to any lengths to defend a preconceived message, is surprising even to me.  The stink of intellectual corruption is overpowering’” (#2359).  

The Greatest Hoax is important primarily because of its author’s position in government.  Inhofe  has, to the degree possible for a busy politician, studied the evidence, assembled the data, and come to a reasoned conclusion regarding one of the most momentous issues of our day.  If the global warming alarmists are wrong, to follow their admonition can irreparably harm not only this nation but the world, plunging us into a cataclysmic economic and social black hole.

# # # 

242 Nancy Pearcey’s Apologetics

 As a restless, questing college student immersed in the relativism and subjectivism of her milieu, Nancy Pearcey found her way to Francis Schaeffer’s L’Abri Fellowship in Switzerland in 1971.  Here, for the first time, she found Christians (many of them long-haired hippies) seriously discussing and providing answers to the “Big Questions” she was asking.  Though reared in a Christian home and nurtured in a Lutheran church, she lacked the coherent, in-depth understanding of the faith Schaeffer set forth.  In subsequent decades, through advanced academic work, personal study and reflection, she established herself through a variety of publications as a thoughtful exponent of an orthodox evangelical worldview, writing for a popular audience.  She skillfully laces her discussions with quotations and illustrations, both personal and historical, making them accessible to all thoughtful readers.  Though at times overly simplistic (too easily reducing all issues to a “two-storey” graphic) and superficial (sharing Schaeffer’s distaste for  significant Catholic thinkers), she still provides helpful guidance in charting a meaningful framework for understanding and responding to our world.  Her most recent treatise, Saving Leonardo:  A Call to Resist the Secular Assault on Mind, Morals, & Meaning (Nashville:  B&H Publishing Group, c. 2010), continues her helpful endeavor to engage the culture from a Christian perspective.  

She begins by evaluating the everywhere-evident “threat of global secularism,” a massive cultural current transforming our world, primarily through our educational and artistic milieu.  Though often oblivious to its subtly and power, we Christians must awaken to its threat.  Following the example of Early Church thinkers, we must “address, critique, adapt, and overcome the dominant ideologies of our day” (p. 14) bearing in mind J. Gresham Machen’s maxim:  “‘False ideas are the greatest obstacle to the reception of the gospel’” (p. 15).  To Pearcey—as to John Henry Newman—the false idea of modern secularism is its reduction of all truth (including metaphysical, theological and moral truth) to personal preference.  “Whatever works for you” goes the modern mantra!   To which Christians must respond:  Absolutely Not!  Rightly grasped, Christianity is not primarily a personal perspective nor an inner feeling of peace and optimism, but a trustworthy knowledge of what really IS.  This means we cannot accept the fact/value distinction that frequently dominates worldview discussions.  Many modern thinkers insist that whereas they deal objectively with scientific “facts” all ethical “values” are subjective.  Unfortunately numbers of “Christian” thinkers embrace this disjunction.  Following Schaeffer’s lead, however, Pearcey insists this two-storey view cannot but fail one seeking for an integrated philosophy.  What must be recovered, she says, is a pre-Enlightenment perspective, seeing an interrelated symbiosis of natural and spiritual realities equally authored by an all-wise Creator.  

Having alerted us to the secularist threat, Pearcey gives us a “crash course on art and worldview”—nicely (if not lavishly) illustrated with scores of reprints in this well-appointed volume—that  that helps explain how it emerged during the past two centuries.  Throughout the ancient, medieval, and Renaissance-Reformation eras, styles changed but the underlying purpose endured:  highlighting beauty that reveals truth about God, man and nature, both visible and invisible.  As Walker Percy, one of America’s finest 20th century novelists, declared, “art is ‘a serious instrument for the exploration of reality.’  It is ‘as scientific and cognitive as, say, Galileo’s telescope or Wilson’s cloud chamber’” (p. 99).  

Enlightenment intellectuals, however, restricted “truth” to natural science.  Consequently, “art is merely decorative.  Ornamental.  Entertaining.  Isaac Newton called poetry ‘ingenious nonsense.’  . . . .  Hume denounced poets as ‘liars by profession.’  Philosopher Jeremy Bentham agreed:  ‘All poetry is misrepresentation’” (p. 98).  It might soothe one’s inner turmoil or exalt one’s expectations, but it reveals  nothing about anything ultimately real.  So too, many thought, for religion.  But revolutionary 18th century developments sparked not only political upheavals such as the French Revolution but artistic celebrations of highly individualistic and Romantic perspectives.  While scientists may well weigh and measure the external world of nature (how things are) artists insisted on freely imagining how things might or ought to be.  Thus, by the end of the 19th century, movements such as “impressionism” and “cubism” flourished as monuments to this disconnect between art and objective reality.  Rather than representing Reality like a photograph, Romantic art serves as a film projector in a theater, casting images on the screen, and throughout the 20th century, as Pearcey persuasively illustrates, this conviction intensified.  

As an antidote to these developments, Pearcey recommends a recovery of great Christian artistic works—including the music of Bach, the “fifth gospel,” which is, amazingly, quite popular in Japan.  Resulting from the work of Masaaki Suzuki, a famous conductor, thousands of Japanese have learned to play and appreciate the work of the gifted Baroque composer.  “‘Bach works as a missionary among our people,’ Suzuki said in an interview.  ‘After each concert people crowd the podium wishing to talk to me about topics that are normally taboo in our society—death, for example.  Then they inevitably ask me what “hope” means to Christians.’  He concluded:  ‘I believe that Bach has already converted tens of thousands of Japanese to the Christian faith’” (p. 267).   And along with recovering great art we need to cultivate a high quality art absent in the popular culture.  In our churches and homes we need to powerfully represent the Gospel story, shunning the “spiritual junk food” and “sentimentalism” that so frequently masquerades as Christian “music” and “art.”    

To R. Albert Mohler, Jr., President of the Southern Baptist Theological Seminary, “Nancy Pearcey helps a new generation of evangelicals to understand the worldview challenges we now face and to develop an intelligent and articulate Christian understanding . . . Saving Leonardo should be put in the hands of all those who should always be ready to give an answer—and that means all of us.”  

* * * * * * * * * * * * * * * * * * *

During the past century, the cultural consequences of taking natural science as the sole guide to truth have been increasingly (indeed alarmingly) evident.  Illustrating this trend, Eric Temple Bell, a professor at the California Institute of Technology and former president of the Mathematical Association of America, declared that modern (i.e. non-Euclidean) geometry makes mathematics and logic, as well as metaphysics and ethics, endlessly tentative, asserting, in The Search for Truth, that there is no such thing as “Truth.”  Trashing Euclid and Plato, Aristotle and Aquinas—all of whom “forged the chains with which human reason was bound for 2,300 years”—Bell celebrated the brave new world of modernity freed from any illusions regarding absolutes of any sort.  Consequently, as Pope Benedict XVI noted in his inaugural  message, we struggle with the “dictatorship of relativism” that renders all certainties suspect.  

Rightly responding to such “modern” views, Nancy Pearcey supported, in Total Truth:  Liberating Christianity from Its Cultural Captivity (Wheaton:  Crossway Books, 2003), Francis Schaeffer’s position as articulated in his 1981 address at the University of Notre Dame:  “‘Christianity is not a series of truths in the plural, but rather truth spelled with a capital “T.”  Truth about the total reality, not just about religious things.  Biblical Christianity is Truth concerning total reality—and the intellectual holding of that total Truth and then living in the light of that Truth’” (p . 15).  Most needed, Schaeffer and Pearcey insist, is a Christian “worldview” that fundamentally shapes the lives of millions of ordinary believers, thus transforming their culture.  “A worldview is like a mental map that tells us how to navigate the world effectively,” Pearcey says.  “It is the imprint of God’s objective truth on our inner life” (p. 23).  

Unfortunately, too many Christians (Evangelicals included) have reduced their faith to a purely internal, “spiritual” realm disconnected from the physical and social worlds.  In the opinion of Sidney Mean, a distinguished historian:  “‘This internalization or privatization of religion is one of the most momentous changes that has ever taken place in Christendom’” (p. 35).  Unfortunately, as Charles Malik noted, we need “‘not only to win souls but to save minds.  If you win the whole world and lose the mind of the world, you will soon discover you have not won the world’” (p. 63).  Thus pastors and teachers should do apologetics as well as preach salvation.  “Every time a minister introduces a biblical teaching, he should also instruct the congregation in ways to defend it against the major objections they are likely to encounter.  A religion that avoids the intellectual task and retreats to the therapeutic realm of personal relationships and feelings will not survive in today’s spiritual battlefield” (p. 127).  

Despite their many positive accomplishments—and Pearcey generously notes them—American Evangelicals may not survive today’s battles unless they take ideas seriously.  Though there was a scholarly (largely Calvinistic) dimension to 19th century Evangelicalism, the movement became, in revivalists’ hands,  inordinately populist, more concerned with converging the masses than cultivating their minds.  Believers  were, then, unprepared to resist and refute powerful and anti-Christian ideas propounded by Darwin, Marx, Freud, et al.  “The overall pattern of evangelicalism’s history is summarized brilliantly by Richard Hofstadter in a single sentence.  To a large extent, he writes, ‘the churches withdrew from intellectual encounters with the secular world, gave up the idea that religion is a part of the whole life of intellectual experience, and often abandoned the field of rational studies on the assumption that they were the natural province of science alone’” (p. 323).  

So it’s now time, Pearcey declares, to regain lost ground, to reestablish a Christian perspective, to redeem minds as well as hearts.  To do so, Christians need to structure their worldview in accord with three guiding certainties:  creation; fall; redemption.  An originally good creation has been scarred by sin, but God’s gracious redemptive work in Christ has (to a degree) restored His original intent.  Every worldview requires a creation story.  Materialists, ancient and modern, explain the world in terms of mindless matter-in-motion, and pantheists, whether Stoics or “process” thinkers, endow Nature with divine attributes.  Every worldview includes an explanation for the evil surrounding us—inadequately evolved species or demonic social institutions or malignant genes.  And every worldview promises redemption, whether through scientific breakthroughs or political revolutions or inner enlightenment.  Thus, as John Milton wrote, “the goal of learning ‘is to repair the ruins of our first parents’” (p. 129).  

To make a Christian case for Creation, Pearcey says we must deal cogently with Darwinism, stressing that it is, as Huston Smith said, “‘supported more by atheistic philosophical assumptions than by scientific evidence’” (p. 153).  Excluding any possibility of God, as Carl Sagan declared, “Nature . . . is all that IS, or WAS, or EVER WILL BE!”  Adamantly upholding this assumption, naturalistic Darwinism has become a “universal acid” eating away many of the most fundamental cultural certainties basic to Western Civilization.  “Half a century ago G.K. Chesterton was already warning that scientific materialism had become the dominant ‘creed’ in Western culture—one that ‘began with Evolution and has ended in Eugenics.’  Far from being merely a scientific theory, he noted, materialism ‘is really our established Church’” (p. 157).  Consequently:  “‘The so-called warfare between science and religion,’ wrote historian Jacques Barzun, should really ‘be seen as the warfare between two philosophies and perhaps two faiths.’  The battle over evolution is merely one incident ‘in the dispute between the believers in consciousness and the believers in mechanical action; the believers in purpose and the believers in pure chance’” (p. 173).  

The deleterious and far-reaching cultural impact of Darwinism stands illustrated by Joseph Stalin who, as a seminary student, lost his faith in God after encountering Darwin’s theory.  Subsequently he imposed his murderous form of atheism upon a large swathe of the world.  Less murderously, American thinkers—particularly pragmatists such as John Dewey and Oliver Wendell Holmes, Jr.—launched an effective assault on many of the traditions vital to this nation.  To them, all “truths” evolve in accord with naturalistic evolution and thus no permanent standards of right and wrong, in any area of life, actually exist.   Everyone “constructs” his own reality and writes his own rules.  Darwin himself realized, and feared, this logical outgrowth of his theory, confessing, in a letter:  “‘With me, the horrid doubt always arises whether the convictions of man’s mind, which has been developed from the mind of the lower animals, are of any value or at all trustworthy’” (p. 243).  

To refute Darwinism, Pearcey follows the lead of Philip Johnson, the U.C. Berkeley law professor who wrote Darwin on Trial and Reason in the Balance and helped launch the “intelligent design” movement.  Rather than bog down in secondary details regarding the age of the earth or the reality of microevolution, Christians need to “focus on the crucial point of whether there is evidence for Intelligent Design in the universe” (p. 174).  Importantly, we must grasp the significance of recent scientific insights, summed up by John Wheeler, a noted physicist, who said:  “‘When I first started studying, I saw the world as composed of particles.  Looking more deeply I discovered waves.  Now after a lifetime of study, it appears that all existence is the expression of information’” (p. 179).  The cosmos, it seems, is not ultimately mindless matter-in-motion; it is, rather, imprinted with an immaterial pattern, bearing information, or (as Christians have always believed) a Logos responsible for the design everywhere evident.  Just ponder for a moment the widely-heralded fact that every cell in your body contains as much information as 30 volumes of the Encyclopedia Britannica!  This Logos-structured world is increasingly evident as we begin to grasp the majesty of DNA, aptly defined by President Bill Clinton as “‘the language in which God created life’” (p. 1919).   

Darwin himself recognized this manifest design but sought to discount it as only “apparent,” not real.  Similarly, his modern apostle, Richard Dawkins, admits (in The Blind Watchmaker) that “‘Biology is the study of complicated things that give the appearance of having designed for a purpose’” (p. 183).   But be not deceived, he insists, it’s all an illusion spun by the random  machinations of natural selection.  Such statements, Pearcey shows, pervade evolutionary literature, imploring us all to ignore common sense and believe the sacrosanct theory launched by Darwin.  Outside the faithful community of naturalistic evolution, however, we find alternatives expressed by thinkers such as Nobel Prize-winner Arno Penzias, who says:  “‘Astronomy leads us to a unique event, a universe which was created out of nothing, one with the very delicate balance needed to provide exactly the conditions required to permit life, and one which has an underlying (one might say “supernatural”) plan.’  In fact, he says, ‘The best data we have are exactly what I would have predicted, had I nothing to go but the five books of Moses, the Psalms, the Bible as a whole’” (p. 189).  

Thus, Pearcey argues, the Bible contains the necessary ingredients for a coherent worldview.  Taking it seriously, living in accord with its precepts, gives us the basis for cultural activities.  Interested readers may begin this endeavor by consulting the annotated “recommended reading” list she supplies.  

                                                           * * * * * * * * * * * * * * 

For several years Nancy Pearcey worked with the late Chuck Colson, doing much of the research underlying his BreakPoint radio program and coauthoring his monthly column in Christianity Today.   Re-examining their coauthored  How Now Shall We Live (Wheaton:  Tyndale House Publishers, Inc., c. 1999), one suspects that Pearcey did the bulk of the research and preliminary writing with Colson adding his personal touch (scores of personal anecdotes, mostly taken from his prison ministry) and imprimatur.  This is especially evident in the book’s structure (basically duplicated, with added scholarly references in Pearcey’s subsequent Total Truth), stressing four themes:  Creation (where did I come from?); Fall (what’s wrong with me and the world?); Redemption (is there any hope?); and Restoration (how can we help repair what’s wrong?).  A probing query from Ezekiel, struggling in Babylon during the exile, sets the stage:  “How should we then live?” (33:10, KJV).  

We should live, Colson and Pearcey answer, by crafting and following a biblical worldview, empowered by the realization, as St. Hippolytus said in the third century, that when Jesus ascended “‘His divine spirit gave life and strength to the tottering world, and the whole universe became stable once more, as if the stretching out, the agony of the Cross, had in some way gotten into everything’” (p. 13).  Thus everything, rightly understood, points to and leads to the Christ.  All truth is God’s truth!  As the great astronomer Johannes Kepler declared:  “‘The chief aim of all investigations of the external world should be to discover the rational order and harmony which has been imposed on it by God’” (p. 51).  

Our challenge, as Christians, is to both discover and proclaim this truth in a world increasingly skeptical of all truth claims, frequently claiming to find neither rational order nor harmony anywhere.  Such skepticism grows logically from the philosophical naturalism dominant in our culture and on display in school textbooks, PBS programs, and the EPCOTT Center in Disney World.  In accord with the “just so” naturalistic story, for billions of years things just happened without design or purpose.  Lifeless chemicals somehow ignited biological beings (protons and molecules) mysteriously structured by DNA strands.  And we human beings are nothing more than enlivened chemicals, inexplicably endowed with both consciousness and self-awareness and conscience.  This is the materialist creed; consequently, as C.S. Lewis said:  “‘The Christian and the materialist hold different beliefs about the universe.  They can’t both be right.  The one who is wrong will act in a way which simply doesn’t fit the real universe’” (p. 307).  

To Colson and Pearcey, the Christian worldview enables us to live wisely and well in the real universe, for “Christianity gives the only viable foundation for intellectual understanding and practical living” (p. 489).  Whatever our gifts, whatever our vocation, we may play a vital role in God’s work so long as we do it Soli Deo Gloria—only to the glory of God!  We especially need to give attention to our homes, churches, and neighborhoods and schools, bearing in mind the words of the tempter in C.S. Lewis’s The Screwtape Letters:  “‘What I want to fix your attention on is the vast overall movement towards the discrediting, and finally the elimination of every kind of human excellence—moral, cultural, social, or intellectual’” (p. 331).  Our children and friends, as well as the world, need godly (i.e. good) music and literature, art and philosophy, films and TV.  Rather than bring Rock & Roll into the sanctuary we need to take Bach into the marketplace!  Rather than giving children video games, offer them Lewis’s Narnian chronicles and J.R.S. Tolkien’s trilogy of the Rings.  

People also need to live in neighborhoods where broken windows are repaired and playgrounds are safe, where laws are enforced and elders respected.  So well-informed political action (e.g. voting!), particularly on a local level, should be part of the Christian vocation.  Rather than trying to ape (and somehow Christianize) the world we should seek to transform it with divinely-rooted truths.  Thereby we will implement the vision of Francis Schaeffer, to whom this treatise is dedicated, volunteering for service in Christ’s corps of intellectual warriors, contending for the Faith once delivered to the saints.  

241 “Dupes” and “The Communist”

Writing Witness, Whittaker Chambers—one of the most celebrated 20th century repentant Communists who helped expose Alger Hiss and other Soviet spies—reflected on his years working within the Party:  “While Communists make full use of liberals and their solicititudes, and sometimes flatter them to their faces, in private they treat them with the sneering contempt that the strong and predatory almost invariably feel for the victims who volunteer to help in their own victimization.”  Such malleable liberals are analyzed in depth by Cold War historian Paul Kengor, in Dupes:  How America’s Adversaries Have Manipulated Progressives for a Century (Wilmington:  ISI Books, c. 2010).  He skillfully documents the prescience of Norman Thomas, the perennially nominated presidential candidate of the Socialist Party, who said Americans could not be persuaded to candidly establish “socialism” but under the aegis of  “liberalism” would gradually put it in place “‘without ever knowing how it happened’” (p. 479).  

Due to the collapse of the USSR, historians such as Kengor have profited from “the massive declassification of once-closed Cold War archives, from Moscow to Eastern Europe to the United States” (p. 3), and the archival materials depicting the Communist Party USA are especially illuminating.  Though American Progressives were not Communists, they fully supported the Communist wish list:  “workers’ rights, the redistribution of wealth, an expansive federal government, a favoring of the public sector over the private sector, class-based rhetoric (often demagoguery) toward the wealthy, progressively high tax rates, and a cynicism toward business and capitalism, to name a few.  The differences were typically matters of degree rather than principle” (p. 4).  To Kengor, however, such degrees of difference really matter, for by assisting the Communist movement they “knowingly or unknowingly contributed to the most destructive ideology in the history of humanity.  This is no small malfeasance” (p. 11).  

Following the Bolsheviks’ triumph in Russia, Communists organized two Chicago subsidiaries which soon merged into the Communist Party USA (CPUSA) that for 50 years dutifully followed edicts from Moscow.  To bring about revolution in America, however, Communists needed to strategically misrepresent themselves and subtly subvert the nation’s social and economic structures.  Thus they encouraged and showcased “Potemkin Progressives” such as Corliss Lamont—an atheistic “humanist” professor at Columbia University who helped lead the Friends of the Soviet Union and in 1933 wrote Russia Day by Day to celebrate the glories of the Soviet Union.  They also promoted the agenda of another Columbia professor , John Dewey—the renowned pragmatist philosopher who largely shaped the progressive educational agenda that has dominated America’s public schools for nearly a century.  His educational ideas incorporated significant swathes of Marxism and were actually implemented in Russia in the 1920s before being adopted by American schools.    

Invited to visit the USSR in 1928, Dewey was given the standard Potemkin village tour touting the grandeurs of Communism.  Returning home, he wrote glowing reports of what he’d seen, affirming that the  Bolshevik agenda was “a great success.”  He especially endorsed the Soviet schools, the “ideological arm of the Revolution,” which would lead to the success of “The Great Experiment and the Future” in Russia (pp. 90-99).  Dewey was not uninformed of the brutalities accompanying this great experiment and acknowledged its “secret police, inquisitions, arrests and deportations” (p. 100).  But he glibly rationalized them as necessary for the regime to prosper.  He would thus head a corps of influential intellectuals urging President Franklin D. Roosevelt to formally recognize and establish diplomatic relations with the USSR.  Soon thereafter, however, faced with mounting evidence regarding Stalin’s Great Purge and its massive bloodletting, Dewey bravely retracted his endorsement of the Stalinist regime.  He was especially distressed by Stalin’s attack on Leon Trotsky and joined a commission to Mexico to defend him.  For his apostasy the once-acclaimed philosopher was vilified and branded by Stalinists “an enemy of the people.”  

Though Stalinists distrusted and denounced President Roosevelt, they diligently sought to infiltrate his administration in the 1930s.  Because of his prominence and influence, Harry Hopkins—appointed by FDR to head the WPA, serving as his “right-hand man during World War II,” living in the White House and accompanying Roosevelt to the major WWII conferences—doubtlessly “stands as the most sensational case among the potential Soviet agents” (p. 124).  Newly available archival evidence demonstrates that Hopkins was in fact a Soviet spy who effectively duped the president on behalf of his “buddy,” Uncle Joe and manipulated programs such as Lend-Lease to benefit Russia.  Concurrently, FDR rejected warnings regarding Stalin and followed his “hunch” that the despot could be trusted.  Relying on Hopkins’ advice toward the end of the war, he believed Stalin wanted nothing “‘but security for his country, and I think that if I give him everything I possibly can and ask nothing from him in return, noblesse oblige, he won’t try to annex anything and will work with me for a world of democracy and peace’” (p. 165).  We now know, of course, that Stalin rapidly occupied Eastern Europe after WWII and provoked a Cold War that endured for 40 years, enabling the slaughter of millions of people and devastating the economies of dozens of countries.  FDR died ignorant of the havoc resulting from the fact that “from the start to the finish of his administration, the great New Dealer was greatly trashed, hated, and duped by Communists at home and abroad” (p. 181).  

Communists always recognized the power of propaganda, so using the arts—and especially the cinema as Lenin stressed—was imperative.  Accordingly, the shaping of Hollywood became a central objective for Soviet agents in America and they easily found multiplied dozens of easily duped liberal “stars.”  Playwrights such as Lillian Hellman, Dashiell Hammett, and Arthur Miller (whose The Crucible amply illustrates the process) provided the scripts.  Singers including Paul Robeson and actors such as Katharine Hepburn, Lucille Ball, Gregory Peck and Humphrey Bogart were easily enlisted in the “progressive” (i.e. communist) cause.  Consequently, in 1947 the House Committee on Un-American Activities summoned a number of Hollywood celebrities to testify.  Some, including Ronald Reagan (head of the Screen Actors Guild and “hero” of the hearings) and Gary Cooper, were “friendly witnesses” who documented and denounced Communist activities.  Unfriendly witnesses—most notably the “Hollywood Ten,” four of whom we now know were dedicated Communists)—parroted the Party line, labeling their critics as fascists on a witch-hunt and insisting there wasn’t a trace of communism in Hollywood.  Benefiting from the support of the press and powerful Democrat politicians (e.g. Claude “Red” Pepper of Florida), the propagandized public soon believed that it was the House Committee on Un-American Activities, rather than Communists in Hollywood, which was real threat to the nation!

When the Korean War erupted, Corliss Lamont and Lillian Hellman defended the North Koreans. When the United States joined the conflict in Vietnam protesters such as Tom Hayden (perennially elected from his Santa Monica base to assorted office in California), Dr. Benjamin Spock (author of a fabulously successful book on child-rearing), Arthur “Pinch” Sulzberger Jr. (in due time publisher of the New York Times), and John Kerry (the Democrat Senator and nominee for President in 2004), easily absorbed and articulated the Communist agenda.  Violent revolutionaries such as Bill Ayers worked for the defeat of the “American Empire” and the total transformation of America.  Many of these anti-war radicals in the ‘60s ultimately infiltrated and significantly shaped the Democrat Party, where they still exert influence. 

Resisting such radicals stood Ronald Reagan, fully aware of Communist strategies since his days as head of the Screen Actors Guild.   “For years as a private citizen and candidate, Reagan had fiercely opposed the accommodationist policy of detente and spoken frankly about the true nature of Soviet Communism” (p. 366).  To him, the USSR was an “evil empire” to be confronted and defeated.  Consequently, scores of “dupes” sought to deride and destroy him.  For example, Henry Steele Commager, an influential historian, called the President’s “evil empire” speech “‘the worst presidential speech in American history, and I’ve read them all.’  Why?  Because, said Commager, of “Reagan’s ‘gross appeal to religious prejudice’” (p. 393).  Senator Ted Kennedy chastised Reagan “for ‘misleading Red-scare tactics and reckless Star Wars schemes’” (p. 403).  Yet, says Kengor:  “As we now know from a highly sensitive KGB document, the liberal icon [Kennedy], arguably the most important Democrat in the country at the time, so opposed Ronald Reagan and his policies that the senator approached Soviet dictator Yuri Andropov, proposing to work together to undercut the American president” (p. 407).  

With the demise of the USSR and the emergence of radical Islam as the great threat to America, progressive “dupes” shifted gears while preserving their fundamental ideology.   Thus they opposed President George W. Bush’s Iraq policies.   Senate Democratic leader Harry Reid called him a “loser” and prominent politicians routinely labeled him a “liar.”  “Leftists in media and academia joined politicians like [Edward] Kennedy in attacking the White House” (p. 432).  A Columbia University historian—an avowed socialist and formerly president of the American Historical Association—Eric Foner declared:  “‘I’m not sure which is more frightening:  the horror that engulfed New York City or the apocalyptic rhetoric emanating daily from the White House’” (p. 432).  

The unexpected emergence of Barack Obama was quickly promoted by “Progressives for Obama,” spearhead by Tom Hayden, the leader of the Students for a Democratic Society in the ‘60s who had recruited a number of fellow-travelers such as Daniel Ellsberg and Jane Fonda (whom he married).  Admittedly, Obama is a “progressive liberal” rather than a SDS-style Marxist.   Yet “Hayden saw in Obama a long-awaited vehicle for ‘economic democracy,’ an instrument to channel an equal distribution of wealth—‘economic justice,’ or ‘redistributive change,’ as Obama himself once put it.  Hayden said that, ‘win or lose, the Obama movement will shape progressive politics . . . for a generation to come’” (p. 469).    Though Hayden has successfully operated within the Democratic Party in California, other ‘60s radicals (Bill Ayers and his wife, Bernardine Dohrn, Mark Rudd and Michael Klonsky) promote the cause within higher education.  Education, Ayers says, “‘is the motor-force of revolution’” (p. 475).  Ayers and Barack Obama worked together in Chicago to funnel money into the city’s schools so as to advance the cause of “social justice.”  Klonsky and Ayers co-authored an article that “raved about Arne Duncan, longtime head of Chicago public schools, whom the pair described as ‘the brightest and most dedicated schools leader Chicago has had in memory.’  Today Duncan is President Obama’s secretary of education” (p. 472).  

One of the aging radicals most thrilled with Obama’s election was a physician, Quentin Young, a long-term advocate of socialized medicine who had helped launch Obama’s career “in the living room of Bill Ayers and Bernardine Dohrn” (p. 477).  “Young noted that Obama, as a state senator in Illinois, had supported a ‘single-payer universal healthcare system’” that could be implemented when Democrats took complete control of Congress and the White House (p. 477).  Evaluating the 2008 election, Pravda declared “‘that like the breaking of a great dam, the American descent into Marxism is happening with breathtaking speed, against the backdrop of a passive, hapless sheep.’  That ‘final collapse,’ said the pages of the chief party organ of the former USSR, ‘has come with the election of Barack Obama’” (p. 478).  

For nearly a century communists have patiently worked behind the scenes, promoting their cause through progressive dupes.  Now, amazingly enough, in 2008 “Americans had voted CPUAS’s way:  the party could not contain its excitement over Obama’s victory.  The election of Barack Obama was the chance for a wish list to come true—a potential host of nationalizations, from the auto industry to financial services to health care, beginning with more modest steps like establishing the ‘public option’ in health-care reform, plus massive government ‘stimulus’ packages, more public-sector unionization and control, more redistribution of wealth, more collectivization.  ‘all these—and many other things—are within our reach now!’ celebrated Sam Webb in his keynote speech for the New York City banquet of People’s Weekly World, the official newspaper of CPUSA, which reprinted the speech under the headline ‘A New Era Begins.’  With the election of Obama, said Webb, the impossible’ had become ‘possible’” (p. 478).  A “century of dupery” has succeeded!  

* * * * * * * * * * * * * * * * * * * 

In The Communist:  Frank Marshall Davis:   The Untold Story of Barack Obama’s Mentor (New York:  Threshold Editions, c. 2012), Paul Kengor moves from the general story told in Dupes to a very particular case of a little-known, card-carrying Communist who significantly influenced our nation by helping shape the young Barack Obama while he was growing up in Hawaii.  Kengor’s purpose, however, is not to question Obama’s ideology or agenda.  “My purpose is to show that Frank Marshall Davis—who clearly influenced Obama—was a communist member and closet member of CPUSA with private loyalties to Mother Russia” (p. 18).  The story can now be more clearly told thanks to the treasure trove of documents now available following the collapse of the USSR.  Though Davis’s associates and pro-Soviet journalistic pieces elicited the attention of the House Committee on Un-American Activities following WWII, he and his defenders always denied he was actually a Communist.  But the propriety of the Committee’s concern has been validated by Davis’s recently-revealed admission that sometime during the war, he had “‘joined the Communist party’” (p. 92).  

Kengor tells the story of Davis, from his Arkansas City, Kansas, roots to his Chicago involvement (as a journalist) in Communist causes to his final days in Hawaii.  Three pivotal years in Atlanta in the early ‘30s, witnessing the notorious Scottsboro Boys trial (nine black boys were accused of raping two white girls), acerbated his anger with racism and receptivity to the CPUSA’s lavishly-funded Scottsboro propaganda campaign.    Returning to Chicago and initially working for the Associated Negro Press, he dove quickly into the intellectual waters colored by the views of “dupes” such as John Dewey and Margaret Sanger.  He also interacted with both secret members of the Party (such as the singer Paul Robeson) and more open devotees, including the celebrated writers Langston Hughes and Richard Wright (who later resigned from and lamented his support for the Party).  Following the CPUSA line they supported movements such as the American Peace Mobilization and promoted “progressive” causes of various hues.  (Though self-consciously communists, they invariably insisted on using the term “progressive” to define both themselves and their “social justice” objectives).  Davis also worked with prominent “progressive” black leaders in Chicago including Robert Taylor and Vernon Jarrett (one the maternal grandfather, the other the father-in-law of Valerie Jarrett, widely considered President Obama’s closest friend and adviser).  And he joined Harry Canter (subsequent to his years in Moscow) and his son, David, working with newspapers to advance worker’s unions; in due time the younger Canter mentored David Axelrod, who became Barack Obama’s political guru.  

In Chicago, “Frank Marshall Davis was increasingly involved in events sponsored or covertly organized by the communist left” (p. 88), teaching a History of Jazz for the Abraham Lincoln School (widely labeled the “little Red school house”), and joining assorted communist fronts.  Fortuitously, he was enabled to freely “uncork his opinions” in the pages of “a full-blown pro-CPUSA newspaper of his own lead and editorship:  the Chicago Star” (p. 104).  He especially vilified anti-Communist statesmen such as Winston Churchill and Harry Truman, closely following the “Party line, not questioning Stalin” (p. 103).  Recruited to write for the Star were communist luminaries such as Howard Fast, the Hollywood writer who was awarded the Stalin Prize in 1953.  Florida’s leftist Senator Claude “Red” Pepper also graced the paper’s pages, promoting his favorite cause:  socialized medicine.  Pepper’s chief-of-staff, Charles Kramer, whom we now know was a Soviet agent, both “handed over important information to the USSR” and wrote a bill to create a National Health Program (p. 121).  And Lee Pressman, another Soviet agent and “close colleague” of both Kramer and Alger Hiss, added his weight to the Star’s roster of writers.  These writers, of course, “co-opted the ‘progressive’ label, claiming to be merry liberals” simply devoted to fulfilling the American dream of “social justice” (p. 125). 

Then Davis abruptly left his beloved Chicago in 1948, moving to Hawaii, where he wrote a weekly column for the Honolulu Record, a Communist paper, and worked closely with Harry Bridges, the “progressive” leader of the International Longshoremen’s and Warehousemen’s Workers Union (ILWU).   Though oft-celebrated by the likes of Nancy Pelosi, the ILWU was “one of the most communist-penetrated and –controlled unions of the time” (p. 145).  While Davis claimed to receive no money from the Record, there is every reason to believe he was generously subsidized by the CPUSA in Hawaii, where it was hoped a “mass revolutionary movement” would establish “a satellite in the Soviet orbit” (p. 150).  With his pen Davis was strategically placed to assume a pivotal role in Stalin’s strategy in the Pacific.  So he consistently attacked President Truman, the Marshall Plan, and America’s military excursions in Asia (Korea; Vietnam).   A recently opened 600 page FBI file on Davis reveals that he also took numerous telephoto pictures of Hawaii’s shoreline.  Consequently, he was listed as a security threat on the government’s Security Index, joining a select group of folks deemed highly dangerous to the nation.  

Little actually came of the CPUSA plan for Hawaii, and an aging Frank Davis slipped into the obscurity of retirement.  Yet though he accomplished little as a journalist he left a larger imprint on the world through his acquaintance with Stanley Dunham, Barack Obama’s grandfather, with whom he enjoyed drinking and playing poker.  As is clear in Dreams from My Father, wherein “Frank” frequently appears, young Barack Obama (desperate for male guidance) easily slipped within Davis’s sphere of influence as he sought to define himself.  “‘Away from my mother, away from my grandparents, I was engaged in a fitful struggle.  I was trying to raise myself to be a black man in America, and beyond the given of my appearance, no one seemed to know exactly what that means’” (p. 233).  But Frank Davis provided some clues—and a reading list of radicals such as Franz Fanon.   Consequently, Kengor says:   “Frank remained a thread in the life and mind of Obama” (p. 237).  Thus, when he arrived in California as a college student at Occidental, he was considered a “fellow believer” by one of his then-Marxist friends, John Drew who, in a recent interview with Kengor recalled that:  “‘Obama was definitely a Marxist and that it was very unusual for a sophomore at Occidental to be as radical or as ideologically attuned as young Barack Obama was’” (p. 251).  

In sum:  “The people who influence our presidents matter” (p. 298).  To understand President Obama we need to weigh the role of his “mentor,” Frank Marshall Davis, in his formation.  The Communist thus provides essential information in evaluating him.