163 Churchill & Reagan Speak

Few historians today question the significance of Winston Churchill and Ronald Reagan.  Both men led their nations through challenging times, maintained a singular commitment to their core values, and helped shape the contours of the 20th century.  Both were gifted orators, and their recorded speeches and archived papers increasingly reveal the quality of their thought.  Furthermore, both illustrate how statesmen rightly respond to crises and conflicts such as America’s current war with Islamic terrorists.

Winston Churchill’s grandson, Winston S. Churchill, has collected  and published Never Give In!  The Best of Winston Churchill’s Speeches (New York: Hyperion, c. 2003), a 500 page treasury that documents his views from 1897-1963.  In his Preface, the editor expresses admiration for his grandfather and indicates his rationale for publishing the collection–a small portion of the five million words in Churchill’s complete speeches.  Amazingly, Churchill never used a speech writer!  His words are truly his words!  And he worked hard to craft them well.  One of his wartime secretaries, Sir John Colville, told the editor: “In the case of his great wartime speeches, delivered in the House of Commons or broadcast to the nation, your grandfather would invest approximately one hour of preparation for every minute of delivery” (p. xxv).  The speeches are presented in chronological order and divided into five periods, though several themes characterized Churchill: 1) the virtue and necessity of courage, both political and military; 2) opposition to Socialism, both Bolshevism in Russia and the softer version of Britain’s Labor Party; 3) adamant opposition to Nazism, demanding armed response to Hitler’s aggression; 4) the goodness of the “property-owning democracy that explained England’s greatness; 5) the correctness of Conservatism, as he defined it, upholding the grandeur of the Christian tradition and of Western Civilization in general..

The first section, entitled “Young Statesman, 1897-1915,” introduces the reader to a young politician finding himself and his political principles.  Churchill launched his political  career in 1900 at the age of 25, and would serve in Parliament (with one brief absence) until 1964.  Elected as a Conservative in 1900, he broke with his party in 1904, “crossed the floor” and joined the Liberals, primarily because of his commitment to free trade.  Subsequently he rapidly rose through the ranks of the British government, becoming First Lord of the Admiralty in 1911, just as the early tremors of WWI rippled across Europe.  When the war began he declared, with words he would repeat 25 years later: “We did not enter upon the war with the hope of easy victory; we did not enter upon it in any desire to extend our territory, or to advance and increase our position in the world” (p. 59).   Unlike many, he believed: “The war will be long and somber” (p. 59), and it would prove difficult for Churchill himself, for he was forced to resign his cabinet position when his plan to attach Germany from the east, through the Dardanelles, misfired.

This led to the second period of his career, “Oblivion and Redemption, 1916-29.”  Following his Dardanelles disgrace, Churchill left the House of Commons and served as a soldier on the front lines in Flanders.  But he returned to the House when David Lloyd George asked him to serve in the cabinet, and in 1924 he would serve as Chancellor of the Exchequer under the newly elected Conservative Prime Minister.  When the Bolsheviks seized control of Russia in 1917, Churchill immediately declared: “Tyranny presents itself in many forms.  The British nation is the foe of tyranny in every form.  That is why we fought Kaiserism and that I why we would fight it again.  That is why we are opposing Bolshevism.  Of all tyrannies in history the Bolshevist tyranny is the worst, the most destructive, and the most degrading.  It is sheer humbug to pretend that it is not far worse than German militarism” (p. 77).  By 1921, he recognized that:  “The lesson from Russia, writ in glaring letters, is the utter failure of this Socialistic and Communistic theory, and the ruin which it brings to those subjected to its cruel yoke” (p. 81).  Churchill also opposed those in Britain’s Labor Party who wanted to install Socialism, asserting that they were “corrupting and perverting great masses of our fellow-countrymen with their absurd foreign-imported doctrines” (p. 89).  “They borrow all their ideas from Russia and Germany,” he said.  “They always sit adulating every foreign rascal and assassin who springs up for the moment.  All their economics are taken from Karl Marx and all their politics from the actions of Lenin” (p. 89).

From 1930-1939 Churchill endured his “Wilderness Years,” lonely and ridiculed as he opposed Hitler and those who appeased him.  England’s “difficulties,” he said, “come from the mood of unwarrantable self-abasement into which we have been cast by a powerful section of our own intellectuals” (p. 104).  Politicians joined them in offering “a vague internationalism, a squalid materialism, and the promise of impossible Utopias” (p. 104).  Clergymen were particularly reprehensible insofar as they sought “to dissuade the youth of this country from joining its defensive forces, and seek to impede and discourage the military preparations which the state of the world forces upon us” (p. 155).  While pacifists talked peace Hitler armed for war.  In the midst of WWII, Churchill remembered:  “For the best part of twenty years the youth of Britain and America have been taught that war is evil, which is true, and that it would never come again, which has been proved false.”  During that time dictators armed their regimes. “We have performed the duties and tasks of peace.  They have plotted and planned for war” (p. 318).  They illustrated the fact that “The whole history of the world is summed up in the fact that when nations are strong they are not always just, and when they wish to be just they are often no longer strong” (pp. 132-133).  To be both strong and just was Churchill’s goal.  That required military strength and the willingness to use it to prevent Hitler’s ambitions.  Rifles and battleships, not rhetoric and resolutions, could deter war.

“For five years,” he said in 1938, “I have talked to the House on these matters–not with very great success.  I have watched this famous island descending incontinently, fecklessly, the stairway which leads to a dark gulf.  It is a fine broad stairway at the beginning, but after a bit the carpet ends.  A little farther on there are only flagstones, and a little farther on still these break beneath your feet” (p. 166).  To Churchill, Prime Minister Chamberlain’s 1938 pact with Hitler in Munich was a “total and unmitigated defeat” (p. 172).  Churchill feared that it would prove to be “only the first sip, the first foretaste of a bitter cup which will be proffered to us year by year unless, by a supreme recovery of moral health and martial vigour, we arise again and take our stand for freedom as in the olden time” (p. 182).

Few heeded Churchill’s words until Hitler actually moved, invading Poland in 1939 and attacking France six months later.  Then Churchill was called upon to lead his nation through the throes of WWII.  These were, his grandson says, “The Glory Years, 1939-45.”  The war was not simply against Germany, he insisted:  “We are fighting to save the whole world from the pestilence of Nazi tyranny and in defense of all that is most sacred to man” (p. 198).  It was a war to restore “the rights of the individual, and it is a war to establish and revive the stature of man” (p. 198).  It was a war of words–and Churchill  empowered his people with words.  He made memorable speeches during these years, offering nothing “but blood, toil, tears and sweat,”elicited courageous resolve.  His policy, he said in 1940, was” “to wage war, by sea, land and air, with all our might and with all the strength that God can give us; to wage war against a monstrous tyranny, never surpassed in the dark, lamentable catalogue of human crime” (p. 206).  And there was only one goal: victory!   Still more, addressing his alma mater, Harrow School, in 1941, he said: “surely from this period of ten months this is the lesson: never give in, never give in, never, never, never, never–in nothing great or small, large or petty–never give in except to convictions of honour and good sense.  Never yield to force; never yield to the apparently overwhelming might of the enemy” (p. 307).

As the Battle for Britain began, Churchill declared:  “Hitler knows that he will have to break us in this Island or lose the war.  If we can stand up to him, all Europe may be free” and the world saved.  “But if we fail, then the whole world, including the United States . . . will sink into the abyss of a new Dark Age     . . .   Let us therefore brace ourselves to our duties, and so bear ourselves that, if the British Empire and its Commonwealth last for a thousand years, men will still say, ‘This was their finest hour'” (p. 229).  And, indeed, it was.  The RAF defeated the Luftwaffe in the skies over England, and “Never in the field of human conflict was so much owed by so many to so few” (p. 245).   However dark the prospects, Churchill ever insisted that “these are great days” (p. 308) and that the courageous would prevail.

In due time, with the help of the United States and the Soviet Union, the war was won.  Churchill urged his colleagues “to offer thanks to Almighty God, to the Great Power which seems to shape and design the fortunes of nations and the destiny of man” (p. 390).  Addressing the nation on 8 May 1945, the day the war in Europe ended, he said “that in the long years to come not only will the people of this island but of the world, whenever the bird of freedom chirps in human hearts, look back to what we’ve done and they will say ‘do not despair, do not yield to violence and tyranny, march straight forward and die if need be–unconquered'” (p. 391).

He fully intended to finish the war against Japan, but England’s Socialists (the Labor Party) turned against him as soon as the war in Europe ceased.  They demanded a general election in July, 1945, and Churchill found himself battling to maintain his position as Prime Minister.  Feeling betrayed, he strongly denounced Socialism as “abhorrent to the British ideas of freedom.”  Though Labor Party leaders portrayed their positions as indigenously English, “there can be no doubt that Socialism is inseparably interwoven with Totalitarianism and the abject worship of the State.  It is not alone that property, in all its forms is struck at, but that liberty, in all its forms, is challenged by the fundamental conceptions of Socialism” (p. 396).  Desiring to control every aspect of life, Socialists sought to establish “one State to which all are to be obedient in every act of their lives.  This State is to be the arch-employer, the arch-planner, the arch-administrator and ruler, and the arch-caucus-boss” (p. 397).  But the English voters apparently wanted such a system–as well as escape the burdens of war–and Churchill’s Conservative Party lost the election.

The next era, “The Sunset Years 1945-63,” witnessed Churchill leading the opposition to the Labor Party of Clement Atlee, speaking out against Russia’s aggression, and returning to power in 1951 and retiring in 1955, soon after his 80th birthday.  He still spoke prophetically, especially at Westminister College in Fulton, Missouri, in 1946, where, in the presence of President Harry Truman, he warned that we must ever oppose “war and tyranny.”  To do so “we must never cease to proclaim in fearless tones the great principles of freedom and the rights of man which are the joint inheritance of the English-speaking world and which through Magna Carta, the Bill of Rights, the Habeas Corpus, trial by jury, and the English common law find their most famous expression in the American Declaration of Independence” (p. 417).  Such goods were endangered, however, because “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the Continent” (p. 420).  This was not the “Liberated Europe we fought to build up” (p. 421).  Soviet aggression threatened the world’s peace, and Stalin was as much a threat in the late ’40s as Hitler had been in the late ’30s.  Thus began the “cold war.”

Elected Prime Minister again in 1951, he sought to reverse the corrosive Socialist policies established by the Labor Government.  He stood for “a property-owning democracy” and detested “the philosophy of failure, the creed of ignorance and the gospel of envy” basic to Socialism.  High taxes and petty regulations had betrayed the traditions of the nation, he believed.  “In our view the strong should help the weak.  In the Socialist view the strong should be kept down to the level of the weak in order to have equal shares for all.  How small the share is does not matter so much, in their opinion, so long as it is equal” (p. 457).  “Socialists pretend they give the lower income groups, and all others in the country, cheaper food through their system of rationing and food subsidies.  To do it they have to take the money from their pockets first and circulate it back to them after heavy charges for great numbers of officials administering the system of rationing” (p. 460).

Few single volumes better illustrate the great issues of the 20th century, for no one I can think of stood for so long at the center of world events.  Churchill’s wisdom and courage, anchored to the realities of the political world, still command respect and (far better than many theoretical treatises) provide direction for making decisions today.

* * * * * * * * * * * * * * * * * *

Three Ronald Reagan scholars–Kiron K. Skinner, Annelise Anderson, and Martin Anderson– have edited Reagan In His Own Hand: The Writings of Ronald Reagan That Reveal His Revolutionary Vision for America (New York: The Free Press, c. 2001).  During the late 1970s, Reagan broadcast some 1,000 weekly radio talks.  His wife, Nancy, says he wrote these messages at his desk at home, rooting his ideas in “voracious” reading.  He rarely watched TV but devoted himself to reading, thinking, speaking.  Though he often appeared to speak spontaneously, in fact he carefully prepared and followed his written texts.  In his Foreword, George Schultz remembers his close association with Reagan: “I was always struck by his ability to work an issue in his mind and to find its essence, and by his depth of conviction. . . . [and] his intense interest and fondness for the spoken word, for caring very deeply about how to convey his thoughts and ideas to people” (p. ix).

The editors have collected Reagan’s radio talks into four categories, meticulously retaining the spelling and revisions in his handwritten texts.  In the first section, “Reagan’s Philosophy,” the reader discovers the bedrock principles that guided him.  Looking back, in 1989, he noted: “We meant to change a nation, and instead, we changed a world” (p. 4).  This resulted, in part, from the lessons he learned from WWII–lessons Churchill tried unsuccessfully to teach his countrymen in the 1930s.  Military weakness encourages aggression.  One month before Pearl Harbor, Reagan noted, the U.S. “Congress came within a single vote of abolishing the draft & sending the bulk of our army home” (p. 8).  The Japanese doubted America’s resolve and thus dared attack her.  We learned, Reagan said, citing an academic study, “that ‘to abdicate power is to abdicate the right to maintain peace'” (p. 8).  This extended to opposing the USSR and Communism in the Post-WWII era.  To those who argued “better red than dead,” he replied:  “Back in the ’20s, Will Rogers had an answer for those who believed that strength invited war.  He said, ‘I’ve never seen anyone insult Jack Dempsey [then the heavyweight boxing champion]'” (p. 480).

Reagan also believed, in accord with Churchill, that the subtly socialistic views of American liberals threatened disaster for the nation.  America’s strength resided in her confidence in “the individual genius of man” (p. 13).  Liberating the individual from government was a major Reagan theme, one he tirelessly repeated.  Summing up his position, he said:  “The choice we face between continuing the policies of the last 40 yrs. that have led to bigger & bigger govt, less & less liberty, redistribution of earnings through confiscatory taxation or trying to get back on the original course set for us by the Founding Fathers.  Will we choose fiscal responsibility, limited govt, and freedom of choice for all our people?  Or will we let an irresponsible Congress set us on the road our English cousins have already taken?  The road to ec. ruin and state control of our very lives?” (p. 10).

Thus he applauded the stance of Britain’s Margaret Thatcher.  Having visited her before she became Prime Minister in 1979, he predicted she would “do some moving & shaking of Englands once proud industrial capacity UNDER WHICH THE LABOR PARTY has bees running downhill for a long time.  Productivity levels in some industrial fields are lower than they were 40 yrs. ago.  Output per man hour in many trades is only a third of what it was in the 1930’s.  Bricklayers for example laid 1000 bricks a day in 1937–today they lay 300.  I think ‘Maggie’–bless her soul, will do something about that” (p. 47).  Indeed she did!  And she and Reagan became the staunchest of allies once he became President in 1981.

A commitment to freedom stood rooted in a belief in God, who had providentially guided this nation.  Reagan shared and often repeated Thomas Jefferson’s view that: “‘The God who gave us life gave us liberty–can the liberties of a Nat. be secure when we have removed a conviction that these liberties are the gift of God'” (p. 14). Still more:  America has a mission to spread the blessings of freedom.  Indeed, as Pope Pius XII said soon after the end of WWII:   “‘America has a genius for great and unselfish deeds.  Into the hands of Am. God has placed the destiny of an afflicted mankind.’  I don’t think God has given us a job we cant handle” (p. 16).

Such convictions shaped Reagan’s “Foreign Policy,” the second section in the book, summed up by these words: “We want to avoid a war and that is better achieved by being so strong that a potential enemy is not tempted to go adventuring” (p. 21).  Since these radio talks were given while Jimmy Carter was President, there was much for Reagan to criticize.  He discussed, insightfully, developments in Cambodia, Vietnam, Taiwan, Korea, Chile, Panama, Palestine, the USSR and Cuba (rebuking the tyrant Castro as a “liar”).  By contrast, Senator George McGovern, visiting Castro, “found the Cuban dictator to be a charming, friendly WELL INFORMED fellow.  It sort of reminds you of how we discovered Joseph Stalin was good old Uncle Joe, shortly before he stole among other things our nuclear secrets” (p. 183).   Everywhere, he argued, the U.S. should support freedom, especially regarding religious expression and private property rights, and he endorsed “Somerset Maughams admonition: “If a nation values anything more than freedom, it will lose it’s freedom; and the irony of it is, that if it’s comfort of money THAT it values more, it will lose that too'” (p. 85).

Part Three of the book, “Domestic and Economic Policy,” delineates what came to be called “Reaganism” in the ’80s.  Personal freedom, limited government, and minimal taxation anchored Reagan’s positions.  To reduce the rightful role of government to a brief sentence, he said: “Govt. exists to protect us from each other” (p. 288).   He favorably cited the words of an English editorialist:  “‘What the world needs now is more Americans.  The U.S. is the 1st nation on earth deliberately dedicated to letting people choose what they want & giving them a chance to get it.  For all it’s terrible faults, in one sense Am. still is the last, best hope of mankind, because it spells out so vividly the kind of happiness which most people actually want, regardless of what they are told they ought to want'” (p. 227).

Reading Reagan, in his own hand, reveals a thoughtful man cruelly maligned by his critics as an ignorant actor.  He routinely refers to the books and articles he was reading, carefully crediting quotations, and blends (with that justly renowned Reaganesque touch) human interest stories into his talks.

162 Global Warming Forest

                My father worked as a meteorologist for the United States Weather Bureau.  He occasionally joked that it helped, now and then, when compiling a weather report, to look out the window rather than stay buried in the papers cranked out by various machines.  Somewhat the same goes for the current concern for global warming.  That the globe is dramatically warming is an article of faith for most environmentalists and many politicians.  But scores of those who best understand what’s actually happening–looking at the evidence rather than computer projections–urge us to disregard TV snippets or Greenpeace press releases and study the facts.  So argues Patrick J. Michaels, research professor of environmental sciences at the University of Virginia and past president of the American Association of State Climatologists, in Meltdown:  The Predictable Distortion of Global Warming by Scientists, Politicians, and the Media (Washington, D.C.:  Cato Institute, c. 2004).  We can either rely on computer projections or factual observations, simulated scenarios or substantiated facts. 

            Truly the globe has warmed slightly during the past few decades.  But it has, in the more distant past, been considerably warmer, and there is no reason to think the current warming is caused by human activity.  Such warming will change some things, but the changes will be modest–nothing remotely like that described by alarmists who control the major media.  For example:  an article in Nature magazine  (one of the most prestigious scientific journals) recently predicted that a single degree (C) increase would destroy 15 percent of all species on earth.  But the earth warmed by more than a degree (C) a century ago and the planet’s species fared quite nicely!  Summarizing his position, Michaels insists:  “Global warming is real, and human beings have something to do with it.  We don’t have everything to do with it; but we can’t stop it, and we couldn’t even slow it down enough to measure our efforts if we tried” (p. 9). 

            NASA’s James Hansen, whose 1988 congressional testimony launched public concerns for global warming, recently (2001) noted that we can now more accurately assess the threat, and in the next 50 years the earth will probably warm up by less than one degree (C).  Hansen’s projection “is about four times less than the lurid top figure widely trumpeted by the United Nations in its 2001 compendium on climate change and repeated ad infinitum in the press” (p. 20).  In part this is because only one-third of the U.N.’s Intergovernmental Panel on Climate Change (IPCC) are climate scientists.  Even worse, its publications are not peer reviewed.  The IPCC’s influential 1996 Assessment relied on ground-measured temperatures, but utterly ignored the highly significant satellite data (which indicate absolutely no global warming of the atmosphere!).  Many ground temperature measurements are taken in areas that have experienced dramatic urban sprawl during the past century.  Thus, for example, Washington D.C. is significantly warmer than it was 50 years ago, but a measuring station in rural Virginia shows no increase at all.  It’s quite possible that much “warming” is simply the warming of areas adjacent to large cities, whose artificial environment (heat absorbing asphalt and heat producing factories and homes) falsifies the picture.  It’s even possible that the surface-warming trend will be reversed within a few decades.  

            The public knows little of this truth because the press is uncritical (or scientifically illiterate) and many scientists are locked into a funding network that discourages dissent.  Consider, for example, “the truth about icecaps.”  In 2001 a Washington Post headline screamed:  “The End is Near.”  Rising water would soon flood seaside cabins on the Chesapeake Bay, the story declared.  Senator Joseph Lieberman, in that year, repeated the alarmist mantra that melting polar icecaps would raise sea levels by “35 feet, submerging millions of homes under our present-day oceans” (p. 33).  If he wasn’t reading the Post, he could easily have taken this scenario from a similar article in the New York Times based upon “the observations of two passengers on a Russian cruise ship” that sailed through the Artic Ocean.  The passengers took pictures of the ice-free water and wondered if  “‘anybody in history ever got to 90 degrees north to be greeted by water, not ice'” (p. 43).  Though a bit of fact checking by the Times would have demonstrated the normality of this, headlines favor the abnormal and the story fueled the political agenda favored and financed by environmentalists.  But the truth is, as Michaels shows–with numerous graphs and scholarly citations–Greenland’s icecap is growing and there’s no cause for alarm.  And there’s especially no cause for alarm regarding the higher ocean levels predicted by Senator Lieberman!  “In fact, the North Polar icecap is a floating mass, and melting that will have absolutely no effect on sea level; a glass of ice water does not rise when the cubes have melted.  With regard to that other polar ice–Antarctica–most climate models predict little or no change” (p. 203). 

            Michaels applies the same scrutiny to allegations of species extinction.  Alarmist studies of butterflies (one of which was the foundation for the Kyoto Accord so sacred to politicians like Al Gore), toads, penguins, and polar bears simply do not survive careful scrutiny.  Though earth’s surface temperatures have slightly increased, there is no evidence that such warming has led to species’ extinctions.  So too hurricanes and tornadoes, droughts and floods, disease and death, though often attributed to global warming by impulsive journalists such as Dan Rather, simply cannot have been caused by it.  A case study for alarmism is the island of Tuvalu.  For years environmentalists like Lester Brown and local officials of this tiny Pacific island nation have been issuing warnings, asking for “environmental refugee status in New Zealand” for its 11,000 people (p. 203).  Tuvalu’s prime minister declared (a decade ago) that “the greenhouse effect and sea-level rise threaten the very heart of our existence” (p. 204).   London’s Guardian, just in time for an important United Nations conference, certified such fears.  “In fact,” Michaels says, the “sea level in Tuvalu has been falling–and precipitously so–for decades” (p. 204).  Not to be bothered by the facts, however the Washington Post irresponsibly spread the word that melting ice caps in polar regions would soon engulf the island nation! 

            Perhaps more distressing than Michael’s factual presentation is his critique of the scientific community responsible for promoting the myth of impending doom.  Just in time for the 2000 election, the U.S. National Assessment of the Potential Consequences of Climate Variability and Change was published.  Guided to publication by President Clinton’s Assistant for Science and Technology, John Gibbons, the report was carefully wrapped with all the ribbons of solid science.  “Gibbons was a popular speaker on the university circuit, lecturing on the evils of rapid population growth, resource depletion, environmental degradation and, of course, global warming.  His visual aids included outdated population and resource projections from Paul Ehrlich in which ‘affluence’ was presented as the cause of environmental degradation, a notion that has been discredited for decades” (p. 207).  Equally dated were his data on climate change! 

            Gibbons guided the various bureaucratic committees that led to the publication of the influential National Assessment.  These committees, “larded with political appointees,” were designed to deliver a document satisfactory to Vice President Al Gore, gearing up for his presidential campaign.  “The resultant document was so subject to political pressure that it broke the cardinal ethic of science:  that hypotheses must be consistent with facts” (p. 208).  The National Assessment embraced the most extreme computer projections regarding global warming–one of which would have erred by 300 percent if applied to the past century!  It ignored that fact that the most of the past century’s warming took place in the U.S. before there was any significant accumulation of greenhouse gasses in the atmosphere.  It even endorsed a Canadian study that predicted temperatures in the Southeast would soar to 120 degrees (F) by the year 2100–a totally ludicrous notion that could only occur if the Gulf of Mexico (and its moderating influence) evaporated!

            Michael’s final chapter, “The Predictable Distortion of Global Warming,” alerts us to the insidious role played by popular theoretical paradigms and the lure of federal funding in shaping contemporary science.  Today’s climatological paradigm reigns in powerful centers and encourages alarmist studies.  It’s the paradigm underlying various laws, for legislators quickly trumpet what’s taken to be conventional wisdom and public concern.  It’s also responsible for the fact that scientific journals rarely consider the possibility that “global warming is exaggerated” (p. 228).  Add dollars to the equation–a grand total of $20 billion granted scientists since 1990 to “research” global warming–and you begin to understand why it’s promoted!  The most prestigious journals–Science, Scientific American, Nature–simply will not tell the “obvious truth” that only minimal global warming is at all possible during the century to come!  Sad to say, money talks in the allegedly “objective” scientific community as surely as it does in politics! 

            What Michaels hopes, writing this book, is that a small but courage coterie of scientists and journalists will begin to challenge the dominant paradigm.  And certainly this book takes a step in that direction.

* * * * * * * * * * * * * * * * * *

            For a fictional version of Meltdown, pick up a copy of Michael Crichton’s recent thriller, State of Fear (New York:  HarperCollins, 2004).  I’ve never before read a novel that has scores of graphs, footnotes to scholarly articles, and a 10 page annotated bibliography at the end!  But they’re here, and it’s obvious he wanted to write more than a popular novel, which he’s certainly done many times!  The novel pits a handful of dedicated, scientifically-informed heroes struggling to save the earth from the machinations of fanatical environmentalists, nominally led by a Hollywood actor, who are manipulated by professional environmentalists who are more concerned with money and power than environmental integrity.  The environmentalists plan to trigger various global disasters and attribute them to global warming, coldly indifferent to their catastrophic results.  There’s riveting action and snitches of romance.  The plot’s suspenseful, and the pages turn quickly as one sinks into the story. And along with the dialogue and adventure, there’s the message!

            So read the story and enjoy it.  Then think about the book’s message, which was summed up by Crichton in a speech he gave in San Francisco in 2004 wherein he decried “the disinformation age” that results from “a secular society in which many people–the best people, the most enlightened people–do not believe in any religion” and embrace environmentalism.  They cite their scriptures (environmental classics by Aldo Leopold, Rachel Carson and Paul Ehrlich), recite their creeds (mantras regarding the plight of the planet and the evils of capitalism), join their cults (Sierra Club, Greenpeace, Earth First!), and denounce any challenges to their faith, especially including questions concerning global warming.   But they are–like the Hollywood character in State of Fear–misinformed at best and Machiavellian at worst. 

            In the “author’s message” at the end of the book, Crichton says he spent three years reading environmental texts before writing the novel.  What astonished him was how little we actually know about the state of the world.  Some things are obvious:  carbon dioxide in the atmosphere and the surface temperatures have both increased.  But, “Nobody knows how much of the present warming trend might be man-made” (p. 569).  The computer models generally cited in global warming scenarios vary enormously, and the best estimates suggest it will take 100 years to increase one degree centigrade.  He believes things will be much better for earth’s inhabitants in 2100, and he thinks “that most environmental ‘principles’ (such as sustainable development . . . ) have the effect of preserving the economic advantages of the West and thus constitute modern imperialism towards the developing world” (p. 571). 

            Environmental activists–Sierra Club and Environmental Defense League types–generally promote the antiquated scientific views of their youth.  Dramatic breakthroughs, such as nonlinear dynamics, chaos theory, and catastrophe theory, have fundamentally changed science without sinking into “the thinking of environmental activists” (p. 571).  He finds the ideas of “wilderness advocates” routinely spurious, declaring them no better than those propounded by “developers and strip miners” (p. 572).  What’s desperately needed is “more people working in the field . . . and fewer people behind computer screens” (p. 572).  And we need honest, independent scientists whose research isn’t funded by special interests, especially environmental organizations and bureaucracies such as the EPA! 

            Read in conjunction with Meltdown, Crichton’s novel effectively quiets (or at least permits questioning) some of the fears fanned by fanatical environmentalists.

* * * * * * * * * * * * * * * * * * * *

            In the summer of 2003, the Hayman Fire, the largest in Colorado history, started about 10 miles west of my summer home in the mountains.  Subsequently it crept five miles closer.  We were evacuated from our place for two weeks, and suddenly “forest fire” took on a whole different meaning!  The Hayman blaze was started by a Forest Fire employee (now in prison) who apparently wanted to gain fame by first reporting and then extinguishing it!  The fire burned so voraciously, many analysts believed, because of forest service policies which make such fires inevitable.  Thus I was drawn to read Robert H. Nelson’s A Burning Issue:  A Case for Abolishing the U.S. Forest Service (Boulder:  Rowman & Littlefield Publishers, Inc., c, 2000).  Nelson worked in the Department of the Interior for 15 years and knows the way Washington works!  He is now professor of environmental policy in the School of Public Affairs at the University of Maryland and has earlier written Public Lands and Private Rights:  The Failure of Scientific Management. 

            The Forest Service has evolved in accord with the various political agendas that shaped it.  During the first half of the 20th century, it mainly worked with timber companies to extract lumber from the nation’s forests.  This fit the philosophy of Gifford Pinchot and the Progressives who constructed the “administrative state” (p. 2).  The service also worked to suppress fires, to save the trees for harvesting.  Lumber companies cut roads and cleared sections of the forest, helping to limit the expanse of fires that erupted.  Cutting trees for lumber saved trees from burning.  More recently, especially following the Wilderness Act of 1964, recreation has assumed a major role in shaping forest policy, and “preserving wilderness areas” has been promoted.  There are now some 100 million acres reserved as national wilderness, and various kinds of preserved lands have expanded “from 51 million acres in 1964 to 271 million acres in 1993” (p. 9).  To keep increasingly large areas “untrammeled by man” has become the objective of powerful interests, and there have, consequently, been “sharp declines in timber harvesting, mining, and other traditional uses of the national forests.  The Clinton administration has actively sought to instill this ethos as the new core value defining the institutional culture of the Forest Service” (p. xiv).  Millions of acres, unless mechanically harvested, will vanish when “catastrophic fires” ignite them. 

In the midst of these policy shifts, the Forest Service steadfastly fought fires and allowed “the buildup of brush and dense thickets of smaller trees in many forests” that became powder kegs awaiting a spark to explode (p. 6).  State and private forests have not suffered similarly, for they “have in general been more intensively managed, involving higher timber harvest levels per acre and greater application of labor and capital for thinning, disease control, reforestation, and other purposes.  Yet, contrary to a common public impression, the more intensively managed state and private forests ‘appear to be healthier than [the] unmanaged forests,’ mostly in the national forest system” (p. 19). 

            The national forests, however, were increasingly “preserved.”  By 1996, the Sierra Club had moved to a radical position, pushing for a ban on all timber harvesting in national forests, even if it removed excess fuels in an effort to reduce forest fires!  Thus the radical reduction in timber harvest results not from a wood shortage but from “changing environmental values and shifting government policies” dictated by fervent environmentalists (p. 58).  These values, Nelson argues, are primarily religious.  “Environmentalism now claims in effect a unique access to the mind of God, reflecting the common substitution in the modern age of scientific truth for the earlier role of Judeo-Christian religion” (p. 131).  President Clinton’s Interior Secretary, Bruce Babbitt, said “‘we need not sacrifice the integrity of God’s creation on the altar of commercial timber production” (p. 67).  He follows the lead of secular prophets, a long list headed by Gifford Pinchot who envisioned conservation as a means of realizing “‘the Kingdom of God on earth'” (p. 69).  Today’s faithful tend to see preservation as a means of regaining the Garden of Eden!  Wilderness areas are frequently referred to as “‘cathedrals,’ ‘sacred places,’ and other religious terms” (p. 73).  Thus the Wilderness Society motto is a Thoreau declaration:  “in Wildness is he preservation of the world.”  Consequently, says “Thomas Bonnicksen, a respected professor at Texas A&M University, perceives that ‘zealots within the agencies, encouraged by some preservations groups and ideologues in universities, have taken over our National Park and Wilderness areas and converted them into their own quasi-religious temples.’  They have renounced the ‘original purpose of providing for “the enjoyment of the people” and instead are now aiming to “satisfy the spiritual needs of a small but influential subculture”‘” (p. 129). 

            To deliver the nation’s forests from this influential subculture, Nelson suggests, we should abolish the U.S. Forest Service and decentralize its functions.  To allow states and smaller communities to decide what to do with forested lands would lead, he thinks, to better management, restored lumber harvests, healthier trees, less destructive fires.  It would also diminish the influence of powerful environmental groups, with the Washington D.C. headquarters, that lobby politicians and finance “scientific studies” to sustain their power. 

161 War Against the Weak

Terri Schaivo’s recent death illustrates the continuation of a process detailed in Edwin Black’s War Against the Weak:  Eugenics and America’s Campaign to Create a Master Race (New York:  Four Walls Eight Windows, c. 2003).   The author, assisted by some 50 researchers combing various archives, links eugenics’ enthusiasts in the United States a century ago (who were primarily concerned with sterilizing the “unfit” and breeding a better species) with the Nazis who vigorously implemented their ideas a generation later.  “National Socialism,” Black says, “transduced America’s quest for a ‘superior Nordic race’ into Hitler’s drive for an ‘Aryan master race.’  The Nazis were fond of saying ‘National Socialism is nothing but applied biology,’ and in 1934 the Richmond Times-Dispatch quoted a prominent American eugenicist as saying, ‘The Germans are beating us at our own game'” (pp. xvi-xvii).

To accomplish this, eugenicists in both Germany and America had to defy and destroy a deeply-engrained principle in Western Civilization:  the sanctity of life.  As President George W. Bush recently said, the day Terri Schaivo died in Florida, a good civilization is distinguished by its care for her weakest, most vulnerable persons.  But challenging that position is a worldview rooted in Darwinian biology that insists a species evolves (and thus improves) as its fittest individuals survive.  Charles Darwin’s cousin, Francis J. Galton, wriote Hereditary Genius six years after the 1859 publication of On the Origin of Species and coined the word “eugenics” two decades later.  Galton is widely considered the “father” of that alleged scientific discipline,

Eugenic ideas quickly gained a favorable hearing in the United States.  In the same year Galton published Hereditary Genius, 1865 “the utopian Oneida Community in upstate New York [best known for its “free love” experiments under the guidance of John Humphrey Noyes] declared in its newspaper that, ‘Human breeding should be one of the foremost questions of the age. . . .’  A few years later, with freshly expounded Galtonian notions crossing the Atlantic, the Oneida commune began its first selective human breeding experiment with fifty-three female and thirty-eight male volunteers” (p. 21).  Though Galton himself disavowed such breeding endeavors, a small cadre of Americans envisioned infinite progress through racial purification.  One of the trustees of the American Museum of Natural History, Madison Grant, the author of The Passing of the Great Race, made the goal clear, “writing that Nordics ‘were the white man par excellence'” (p. 29).  Grant opposed all interracial unions, asserting that inferior race degraded the “superior” race.  Thus the children of an Iroquois and a German were Iroquois.  Somehow the blood of “inferior” races dominates the genetic development of mixed-blood offspring!  To some eugenicists, one drop of a “mongrel’s” blood makes one a mongrel!

Genetic research in America thrived in large part because wealthy individuals (Mary Harriman) and foundations (Rockefeller, Carnegie) supported it.  In 1904 The Carnegie Institution’s Station for Experimental Evolution opened at Cold Spring Harbor, on Long Island, NY.  The station’s director, Charles Davenport, thenceforth played a crucial role, for nearly 40 years, as the eugenicists’ prime proponent.  Davenport began by gathering a sizeable research library and lab animals.  He effectively enlisted professors from the nation’s most prestigious universities as “associates” at Cold Spring Harbor.  From its inception, Black says:  “Eugenics was nothing less than an alliance between biological racism and mighty American power, position and wealth against the most vulnerable, the most marginal and least empowered in the nation” (p. 57).

One of the powerful men aligned with eugenics was Oliver Wendell Holmes, a justice of the United States Supreme Court, who wrote nearly 1,000 opinions in 30 years on the bench.  He interpreted the country’s Constitution as a “living” document, changing with the decades in accord with new experiences and convictions.  Wounded at Chancellorsville in the Civil War, he read Herbert Spencer’s Social Statics while recovering.  Converted to Spencer’s version of Social Darwinism, Holmes subsequently rejected, as “inherently absurd,” the notion that all men “are endowed by their Creator with certain inalienable rights.”  Truth to tell, he decided:  might makes right.  “‘Truth,’ he declared, ‘is the majority vote of that nation that could lick all others'” (p. 119).  Humanitarians, do-gooders, religionists, sentimentalists–all equally evoked Holmes’ disdain.

Before Justice Holmes’ court came the case of Carrie Buck, a Virginia woman declared “unfit” to bear children.  Under the laws of her state, crafted by eugenicists citing the “research” of Davenport and his colleagues at Cold Spring Harbor, defectives should be sterilized in order to improve the biological basis of society.  Since Carrie Buck was anything but demonstrably defective, her case wound its way to the nation’s highest court.  Writing for the majority of his colleagues, Holmes wrote, in 1927, that Carrie Buck was a “feeble minded” woman who should be sterilized.  “‘It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind.'” The state had the necessary power.  And, he concluded:  “‘Three generations of imbeciles are enough'” (p. 121).

Having demonstrated the power of eugenics in America, Black turns to its demonic results abroad.  Though folks like H.G. Wells in England insisted that the movement focus upon restricting the births of undesirables, growing numbers scientists in northwest Europe, and especially in Germany, subtly advocated a more active approach:  “eugenicide.”  A report from the American Breeders Association toyed with the notion of “‘painless killing’ of people deemed unworthy of life.  The method most whispered about, and publicly denied, but never out of mind, was a ‘lethal chamber'” (p. 247).  Lethal chambers were widely used to eliminate unwanted pets in England, and a few eugenicists suggested that “idiots” and “imbeciles” could be similarly destroyed.  Unfit children, the American urologist William Robinson declared, should be chloroformed or given potassium cyanide.  And Madison Grant, in The Passing of the Great Race, summed it up:  “Mistaken regard for what are believed to be divine laws and a sentimental belief in the sanctity of human life tend to prevent both the elimination of defective infants and the sterilization of such adults as are themselves of no value to the community.  The laws of nature require the obliteration of the unfit and human life is valuable only when it is of use to the community or race” (p. 259).

Grant received a letter commending his book from an obscure German politician who referred to it as “his Bible.”  The politician was, of course, Adolf Hitler, whose “eugenic writings resembled passages from Grant’s The Passing of the Great Race” (p. 274).  And he would, in time, implement the eugenic policies so favored by Americans such as Grant.  Still more:  the Nazi legislation followed precedents already laid down by laws in America.  Naturally, Hitler derived his views from German eugenicists as well.  And during the 1920s there were vigorous proponents of race purification and perfection whose ideas bolstered his visions of a “master race” imposing its will upon the world.  To get such a race, “breeding facilities” were needed to “mass-produce perfect Aryan babies.”  They would fulfill Nietzsche’s aspiration for Ubermenschen–”taller, stronger and in many ways disease-resistant” (p. 367).  As defined by Hitler’s influential associate, Rudolf Hess, “National Socialism is nothing but applied biology” (p. 270).

To do this, professors, such as Ernst Rudin, were recruited and scholarly institutes, preeminently the Kaiser Wilhelm Institute, established.  Maps were drawn to indicate concentrations of “defective” and “half-breed” populations.  Such folks were dismissed as “worthless eaters” who led a “life unworthy of life.”  Fortunately for the professors, money flowed freely from the Rockefeller Foundation to various “researchers” in Europe.  A new punch card system, perfected and exported by IBM, facilitated such research.  Consequently, in the ’40s, “thousands of Germans taken from old age homes, mental institutions and other custodial facilities were systematically gassed.  Between 50,000 and 100,000 were eventually killed.  Psychiatrists, steeped in eugenics, selected the victims after a momentary review of their records, jotted their destinies with a pen stroke, and then personally supervised their exterminations” (p. 317).

More horrendous developments followed:  Buchenwald; Auschwitz; the Holocaust.  At Buchenwald, Dr. Edwin Katzen-Ellenwald, who had spent many years in America and “served as chief eugenicist of New Jersey under then-Governor Woodrow Wilson” (p. 320), carried out experiments and supervised the killing of thousands of inmates.  He was, ironically, a Jew who was arrested by the Nazis in 1940.  At Auschwitz, another eugenicist, Josef Mengele–the “Angel of Death”–conducted scientific experiments and orchestrated the killing of thousands.  He was particularly interested in the study of twins, following the lead of Francis Galton, to determine precisely how genetics affected a person’s response to various experiments.  “Twins were the perfect control group for experimentation” (p. 348).

With the collapse of the Third Reich and the world’s horrified reaction to its eugenic aspirations, following WWII the movement’s leaders re-labeled it and re-furbished themselves.  Much of the alleged “scientific evidence” accumulated by eugenicists was largely discarded as spurious.  But the basic commitment remained.  The American Society of Human Genetics was established and elected an American, Joseph Muller, who had worked at the Kaiser Wilhelm Institute in the ’30s.  German eugenicists who were not implicated in Hitler’s policies settled quickly into research positions in America and Germany.  England’s James Watson, famed for co-discovering DNA, was deeply involved with the Cold Spring Harbor Laboratory for 40 years.  The term “eugenics” was replaced with a more respectable term, “genetics,” and given fresh energy by environmentalists who focused upon the dangers associated with the world’s “population explosion.”

And we have, today, thousands of scientists clamoring for what Black labels “newgenics.”  To eliminate birth defects, to design a perfect baby, to make ourselves masters of our racial development, are goals widely embraced by many geneticists.  And some propose more radical steps.  “Mass social engineering is still being advocated by eminent voices in the genetics community” such as James Watson, who in 2003 labeled the “lower 10 percent” of the population “stupid” and confessed:  “So I’d like to get rid of that, to help the lower 10 per cent” (p. 442).  Others propose cloning as a means whereby we can perfect our species.  Brave New World approaches!

Black’s book is packed with carefully documented information–60 pages of dense, double-column footnotes.  His research, in various languages, clearly demonstrates the close (if unintended) ties between American eugenics and Nazi genocide.  Though the links may be more correlative than causative, they certainly indicate influence.  There are, as one might expect, certain blind spots in Black’s vision.  Strangely enough, he indignantly condemns Margaret Sanger for supporting sterilization for eugenic purposes while endorsing her stance on abortion.  To oppose the sterilization of a retarded person while allowing the killing of unborn children seems morally confused at best!  But the very title of the book, War Against the Weak, rightly alerts us to the unending struggle at the heart of our culture.

As a companion volume, Richard Weikart’s From Darwin to Hitler:  Evolutionary Ethics, Eugenics, and Racism in Germany (New York:  Palgrave Macmillan, c. 2004) deserves careful study.  Weikart is an associate professor at California State University, Stanislaus, who previously published Socialist Darwinism:  Evolution in German Socialist Thought from Marx to Bernstein.  He does meticulous research and reaches cautious conclusions.  But his message is akin to Black’s:  certain ideas, rooted in Darwinian biology, were brutally implemented in Hitler’s Germany.

Darwin himself, in his Autobiography, noted that a person such as himself, having discounted if not rejected the reality of God and immortality, “can have for his rule of life, as far as I can see, only to follow those impulses and instincts which are the strongest or which seem to him the best one” (p. 21).  Herbert Spencer and Leslie Stephen, almost immediately, developed this position into a purely naturalistic, evolutionary ethics frequently labeled Social Darwinism.  T.H. Huxley–”Darwin’s bulldog”–interpreted Darwin to conclude that “only from death on a genocidal scale could the few progress” (p. 74).  David Friedrich Strauss, famed for his radical portrayal of the purely human “historical Jesus” in The Life of Jesus, published an equally shocking book, The Old Faith and the New in 1872, urging that the old Christian faith be replaced by a naturalistic “worldview containing large doses of Darwinian science” (p. 33).  Though Darwinians today generally demand that the biological theory be separated from its social implications, Spencer and Huxley rightly recognized the inescapable link between them.

The Nazis, six decades later, stepped through Darwin’s door and openly rejected Judeo-Christian morality, seeking to establish a new ethic rooted in both Darwin and Friedrich Nietzsche, whose philosophy was a “direct response to evolutionary ethics” (p. 46).  Nietzsche, for example, encouraged suicide and euthanasia, eliminating disabled children and incurably ill adults.  Many Nazis drank deeply of Nietzsche, and they took seriously the call to revalue all values, to construct a new morality more attuned to natural science.  Indeed, Professor Weikart’s research shows that certain unintended consequences seem inevitable when certain ideas are enthroned in ideological movements.  No one doubts that Darwin would have personally detested the Nazi’s concentration camps, but they are perhaps inevitable consequences of the philosophy he advocated.

Weikart discovered and documents the fact “that many Darwinists believed that Darwinism had revolutionary implications for ethics and morality, providing a new foundation for ethics and overturning traditional moral codes” (p. ix).  Eminent thinkers, such as Darwin’s Cambridge mentor, Adam Sedgwick, immediately recognized this in 1859.  Writing his former student, Sedgwick protested that Darwin ignored the moral or metaphysical aspects of human nature.  Doing so would gravely harm mankind, reducing it “into a lower grade of degradation than” ever recorded by historians (p. 1).  Before Darwin, Weikart repeatedly emphasizes, the “sanctity of life” was an intact, governing principle throughout Western Civilization.  Murder, abortion, infanticide, suicide, and euthanasia were both condemned and illegal.   Though the “sanctify of life” ethic was deeply rooted in Christian, even the manifestly humanistic ideals of the French Revolution–Liberty, Equality, Fraternity–were also repudiated by Darwinists so as to celebrate “determinism, inequality, and selection” (p. 89).

A tight nucleus of positions and policies appeared wherever Darwinian evolution gained currency.  Ernst Haeckel, Darwin’s most aggressive and influential disciple in Germany, envisioned a radically different world, reconfigured by Darwin’s theory of natural selection.  Freed from centuries of Judeo-Christian tradition, Haeckel and his associates “denied any possibility of divine intervention, heaped scorn on mind-body dualism, and rejected free will in favor of complete determinism” (p. 13).  Darwinism, Haeckel argued, has inescapable ethical implications:  “(1) Darwinism undermines mind-body dualism and renders superfluous the idea of a human soul distinct from the physical body.  (2) Darwinism implies determinism, since it explains human psychology entirely in terms of the laws of nature.  (3) Darwinism implies moral relativism, since morality changes over time and a variety of moral standards exist even within the human species.  (4) Human behavior and thus moral character are, at least in part, hereditary.  (5) Natural selection (in particular, group selection) is the driving force producing altruism and morality” (p. 25).

Consequently, Haeckel advocated killing the “unfit” through abortion and infanticide and euthanizing the mentally ill as well as incurable cancer patients and lepers.  All such steps were a “logical consequence of his Darwinian monistic worldview” (p. 146).   Evaluating Haeckel, Weikart notes that “it is striking that the vast majority of those who did press for abortion, infanticide and euthanasia in the late nineteenth and early twentieth centuries were fervent proponents of a naturalistic Darwinian worldview.  Some did not overtly link their views on killing the feeble to Darwinism, though many did” (p. 149).  Anticipating the chorus of criticism, always issued by Darwinists who claim that biology has no social aspects, Weikart sums up his argument:  “First, before the rise of Darwinism, there was no debate on these issues, as there was almost universal agreement in Europe that human life is sacred and that all innocent human lives should be protected.  Second, the earliest advocates of involuntary euthanasia, infanticide, and abortion in Germany were devoted to a Darwinian worldview.  Third, Haeckel, the most famous Darwinist in Germany, promoted these ideas in some of his best-selling books, so these ideas reached a wide audience, especially among those receptive to Darwinism.  Finally, Haeckel and other Darwinists and eugenicists grounded their views on death and killing in their naturalist interpretation of Darwinism” (pp. 160-161).

This was also evident in the militarism so pronounced in WWI.  Darwinists frequently celebrated war as an effective means of natural selection.  The evolution of the human species advanced as “inferior races” were eliminated in combat.  The strongest rightly destroy the weak, and racial progress follows, as Darwin himself said.  In his Descent of Man, for example, Darwin predicted that “savage races” would be replaced by “civilized races,”  as was then taking place in New Zealand.  As H.G. Wells (whose simplistic evolutionary historical works were widely read) declaimed:  “‘there is only one sane and logical thing to be done with a really inferior race, and that is to exterminate it'” (p. 85).  To Franz Conrad von Hotzendorf, the Austrian chief of the general staff, naturalistic evolution explained everything.  He had read Darwin as a youngster, and thenceforth considered “‘the struggle for existence as the fundamental principle of all earthly events [and] the only real and rational foundation of any policy.’  History, he thought, was a continual ‘rape of the weak by the strong,’ a violent contest decided by bloodshed” (p. 173).  To the extent he had a moral code it was that of the ancient Sophist, Thrasymachus:  “‘”Right” is what the stronger wills'” (p. 173).

Weikart persuasively documents the degree to which such views permeated German society in the first three decades of the 20th century.  Such positions were by no means espoused by the majority of the people.  But highly aggressive and influential people did so.  Consequently, when Hitler seized power there were  significant numbers of people prepared to support his racial and eugenic notions.  Most of them took their beliefs from popular writers, but those writers were generally disseminating positions espoused by eminent scientists like Haeckel.  With Haeckel–and then Hitler–they had rejected the Judeo-Christian morality as antiquated and unscientific.  With them, they believed the “Aryan race” was the “master race” and entitled to rule “inferior” peoples.  When necessary, certain “inferior” people should simply be eliminated.  Old-fashioned advocates of the sanctity of life could simply be ignored because Darwinism more accurately described the true reality of things.  “In Hitler’s mind Darwinism provided the moral justification for infanticide, euthanasia, genocide, and other policies that had been (and thankfully still are) considered immoral by more conventional moral standards.  Evolution provided the ultimate goals of his policy:   the biological improvement of the humans species” (p. 215).

# # #

160 Schroeder’s Cosmology


                Tremors cascaded through the philosophical community when Anthony Flew, perhaps the world’s most famous atheist, recently announced that accumulating evidence pointing to a deistic Designer had persuaded him to believe in a Creator-God.  Flew has certainly not embraced theism, but he now seems aligned with thinkers such as Thomas Jefferson.  Explaining his new position, he named Gerald L. Schroeder as one of two thinkers who had most influenced him.  (The second man mentioned is Roy Varghese, the author of The Wonder of the World:  A journey from Modern Science to the Mind of God).  Following his undergraduate and graduate studies at the Massachusetts Institute of Technology and some years in America’s atomic energy establishment, Schroeder has worked at several research institutes in Israel.  In the midst of his scientific work, he also reared children, and their questions regarding Scripture and science prodded him to write three engaging and persuasive books:  Genesis and the Big Bang:  The Discovery of Harmony Between Modern Science and the Bible (New York:  Bantam Books, c. 1990); The Science of God:  The Convergence of Scientific and Biblical Wisdom (New York:  Broadway Books, c. 1997; and The Hidden Face of God:  Science Reveals the Ultimate Truth (New York:  Simon & Schuster, c. 2001).  

            Since the same basic themes weave their way through all three books, I’ll focus on seven of what I take to be Schroeder’s central theses rather than dealing with each book independently.  Firstly, as a physicist seeking wisdom he acknowledges the ultimate importance of metaphysics.  Aristotle, in his great treatise, Metaphysics, noted that “all men suppose what is called Wisdom to deal with the first causes and the principles of all things.”  We cannot but wonder why there is anything rather than nothing.  “Why is there an ‘is’?” (Face, 1).  This is the truly amazing question underlying all Schroeder’s books.  For to understand what Aristotle called the universal “being qua being” is the greatest of all intellectual challenges, and Schroeder believes the evidence (both ancient and modern) points to “a metaphysical Force that brought the physical universe into being.  The universe is the physical expression of the metaphysical” (Face, 1). 

Summing up this conviction in his latest book, he says:  “The physical system we refer to as our universe is not closed to the nonphysical.  It cannot be closed.  Its total beginning required a nonphysical act.  Call it the big bang.  Call it creation.  Let the creating force be a potential field if the idea of God is bothersome to you, but realize the fact that the nonphysical gave rise to the physical.  Unless the vast amounts of scientific data and conclusions drawn by atheistic as well as devout scientists are in extreme error, our universe had a metaphysical beginning.  The existence–if the word existence applies to that which precedes our universe–of the eternal metaphysical is a scientific reality.  That single exotic fact changes the rules of the game.  In fact, it establishes the rules of the game.  The metaphysical has at least once interacted with the physical.  Our universe is not a closed system” (Face, 186). 

Given the importance of metaphysics, Schroeder’s second theme is the intricate relationship between mind, energy, and matter.  The universe, in its most ultimate and important aspect, is mental rather than physical.  The scientific materialism that has so dominated modern science–as well as certain currents of Greco-Roman philosophy–has dramatically lost its allure for 20th century physicists.  They virtually all now agree that “energy is the basis of matter” (Science, xii).  What’s increasingly clear, Schroeder argues, is that “wisdom and knowledge are the basis of, and can actually create, energy which in turn creates matter” (Science, xii).  This view is as controversial today as were Einstein’s theorems a century ago, but most everything points to that conclusion.  John Archibald Wheeler, a renowned Princeton University professor of physics, has “likened what underlies all existence to an idea, the ‘bit’ (the binary digit) of information that gives rise to the ‘it,’ the substance of matter” (Face, 8).   This is a truly “profound” position, for it means “that information is the actual basis from which all energy is formed and all matter constructed” (Face, 154).   What’s really Real, as Plato discerned, are the ideas, the forms, that shape matter.  However offensive this may be to philosophical materialists, it is no more strange than the widely-accepted notion that “the massless, zero-weight photon,” the most elementary of elements, “gives rise to the massive weight of the universe” (Face, 154). 

What scientists like Wheeler now recognize as information or ideas the ancient Hebrew Bible calls wisdom.  This leads to Schroeder’s third main emphasis:  the harmony between science and Scripture.  Today’s scientists, who have discovered the “underlying unity of the physical world,” are “on the brink of discovering an even more sensational reality, one predicted almost three thousand years ago, that wisdom is the basis of all existence.  ‘With the word of God the heavens were made’ (Ps. 33:6).  ‘With wisdom God founded the earth’ (Prov. 3:19)” (Face, 88).  The Bible ( rightly read and interpreted by great exegetes like Maimonides and Nahmanides, allowing for both the literal and symbolic aspects of Revelation) illuminates and harmonizes the most recent scientific discoveries.  Only the Bible, of all the ancient religious texts, generates serious interest in sophisticated scientific circles.  “It alone records a sequence of events that approaches the scientific account of our cosmic origins” (Science, 80).  Indeed, “The parallel between the opinion of present-day cosmological theory and the biblical tradition that predates it by over a thousand years is striking, almost unnerving” (Genesis, 67).  The ancient words of  “Divine revelation” accurately portray the creative process.  But “the words are only a part of the message.  The other part is placed within nature, the wisdoms inherent in the Creation.  Only when we understand those hidden wisdoms will we be able to read between the prophetic lines and fully understand the message.  With the help of science, we are learning to read between the lines” (Face, p. 173). 

            Fourthly, we now read creation in the awesome light of the Big Bang.  Fifty years ago, many physicists, such as Fred Hoyle, still held to the “steady state” cosmos, an eternally existent material world.  Christian thinkers (such as Athanasius in the fourth century and Thomas Aquinas in the 13th) who insisted on creation ex nihilo did so purely on the basis of Revelation, understanding that the word barah “is the only word in the Hebrew language that means the creation of something from nothing” (Genesis, 62).   It cannot be emphasized too strongly that creation ex nihilo “is at the root of biblical faith” (Genesis, 62).  Amazingly, like a relentless blizzard, recent cosmological data regarding the formation of the universe points to precisely such a singular event–an explosion of being–some 15 billion years ago, when literally everything was a super-concentrated bundle of energy “the size of a speck of dust.  It would have taken a microscope to study it” (Genesis, 65).  With incredible power, this concentrated bit of energy expanded and quickly transitioned into matter, taking form in accord with the four basic laws of the universe.  Though these laws largely explain the ways the universe thenceforward developed, they do not, in any way, explain why it came into being.  Neither physics nor any other branch of science can explain being.  So metaphysics, marvelously revealed in the Bible, is needed. 

            Granting the reality of the Big Bang and its explicit confirmation of Einstein’s theories, we move to Schroeder’s fifth point:  time is relative to the speed of light and is thus, like space and matter, hardly constant.  “It is highly significant that light was the first creation of the universe.  Light, existing outside of time and space, is the metaphysical link between the timeless eternity that preceded our universe and the world of time and space and matter within which we life” (Science, 165).  While we cannot fully fathom it, Einstein’s equations indicate that “tomorrow and next year can exist simultaneously with today and yesterday.  But at the speed of light they actually and rigorously do.  Time does not pass” (Science, 164).  A Rolex watch on the moon runs more rapidly than one on earth because it’s subject to less gravity.   Accelerated so as to approach the speed of light, it would virtually stop ticking but still work perfectly.  Depending on one’s reference point, “when a single event is viewed from two frames of reference, a thousand or even a billion years in one can indeed pass for days in the other” (Genesis, 34).  It is, thus, quite correct to insist that the Bible’s six day creation account and a 15 billion year-old cosmos, are identical:  “Deep within Psalm 90, there is the truth of a physical reality:  the six days of Genesis actually did contain the billions of years of the cosmos even while the days remained twenty-four-hour days” (Science, 43).  However illogical it may seem, Schroeder insists that this is literally true:  “Six 24-hour days elapsed between ‘the beginning,’ that speck of time at the start of the Big Bang, and the appearance of mankind, and simultaneously, it took some 15 billion, 365-day years to get from ‘the beginning,’ or the Big Bang as astrophysicists call it, to mankind” (Genesis, 29).  From God’s standpoint–and that’s what’s recorded in the Genesis account–creation took six days.  But His days are the same as 15 billion years from our perspective, as we weigh the scientific data. 

            To cite Schroeder more fully on this critical point, he says:  “To measure the age of the universe, we look back in time.  From our perspective using Earth-based clocks running at a rate determined by he conditions of today’s Earth, we measure a fifteen-billion-year age.  And that is correct for our local view.  The Bible adopts this Earthly perspective, but only for times after Adam.  The Bible’s clock before Adam is not a clock tied to any one location.  It is a clock that looks forward in time from the creation, encompassing the entire universe, a universal clock tuned to the cosmic radiation at the moment when matter was formed.  That cosmic timepiece, as observed today, ticks a million million times more slowly than at its inception” (Science, 58).  Amazingly, “This cosmic clock records the passage of one minute while we on Earth experience a million million minutes.  The dinosaurs ruled the Earth for 120 million years, as measured by our perception of time.  These clocks are set by the decay of radioactive nuclides here on Earth and they are correct for our earthly system.  But to know the cosmic time we must divide earth time by a million million.  At this million-million-to-one ratio those 120 million Earth years lasted a mere hour” (Science, 58).  Summing up what this all means, Schroeder says:  “In terms of days and years and millennia, this stretching of the cosmic perception of time by a factor of a million million, the division of fifteen billion years by a million million reduces those fifteen billion years to six days!” (Science, 58).  Though such statements may utterly perplex those of us mystified by the mysteries of modern physics, Schroeder’s tantalizing suggestions certainly open one’s mind to time’s dilation and the implications it has for understanding the “days” of creation.

            Periodically in his presentations Schroeder points out some of the glaring flaws of evolutionary theory, especially regarding the beginning of the universe and the origin of living organisms, what seems to me his sixth distinct emphasis.  His criticism comes in three categories:  1) the mathematical improbability of aimless evolution; 2) the glaring gaps in the fossil record; and 3) the willful deceptions advanced by some of evolutionary science’s premier advocates. 

Schroeder, like most physicists (and unlike many biologists), finds the cosmos’ structures deeply, indeed astoundingly, mathematical.  But, strangely enough, “many of the texts on evolution eschew any semblance of a mathematical analysis of the theories that random reactions produced this ordered, information-rich complexity” (Face, 120).  When they do they encounter difficulties.  For example, two evolutionary biologists invited a renowned chemist, Henry Schaffer, to calculate the mathematical probabilities of their presentation, Population Genetics and Evolution, and their case was tainted by the fact that “evolution via random mutations has a very weak chance of producing significant changes in morphology” (Face, 120).  Most biologists just ignore the mathematical problems entailed in their presentations.  Yet to Schroeder, alleged “scientists” who blithely dismiss the mathematical improbabilities of evolution betray the fundamental nature of the reality they claim to explain.  He cites favorably the 1968 work of a Yale University physicist, Harold Morowitz,who carefully calculated the probability of earthly life evolving by random selection (as most evolutionists insist).  Five billion years, he showed, affords too little time “for random chemical reactions to form a bacterium–not an organism as complex as a human, not even a flower, just a simple, single-celled bacterium.  Basing his calculations on optimistically rapid rates of reactions, the calculated time for the bacterium to form exceeds not only the 4.5-billion-year age of the Earth, but also the entire 15-billion-year age of the universe” (Genesis, 111).  “In short, life could not have started by chance” (Science, 85). 

            Nor does the fossil record validate the theory of gradual, mindless, random evolution.  “Macro-evolution, the evolution of one body plan into another–a worm or insect or mollusk evolving into a fish, for example–finds no support in the fossil record, in the lab, or in the Bible” (Science, 16).  More than a century after Darwin, Niles Eldredge, one of the world’s most distinguished paleontologists, admitted that the evidence for the gradual evolution of living creatures was still lacking.  “The fossil record of the late 1900s,” says Schroeder, “is as discontinuous as that of Darwin’s (and Wallace’s) time” (Genesis, 134).  Species–and intricate things like eyes and gills– simply appear in the fossil record and “stasis, not change, is the trend with all species yet formed” (Genesis, 135).  During the five million year Cambrian explosion, all the major extant phyla suddenly appeared.  Neither before nor since has anything remotely similar transpired.  For example, five phyla appeared in the Cambrian Era with various kinds of visual systems.  But there is no “common ancestor” for these seeing creatures.  Indeed, “there is no animal, let alone an animal with a primitive eye, prior to these eye-bearing fossils.  Random reactions could never have reproduced this complex physiological gene twice over, let alone five times independently.  Somehow it was preprogrammed” (Face, 121).  This was definitively demonstrated by Harvard’s Elso Barghoorn, who studied the oldest fossil-bearing rocks available.  He discovered “fully developed bacteria” in rocks some 3.6 billion years old.  Living cells were, in fact, present at virtually the same time “liquid water first formed on Earth” (Face, 51).  “Overnight, the fantasy of billions of years of random reactions in warm little ponds brimming with fecund chemicals leading to life evaporated.  Elso Barghoornhad discovered a most perplexing fact:  life, the most complexly organized system of atoms known in the universe, popped into being in the blink of a geological eye” (Face, 51).       

            Finally, Schroeder condemns some eminent evolutionists for their calculating deceits.  To win their case in the court of public opinion, learned scientists have fudged the evidence.  Darwin himself–seven times in the Origin of Species–”implored his readers to ignore the evidence of the fossil record as a refutation of his concept of evolution or to ‘use imagination to fill in its gaps'” (Science, 31).   Shortly thereafter, Charles D. Walcott, the director of the Smithsonian Institute, collected 60,000 Cambrian fossils from Canada’s Burgess Pass.  Hauling them back to Washington, D.C., he stacked them away in laboratory drawers.  Perhaps the richest fossil collection dealing with the appearance of life on earth was deliberately shelved.  Walcott devoutly believed in Darwinian evolution, and the fossils he collected challenged that belief.  If the evidence challenges the theory, hide the evidence!  So Walcott simply kept the fossils out of sight.  Rediscovered in the 1980s, these fossils have played a major role in making clear “Evolution’s Big Bang” in the Cambrian Era. 

More recently, Harvard’s Nobel Prize-winning Professor George Wald declared (in the Scientific American), that life had necessarily arisen from random chemical reactions.  To Schroeder, Wald’s remarks illustrate the fact that such views were “often based on poorly researched science present as fact by one or a few noted personalities” (p. 110).  Wald was in fact so wrong that the magazine printed a retraction of his earlier declaration–an unheard of correction of a Nobel laureate.  Further illustrating the deceitful strategies of eminent evolutionists, Schroeder shows how another Harvard biologist, Stephen Jay Gould, concluded an essay with a quotation from Darwin’s Origin of Species that (in Darwin’s text) left open the possibility that God might have played a role in creation.  Gould, however, deleted some of Darwin’s words and then capitalized a word in the middle of one of Darwin’s sentences to suggest the beginning of a new one, deftly altering the text’s original message.  This was a subtle but significant move by Gould.  And it shows the commitment to a theory that leads to a deliberate distortion of the truth.  Still more:  prestigious museum displays often maximize deception.  The London Museum of Natural History, setting forth a massive “demonstration” of evolution, managed only to portray “pink daisies evolving into blue daisies, little dogs evolving into big dogs, a few dozen species of cichlid fish evolving into hundreds of species of–you guessed it–cichlid fish.  They could not come up with a single major morphological change clearly recorded in the fossil record” (Face, 91).  How revealing it is that evolutionists routinely cite the “evolution” of fruit flies, following intensive genetic manipulation by scientists in laboratories, resulting in bewildering varieties of fruit flies! 

Turning to the computer-generated models that allegedly prove evolution, Schroeder scoffs at the poor mathematical work of celebrated biologists like Richard Dawkins, whose work “proves only that his computer is working correctly!” (Science, 108).   Quite simply:  “As a way of supporting arguments for evolution, computer programs blithely show the transition of outer body forms from amoeba to fish to amphibian to reptile to mammal and human.  The electronic displays deliberately ignore the intricacy of the molecular functions of each cell and serve only to repress the impossibility of randomness as the driving force behind these life processes.  They are in fact an exercise in deception, an insult to adult intelligence” (Science, 189). 

Seventhly, Schroeder regularly discusses human beings as the crowning work of creation, freely responding to the Creator throughout history.  We’re uniquely conscious of ourselves and God.  Amazingly, “If the universe is indeed the expression of an idea, the brain may be the sole antenna with circuitry tuned to pick up the signal of that idea” (Face, 105).  Designed with the ability to discern God’s work and presence in His world, with eyes that are external extensions of the brain, we have, since Adam, been making civilization, preeminently through the use of language, oral and written.  With a brain capable of housing “the information contained in a fifty-million-volume encyclopedia, we ought to be sufficiently wise to succeed at the task” (Science, 170), though we have no idea precisely how non-material words get embedded in and then recalled from a material brain.  Indeed, “the brain is amazing.  The mind is even more so” (Face, 147).  Just as a radio pulls in music from radio waves, so the mind gleans data from the brain.  Without a radio there’s no music; without a brain there’s no thinking.  But a dead radio no more destroys the music than a damaged brain destroys the mind. 

“‘With wisdom’ God created the heavens and the earth” is one way of translating the familiar first verse of Genesis.  Wisely thinking about it all, Schroeder says:  “Life, and certainly conscious life, is no more apparent in a slurry of rocks and water, or in the primordial ball of energy produced in the creation, than are the words of Shakespeare apparent in a jumble of letters shaken in a bag.  The information stored in the genetic code common to all of life, DNA, is not implied by the biological building blocks of DNA, neither in the nucleotide letters no in the phospho-diester bonds along which those letters are strung.  Nor is consciousness implied in the structure of the brain.  All three imply a wisdom that precedes matter and energy” (Face, 178).

To gain such wisdom, fully informed by the best understanding of Bible and science, is our privilege. 

159 Secularizing America


            In the 1960s, there was significant cultural ferment as Pope John XXIII “opened the windows of the church” to the modern world and orchestrated the Second Vatican Council, alleged devotees of Dietrich Bonhoeffer  (no doubt misrepresenting him) called for a “religionless” Christian fully attuned to the needs of the world, and Harvey Cox celebrated the “Secular City” as the only forum for a vibrant church.  Perhaps unconsciously, these religionists joined hands with various secularists who, for a century, had emerged as the architects and arbiters of culture.  Traditional, orthodox Christianity, especially, was portrayed as a remnant of the Middle Ages, when faith, not reason, prevailed, and it was widely assumed that truly enlightened, reasonable people would become secular humanists. 

That picture must be re-evaluated in accord with a recent publication by Christian Smith, a sociology professor at the University of North Carolina, who has edited a valuable collection of essays, The Secular Revolution:  Power, Interests, and Conflict in the Secularization of American Public Life (Los Angeles:  University of California Press, c. 2003), that illuminate one of the singular developments of the past century.   Secularism has indeed triumphed in many sectors.  But we’ve been misled, says Smith, about the process that facilitated this triumph.  Secularism displaced religion–helped along by some within the evangelical establishment–as the entrenched worldview of the nation’s elite institutions not because it is a necessary component of a technologically sophisticated and progressive people.  Rather it was brought about by a committed band of skeptical, agnostic, liberal rebels, who plotted to re-make America in their own image.  They gained control of the “knowledge-production occupations” and promoted “materialism, naturalism, positivism, and the privatization or extinction of religion” (p. 1). 

            Until the last quarter of the 19th century, Scriptural Protestantism, rooted in a “Scottish Common Sense Realist epistemology and Baconian philosophy of science” (p. 25) largely shaped America.    “On the broadest scale, the Protestant establishment maintained that Christian virtue, free market capitalism, and civic republicanism were working together to beget a civilization higher than humans had ever known–begetting perhaps the kingdom of God” (p. 26).  By 1880, however, a new view had gained a foothold.  Generally “progressive” and “secular” in nature, champions of this position wielded “science” as club with which to drive religion into carefully circumscribed ghettos.  By the 1930s, “the Protestant establishment was in shambles” (p. 28).  Ordinary believers still populated church pews.  But the universities, newspapers, and judiciary had largely turned secular. 

            This took place, Smith argues, because a self-anointed elite seized control of the nation’s power centers.  William James, speaking to the alumnae of Radcliffe College in 1907, summed up the attitude:  “‘We alumni and alumnae of the colleges are the only permanent presence [in America] that corresponds to the aristocracy in older countries.  We have continuous traditions, as they have; our motto, too, is noblesse oblige; and, unlike them, we stand for ideal interests solely, for we have no corporate selfishness and wield no powers of corruption. We ought to have our own class-conscioussness.  “Les Intellectuels!”  What a prouder clubname could there be than this one?'” (p. 41).  William James and John Dewey, with their respective versions of pragmatism, hugely influenced 20th America–James reducing religion to psychology and Dewey driving it from the schools. 

            The secular elite embraced Kant’s revolutionary Enlightenment dictum:  “Dare to think!”  By which they meant:  think for yourself without restraint.  They particularly embraced the positivism of Auguste Comte, with his “religion of humanity” and sociological “science.”  Herbert Spencer’s naturalistic Social Darwinism also attracted Americans such as William Graham Sumner and Andrew Carnegie, whose philanthropy often included blatantly anti-Christian provisos.  Indeed, the more pernicious aspects of evolution were social rather than biological in nature.  Scores of young Americans studied in German universities, absorbing their “historicism, idealism, theological liberalism, higher biblical criticism, and the ideal of academic freedom as autonomous rationality” (p. 57).  Introducing these ideas into America, establishing their dominance everywhere, became the driving motivation of the secularists–Les Intellectuels!

            Smith traces this story in his essay, “Secularizing American Higher Education.”  For roughly 700 years higher education had flourished, in the West, under the auspices of Christianity.  America’s colleges, prior to the Civil War, clearly continued this tradition, providing a solid education for their constituents.  Radicals intent on transforming them infiltrated the colleges, however, and by 1900 a new zeitgeist reigned.  Rather than respecting the religious commitments responsible for founding the colleges, the new class sought to remove religion from the curriculum and relegate it to increasingly rare chapel services.  Prestigious presidents–Charles Eliot at Harvard, Andrew Dickson White at Cornell–pushed the levers of power to secularize their institutions.  A new academic discipline, Smith’s own sociology, was particularly committed to this process, with virtually all the leading professors quite hostile to religion. 

            Generally speaking, these secularists worked subtly within religious institutions to slowly subvert them.  As Smith shows, they camouflaged their endeavors, seeking to beguile the religious folk who supported the colleges.  What they said in public was often quite different from what they said in their “scholarly” writings, lectures, and work within the schools.  As one of the leaders of the discipline of sociology, Edward Ross, said, “The secret order is not to be bawled from every housetop.  The wise sociologist will show religion a consideration . . . [and] will venerate a moral system too much to uncover its nakedness'” (p. 125).  Yet of the 170 general sociology textbooks published between 1880 and 1930, an overwhelming majority clearly sought to destroy “one moral and epistemological order” (p. 115), Christianity, and establish a secular counterpart.  The texts routinely embraced the views of Comte and Spencer, Darwin and Nietzsche.  Religion, as a purely personal emotional endeavor, might be tolerated.  But it could make no claims to knowing objective truth about anything important. 

            Kraig Beyerlein focuses on the role of the National Education Association in his fine essay, “Educational Elites and the Movement to Secularize Public Education.”  We who routinely observe the NEA in action at the Democratic National Conventions, as well as in local electoral campaigns, should not be surprised at its strongly secularist stance.  Early established (as the National Teachers Association in 1857), to promote “common Christianity” in the schools, the NEA by 1880 had changed, moving within a decade to prevent the schools from conducting religious activities.  This turn followed an intense power struggle within the association, as those responsible for its founding sought to maintain its pro-Christian stance.  Beyerlein quotes eminent educators–many of them clergymen–in the ’70s and ’80s to show how thoroughly Christian were the nation’s public schools.  But as these men aged and were replaced by younger secularists, the NEA changed rapidly and a “new orthodoxy” was installed.  “Only vestiges of the old religious system remained” (p. 193). 

            Eva Marie Garroutte has a short essay, “The Positivist Attack on Baconian Science and Religious Knowledge in the 1870s.”  She carefully studied Popular Science Monthly and Scientific American, journals for the intelligentsia in that era.  Before 1880, America’s leading scientists, such as Yale’s Benjamin Silliman, were frequently devout Christians who approached their studies from a Baconian and Scottish Common Sense perspective, envisioning their work as a “sacred science,” marvelously compatible with the Scriptures.  They worked inductively, eschewing unwarranted hypotheses, committed to drawing universal truths from particular facts.  Ultimately, there were “laws discovered by induction [that] were understood teleologically as descriptions of the mediate intervention of the divine in the world” (p. 198).  The Bible too, said Charles Hodge, was a compendium of facts that, inductively studied, led to an accurate understanding of God.

            This approach to science was overwhelmed by the “positivist attack” Garroutte describes.  Promoting Spencer, Darwin, and Huxley, the new class of intellectuals worked to spread “the gospel of naturalism.”  Cornell University’s Andrew Dickson White helped lead the assault, with his famous (if deeply flawed History of the Warfare of Science with Theology in Christendom.  The positivists shrewdly attacked language, insisting (in the nominalist mode) that there is no real bond between words and the realities they describe.  Horace Bushnell had earlier anticipated this cleavage, arguing that religious language could never depict religious truths with any precision.  One could never take anything in the Bible “literally.”  Thus as the Darwinian controversy escalated, the positivists insisted that the Bible’s version of creation was purely poetic.  Consequently, the “religion-friendly science” of the 19th century was replaced by the religion-hostile science of the 20th. 

            Reducing religion to psychology, a la William James, was a powerful part of the secularization process.  As the astute Yale historian H. Richard Niebuhr noted, reducing Christianity to psychology was a “‘sterile union’ resulting from the revolution introduced by William James and his followers'” (p. 270).  In Keith Meador’s judgment, James’ religious teaching “is finally a threadbare version of pious Christianity, for James was, as he aptly characterized himself to a friend, ‘a Methodist minus a Savior'” (p. 292).  Though James’s Varieties of Religious Experience is warmly supportive of the phenomenon, he basically espoused a pious humanitarianism devoid of any doctrinal content.  The reality of sin never troubled him.  We are, by nature, in need of healing, not forgiveness.  To the extent preachers assure us that “‘God is well, and so are you’” (p. 292) all is well!  To move from James to the “I’m OK, You’re OK” pabulum of the ’60s–or the “self-esteem” educators’ frenzy of the past two decades–easily illustrates the power of his thought. 

One of the men supporting this move was Charles Clayton Morrison, long-term owner and editor of The Christian Century.  His story is evaluated by Keith G. Meador in “‘My Own Salvation’:  The Christian Century and Psychology’s Secularizing of American Protestantism.”  Sadly enough, by 1939 Morrison had come to lament this process, writing an essay, “How My Mind Has Changed,” and declaring:  “I had baptized the whole Christian tradition in the waters of psychological empiricism, and was vaguely awakening to the fact that, after this procedure, what I had left was hardly more than a moralistic ghost of the distinctive Christian Reality.  It was as if the baptismal waters of the empirical stream had been mixed with some acid which ate away the historical significance, the objectivity and the particularity of the Christian revelation, and left me in complete subjectivity to work out my own salvation in terms of social service and an ‘integrated personality‘” (p. 269).  Few statements that I’ve read more powerfully depict the trajectory of mainline Protestantism in the 20th century!

            Morrison was reared in the revivalistic atmosphere of his father’s preaching, within the Disciples of Christ denomination.  After alternatively pasturing and attending college, Morrison graduated from Drake College in 1898 and headed for the University of Chicago to do graduate studies in philosophy under John Dewey and George Herbert Mead.  Along the way he embraced higher criticism, evolution, and the Social Gospel, which proved to be central tenets of his creed.  Darwin, he thought, was the 19th century mental giant who had freed man from ignorance.  Thus he wrote, in 1910:  “If there is any word which is especially dear to all modern liberals it is the word evolution.  We have never seen a liberal who is not an evolutionist” (p. 280).  This is important, because “it is Darwin’s theory of evolution that undergirds almost every psychological theory since the nineteenth century” (p. 284).  G. Stanley Hall, along with his teacher, William James, one of the most influential psychologists of that era, noted that as a student he discovered Darwin’s theory and rejoiced to know that “‘all worlds and all in them had developed very gradually and by an inner and unremitting impulsion from cosmic mist and nebulae . . . while all religions, gods, heavens, immortalities, were made by mansoul'” (p. 284). 

So theology must adjust to the truth of evolution, which for Morrison meant discarding anything supernatural in its claims.  Thus it is Jesus the very human teacher, not the eternally-begotten Incarnate Son of God, that matters.  Religious dogma must be tossed in the trash barrel of history.  Only empirically based, scientific studies–especially in the realm of psychology–can be trusted as authorities for living well.  As James declared in his influential The Principles of Psychology, “Psychology is the science of the mental life” (p. 283).   Self-realization, mental health, social service and genial good cheer should be the true Christian traits.  G. Stanley Hall wrote a two volume study of Jesus, titled Jesus, The Christ, in the Light of Psychology wherein he basically reworked “Ernest Renan’s The Life of Jesus, so that Jesus is not the one in whom Christian believe, but the one with whom Christians believe” (p. 288).   In accord with W.F.D. Schleiermacher’s Liberalism, Hall adjusted attractive elements of the Christian message to his psychological world view. 

Now enters John Dewey!  Influenced by Hall and James, Dewey deeply influenced America’s educators–including religious educators!  Pastoral care in 20th century, ironically, was deeply shaped by Dewey’s 1916 classic, Democracy and Education.   No longer, said Dewey’s acolytes within the seminaries and churches, should we worry ourselves with questions such as the existence of God.  What matters is how our idea of God affects our lives.  If our idea of God enables us to live more productively, it’s true enough for us.  Indeed “‘that conception of God is truest which aids most in guiding, ennobling, comforting, and strengthening man in his devotion to moral ends'” (p. 295).   Theology becomes psychology, preaching becomes therapy, pastoral care becomes counseling. 

All of these trends were celebrated by C.C. Morrison in The Christian Century.  Psychological texts, the works of James, Hall, Dewey, et al., were regularly reviewed and promoted in its pages.  In the midst of it all, Morrison seemed strangely oblivious as to what was really happening.  The allegedly Christian publication seemed remarkably akin to various secular magazines.  By 1939 he realized that much that constitutes Christianity had been deleted from his journal’s pages.  The Social Gospel he’d energetically promoted between WWI and WWII–espoused by the likes of Jane Addams and Henry Emerson Fosdick, featuring the routine support of labor unions, women’s suffrage, prohibition, and pacifism–seemed increasingly vacuous.  A decade later, Morrison declared that the secularization process he’d promoted throughout most of his life had robbed the Church of her riches.  Embracing the “modern culture” had been the “undoing of Protestantism.”  Rather than energetically critiquing the secular society, liberalism had critiqued Christianity!  Morrison concluded, sadly enough, that by embracing “Dewey’s language of ‘adjustment,’ the Darwin-inspired language of ‘progress,'” Protestant Liberalism had resulted in “an accommodation to psychology whose final result was the secularization of American public life” (p. 303). 

            The law, too, was deliberately secularized, as David Sikkink makes clear in “From Christian Civilization to Individual Liberties:  Framing Religion in the Legal Field.”  Following the Civil War, under the influence of law school professors, the legal profession moved away from its traditional grounding in the natural law.  Earlier judges had freely utilized biblical sources and moral reasoning in deciding cases.  A new elite insisted on a “case law” approach more “scientific” in nature, and religion was often labeled “sectarian” and thus somewhat irrelevant in the courtroom.  Then, as the 20th century dawned, this approach faded and there was a rapid shift “from general religion to civil liberties.”   Judges like Felix Frankfurter and Oliver Wendell Holmes increasingly insisted on “making” rather than merely “interpreting” the law. 

            Holmes was especially influential in this transition.  While a student at Harvard, he embraced science (without any God) as his religion.  Darwin’s theory especially affected Holmes as a student.  He also served valiantly as a soldier and saw some of the worst fighting in the Civil War, becoming quite cynical regarding mankind.  Back in Boston, he joined an elite intellectual circle–the “Metaphysical Club–that included Charles Peirce and William James.  Embracing their pragmatism, Holmes published The Common Law in 1880 and quickly moved up through the judicial system, ultimately becoming a United States Supreme Court justice.  Holding that the law is whatever the courts decide it to be, without any higher grounding, he rigorously abolished many of the legal traditions that had guided jurists for more than a century.  The popular notion–often labeled “sociological jurisprudence”–that the Constitution is a “living document,” constantly changing as judges apply it to new situations, comes directly from Holmes.  

            In these essays, and others that I’ve not discussed, there is great attention to detail.  Hundreds of footnotes and bibliographic citations make this an eminently scholarly collection.  Basically written for sociologists–and occasionally getting overly absorbed in intra-mural quibbling about various theories of secularization–the book nevertheless rewards careful reading.  What’s clear is that a self-anointed elite has waged an intense war for more than a century, determined to take charge of the most powerful cultural institutions in America.  By changing the very nature of education, religion, science, and law, they have greatly changed American society.  But, these essays make clear, they won their victories neither through evidence nor argument.  They basically worked within various institutions until they had the power to impose their views.  And thus they have secularized America. 

# # #

158 Higher Education Woes

Anyone seriously concerned about the souls of our youth should seriously ponder Vigen Guroian’s recent essay, “Dorm Brothel:  The new debauchery, and the colleges that let it happen,” in Christianity Today (February 2005), 45-51.  A professor of theology at Loyola College in Baltimore, Guroian confirms what Tom Wolfe describes in his widely discussed novel, I Am Charlotte Simmons, a searing portrait of the corruption of a young woman who attends an elite university such as Stanford or Yale.  Wolfe believes that colleges and universities have replaced churches as sources of moral authority and abandoned students to the anarchical nihilism described in his novel.   In her favorable review of I Am Charlotte Simmons, Professor Mary Ann Glendon, of Harvard Law School, laments that its descriptive passages of “binge drinking, foul language, academic dishonesty, and predatory sex” portray “a parent’s worst nightmare” (First Things, February 2005, p. 41).  Sending your children to war in Iraq may very well less threaten their character than sending them to Harvard or MIT!

The collapse of standards on university campuses was anticipated by William F. Buckley’s 1952 treatise, in God and Man at Yale:  The Superstitions of “Academic Freedom,” written within months of his 1950 graduation from his alma mater.   A 50th anniversary paperback edition has recently appeared (Washington, D.C.:  Regnery Publishing, Inc., 2002) and provides one with a helpful beginning point with which to chart the course of higher education since WWII.  Buckley wrote book to reflect his concern for “God, for country, and for Yale . . . in that order.”  Though an allegedly “Christian” university, Yale was clearly slipping away from even a passing commitment to God and His reality.  Arriving in 1946, after serving two years in the Army, Buckley (a very traditional Roman Catholic)  believed “that an active faith in God and a rigid adherence to Christian principles are the most powerful influences toward the good life” (p. lxiii).  Though Yale officially claimed, in those days, to be solidly Christian and freely took money from donors who believed it, Buckley’s assumption that the university would encourage such faith dissipated almost immediately. Undermining the university president’s  pious pronouncements was a critical agnosticism that typified the faculty and infected many students.

For example, the most popular “religion” class was “entitled the Historical and Literary Aspects of the Old Testament” (p. 5).  Taught by the warmly winsome college chaplain, it was popular mainly because good grades were readily available for minimal effort.  A Philosophy of Religion class was taught by a professor who claimed to be a “nondogmatic” Christian who was, in fact, equally “open” to all religious positions, endorsing the “pluralism” that now characterizes many religion departments.  Another professor, a former Congregational minister described himself as “80 percent atheist and 20 percent agnostic” (p. 8).  Yale’s philosophy department featured Paul Weiss, an agnostic, who delighted to “debunk” the Christian religion, along with various professors who immersed their students in the works of Bertrand Russell, David Hume, and assorted skeptics.  By assiduously sorting through one’s options, students could find a defender of orthodox Christianity, but the university’s instructors generally promoted the relativistic, instrumental views of John Dewey

Similarly, most professors advocated varieties of socialism, condemning in the process the traditional American way of life.  They were especially critical of America’s individualism and free enterprise capitalism.  Commencement speeches, presidential proclamations, and catalogue declarations notwithstanding, Yale’s professors consistently espoused versions of collectivism.  Only a stalwart few openly sympathized with Karl Marx, but Marx himself had urged two routes to utopia:  1) violent revolution or, 2) “a slow increase of state power, through extended social services, taxation, and regulation, to a point where a smooth transition could be effected from an individualist to a collectivist society” (p. 42).  The second, more gradual approach obviously appealed to many Yale’s professors who claimed to support the American way but consistently celebrated the demise of “rugged individualism” and endorsed the equalization of income, the  “just” redistribution of wealth, steeply progressive death and income taxes, and the apparently infinite expansion of the welfare state.

Having distanced itself from God and country, Buckley argued, Yale had lost its raison d’etre.  To a large extent, he thought, this was because professors recklessly indulged themselves, rationalizing “academic freedom” as a license to promote personal ideologies rather than carrying out the mission of the university.  On some issues, of course, Yale was intensely dogmatic, says Buckley.  A racist would be quickly fired from the faculty.  Yet an atheist promoting his agenda in this “Christian” institution would almost certainly gain promotion and tenure!  Sadly, Buckley says, Yale’s professors failed to take seriously their calling, which is to guard the treasures of civilization and rightly shape students.  They had the power to affect “the destiny of the world” (p. 172).  But they were promoting an anti-Christian and pro-collectivist agenda that would destroy it.

Michael L Budde and my colleague, John Wright, have edited Conflicting Allegiances:  The Church-Based University in a Liberal Democratic Society (Grand Rapids:  BrazosPress, c. 2004).  Inevitably, the essays are uneven in quality, but the general thesis of the book, summed up in Professor Wright’s Introduction:  “How Many Masters?  From the Church-Related to an Ecclesially Based University,” deserves careful consideration.  Christian institutions–by definition it would seem–should promote orthodox theology and encourage virtuous character.  They should, in short, be Christian!  But all too many of them–as Buckley’s portrait of Yale illustrated–claim to be what they’re not and largely leave students to their own devices.  Consequently, Wright insists, they will be shaped by the surrounding non-Christian culture rather than the Church.  To correct this, he argues the Christian university should reconfigure itself as an “ecclesially based” institution that openly and fervently seeks to “initiate and socialize” both professors and students “into the polity and practices of the church” (p. 26).  Rather than prepare folks to work in the world, they should be ready to die for their faith!

Fundamental to Wright’s project is the radical overhaul–if not displacement–of the “liberal arts Christian college.”  For two centuries churches have conducted a great experiment:  inviting students to blend “faith and learning” in their colleges.  The faith was understood to be the faith of the Fathers, the traditional doctrinal and ecclesial positions of the community of faith.  The “learning,” however, was to be that of the broader world.  Thus the curricula and textbooks of “Christian colleges” almost always mirrored that of the secular world.  In time, Wright argues, the “Christian” aspect of the college could not but fade away.  Consequently, only a more militantly committed community–the “ecclesially-based university–can maintain its integrity.

Sharing Wright’s position, William T. Cavanaugh, in “Sailing Under True Colors:  Academic Freedom and the Ecclesially Based University,” argues that the liberal version of freedom enshrined in secular universities–a commitment to unlimited professorial autonomy–slowly erodes what must be central to a Christian institution:  the authority of Scripture and Tradition.  To Cavanaugh, “academic freedom” should be corporately understood, and any given university should be free to espouse its peculiar ideals.  But within a given university, consensus and commitment to the mission should preempt personal professorial preferences.  So a Christian college should freely and openly indoctrinate rather than offer options to students.

Wes Avram, having served as Chaplain at Bates College (once an American Baptist but now unaffiliated “Christian” liberal arts college), writes perceptively about his role in “With Friends Like These:  Pathetic Chaplaincy and the Ethos of the Ecclesial College.”   In the 1990s, Bates’ vaguely Christian identity was, sadly enough, probably best evident in its buildings–ornate tombstones bearing witness to an earlier epoch.  At Bates College in the ’90s, egalitarian and politically correct rhetoric had replaced theological pronouncements, and the college’s Baptist founders were unfailingly praised as social reformers rather than clergymen.  Spiritual concerns–sin, salvation, holiness and hope–had been shifted to health and counseling offices, and humanitarian service projects had replaced worship and personal discipleship.  Any form of evangelism was now frowned upon as fervently as inclusive, inter-faith services were praised.  Virtually nothing but “fundamentalism” could be safely denounced.  Working within this environment without embracing its ethos was, to say the least, challenging for Avram.  But his essay both illustrates the dead-end of the secularizing process and provides helpful hints as to how one functions and works within such a secularized “Christian” college.

John Milbank, as one would expect from an eminent advocate of “radical orthodoxy,” urges readers to recover a more ancient and worthy perspective in “The Last of the Last:  Theology in the Church.”  Rather than follow the “critical” pathways of academicians, he wants to get back to the pre-modern world where the Church was stronger and theology better.  He especially insists that “Scripture, tradition, and reason were not seen as separate sources prior to 1300” and should not be today (p. 240).  In particular, he urges us to recover the richness of Thomas Aquinas, rooting ourselves in a worldview prior to the devastating incursions of nominalism (ironically rooted in an essentially Islamic perspective) and restore a solidly Christian theology.

While the essays I’ve mentioned prove valuable, other essays in the volume illustrate some of the problems of postmodernism.  Amy Laura Hall’s disquisition on Edith Wharton, making a “case for women’s studies” that encourages students to be “strategically rude,” is a highly improbable (if predictably postmodern) interpretation of Wharton.  M. Therese Lysaught, in “Love Your Enemies,” labors to establish an improbable link between “the contemporary practice of the life sciences and the infrastructure of violence of the liberal democratic state.”  To her, the recent Human Genome Project is the “Manhattan Project for biology” (p. 113) and is yet another bastard son of  America’s evil war machine.

Lysought’s animosity towards America, unfortunately, taints many of the essays in this volume.  That Michael L. Budde, one of the editors, favorably cites Noam Chomsky (one of the most irresponsible and hateful writers in America) indicates something of the ideological bias of the collection.  Generally speaking, the essays attack America’s “liberal democratic society” as an almost demonic entity that must rejected if not destroyed.  But the real problem with maintaining Christian colleges, in my judgment, is not “liberal democracy.”  Christian institutions struggled to retain their identity long before liberalism or democracy flourished in the modern era.  It’s “the world, the flesh and the devil,” as St. John so perceptively said long ago, it’s deadly sins–lust, pride, envy, sloth, intemperance–that seduce and destroy far more than fall prey to pernicious political structures.


Jim Nelson Black, former executive director of the Wilberforce Forum, has conducted extensive interviews with students and professors, as well as extensively researched the Freefall of the American University:  How Our Colleges Are Corrupting the Minds and Morals of the Next Generation (Nashville:  WND Books, c. 2004).  For our children’s sake, he argues, higher education in this nation must be overhauled, for the radicals of the ’60s now control the universities, making them centers of leftist activism, aiming “to transform the United States into a socialist utopia” (p. 4) rather than loci of higher learning.

Disciples of Antonio Gramsci and Herbert Marcuse–Bill and Hillary Clintons’ admitted “main academic influence” (p. 221) in their student days–have “marched through American institutions” and established their Marxist ideology.  Today’s professors, says USC’s Dallas Willard, more routinely cite Jacques Derrida and Richard Rorty, who are fully devoted to the “Marxist idea . . . that rhetoric is everything, and everything is political” (p. 256).  They fulfill Norman Thomas’s prediction:  “‘The American people will never knowingly adopt socialism, but under the name of liberalism they will adopt every fragment of the socialist program until one day America will be a socialist nation without ever knowing how it happened'” (p. 247).  This was clarified by Professor Richard Rorty, one of the luminaries of the current academy, who declared:  “The power base of the Left in America is now in the universities,” and it especially thrives in ethnic studies programs, “Women’s Studies programs, and Gay and Lesbian Studies program” (p. 11).  Social Justice Centers almost always provide avenues for decrying racism, sexism, ageism, global capitalism, etc.  Displacing the despised Western Christian Culture of traditional curricula, the Left celebrates its adversary stance toward both Christianity and America.  Destroying traditional sexual standards–so evident in Thom Wolfe’s novel–is but part of a larger assault on all social norms.

Academic standards have softened.  “College seniors of today have no better grasp of general knowledge than high-schoolers a half century ago” (p. 29).   They don’t study Shakespeare, because courses on “the bard” are rarely offered and never required, even of literature majors.  They are generally required to take courses in non-Eurocentric studies, but almost never in American history.  To the degree they learn about this country, they hear about the abuses of slavery and patriarchy, of American imperialism and Puritanical prudery.  But they know little about the nation’s presidents or generals or founding documents.  Thus they’re primed to believe the fantasies of Oliver Stone and Michael Moore, taking as “historical” the rants of Howard Zinn and Noam Chomsky.  Consequently, Black says, “There is perhaps no more troubling aspect of the radicalization of the university campus than the revisionism that passes for history today” (p. 163).  Even at Harvard, says Professor Harvey Mansfield, too few students think critically, discerning the difference between coherent argumentation and emotional venting.

Moral standards are in “freefall,” according to Black.  Promiscuous and virtually anonymous sex–”Hooking up” in today’s parlance–almost defines campus life.  Homosexual activity has especially gained approval, with “coming out” festivals and campus organizations vigorously promoting the gay lifestyle.  Professors in English departments now deal with subjects like “lesbianism sadomasochism” and “the queer child” rather than Melville and Dickens.  “Transgendered scholars” now grace elite academic departments.  Pornography is “seriously” studied and various kinds of experimentation encouraged.  Consequently, STDs proliferate.  According to Meg Meeker, 20 percent of our teens have incurable herpes.  “Every third girl has the human papilloma virus (HPV).  HPV causes 99.7 percent of cervical cancer cases that kills over five thousand women each year.  One out of ten has chlamydia'” (p. 203).

Black’s data, sadly enough, confirms the Tom’s Wolfe’s poignant portrait of the destruction of young people’s souls by the very institutions that should protect and nourish them.

In Going Broke by Degree:  Why College Cost Too Much (Washington, D.C.:  The AEI Press, c. 2004), Richard Vedder, for many years a professor of economics at Ohio University, drafts a sobering portrait of the serious financial failings of today’s colleges and universities.  Though the book contains ample data and sophistical quantitative analysis, the thesis is clear:  in the non-competitive environment of higher education, student fees and tax monies can be endlessly increased in order to provide an increasingly comfortable life for increasingly unproductive professors.

Though costs have soared, there’s no evidence that students are better educated than they were 50 years ago.  Student grade point averages have soared while their Graduate Record Exam scores have declined.  Students are taught by graduate students–or adjunct instructors–many of whom barely know more than their charges.  Monies once devoted to instruction have been diverted to administrative tasks–jumping  from 20 percent to 50 percent in 70 years.  Professors’ salaries have boomed while their teaching loads have decreased.  Smaller classes have been mandated, but though they make the professor’s life easier there’s no evidence that they improve student learning.  Enormous investments are made in “research,” but very little of it has value beyond the expansion of professors’ curriculum vita.

To bring sobriety and financial health to the nation’s colleges, Vedder proposes some radical changes, making them competitive and accountable to the broader public.  Privatizing public institutions, requiring the kind of  strategies and ethics that prevail in the private sector, making educational institutions truly educational rather than “student service” centers, would bring some sanity to higher education.

As a student at UCLA, Ben Shapiro openly challenged his leftist professors.  He kept a record of his experiences and then expanded his reaction through research and wrote Brainwashed:  How Universities Indoctrinate America’s Youth (Nashville:  WND, c. 2004).   He endured professors who praised Mao Tse-Tung and Islamic radicals.  He discovered that the “Democratic Socialists of America, the largest socialist organization in the US, is riddled with university faculty” (p. 35).  Thus Eric Foner, a celebrated Columbia University historian and fervent Marxist, responded to 9/11 with the declaration:  “‘I’m not sure which is more frightening:  the horror that engulfed New york City or the apocalyptic rhetoric emanating daily from the White House'” (p. 111).

Many of today’s professors, Shapiro says, are only minimally interested in scholarship, for they want to change the world.  Environmentalism, especially, has become a fervent faith for many professors.  To save the earth is a sacred mission.  So they attack SUVs, capitalism, free trade, and even the green revolution that has largely eliminated starvation in China and India.  One environmentalist, David Ehrenfeld, even argues that “the smallpox virus should not be destroyed since it kills only human beings” (p. 83).  And that, says another professor, CUNY’s Paul Taylor, would be “‘Good Riddance'” (p. 83).

This is obviously the book of a bright and very young writer.  Strong on anecdotes and alarming quotations, short on balance and analysis!  But it does reflect the reaction of one perceptive student.


Something of a companion to Shapiro’s is Mike S. Adams’ Welcome to the Ivory Tower of Babel:  Confessions of a Conservative College Professor (Augusta, GA:  Harbor House, c. 2004).  He particularly addresses the problems of Political Correctness–the straight jacket of thought imposed on both teachers and students alike on many university campuses.  He awakened to the problem not long after he was hired to teach criminal justice at UNC-Wilmington in 1993.  At that time he was an atheist solidly committed to the Democratic Party.  In 1996, however, he traveled to Quito, Ecuador, where he interviewed a Catholic prisoner on death row who seemed to have a better perspective on life than he did.  Three years later he interviewed a “mentally retarded inmate on Texas’ death row” who quoted John 3:16.  Those encounters led Adams to buy a Bible and, before finishing it, embrace the Christian faith.  He also became a Republican.  By that time, since he openly aired his views, he had become a controversial professor, routinely attacked by colleagues for his controversial views.  Written as a series of letters, the book gives insight into Adams and his battles, illustrating the general conclusions of the more comprehensive studies earlier discussed.

# # #

157 Taking Sex and Home Seriously

“Facts do not cease to exist because they are ignored,” Aldous Huxley wisely noted, and the facts differentiating the sexes must no longer be ignored, says Stephen Rhoads in Taking Sex Differences Seriously (San Francisco:  Encounter Books, 2004).  Rhoads, a professor at the University of Virginia, takes sex differences seriously because he takes research seriously and finds overwhelming evidence “that sex differences are large, deeply rooted and consequential” (p. 4).   From preferred forms of humor to risk-taking, from physical strength to social needs, from competitive athletics to social support groups, the two sexes radically differ.  Yet for nearly half-a-century, as a pronounced part of the “sexual revolution,” influential activists have sought to minimize, if not erase the differences between the sexes, talking learnedly about “genders” as “culturally determined” or mere matters of personal choice.

This is perhaps most clearly evident in athletics, which are far more important to boys than girls.  “One study of fourth and sixth graders showed that during free play, boys are competing with other boys 50 percent of the time whereas girls compete against each other only 1 percent of the time” (p. 168).   In the nation’s high schools and colleges, “girls outnumber boys in almost every extracurricular activity” except sports (p. 186).  Sports help society, for they are one of the few activities that help harness boys’ innately aggressive tendencies.  Boys socialize mainly through athletics, and they need “sports more than girls do because boys have more difficulty than girls in making friends” (p. 183).

But since 1972 the heavy hand of the federal government has sought to level the playing field for both sexes.  Consequently, over 20,000 “spots for male athletes disappeared” in university programs, and more than 350 men’s teams were jettisoned to make way for female athletes.  Thousands of men would like to voluntarily participate in varsity athletics, whereas women must often be enticed (through scholarships) to join a team.  Women are markedly less interested in competitive athletics, and fans decidedly prefer to watch men’s teams.  At the University of Virginia, where Rhoads teaches, “97 percent of total ticket revenues come from sales for men’s games, and 3 percent from sales for women’s games” (p. 164).

Nevertheless, “gender equality” in athletics (though not, one must note, in music or drama departments, much less in scholarships for women’s studies programs!) has been part of the “sexual revolution” for 40 years.  Rhoads attributes the success of the sexual revolution to three things:  1) the birth control pill, that made sex primarily “recreational” rather than “procreative;” 2) the counterculture of the ’60s, with its mantras of “if it feels good, do it” and “make love, not war”; and 3) the successful feminist movement.  Whatever its intent, however, Sally Cline says it is better labeled “the Genital Appropriation Era” and what it “‘actually permitted was more access to women’s bodies by more men; what it actually achieved was not a great deal of liberation for women but a great deal of legitimacy for male promiscuity; what it actually passed on to women was the male fragmentation of emotion from body'” (p. 97).

Though a majority of women still hope to find a husband in college, “hooking up” has replaced dating on campus, and even dating rarely leads to marriage.  Though widely practiced, however, “hooking up” inevitably harms women.  It’s not an equal opportunity activity!  “The most sexually active women were just as likely as other women to think about love, commitment and marriage with the men they slept with.  Sexually active men thought less about love, commitment and marriage as they had more casual sex.  The men’s feelings about casual sex were often very positive.  The women’s were more often negative” (p. 106).  The men who enjoy  promiscuous sex, however, generally disdain promiscuous women as possible wives!  “Of single men age 25 to 33, 74 percent agree that if they meet someone they want a long-term relationship with, they try to postpone sex” (p. 122).  All too many unmarried women, still cohabitating their 30s, find themselves “‘acting like a wife’ while their partners are ‘acting like a boyfriend'” (p. 119).  It’s hardly surprising, then, that “since the sexual revolution began, women have been thinking worse of men” (p. 118).  Indeed, there’s lots of “rage” toward them.  Men, for many women, are “jerks.”

The sexual revolution also ignited the egalitarian dogma of culturally constructed “genders,” equally capable of rearing children.  Ignoring sexual differences, as do those who promote “androgynous parenting,” involves little more than believing one’s fantasies.  Staunch feminists, such as Joyce Carol Oates and Kate Millet, determined to abolish patriarchy, have insisted that “gender” is a human construct and “gender differences” are quite superficial.  Supreme Court justice Ruth Bader Ginsberg declared:  “Motherly love ain’t everything it has been cracked up to be” (p. 17).  Men pretend that women are better mothers, she says, simply because they want to escape the demanding work of child care.  More radically, Susan Okin, reflecting the typical academic feminist’s longing for a “just future,” envisions a world wherein “one’s sex would have no more relevance than one’s eye color or the length of one’s toes.”  Male and female distinctions would disappear, and both men and women would do domestic chores equally well.

Such views, though advanced by “researchers” in the ’70s, can no longer be honestly maintained.  Women and men really are different.  Women, for example, actually enjoy being with babies and toddlers more than men.  “Cross-nationally, girls show more interest in babies and are preferred as babysitters.  Neither Israeli kibbutzim nor U.S. communes have had any success in abolishing such sex roles, although many have made doing so their highest priority” (p. 26).  And babies–by a margin of “fourteen to one”–prefer being with their mothers rather than their fathers (p. 11).  “Young children, moreover, are quick to self-segregate by sex” (p. 25).  Like water flowing through the Grand Canyon, the sexes simply conform to the “patriarchal” stereotypes disdained by feminists.  In truth, says Alice Eagly, women really “‘do ‘tend to manifest behaviors that can be described as socially sensitive, friendly, and concerned with others’ welfare” (p. 18).  They “find special pleasure in small groups of women–a preference that gives them practice at establishing intimate friendships” (p. 204).

Men, conversely, “tend to manifest behaviors that can be described as dominant, controlling and independent'” (p. 18).  Men everywhere “want to ‘drive the car, pick the topic, run the war'” (p. 151).  Boys forever fight and refuse to listen to girls (p. 154).  So too (once grown-up) men “hate to be dominated.  Men attempt to climb workplace hierarchies in part because of their strong desire for a job with no close supervision” (p. 151).  In the judgment of Anne Campbell, “Deep inside, men are always on their own against the world'” (p. 184).  That’s the way it is–and perhaps the way it should be!  Men also desire physically attractive women, for “researchers found that feminine beauty affects a man’s brain at a very primal level–similar to what a hungry man gets from a meal or an addict gets from a fix” (p. 59).  Like it or not, women are forever in a beauty contest, competing for men’s attention.  Women, by contrast, find men’s physical stature and financial accomplishments more alluring.  They want to “look up to” their husbands, both physically (wanting someone six inches taller) and intellectually.   Universally, women value a mate’s financial resources more highly than do men.  Men often prefer women who earn less than they do, but women almost never do.  Strangely enough, “highly paid professional women have an even stronger preference for high-earning men than do women working in less well-paid jobs” (p. 63).

Fathers are important not only as breadwinners, however; they are necessary for a child’s well being, especially serving as disciplinarians and “guides to the outside world” (p. 80).  Fatherless children suffer.  They have significantly more developmental problems and die more frequently.  “Swedish boys in single-parent families are four times as likely to develop a narcotics-related disease, and girls are three times as likely” (p. 80).  The pain of losing a father through divorce is powerfully expressed by Jonetta Rose Barras, in Whatever Happened to Daddy’s Little Girl?:  “‘A abandoned by the first man in her life forever entertains powerful feelings of being unworthy or incapable of receiving any man’s love.  Even when she receives love from another, she is constantly and intensely fearful of losing it.  This is the anxiety, the pain, of losing one’s father'” (p. 94).

Mothers, of course, are as important as fathers, and women instinctively “dream of motherhood” (p. 190).  Germaine Greer published an influential manifesto, The Female Eunuch, in 1970 that mocked motherhood.  “Now, thirty years later, she says she is ‘desperate for a baby. . . .  She mourns her unborn babies . . . and [has] pregnancy dreams, waiting with vast joy and confidence for something that will never happen'” (p. 205).  Today’s young women, frequently ingesting feminists’ formulae for the “good life,” choose to make careers primary, but they routinely lament their barrenness when they reach their 40s.  Lugging a leather briefcase to a lush office proves to be a poor substitute for a baby at one’s breast.  Giving birth to a baby profoundly changes most women, who discover a mysterious and joyous bond between themselves and their young.  Naomi Wolf, once pregnant, recorded that “the hormones of pregnancy” so changed her that she had to “question my entire belief system about ‘the social construction of gender'” (p. 205).

Greer and Wolf, however, no longer represent feminism.  Rather than acknowledge the research that challenges their prejudices, most feminists reject it.  They angrily denounce it, without citing evidence, as biased and anti-woman.  Gloria Allred labels such research “harmful and dangerous” (p. 19).  Feminists, deeply imbedded in publishing houses, promote school textbooks that celebrate atypical women, actually giving “more attention to Maria Mitchell, a nineteenth-century astronomer who discovered a comet, than to Albert Einstein” (p. 40).  Scholars (many of them women) who dare question feminist claims face disrespectful audiences and personal attacks, and they have difficulty finding publishers for their research.

What seems clear, Rhoads says, is that there are “two kinds of females, one kind of male” (p. 29).  Men, universally, see themselves as protectors and providers.  Women seem to divide (perhaps in accord with their testosterone levels) into semi-masculine and fully feminine groups.  Strong feminists, Rhoads suggests, embody the competitive, aggressive traits of men.  Many of them–Gloria Steinem, Kate Millett, Simone de Beauvoir–never married.   “A disproportionate number of female business executives were athletes in school” (p. 173).  Thus the only segment of women in America with a majority identifying themselves as “feminists” is a tiny cadre “making more than $100,000 a year” (p. 35).  These powerful women, when pregnant, often “‘see the body as an imperfect tool that the more perfect self should control.  They tend to experience pregnancy and birth as unpleasant because they are so out of control.'”  They cannot understand home-birthers, who “see themselves as ‘actively growing the baby'” in their wombs.  Rather, “‘see the baby as a separate entity,’ a ‘foreign body growing inside my body'” (p. 36).   So naturally they cannot understand the clear majority of women who want to be home with their kids, who appreciate traditional husbands who protect and provide.

Yet, Rhoads concludes, these traditional men and women are the ones who find life most satisfying.  Women really do find joy in rearing children and homemaking.  Men really thrive when they can succeed in work and thereby support a family.  Taking sex differences seriously explains why this is so.  Everyone concerned with marriage and family, with the health of young people, should take seriously Rhoads’ research.  As the distinguished Rutgers University professor of anthropology Lionel Tiger says:  “The Empress of Androgyny has no clothes.  Steven Rhoads provides a responsible, clear, exhaustive and convincing description of human sex differences and what they mean for social policy and personal life.  While members of the academy rush to consume ‘natural’ foods and protect ‘nature,’ they simultaneously ignore and even avoid ‘human nature,’ especially in the sexual sphere where political intensity is greatest.  Rhoads offers a generous-minded but hard-headed corrective to ideological fatuities and concernocrat assertions that have polluted the intellectual air.  And his scholarship is as punctilious as his writing is efficient” (book jacket).

* * * * * * * * * * * * * * * * * * *  * * * * * *

Mary Eberstadt’s Home-Alone America:  The Hidden Toll of Day Care, Behavioral Drugs, and Other Parent Substitutes (New York:  Sentinel, c. 2004), adds a journalist’s perspective to Rhoad’s more scholarly treatise.  Though she is a research fellow at the Hoover Institute, and provides both footnotes and bibliography to document her case, she writes for a general audience and addresses “one of the fundamental changes of our time:  the ongoing, massive, and historically unprecedented experiment in family-child separation in which the United States and most other advanced societies are now engaged” (p. xiii).  As the book’s subtitle indicates, she argues, courting controversy, that we simply must fill homes with adults who rightly nurture children.

In a chapter devoted to day care, markedly similar to a chapter in Rhoads’ treatise, Eberstadt notes that it is necessary because 70% of the mothers with children under age six were working.  However necessary for moms who lack other options, day care harms kids.  Child care advocates, especially those who promote universal, governmentally-funded child care, are sadly misguided.  “In sum,” she insists, “the real trouble with day care is twofold:  One, it increases the likelihood that kids will be unhappy, and two, the chronic rationalization of that unhappiness renders adults less sensitive to children’s needs and demands in any form” (p. 19).  Rather than “overparenting” or excessive maternalism–portrayed by “separatonists” as harmful to children–kids need parents who are continually present, constantly involved in their activities.

When they’re not, they face the “furious child problem.”  Shooters in our schools most clearly illustrate it.  But less lethal forms of savagery–”feral behavior,” Eberstadt calls it–have markedly increased and generally develop in “stressed, single parent homes,” where children spend little time with the only adults capable of inculcating academic and social skills.  Careful studies, taking into account “differences in family income and in parental education, marital status, and total hours worked, the more hours parents are away from home after school and in the evening, the more likely their children are to test in the bottom quartile on achievement tests [emphasis added]” (p. 37).

The same goes for physical and mental fitness.  “Fit parents, fat kids” describes the U.S.  While “boomers” embrace vegetarianism and haunt health clubs, their kids (at home alone) veg-out on chips and soda while playing video games.  Between 1960 and 2000, “the percentage of overweight children and teenagers tripled” (p. 41).  Too little breast-feeding, too much TV, too little exercise, too few family meals, too much fast-food grazing.  In sum:  “Today’s child fat problem is largely the result of adults not being there to supervise what kids eat” (p. 54).  Mental disorders and suicides have soared.  Depression, autism, learning disabilities, ADD all point to troubled children.  Yet Eberstadt wonders whether much of this may simply be “a legitimate emotional response to the disappearance from children’s lives of protecting related adults?” (p. 78).

Rather than stay home with their kids, parents increasingly rely on psychiatric drugs to keep them pacified.  In one decade (1987-1996) the number of kids taking such drugs tripled.  Prozac eases girls’ depression; Ritalin curbs boys’ exuberance.   Psychologists dispense prescriptions rather than probe the hidden hurts in kids’ hearts.   Parents and teachers who ban even pictures of guns from school routinely endorse mind-altering drugs such as Ritalin.  A significant number of school shooters, such as Kip Kinkel in Oregon and Eric Harris in Colorado, were taking prescribed drugs.  Ritalin has been widely prescribed and now thrives on an underground black market as kids with prescriptions turn dealers with their surplus pills.  Angry musicians, including Kurt Cobain and Eminem, loudly lamented being placed on Ritalin as children.  Eberstadt fears that “children and teenagers are increasingly treated with performance-enhancing drugs not only to help them compete, but also to relieve the stresses that their long, out-of-home, institutionalized days add to the adults around them, the teachers, parents, and other authorities” (p. 102).

She believes this, in part, because of “the primal scream of teenage music,” perhaps the most disturbing chapter in the book.  Older folks (like myself) who can’t stand the music and simply hope it goes away should carefully heed Eberstadt’s analysis of it.   Digging beneath the profanity and violence of rap music, Eberstadt argues that “if yesterday’s rock was the music of abandon, today’s is that of abandonment” (p. 106).  The lyrics of Pearl Jam, Kurt Cobain, Eminem et al. rail against “the damage wrought by broken homes, family dysfunction, checked-out parents, and (especially) absent fathers” (p. 106).  Tupar Shakur, a violent rapper gunned down in his 20s in Las Vegas, was “a boy who ‘had to play catch by myself,’ who prays:  ‘Please send me a pops before puberty’” (p. 114).   Eminem’s songs emphasize “the crypto-traditional notion that children need parents and that not having them has made all hell break loose” (p. 117).  In a song written for a movie he made, Eminem “studies his little sister as she colors one picture after another of an imagined nuclear family, failing to understand that ‘momma’s got a new man.’  ‘Wish I could be the daddy that neither of us had,’ he comments” (p. 117).  Though their parents don’t want to admit it, kids buying Eminem’s albums by the millions generally crave the Ozzie and Harriet homes of the ’50s.

For one thing, kids in the ’50s were free from the “ravages of ‘responsible’ teenage sex” that now devastates our young people.  STDs run rampant, thanks to absent parents and the “safe sex’ mantra of “sex educators.”  Five of the ten most frequently reported diseases in modern America are STDs.  Parents and teachers who are paranoid about smoking tobacco tolerate and even encourage far more lethal experimentation with sex.  But “if tobacco were doing to teenage girls’ lungs what intercourse and oral sex are now doing to their ovaries and other female organs, there would be no more adult talk of ‘safe sex’ than there is talk of ‘safe cigarettes.’  The difference is that one can always stop smoking, whereas some of the STDs are for keeps” (p. 140).  Kids are now “sexually active,” it’s clear.  And empty homes, with both parents working from dawn to dark, provide perfect places for teenage trysts–91% taking place there after school.

Sex abuse also escalates when parents aren’t around.  Men rarely abuse their biological children.  A British psychiatrist, Theodore Dalrymple, declares:  “‘He who says single parenthood and easy divorce says child sexual abuse'” (p. 137).  Another scholar, David Blankenhorn, notes that 86 percent of child abusers “were known to the family, but were someone other than the child’s father” (p. 137).  “In the statistics on teenage STDs lurks one of the saddest stories in this book,” says Eberstadt.  “Here is a clear-cut example that laissez-faire parenting has caused real harm to millions of teenagers, most seriously the girls whose bodies now carry viruses latent with short- and long-term problems–everything from infertility to increased risks of various cancers.  Many of them do not even know what they have, and neither do their happy-talk parents who continue on in their unenlightened happy-talk way–responsibly buying their responsible adolescents birth control, all the while clinging to ideological reassurances about ‘responsible’ teenage sex” (p. 138).

In conclusion, Eberstadt  says we need less day care and “more parent-child separation but, rather, the adoption of a higher standard that acknowledges what has too long gone unacknowledged:  the benefits of increasing the number of intact adult-supervised homes” (p. 172). 

156 David Horowitz on the Left


               Few thinkers understand the American Left as well as David Horowitz.  Reared in New York as a  “baby diaper” Communist, deeply committed to Marxism, in the ’60s he edited the most widely-read counter-cultural periodical, Ramparts Magazine, helping to inspire and orchestrate the anti-war movement of that era.  Making an about-face in the ’70’s, he has become a trenchant critic of today’s Left.  He fully understands the its ideology and knows personally many of its most prominent spokesmen.  Horowitz’s latest book, Unholy Alliance:  Radical Islam and the American Left (Washington, D.C.:  Regnery Publishing, Inc., c. 2004) provides readers a valuable analysis of the latest fluorescence of radicalism and its influence on the liberal mainstream of this nation. 

            The debris from the World Trade Towers had barely settled before Leftists began to blame America for both the murderous attacks and the manifold woes of the world.  Rather than condemn the murderous Moslem terrorists, intellectuals such as Susan Sontag and Barbara Kingsolver decried the “root causes” responsible for terror.  Activists   staged “peace vigils” and teach-ins” to protest America’s villainy as her troops attacked the Taliban tyrants in Afghanistan.  Columbia University Professor Eric Foner, an unabashed Marxist who was elected president of both the American Historical Association and the Organization of American Historians in the 1990s, declared:  “I’m not sure which is more frightening:  the horror that engulfed New York City or the apocalyptic rhetoric emanating daily from the White House'” (p. 15).  President Bush’s “Manichean vision” was “deeply rooted in our Puritan past and evangelical present,” Foner declared, making it the moral equivalent of the fanaticism of Osama bin Laden. 

            When the war against terror shifted to Iraq, the Left mounted a furious attack on President Bush.  Anti-war demonstrations, organized by International ANSWER (an openly Bolshevik-style group noted for its support for North Korea), featured speakers who called America a “rogue” or “terrorist” state and likened George Bush to Adolf Hitler.  “No Blood for Oil,” the protestors screamed.  A handful of Democratic Congressmen, such as John Conyers and Charles Rangel, supported these radical protests, and professors in hundreds of universities paraded to various podia to revive and revise the anti-Vietnam War rants of the ’60s.  Along with the war, the protestors reviled capitalism–specifically the “globalization” of detested companies such as Halliburton–and demanded the implementation of their utopian visions of “social justice.”  Consequently, militant Leftists in America have sided with Islamists abroad as part of their endeavor to radically change their own country.  The Marxist critique of America suffused the radical Islamicism of Iran’s Ayatollah Khomeini and (subsequently) his protégé, Osama bin Laden.  When Iranian students took American embassy hostages in Teheran in 1979, they referred to the U.S. as the “Great Satan.”  Enjoying the support of Iran’s Communist Party, Khomeini adopted a Leninist approach to transforming Iran, establishing revolutionary tribunals, purging dissidents, making friends with the U.S.S.R.  The only free, democratic states in the Middle East, Lebanon and Israel (the “Little Satan”) were targeted for destruction.  Christian Lebanon–a jewel of freedom and prosperity in that region–was soon destroyed by Syria and the PLO. 

When the U.S. attacked Afghanistan and Iraq, aging anti-Vietnam War protesters, heeding the call of Ramsey Clark and others, found new life, flying the Palestinian Liberation Organization flag much as they did with the Vietnam flag decades ago.  Horowitz notes that youthful protesters, trashing cities such as Seattle, when they host meetings of the World Trade Organization, are manipulated by hard-core communists in groups like International ANSWER.  Allied organizations, including the Coalition for Peace and Justice, blessed by the National Council of Churches, have brought a religious fervor to the anti-war movement.  Horowitz adeptly traces its nihilistic views to their ideological source:  Karl Marx, who said, “Everything that exists deserves to perish” (p. 50).  Undaunted hate for what is, unmitigated hope for what is not but is to come, marks socialism.  Thus part of the socialist agenda includes an Anti-Americanism intent on replacing the American system with something akin to Cuba.  Leftists at the beginning of the 21st century illustrate an affinity with their 20th century predecessors, when, Whittaker Chambers said, “men banded together by the millions in movements like Fascism and Communism,” determined to undermine their own nations.  Consequently, “treason became a vocation whose modern form was specifically the treason of ideas” (p. 48).

            Many intellectuals on the Left take Noam Chomsky (a Massachusetts Institute of Technology linguist) as the North Star for the movement.  “No individual has done more to shape the anti-American passions of a generation,” says Horowitz.  Professors cite him, thousands of university students flock to his lectures, and he enjoys an enormous international reputation.  Anti-American Europeans take particular delight in citing his analyses.  Chomsky claims to be an anarchist and certainly advocates a nihilistic agenda:  destroy the United States, whose history is laced with little more than atrocities and genocide, and something better (precisely what, he never says) will replace it.  America is the “Great Satan” to Chomsky, and his venom inspires both critics and enemies of this country.  Less than two weeks after American forces invaded Afghanistan, for example, Chomsky told a friendly crowd that that the U.S. was the “greatest terrorist state” on the planet.  He also predicted that American troops would orchestrate a genocide, annihilating millions of civilians in that country.

            Aligned with Chomsky is Howard Zinn, whose “signature book, A People’s History of the United States is a raggedly conceived Marxist caricature that begins with Columbus and ends with George Bush.  It has sold over a million copies, greatly exceeding that of any comparable history text” (p. 102).  Praised by professors such as Eric Foner, Zinn has been feted by academics and touted by movie directors and music celebrities.  His historical treatise is required reading in hundreds of classrooms.  The New York Times Book Review endorsed it “as a step toward a coherent new version of American history.”  Zinn sees American history as a long record of injustice wherein the powerful have exploited the weak.  Indians, slaves, labor unionists, socialists, et al. have endured brutality throughout this nation’s history.  Following the lead of Chomsky and Zinn, “Entire fields–’Whiteness Studies,’ ‘Cultural Studies,’ ‘Women’s Studies,’ ‘African American Studies,’ and ‘American Studies,’ to mention some–are now principally devoted to this radical assault on American history and society and to the ‘deconstruction’ of the American idea'” (p. 106). 

            Veteran “movement” activists, such as Leslie Cagan, who brings 40 years of radicalism to her position as “national coordinator” for the Coalition united for Peace and Justice, celebrate the virtues of Communism.  She lived 10 years in Cuba, years that “made it seem like I died and went to heaven” (p. 173).   She and others in the current anti-war movement have embraced the cause of radical Islamists, ever supporting terrorists who are brought to trial, and rallying to the ACLU’s defense of Professor Sami al-Arian, who used his position at the University of South Florida to finance and promote terrorist groups such as Islamic Jihad.  One of the professor’s organizations, the Islamic Committee for Palestine, raised money to subsidize Palestinian “martyrs” by urging donations of  “‘$500 to kill a Jew'” (p. 190).  When the FBI arrested al-Arian, he instantly attained a “victim” status and was staunchly defended USF’s faculty union and the American Association of University Professors!

            Hysterical opposition to the Patriot Act is another Leftist trademark.  Bernardine Dohrn, who three decades ago helped lead the Weather Underground (responsible for bombings and various terrorist acts), is now a law professor at Northwestern University and enjoys the esteem of her colleagues in the American Bar Association.  In a 2003 article published in Monthly Review, a Marxist periodical, she urged resistance to the both American imperialism abroad and counter-terrorism at home.  She somberly warned against immanent McCarthy-type measures everywhere threatening our liberties–specifically evident in John Ashcroft’s moves against Islamic charities (clear channels, Horowitz says, for moving funds from the U.S. to Middle East terrorists). 

            Professor Dohrn’s views have been endorsed by prominent spokesmen in the Democratic Party.   House Minority Leader Nancy Pelosi and former Vice President Al Gore became stridently anti-Bush and anti-war as the war against Iraq developed.  Senator Ted Kennedy claimed that Bush and a cabal of conspirators concocted the plan for war in Texas in order to gain political advantages.  “This whole thing,” said Kennedy, “was a fraud” (p. 236).  Former President Jimmy Carter (revealing an affinity for the Left that would be manifestly evident when he sat in a special box with Michael Moore at the 2004 Democratic Convention) proclaimed positions that garnered for himself the Nobel Peace Prize.  Carter was praised by the Nobel Committee for “promoting social and economic justice” and condemning “the line the current U.S. Administration has take on Iraq” (p. 216).  Carter claimed that the Iraq war reversed 200 years of American foreign policy, which had “been predicated on basic religious principles, respect for international law, and alliances that resulted in wise decisions and mutual restraint” (p. 221).  All these principles, he said had been trampled under foot by George W. Bush. 

            In truth, Horowitz suggests, the anti-war movement has little to do with the specific situation in the Middle East.  It’s simply the latest edition of a century-long struggle between the socialist Left, committed to replacing the American system with a socialist utopia, and the patriotic (if oft-naïve) citizens who support their country’s economic and foreign policy traditions.

* * * * * * * * * * * * * * * *

            To understand Horowitz, reading a recent compilation of articles edited by Jamie Glazov, Left Illusions:  An Intellectual Odyssey (Dallas:  Spence Publishing Company, 2003) supplements his earlier autobiography, Radical Son.  The collection is taken from books and articles, often written for on-line publications such as salon.com and FrontPageMagazine.com.  The book features into ten sections, tying together essays on topics such as race, the new left, Antonio Gramsci (the Italian Communist who has deeply influenced American Leftists), the post-communist left, and the war on terror.  Introducing Horowitz, Glazov quotes Camille Paglia’s appraisal of him as an “original and courageous” thinker” whose “spiritual and political odyssey [will prove] paradigmatic for our time” (p. xii). 

That odyssey is documented by Glazov in a short essay, describing Horowitz’s role in shaping the ’60s generation, followed by his turn to conservatism.  He slowly realized, in the ’70s, that “social engineers could not reshape human nature,” one of the core Marxist dogmas, and that the Left’s rhetoric and aspirations were, sadly enough, sheer illusions.  He lost the faith that binds together the revolutionary left.   Reflecting on it, Horowitz believes Sigmund Freud rightly understood (in Civilization and Its Discontents) the problem.  Socialists dream of a beautiful world wherein love and justice reign.  Consequently, Horiwitz laments, socialism is “an adult fairy tale.  Socialism was a wish for the comforting fantasies of childhood to come true.  I had an additional thought:  the revolutionary was a creator, just like God.  Socialism was not only a childish wish, but a wish for childhood itself:  security, warmth, the feeling of being at the center of the world” (p. 100). 

            In a chapter entitled “The Road to Nowhere,” Horowitz develops the arguments of a Polish philosopher, Leszek Kolakowski, an eminent former Marxist, who said:  “‘The self-deification of mankind, to which Marxism gave philosophical expression, has ended in the same way as all such attempts, whether individual or collective:  it has revealed itself as the farcial aspect of human bondage'” (p. 126).  Kolakowsky clearly saw the deeply religious roots of Marxist ideology, an endeavor to end man’s estrangement and bring into being a “new man.”  Frankfurt School Marxists, such as Herbert Marcuse, derided the “commodity fetishism” of the capitalist system that resulted in a truncated “one dimensional” man.  But we now know the pervasive ills that have resulted wherever the Marxist recipe has been followed.  For example, in 1989, Soviet citizens ate half the meat Russians enjoyed in 1913 under the Czars.  When you’re not eating anything, “commodity fetishism” looks more like blessed abundance!  Tiny Taiwan and Switzerland each exported more manufactured goods than the Soviet Union, whose “factories” were models of inefficiency.  South African Blacks under apartheid “owned more cars per capita than did citizens of the socialist state” (p. 133).  The socialist hopes were all illusions. 

Since he apostatized, Horowitz has suffered endless, venomous attacks from his former colleagues.  This results, in part, from Horowitz’s thorough understanding of the cause he once championed.  He recognizes how constantly the Left indulges in “Telling It Like It Wasn’t.”  In an essay by this title he focuses on a PBS documentary, “1968:  The Year that Shaped a Generation,” which was virtually dictated by former leaders of the Students for a Democratic Society such as Tom Hayden, who now defend their ’60s radicalism as a species of liberal reformism.  But  Hayden misrepresents himself, Horowitz says, for:  “By 1968, Hayden was already calling the black Panthers ‘America’s Vietcong’ and planning the riot he was going to stage at the Democratic convention in Chicago that August” (p. 76).  That riot, as scripted by Hayden and the SDS, shattered the calm of the city.  Consequently Hubert Humphrey lost the presidential election.  That, in turn, “paved the way for a takeover of its apparatus by forces of the political left–a trauma from which the party has yet to recover” (p. 77).

            By 1974, new-style Democrats–Ron Dellums, Pat Schroeder, David Bonior, Bella Abzug–asserted themselves.  Having helped shape the anti-war movement, Horowitz knows that the slogan “Bring the Troops Home” was merely a cover for the real goal:  facilitating the victory of North Vietnam.  “Let me make this perfectly clear:  Those of us who inspired and then led the anti-war movement did not want merely to stop the killing, as so many veterans of the domestic battles now claim.  We wanted the communists to win” (p. 111).  Mounting evidence indicates that the war was not lost on the battlefields, where America could have prevailed and saved millions from Communism.  America lost the war because it lost the will to persevere.  Democrats thwarted Nixon’s efforts to establish a negotiated peace in Southeast Asia.  They cut off funds for South Vietnam and Cambodia and “precipitated the bloodbath that followed” (p. 78).  “The mass slaughter in Cambodia and South Vietnam from 1976 to 1978 was the real achievement of the New Left and could not have been accomplished without Hayden’s sabotage of the Humphrey presidential campaign and the anti-communist Democrats” (p. 78). 

            I’ve touched upon only a few of the themes Horowitz addresses in this book.  He has always been fiercely partisan, but, he says:  “I make no apologies for my present position.  My values have not changed, but my sense of what supports them and makes them possible has.  It was what I thought was the humanity of the Marxist idea that made me what I was; it is the inhumanity of what I have seen to be the Marxist reality that has made me what I am” (back cover). 

* * * * * * * * * * * * * * * *

            Horowitz and his long-term colleague Peter Collier have edited The Anti-Chomsky Reader (San Francisco:  Encounter Books, c. 2004) in an effort to expose the duplicity and destructiveness of one of the most influential members of the radical Left in America.  A MIT professor of linguistics, Chomsky early established his reputation as an academic.  Then, during the Vietnam War, he took center stage by writing impassioned indictments of America’s foreign policies.  “According to the Chicago Tribune, Chomsky is ‘the most cited living author’ and ranks just below Plato and Sigmund Freud among the most cited authors of all time” (p. vii).  In some circles he’s revered as one of the 20th century’s greatest thinkers, and he certainly has attracted multiplied thousands of devotees.  The “documentaries” of Michael Moore, the recent political machinations of billionaire George Soros, and web sites such as MoveOn.org. all reflect Chomsky’s views.  Horowitz and Collier contend, however, that Chomsky’s influence actually results from providing “an authentic voice to the hatred of America that has been an enduring fact of our national scene since the mid-1960s” (p. viii).  More perniciously, two linguists who have studied his academic work (Robert Levine of Ohio State University and Paul Postal of New York University) accuse him of “a deep disregard of, and contempt for, the truth; a monumental disdain for standards of inquiry; a relentless strain of self-promotion; notable descents into incoherence; and a penchant for verbally abusing those who disagree with him” (p. ix). 

            In the lead article of the collection, Stephen J. Morris, a Johns Hopkins University professor, accuses Chomsky of “Whitewashing Dictatorship in Communist Vietnam and Cambodia.”  Slighting scholarly literature, ignorant of the complex history of the region, Chomsky wrote about Vietnam on the basis of left-wing journalistic accounts and his own one-week visit to the country in 1970.  Applauding America’s withdrawal in 1974, he defended Communist rule in Vietnam in The Political Economy of Human Rights.  The book was, Morris says, “an attempt to reconstruct the anti-Western ideology of the New Left; it also is the most extensive rewriting of a period of contemporary history ever produced in a nontotalitarian society” (pp. 8-9).  More than ignoring the millions slaughtered as the Communists extended their control from Vietnam to Cambodia, Chomsky actually defended Pol Pot’s vicious regime by attempting to deny the genocide that transpired under its rule. 

            Thomas Nichols examines related issues in “Chomsky and the Cold War” and demonstrates the anti-American bias in his works, wherein he routinely delights to assert the moral equivalency of the U.S. and the U.S.S.R.  During ’80s, he wrote with “almost pathological hostility” regarding President Reagan, dismissing him as an ignorant figurehead of a flawed Administration.  When Czechoslovakia’s courageous Vaclav Havel (an informed and articulate critic of socialism) spoke to the U.S. Congress in 1990, Chomsky called it an “embarrassingly silly and morally repugnant Sunday School sermon'” (p. 60).  Anti-Communist cold warriors were always wrong!

            In “Chomsky’s War Against Israel,” Paul Bogdanor documents the professor’s contempt for documentary evidence, making him an “intellectual crook” according to Arthur M. Schlesinger (p. 98).  Chomsky defended Yassar Arafat and the PLO and overlooked the genocidal rhetoric and terrorist attacks of Muslims.  Conversely, he routinely denounces Israel.  He ignored “the Saudi reaction to the capture of Adolf Eichhman, ‘who had the honor of killing five million Jews,’ or the Jordanian announcement that by perpetrating the Holocaust, Eichmann had conferred a real blessing on humanity,’ and that the best response to his trial would be ‘the liquidation of the remaining six million’ to avenge his memory” (p. 91).   Even more disturbing, in “Chomsky and Holocaust Denial,” Werner Cohn (a sociology professor at the University of British Columbia who has published a book-length study on the topic) charts links between Chomsky and a small coterie of European writers, including Israel Shahak, “the world’s most conspicuous Jewish anti-Semite” (p. 119) and Robert Faurisson, a neo-Nazi French writer. 

            These essays document what Horowitz labels Chomsky’s “Anti-American Obsession.” His life-long commitment to socialism has led him to support Marxist movements around the world, ignoring their failures while praising their objectives.  Even when forced to criticize certain glaring socialist catastrophes and brutalities, he laments them as forgivable failures to develop the socialist utopia of Chomsky’s dreams.  And that commitment explains the incessant Anti-Americanism which is perhaps the most distinguishing dimension of his writings.  

155 Aussie Academics: Biology; Philosophy; History

                Though Australia stands, in many ways, on the periphery of Western Civilization, some of her scholars deserve careful reading.  In part this is because as “outsiders” they often bring a refreshing perspective to their respective disciplines.  Indeed, Michael J. Denton’s Evolution:  A Theory in Crisis (c. 1984) helped launch the challenging “Intelligent Deign” movement in biology.  More recently Denton has published a sequel, titled Nature’s Destiny: How the Laws of Biology Reveal Purpose in the Universe (New York:  The Free Press, c. 1998).  He aims, he says, cogently outlining his thesis:  “first, to present the scientific evidence for believing that the cosmos is uniquely fit for life as it exists on earth and for organisms of design and biology very similar to our own species, Homo sapiens, and second, to argue that this ‘unique fitness’ of the laws of nature for life is entirely consistent with the older teleological religious concept of the cosmos as a specially designed whole, with life and mankind as its primary goal and purpose” (p, xi).  The cosmos appears as if were precisely designed to enable intelligent beings to flourish on a very special place, planet earth.  The more we learn about our world, the more it reveals a “deeper order” that orchestrates all that is. 

            Though teleology–the Aristotelian notion that there is purpose and design to the world–has been discarded by many modern thinkers, Denton insists it makes sense.  Indeed, as Fred Hoyle (no friend of theism) acknowledged, “‘a commonsense interpretation of the facts suggests that a super intellect has monkeyed with physics, as well as chemistry and biology, and that there are no blind forces worth speaking about in nature'” (p. 12).  This is evident in elementary matters, as Denton makes clear in a lengthy, fascinating discussion of the marvelous properties of water.  “What is so very remarkable about the various physical properties of water . . . is not that each is so fit in itself, but the astonishing way in which, in many instances, several independent properties are adapted to serve cooperatively the same biological end” (p. 40).  Without water there would be no life–and its unique composition serves to facilitate life.  Joining water, light is likewise essential for life on earth.  The sun’s radiation, screened by intricately coordinated atmospheric gases, stimulates and sustains living creatures.  That the right amount of the right kind of light reaches earth is a “staggering” coincidence.  That water is transparent to this light is equally amazing.  Indeed, we should “be awed and staggered” by such “coincidences,” defying mathematical probabilities, that are absolutely necessary for the world to be as it is. 

Denton then peruses the presence of radioactive substances, the movement of tectonic plates, the marvelous rightness of the atmosphere and atmospheric pressure, the role of carbon and iron in the processes of life, the positive influence of planets such as Jupiter on the earth, the unique properties of oxygen and carbon dioxide in sustaining life, the mysterious power of photosynthesis, the incredible information contained in DNA, the sophisticated functioning of proteins within the cell, the life-sustaining efficiency of hemoglobin in the blood, the marvel of the cell’s membrane, and the brain’s computing power.  “The emerging picture is obviously consistent with the teleological view of nature.  That each constituent utilized by the cell for a particular biological role, each cog in the watch, turns out to be the only and at the same time the ideal candidate for its role is particularly suggestive of design.  . . . .  The prefabrication of parts to a unique end is the very hallmark of design.  Moreover, there is simply no way that such prefabrication could be the result of natural selection” (p. 233).  So too for man.  Denton finds the cosmos perfectly designed for our flourishing.  He notes our unique capacity to see and speak, our hand’s marvelous dexterity, our fire-making and using capacities, our suitability for our place in the cosmos, and our propensity for mathematics and abstract thought.  All things considered, the “chain of coincidences underlying our existence . . . is simply too long and the appearance of contrivance too striking” (p. 261) to be attributed to naturalistic chance.  

                Having devoted the first part of the book to “life,” he turns to the question of “evolution.”  That life should appear on planet earth is, quite simply, miraculous.  Denton emphasizes that few living creatures existed before the Cambrian Explosion–a 5 million year sliver of time, 600 million years ago–which witnessed the “great and  never-to-be-repeated burst of creative growth” responsible for all the main branches of the tree of life.  Thenceforth occurred an apparently “inevitable unfolding of a preordained pattern, written into the laws of nature from the beginning’ (p. 282).  Indeed, given the “immensely complex” composition of living organisms, “it is hard to understand how undirected evolution via a series of independent changes could ever produce a radical redisgn in any sort of system as complex as a living organism.  “In effect,” Denton says, “modern biology has revealed us a watch, a watch with a trillion cogs!–a watch which wonderfully fulfills William Paley’s prophetic claim in this famous section from his Natural Theology;  or Evidence of the Existence and Attributes of the Deity, Collected from the Aoppearances of Nature, published in 1800, that ‘every indication of contrivance, every manifestation of design, which existed in the watch, exists in the works of nature; with the difference, on the side of nature, of being greater and more and that in a degree which exceeds all computation'” (p. 350).  Soon, he hopes, science will increasingly side with natural theology and defend of the “anthropocentric faith” Isaac Newton envisioned two centuries ago.

                Denton’s two treatises provide persuasive building blocks for the Intelligent Design movement.  Though personally agnostic, he remains open to and respectful of religious perspectives.  And he certainly thinks nature’s design reveals its underlying intelligence.

* * * * * * * * * * * * * * * * * * * *

The late philosopher David Stove had an amazing ability to cogently refute shoddy logic, especially in the philosophy of science.  Largely unknown outside Australia, a Stove sampler has been edited by Roger Kimbell and titled:  Against the Idols of the Age (New Brunswick, NJ:  Transaction Publishers, c. 1999).  In the book’s prefeace, Kimball commends Stove as a healthy antidote to the intellectual cowardice and pernicious illogic that pervades far too many fashionable theories.   

                In the book’s first section, “The Cult of Irrationalism in Science,” he deals with “Cole Porter and Karl Popper:  The Jazz Age in the Philosophy of Science.”  The mood of jazz–”anything goes,” as Cole Porter  crooned–provides “the key to Popper’s philosophy of science” (p. 5).  In the philosophy of science, Popper spawned thinkers such as Thomas Kuhn (of The Structure of Scientific Revolutions fame) who amplified and made respectable Nietzsche’s “transvaluation of all values.”  Rather than discovering truth, they alleged, science merely advances theories and probabilities, all of which decay like fallen leaves.  The best example of such Jazz Age nihilism is P.K. Feyerabend, a University of California professor who called himself “a ‘Dadaist’ and his philosophy ‘epistemological anarchism.’  He maintains that science knows, and should know no rules of method, no logic” (p. 14).  Thus witchcraft and astrology and even the sorcery of Carlos Casteneda’s Don Juan fictions, as well as Newton and Pasteur, have “scientific” standing. 

                Confronting such blatant irrationalism, Stove simply asks if Popper, Kuhn, Feyerabend, et al. make sense.  He examines their words.  They assert, for example, that “unfalsifiable” and “irrefutable” statements are the same.  This is, however, patently untrue.  An “unfalsifiable” statement means “consistent with every observation statement,” whereas an “irrefutable statement” means “known for certain” (p. 21).  “They are no more related in meaning than, say, ‘weighty’ in ‘weighty thinker,’ and ‘overweight.’  Someone who identified weighty thinkers with overweight thinkers, and took himself to have gained a new insight into the nature of thinkers, would be guilty of a stupid enough pun.  Someone who identifies irrefutable propositions with unfalsifiable ones, and takes himself to have gained a new insight into the nature of scientific propositions is guilty of no better” (p. 210).     

                In “Idols Contemporary and Perennial,” Stove deals with some social and political issues.  He attacks  Harvard University Professor Robert Nozick in “‘Always apologize, always explain’:  Robert Nozick’s War Wounds.”  Following WWII the only nation able to resist Communism’s march toward world domination was the United States, so Nozick illustrates how during the Vietnam War “America’s capactity for such resistance remained intact, [but] her willingness did not.  For that war was lost, not through defeat of American armies in the field, nor yet through treachery among them, but through a massive sedition at home” (p. 93).  Consequently, Nozick’s Philosophical Explanations reveals “the gruesome and disabling wounds which were inflicted on American life, and on American intellectual life in particular, by the defeat in Vietnam” (p. 94).  Yet Nozick, strangely, wore such wounds as badges of honor.  He celebrated the defeat of America as a “moral” advance. The U.S. learned, he claimed, the signal virtue of “non-coerciveness.”  Just be nice to everyone and tolerate everything!  Impose nothing, especially anything as controversial as universal truth, on anyone. 

                Philosophizing in a non-coercive manner, Nozick foregoes rigorous proofs and indulges in pleasing “explanations,” an exercise that “is as insubstantial intellectually as it is over-charged emotionally:  in fact it is like nothing so much as a paper kite driven by a fifty horsepower motor” (p. 99).  Nozick’s views, Stove shows, are rooted in the notions of Kant, who declared that our experience “‘constitutes’ nature.  If this is not madness, and more specifically the self-importance of the human species run mad, it will do so until the real thing comes along” (p. 103).  While declaring his intent to address tough questions in Philosophical Explanations, Nozick instead indulges in verbal gymnastics that enable him–and philosophers like him–to “sound nicer:  that is, ‘gentler, softer, more considerate of others, respecters of their rights, and so forth'” (p. 107). 

                To Stove, such spinelessness cannot be considered philosophy.  Philosophers should seek and demand truth, a fundamentally “coercive” notion, “since what is true is independent of what anyone wants or believes” (p. 111).  Still more:  “No ideal could be more destructive of human life than the ideal of non-coerciveness” (p. 111).  Were not parents coercive their offspring would never survive.  Were not teachers coercive students would never learn.   Nozick’s endeavor, ultimately, reduces to an “autism” akin to America’s withdrawal from Vietnam.  “Autism is your only non-stop guaranteed-non-coercive fun.  At least, it is, if ‘fun’ is the right word” (p. 112).  Whatever it is, Stove says, it ought not be taken seriously!

                Finally, in the third section of the book, Stove tackles “Darwinian Fairytales.”  Whatever one may think of the empirical evidence, however one may respect the “authorities” endorsing it, Darwinism’s illogic, he says, deserves pillaring.  According to Darwin, constant competition ruthlessly weeds out the unfit and facilitates the evolution of species.  However, Stove insists, “the facts of human life” manifestly disprove this thesis, leaving us with “Darwinism’s Dilemma.”  For we do, in fact, cultivate religious values, help each other, nurture each other, build hospitals to care for the sick, and even give our lives for others.  Some Darwinians say the “Cave Man” (but not us moderns) lived according to the survival of the fittest code; others say the ruthless “Hard Man” still reigns, directing our species’ development; and still others, taking the “Soft Man” approach, cheerfully contradict themselves, declaring Darwin was right about natural selection while defending the need for welfare programs, foreign aid, etc.  What Stove insists is that the species we know best–our own–amply illustrates the very antithesis of Darwin’s fundamental thesis.

                Darwin erred, egregiously, by embracing the demographic theory of Thomas Malthus as the key to understanding evolution, whereas many “organic populations” never “obey this principle” (p. 240).  Microorganisms, parasites, and insects may seem to illustrate the idea that a species proliferates geometrically until food supplies are exhausted.  But more advanced creatures defy Malthus’ view.  Both domestic pets and “huge African wild” animals frequently “fail to increase in numbers, or even decline, in the presence of abundant food” (p. 241).  More importantly, the very species Malthus studied–man–easily demonstrates the “grotesque falsity” of his thesis.  Darwin’s modern defenders similarly fail Stove’s logic tests.  In “Genetic Calvinism, or Demons and Dawkins,” he ridicules Oxford University Professor Richard Dawkins’ portrait of  “selfish genes” guiding biological evolution in strictly deterministic fashion, much like the predestinarian God of Calvinist theology.  To call genes selfish, Stove insists, counters common sense.  To say a gene is “selfish” is akin to saying a virus is “studious, or shy.  You could just as intelligibly describe an electron as being slatternly, a triangle as being scholarly, or a number as being sex mad” (p. 255).  Yet Dawkins attained international eminence by propounding such nonsense. 

                Neo-Darwinists like Dawkins struggle (as did Darwin himself) to explain many things, especially the “altruism” that pervades creation, since it ought not exist in a dog-eat-dog Darwinian world.    Some, like E. O. Wilson, promote “sociobiology” to suggest that altruism is nothing but selfishness dictating preferential treatment for next-of-kin.  One would expect, then, Stove insists, that in bacteria, which reproduce by fission and “have 100 percent of their genes in common,” altruism would prevail. In fact, there is utterly “no kin altruism” evident (p. 295).  Few animals even recognize their first cousins, much less treat them favorably.  But human beings adopt orphans, send money to unknown starving people, and illustrate the utter folly of such “shared genes” musings.

                With sarcasm and relentless logic, Stove makes his case.  He admits that Darwinism may be the most persuasive theory afloat, but it simply cannot be true.   He’s no theist.  Indeed he’s rigorously skeptical about most everything.  But he’s a truth seeker and truth teller, and one profits from the clarity of his critiques.

* * * * * * * * * * * * * *

                In The Killing of History:  How Literary Critics and Social Theorists Are Murdering Our Past (San Francisco:  Encounter Books, c. 1996), Keith Windschuttle documented the demise of traditional history in many realms.  For  2400 years, beginning with Thucydides, historians have sought to discern and narrate what actually happened in the past.  Mistakes might be made, interpretations might vary, but they sincerely believed there is “truth” to tell.  That ancient endeavor has lately been discounted by thinkers swayed by Neitzsche’s equation of  history and myth.  Nietzsche “wanted to replace the whole of Weswtern philosophy with a position that held there are no facts, only interpretations, and no objective truths, only the perspectives of various individuals and groups” (p. 24).  His disciples, such as Martin Heidegger and Jacques Derrida, insisted that all “history” is merely a momentary perspective, historically shaped, and valuable insofar as it empowers whoever constructs it.  Consequently, today’s young people are “taught to scorn the traditional values of Western culture–equality, freedom, democracy, human rights–as hollow rhetoric used to mask the self-interest of the wealthy and powerful” (p. 5).  “Cultural studies,” focused on popular culture (especially movies and TV), have replaced the disciplined investigation of documents and discovery of facts.  Massaging texts and words, rather than portraying persons and locating events, enamor historians. And everyone’s free to creatively construct his own past.   

This approach to writing history is assailed in Windschuttle’s chapter entitled “Semiotics and the Conquest of America.”  A torrent of books re-evaluating Columbus’s landfall appeared in 1992.  In David Stannard’s American Holocaust, for example, we’re told that “‘The road to Auschwitz led straight through the heart of the Americas'” (p. 39).  Tzvetan Todorov, in The Conquest of America, declared there’s no difference between Christians ingesting the sacramental bread and wine and Aztecs cutting out the hearts of their sacrificial victims.  Hernando Cortes and the conquistadores are routinely demonized.  But Indians, who routinely offered human sacrifices to their gods and indulged in cannibalism, are always treated sympathetically.  They lived according to their cultures’ code.  To historians like Stannard and Todorov, cultural relativism is an article of faith until one deals with Cortes or Christians, who are wrong all the time in all places! 

In fact, Cortes conquered Mexico because the Aztecs’ Indian foes assisted him.  And his much-lamented  Spanish brutalities were, primarily, the result of following their Indians allies’ approach to war, for the Tlascalans  insisted:  “‘In fighting the Mexicans . . . we should kill all we could, leaving no one alive:  neither the young, lest they should bear arms again, nor the old, lest they give counsel'” (p. 58).  The religion and culture of the Aztecs, Windschuttle inists, “made it necessary for Cortes to destroy Tenochtitlan and kill most of its inhabitants.  The Mexica had no concept of surrender and the transfer of power to the victor.  During the final stages of the siege, Cortes made several attempts to negotiate with the remaining Mexican lords but was rebuffed.   They refused any terms save a swift death.  Even with all their warriors either dead or unarmed and the people starving, they responded to further mass killings from cannon and handgun not by surrendering but by pressing on to destruction.  Exasperated, Cortes decided to raze the city, and unleashed his native allies who massacred the remnants of the defenseless men, women and children” (p. 54). 

Turning to the history of Hawaii and Australia, Windschuttle shows how postmodern historians are re-imaging and rewriting the past with little concern for empirical data.  Underlying this approach is Michel Foucault, the anti-humanist, anti-history historian whose theories largely shape “the directions history is now taking” (p. 131).  He’s especially noted for his rejection of the “humanism of the modern era” (p. 134).  To Faucault, there is no “autonomous” human person, no subjective self.  Indeed, neither consciousness nor free will nor external reality are real.  Our words and the way we interpret them are all there is. 

Foucault’s radical relativism, of course, subverts itself.  He claimed that all cultural groups have their own “truths,” none of which is objective or universal.  Histories are mere “fictions.”  Yet he assumed, of course, that his thesis–all cultures normalize only small “truths”–is True for all cultures at all times everywhere!   Toward the end of his life, he began back-peddling, suggesting that perhaps one must be a “subject” of some sort, capable of real moral acts.  “He defines the basic practice of ethics as self-mastery that is derived from ‘the thoughtful practice of freedom.’  Unfortunately, neither he nor his supporters like to admit that he has thereby jettisoned key passages of his earlier work.  But rather than admit he was mistaken or wrong, they dealt out  equivocations such as ‘shifts of emphasis’, ‘discontinuities’ and a similar range of euphemisms” (pp. 148-149). 

With the collapse of the USSR one would have expected a related retreat of Marxism.  Such has not, however, occurred.  Tactics have simply shifted.  Rather than hoping to overthrow capitalistic regimes through revolution, Marxists now work to reform and ultimately transform them through “creeping socialism.”  Adopting the approach of Antonio Gramsci, this led to “leftist participation in the upper reaches of government, education, the law and the media, as well as lobby groups concerned with environmental, feminist, homosexual, ethnic and welfare issues” (p. 185).  Infiltrating the historical profession, they have replaced the empirically-based narrative  with a “grand theorist” method that explains the past in terms of class struggle and historical dialectic.

Windshuttle finally examines efforts to kill history by reducing it to a social science or re-casting it as imaginative literature–as did Hayden White, who declared, in his influential Metahistory:  “‘The aged Kant was right, in short:  we are free to conceive “history” as we please, just as we are free to make of it what we will'” (p. 258).  And that, precisely, is our problem!  Kant’s heirs are killing history!

154 A Century of Holiness Theology




                Everyone interested in the “cardinal doctrine” of The Church of the Nazarene must read Mark R. Quanstrom’s A Century of Holiness Theology:  The Doctrine of Entire Sanctification in the Church of the Nazarene, 1905 to 2004 (Kansas City:  Beacon Hill Press of Kansas City, c. 2004).  Written as a Ph.D. dissertation, it meticulously documents the denomination’s story without special pleading for the author’s personal position, which is not particularly evident.  Written by an active pastor (22 years in Belleville, IL), it presents the evidence in a readable manner that also bears witness to the author’s concern for and commitment to his church’s distinctive doctrine. 

                Quanstrom begins his study by linking the Church of the Nazarene’s emergence with the “American Ideal” at the beginning of the 20th century.  It was an optimistic era when progressive–and often utopian–aspirations filled the air.  Progressive politicians like Woodrow Wilson envisioned and worked for a perfect world.  Utopian novels, especially Edward Bellamy’s Looking Backward, touted a world-to-come without scarcity or injustice, wherein social and technological developments guaranteed comfort for all.    Christian socialists and advocates of the “Social Gospel” like Washing Gladden and Walter Rauschenbusch, called for the immediate establishment of the Kingdom of God, free of war and economic inequities.  The first generation of Nazarenes, while rejecting the theological liberalism of Social Gospel devotees, entertained many of the same utopian aspirations.  They thought the world could be transformed, in socio-political as well as personal ways, through the proclamation of Christian holiness.

                Then came World War I.  Sobriety set in.  Utopian visions dissipated amidst the mustard gas of trench warfare.  So, as early as 1919, some doctrinal retrenchments began in Nazarene circles.  It was decided that entire sanctification should be promoted as a personal transformation, a deliverance from the power of sin, rather than a means whereby the Kingdom of God would come into being.  Following decades witnessed significant reductions in the claims made for the work of entire sanctification, even within an individual’s heart.  Consequently, within a century of the church’s founding, “entire sanctification would not be taught so much as an instantaneous change in the heart of the believer appropriated by consecration and faith, but rather more as an unremarkable event in the progress of growth, if taught at all.  The ‘eradication of the sinful nature’ would be terminology that many Nazarenes would eschew, even though the words would remain in the Articles of Faith” (p. 23).  Consequently, the “early Nazarenes’ hopes for the propagation and preservation of the doctrine as they understood it have not been realized” (p. 24).

                Filling in this story, Quanstrom first explores the Church of the Nazarene’s 19th century doctrinal roots.  The holiness theology of Pheobe Palmer (who emphasized consecration and faith), holiness camp meetings, and eminent Methodist preachers (such as John Allen Wood and Daniel Steele), established the position on “Christian perfection” the church adopted.  A.M. Hills, one of the most influential early Nazarene theologians, summed it up in Holiness and Power, a book which was part of the Ministerial Course of Study from 1911-1964.  Hills especially emphasized the “instantaneous” nature of the “second work of grace” whereby a believer was entirely sanctified.  Seekers were urged to “believe and receive” the experience, to accept by faith the effectual working of God in their heart if they only surrendered their all to Him. 

Alongside Hills’ book, Possibilities of Grace, a treatise by a Methodist writer, Asbury Lowery, was included in the course of study in 1911, where it remained until 1956.  “Lowrey,” Quanstrom says, “was as optimistic as any in the holiness movement concerning what would happen if the church received this blessing.  He believed that holiness was the reason for every great reformation in the history of the church” (p. 45).  Following the lead of Phoebe Palmer, Lowrey urged believers to consecrate themselves completely to God, the “place their all upon the altar,” and take God at His Word by believing that the “altar sanctifies the gift.”  Truly sanctified, they need not even consider themselves actually in need of confessing “forgive us our trespasses” when reciting the Lord’s Prayer. 

                As the young denomination took form, following World War I, clarifying (for example) the difference between Pentecostals and Nazarenes, a “fundamentalist leavening” began.  Whereas mainline denominations, such as the Presbyterians and Methodists, had divided into modernist and fundamentalist factions, the General Superintendents declared, at the 1928 General Assembly, that “there will be no discussion of modernism or fundamentalism.  We are all fundamentalists, we believe the Bible, we all believe in Christ, that He is truly the Son of God” (56).  They declared that the scripture is “infallible,” for “the Bible is the Word of God.  We believe it from Genesis to Revelation.”  Still more:  “The church must stand first, last and all the time for the whole Bible, the inspired, infallible, revealed Word of God” (p. 56).  Subsequently, delegates to the General Assembly expanded the denomination’s article of belief to read:  “We believe in the plenary inspiration of the Holy Scriptures by which we understand the sixty-six books of the Old and New Testaments, given by divine inspiration, inerrantly revealing the will of God concerning us in all things necessary to salvation; so that whatever is not contained therein is not to be enjoined as an article of faith” (p. 57, italics Quanstrom’s).     

                During the 1930s, H. Orton Wiley, the architect of the above statement regarding scripture, began work on an official theology for the church, pursuant to a request by J. B. Chapman.  A.M. Hills, at times a colleague of Wiley’s at Pasadena College and clearly an “elder statesman” of the movement, was disappointed that he wasn’t assigned the task.  But he published, with an independent publisher, his Fundamental Christian Theology, which was added to the Ministerial Course of Study and widely utilized during that decade.  Hills espoused post-millennialism, confident that Christ will return when His disciples have perfected His commission.  He also espoused a lofty view of human freedom, rather typical of the pre-WWI progressives, that was ratified by the 1928 General Assembly’s declared confidence in man’s “godlike ability of freedom” (p. 72).  So his explanation of entire sanctification was naturally optimistic.  If we simply will to will God’s will we will be holy. 

                H. Orton Wiley’s magisterial, three-volume Systematic Theology appeared during the early ’40s and was instantly recognized as the official theology of the Church of the Nazarene.   Thorough scholarship rooted his work in the classical, orthodox mainstream of the Christian Faith, but he primarily engaged 19th century theologians such as Miley who had worked within the Wesleyan tradition.  Though Wiley never openly differed with A.M. Hills, he subtlety and firmly shifted the focus of salvation from man’s “free will” to God’s “free grace,” giving preeminence to God alone in all aspects of salvation.  Emphasizing “prevenient grace,” he embraced both the Protestant notion of moral depravity without denying the reality of free will, thereby distancing himself from the rather radical “moral freedom” espoused by Hills.   

                Wiley fully endorsed the article on entire sanctification as set forth (in 1928) in the Articles of Faith for the Church of the Nazarene:  “We believe that entire sanctification is that act of God, subsequent to regeneration, by which believers are made free from original sin, or depravity, and brought into a state of entire devotement to God, and the holy obedience of love made perfect.  It is wrought by the baptism with the Holy Spirit, and comprehends in one experience the cleansing of the heart from sin and the abiding, indwelling presence of the Holy Spirit, empowering the believer for life and service.  Entire sanctification is provided by the blood of Jesus, is wrought instantaneously by faith, preceded by entire consecration; and to this work and state of grace the Holy Spirit bears witness.  This experience is also known by various terms representing its different phases, such as ‘Christian Perfection,’ ‘Perfect Love,’ ‘Heart Purity,’ ‘The Baptism with the Holy Spirit,’ ‘The Fullness of the Blessing,’ and ‘Christian holiness.'”

                Wiley especially emphasized the instantaneous nature of the second work of grace.  He stressed the aorist tense of New Testament words describing God’s sanctifying work in the heart, embracing fundamentalism’s commitment to the words of Scripture, and he tried to make sure that Nazarenes would tenaciously proclaim it.  Methodist theologians in the 19th century had failed to defend this position and gradually emphasized “‘growth and development, rather than upon the crises which marked the different stages in personal experience'” (p. 81).  But the great promise to believers, is “‘that God has promised a cleansing from all sin through the blood of Jesus.  He lays hold of the promises of God, and in a moment, the Holy Spirit purifies his heart by faith.  In that instant he lives the full life of love.  In him love is made perfect . . .  The law of God is written upon his heart'” (p. 84). 

                Definitively defended by Wiley, the doctrine of entire sanctification was promoted by Nazarene preachers and teachers in the post-WWII era, albeit with less triumphant optimism than earlier.  Tending to share the somber realism of Reinhold Niebuhr, Americans in general rather discounted the perfectibility of man.   Revealing this were some of the books added to the ministerial course of study, such as A Right Conception of Sin, by Richard Taylor; The Terminology of Holiness, by J.B. Chapman; Conflicting Concepts of Holiness, by W.T. Purkiser.  “These were works,” says Quanstrom, “which were clinically precise in their definition concerning exactly what it was that was eradicated by the grace of entire sanctification and they were not making the claims that earlier holiness works had” (98).  The doctrine was still clearly declared, though its scope had narrowed.  Rejecting Lowrey’s perfectionism, Purkiser, for example, insisted that sanctified believers certainly should pray that God “forgives us our trespasses,” noting that we pray for others as well as ourselves and (even if cleansed from sin) remain aware of our past sins.

                Concerned that Nazarene preachers have a seminary supporting the church’s doctrines, Nazarene Theological Seminary opened in 1945.  The first professor of theology was Stephen S. White, who upheld  the doctrine of entire sanctification, stressing that it is a distinctively second, instantaneous work of grace, freeing the believer from inbred sin as a result of the baptism of the Holy Spirit.  He especially emphasized (and was supported therein by NTS’s first president and future general superintendent, Hugh Benner) the importance of sin’s “eradication.”  But others, such as Richard Taylor, qualified such claims, distinguishing between sin and infirmity, exempting from cleansing various flaws of personality and temperament. 

Asbury Lowrey’s Possibilities of Grace disappeared from the ministerial course of study in 1960, and The Spirit of Holiness, by Everett Lewis Cattell, the president of Malone College, was added in 1964.  Cattell urged holiness theologians to get back to Wesley, whose A Plain Account of Christian Perfection had been added, for the first time, in 1954.  Reading Wesley revealed his rather “gradual growth” understanding of holiness, as well as his wariness of claiming it as a personal experience.  What Nazarenes like Richard Taylor labeled “infirmities” John Wesley had called “sins.”  Consequently, the 1976 General Assembly added two new paragraphs to the Articles of Faith, indicating the importance of gradual growth as well as crisis experience.  The 1985 General Assembly featured a vigorous discussion concerning the denomination’s article of faith on Original Sin, responding to a commission’s proposals that it be significantly revised to remove the possibility of its “eradication.”  The delegates ultimately decided to add paragraphs in accord with the commission’s proposal, resulting in a softening of the possibility of sinless sanctity.   

This move reflected the growing influence of Mildred Bangs Wynkoop, whose 1973 treatise, A Theology of Love, sparked one of the most significant shifts in the denomination’s history.  Paul Orjala labeled it “one of the most important books ever published by Beacon Hill Press of Kansas City . . . for here is the first modern theology of holiness” (p. 141).  Though it was not added to the course of study until 1986, it early exerted influence throughout the church.  Wynkoop emphasized the “credibility gap” between what had been taught and what was actually lived by most Nazarenes, and she proposed a thorough “restructuring of the conceptual framework within which holiness theologians had worked” (p. 143).  Rather openly rejecting the holiness theology of earlier thinkers, she insisted on a “Wesleyan hermeneutic” with a new definition of human nature and sin.  A person is not, she suggested, by nature sinful, and sin is not a “thing” to be removed.  Rather, when the relationship with God is broken we do sinful things. 

Restoring that relationship, therefore, solves the sin problem.  Nothing particularly changes within

one’s soul, but a healthy relationship with God develops.  Holiness is interpersonal love–nothing more, nothing less.  Though she tried to distance herself from Pelagianism, she nevertheless insisted that we’re not locked into sin’s bondage by birth.  Denying the reality of inbred sin, she had no need for a second work of grace.  With nothing to cleanse, there’s no need for the “baptism of the Holy Spirit” apart from His role in regeneration.  In truth, we’re free to love or not to love, and the decision is largely ours, ironically reviving   A.M. Hills’ focus on the will.  “Specifically, Wynkoop defined a pure heart as a heart that, as Kierkegaard had written, willed one thing” (p. 146).  In the final analysis, says Quanstrom, she discounted the efficacious power of grace and emphasized “the purity of a person’s consecration which would lead to unhindered communion with God” (p. 146).   Thus, however much she may have endeavored to provide a better way to understand entire sanctification, “her definitions tended to undermine the doctrine’s distinctiveness” (p. 150). 

                In 1979 the Board of General Superintendents asked H. Ray Dunning to write a replacement text for Wiley’s Systematic Theology.  It would be titled Grace, Faith, and Holiness:  A Wesleyan Systematic Theology, published in 1988, and added to the course of study in 1990.  Like Wynkoop, Dunning was a Trevecca Nazarene College professor, and he basically shared her position–especially her singular reliance upon Wesley–regarding the doctrine of holiness.  “Dunning’s systematic theology was different,” notes Quanstrom.  “He intended it to be” (p. 160).  Departing from the methodology of Wiley et al., he embraced Wynkoop’s “relational model” of holiness.  One’s very being was shaped by his relationship with God, so being born in a state of sin made no sense if since sin is primarily a dismembered bond.  A right relationship with God is restored in justification, minimizing any need for a “second” work of grace that would cleanse one from inbred sin. 

                Dunning’s work broadened the doctrinal breach within the Church of the Nazarene, since theologians like Richard Taylor and Donald Metz strongly opposed the “relational” approach of the Trevecca school.  Taylor, upholding the church’s traditional position, “refuted the relational understanding of sin and salvation, calling it heretical” (p. 167).  Though he naturally denied it, Dunning’s views on sin and salvation were pure Pelagianism, Taylor argued.  Consequently, Quanstrom says, within a century of her founding it was questionable “whether or not the Church of the Nazarene had a coherent and cogent doctrine of holiness at all” (p. 169).  To Richard Taylor, the holiness movement was virtually dead because it had ceased proclaiming the doctrine as crafted by Wiley and White.  This had happened because sinners were offended by it and too few believers were willing to die to self in order to live a holy life.  Still more, in Taylor’s judgment, a corps of “liberal” teachers had taken control of the church’s teaching institutions and were undermining her doctrines.  He particularly pointed to Wynkoop’s A Theology of Love as “‘a major contributing cause of the staggering of holiness ranks'” (p. 172).  Aspiring preachers who followed her forfeited “the message of a clear, knowable experience of entire sanctification which cleansed the carnal mind” (p. 172).  Taylor’s long-term colleague at Nazarene Theological Seminary, J. Kenneth Grider, shared his views and was equally critical of Wynkoop.  His own treatise, A Wesleyan-Holiness Theology, espoused a very traditional position, stressing:  “This second work of grace is obtained by faith, is subsequent to regeneration, is occasioned by the baptism with the Holy Spirit, and constitutes a cleansing away of Adamic depravity and an empowerment for witnessing and for the holy life'” (p. 173). 

                Hoping to reconcile the factions within the church, the General Superintendents issued a “Core Values” mission statement” in 1999, declaring that both crisis and growth, both cleansing and love, are vital parts of living the holy life.  In an insert accompanying the “Core Values” booklet, sent to all pastors in the denomination, a bright future was envisioned.  “‘Many believe that we were raised up, not for the 20th century, but for the 21st century.”  This was due to the church’s “radical optimism of grace.  We believe that human nature, and ultimately society, can be radically and permanently changed by the grace of God.  We have an irrepressible confidence in this message of hope, which flows from the heart of our holy God'” (p. 179). 

                Such optimism, of course, typified the first generation of Nazarenes.  But it is an optimism largely lacking in the post-WWII professors (e.g. Wynkoop and Dunning) who have shaped the denomination’s preachers for the past three decades.  They did their work by redefining the definitions of sin and sanctification to the point that little differentiated a sincere believer and a sanctified saint.  In Quanstrom’s judgment:  “The problem with these re-definitions for the denomination was that they effectively emasculated the promise of entire sanctification, at least as it had been understood at the beginning of the century” (p. 180).  Consequently, as we begin our second century, we have no clear identity as a “holiness denomination.” 

                Quanstrom clearly presents the evidence for the doctrinal shifts within the Church of the Nazarene.  Whether Wiley or Wynkoop was right he does not say.  What seems clear to me, however, is that the position of Mildred Bangs Wynkoop has largely replaced that of H. Orton Wiley.