245 Refuting Relativism

While the recently installed Pope Francis urges empathy with the poor he also laments the spiritual poverty of those in bondage to what Benedict XVI called “the tyranny of relativism.”  He certainly follows St.  Francis of Assisi, urging us to be peacemakers—“But there is no true peace without truth!  There cannot be true peace if everyone is his own criterion, if everyone can always claim exclusively his own rights, without at the same time caring for the good of others, of everyone, on the basis of the nature that unites every human being on this earth.”  His papal predecessor, Benedict XVI, had warned:   “We are building a dictatorship of relativism that does not recognize anything as definitive and whose ultimate goal consists solely of one’s own ego and desires.”  While acknowledging that fanatics too easily assert their confidence in various “truths,” we should not cease discerning and proclaiming with certainty self-evident and trustworthy insights and convictions.  “That is why,” he said, “we must have the courage to dare to say:  ‘Yes, man must seek the truth; he is capable of truth.”   

Benedict’s admonitions would not have surprised Allan Bloom, who in 1987 wrote The Closing of the American Mind as “a meditation on the state of our souls, particularly those of the young, and their education” (p. 19).  Youngsters need teachers to serve as midwives—above all helping them deal with “the question, ‘What is man?’ in relation to his highest aspirations as opposed to his low and common needs” (p. 21).   University students, Bloom said, were “pleasant, friendly and, if not great-souled, at least not particularly mean-spirited.  Their primary preoccupation is themselves, understood in the narrowest sense” (p. 83), preoccupied with personal feelings and frustrations.  Not “what is man” but “who am I” is the question!  They illustrate “the truth of Tocqueville’s dictum that ‘in democratic societies, each citizen is habitually busy with the contemplation of a very petty object, which is himself’” (p. 86).  

     This preoccupation with self-discovery and self-esteem, Bloom believed, flowers easily in today’s relativism, a philosophical dogma espoused by virtually everyone coming to or prowling about the university.  Under the flag of “openness” and “tolerance,” no “truths” are acknowledged and everyone freely follows his own feelings.  So even the brightest of our young people know little about history, literature, or theology, for such knowledge resides in books, which remain largely unread, even in the universities.  Minds shaped by films, rock music and television have little depth, and “the failure to read good books both enfeebles the vision and strengthens our most fatal tendency—the belief that the here and now is all there is” (p. 64).  Deepening his analysis in a section titled “Nihilism, American Style,” Bloom diagnosed the philosophical roots of today’s educational malaise as preeminently rooted in Nietzsche, Freud, and Heidegger.  An enormous intellectual earthquake has shaken our culture to its foundations.  It is “the most important and most astonishing phenomenon of our time,” the “attempt to get ‘beyond good and evil’” by substituting “value relativism” for Judeo-Christian absolutism (p. 141).  

* * * * * * * * * * * * * * * * * * * * * * * * *

What concerned Bloom and the popes at the turning of the millennium was perceptively examined half-a-century earlier by C.S. Lewis in one of his finest books, The Abolition of Man (New York:  Macmillan, 1947).  First presented during WWII as a series of lectures, Lewis began by carefully examining an elementary English textbook he dubbed The Green Book.   While allegedly designed to help students read literature, the text was inadvertently a philosophical justification for relativism, promoting the notion that all values, whether aesthetic or ethical, are subjective and ultimately indefensible.  However, Lewis said:  “Until quite modern times all teachers and even all men believed the universe to be such that certain emotional reactions on our part could be either congruous or incongruous to it—believed, in fact, that objects did not merely receive, but could merit, our approval or disapproval, our reverence or our contempt.  The reason why Coleridge agreed with the tourist who called the cataract sublime and disagreed with the one who called it pretty was of course that he believed inanimate nature to be such that certain responses could be more ‘just’ or ‘ordinate’ or ‘appropriate’ to it than others.  And he believed (correctly) that the tourists thought the same.  The man who called the cataract sublime was not intending simply to describe his own emotions about it:   he was also claiming that the object was one which merited those emotions” (#148 in Kindle).  

Coleridge and others who believed in objective Truth (and truths) generally appealed to what Chinese thinkers revered to as “the Tao.  It is the reality beyond all predicates, the abyss that was before the Creator Himself.  It is Nature, it is the Way, the Road.  It is the Way in which the universe goes on, the Way in which things everlastingly emerge, stilly and tranquilly, into space and time.  It is also the Way which every man should tread in imitation of that cosmic and supercosmic progression, conforming all activities to that great exemplar” (#107).  We instantly recognize—through theoretical reason—certain laws of thought (e.g. the law of non-contradiction) or geometry (e.g. a line is the shortest distance between two points); we also know—through practical reason—certain permanent moral maxims (e.g. murder is wrong).  Any effort to reduce universal values to personal feelings inevitably founder in nihilistic confusion.    

In truth:  “All the practical principles behind the Innovator’s case for posterity, or society, or the species, are there from time immemorial in the Tao.  But they are nowhere else.  Unless you accept these without question as being to the world of action what axioms are to the world of theory, you can have no practical principles whatever” (#358).  “The human mind has no more power of inventing a new value than of imagining a new primary colour, or, indeed, of creating a new sun and a new sky for it to move in” (#398).  By disregarding the Tao, advocates of any new morality sink into a “void” without a  pattern to follow, a nihilistic abyss promoting “the abolition of Man” (#556).  

* * * * * * * * * * * * * * * * * * * * * * * * 

In The Book of Absolutes:  A Critique of Relativism and a Defense of Universals (Montreal:  McGill-Queen’s University Press, c. 2008), William D. Gairdner updates and amplifies an ancient and perennial proposition.  A distinguished Canadian Olympic athlete with degrees from Stanford University, Gairdner has effectively influenced the resurgence of conservatism in his native land.  Though he acknowledges the present power and pervasiveness of relativism, he finds it “a confused and false conception of reality that produces a great deal of unnecessary anxiety and uncertainty, both for individuals and for society as a whole” (#71).  To rectify this problem he wrote “a book to restore human confidence by presenting the truth about the permanent things of this world and of human existence” (#74).  

The current notion—that truth varies from person to person (or group to group), that all perspectives must be tolerated, that moral judgments must be suspended—has an ancient history which Gairdner explores, noting that earlier generations generally judged it “preposterous.  The ancient Greeks actually coined the word idiotes (one we now apply to crazy people) to describe anyone who insisted on seeing the world in a purely personal and private way” (#88).  There were, of course, Greek Sophists such as Protagoras who declared:  “As each thing appears to me, so it is for me, and as it appears to you, so it is for you.”  But their views withered under the relentless refutations of Socrates, Plato, and Aristotle—all defending objective truth and perennial principles—followed by Medieval philosophers such as Thomas Aquinas and Enlightenment scientists such as Isaac Newton.  

Dissenting from the traditional, absolutist position were thinkers such as Thomas Hobbes, who rejected any rooting of moral principles in a higher law, declaring in The Leviathan  that we label “good” whatever pleases us.  Indeed, the words good and evil “are ever used with relation to the person that usesth them:  there being nothing simply and absolutely so.”  A century later Hobbes’ subjectivism would be enshrined by a thinker markedly different from him in many respects, Immanuel Kant, “the most coolly influential modern philosopher to have pushed us toward all sorts of relativist conclusion” (p. 14).   Building on Kant’s position, Friedrich Nietzsche formulated the relativist slogan:  “there are no facts, only interpretations.”  American pragmatists and cultural anthropologists, European existentialists and deconstructionists took up the catchphrase, and today we live in a postmodern culture deeply shaped by epistemological skepticism and moral relativism.  

Sadly, this intellectual shift was bolstered by a profound popular misunderstanding and misrepresentation of one of the monumental scientific discoveries of all time, Einstein’s “Special Theory of Relativity.”  As historian Paul Johnson explains, “‘the belief began to circulate, for the first time at a popular level, that there were no longer any absolutes:   of time and space, of good and evil, of knowledge, and above all of value.  Mistakenly but perhaps inevitably, relativity became confused with relativism.’  And, he adds, ‘no one was more distressed than Einstein by this public misapprehension’” (p. 17).  Nevertheless, it served as a scalpel that helped “‘cut society adrift from its traditional moorings in the faith and morals of Judeo-Christian culture’” (p. 18).   

After explaining various forms of relativism—noting that its moral and cultural forms are most prevalent and pernicious—Gairdner registers some objections to it.  It is, importantly, “self-refuting,” basing its entire case upon the absolute assertion that there are no absolutes.  Thus it cannot withstand Aristotle’s powerful argument, set forth in his Metaphysics, showing how it violates the law of non-contradiction.  That various persons or cultures claim different “truths” hardly affects the fact that a great many beliefs are manifestly wrong, whereas others (e.g. the earth is spherical) are demonstrably right.  The fact that some groups of people (“sick societies”) have condoned human sacrifice or infanticide hardly justifies these practices.  Admittedly, some truths—both scientific (a heliocentric solar system) and moral (slavery is wrong)—become clear only after considerable time or laborious investigation, but that only strengthens their certainty.  Thus:  “Neither believing nor doing makes a thing right or wrong” (p. 39).  

Challenging relativism, Gairdner invokes scholars such as Donald E. Brown, a professor of anthropology, whose 1991 publication, Human Universals, effectively refutes many sophomoric relativistic mantras by listing more than 300 human universals.  For example:  “All humans use language as their principal medium of communication, and all human languages have the same underlying architecture, built of the same basic units and arranged according to implicit rules of grammar and other common features.  All people classify themselves in terms of status, class, roles, and kinship, and all practice division of labour by age and gender.  . . . .  All have poetry, figurative speech, symbolic speech, ceremonial speech, metaphor, and the like.  All use logic, reckon time, distinguish past, present, and future think in causal terms, recognize the concept and possibility of cheating and lying, and strive to protect themselves from the same’” (p. 64).  Everywhere and at all times we humans have acknowledged such universal realities.  

There are indubitable, demonstrable constants (laws) throughout the natural world—notably the law of gravity and Einstein’s famous theorem, E=mc2.  Material things, moving through space, continually change; but laws remain the same.  As David “Berlinski puts it, ‘the laws of nature by which nature is explained are not themselves a part of nature.  No physical theory predicts their existence nor explains their power.  They exist beyond space and time, they gain purchase by an act of the imagination and not observation, they are the tantalizing traces in matter of an intelligence that has so far hidden itself in symbols’” (p. 76).  Still more, says Berlinski:  “‘We are acquainted with gravity through its effects; we understand gravity by means of its mathematical form.  Beyond this, we understand nothing” (p. 78).  

Analogously, as human beings we are, by nature, “hardwired” with important “universals.”  The “blank-slate” notion, promulgated by empiricists such as John Locke and B.F. Skinner, cannot support the accumulating genetic and cognitive evidence concerning our species.  Beyond using the same nucleic acid and DNA basic to all living organisms, we do a great number of remarkable things:  negotiating contracts; acting altruistically and even sacrificially; establishing elaborate kinship ties; walking erectly; engaging in year-round sexual encounters; recognizing ineradicable male-female differences; manifesting an inexplicably miraculous intelligence, reason, and free will.  Gairdner patiently compiles and evaluates the massive evidence available to show that a multitude of “universals” do in fact define us as human beings.  “Contrary to the claims of relativists everywhere that nothing of an essential or innate human nature exists, we find that there is indeed a basic universal, biologically rooted human nature, in which we all share.  This is so from DNA to the smiling instinct.  It is a human nature that runs broad and deep, and nothing about it is socially constructed or invented” (p. 162).  This is markedly evident in the “number of universals of human language” (p. 217) discovered by scholarly linguists such as Noam Chomsky, who insists “‘there is only one human language,’ which he and his followers later labeled ‘UB,’ or ‘Universal Grammar’”(p. 229).  As an English professor, Gairdner devotes many pages, in several chapters, to an explanation and analysis of recent developments in the study of language, truly one of the defining human characteristics.  Importantly, it can “be seen as a mirror of the internal workings of the mind rather than as a mirror of the external workings of culture or society” (p. 291).  

Embedded within this human nature we find a natural law prescribing moral norms.  “The traditional natural law is therefore based on four assumptions:  ‘1.  There are universal and eternally valid criteria and principles on the basis of which ordinary human law can be justified (or criticized).  2.  These principles are grounded both in nature (all beings share certain qualities and circumstances) and in human nature.  3.   Human beings can discover these criteria and principles by the use of right reason.  4.  Human law is morally binding only if it satisfies these criteria and principles’” (p. 164).  It is a hallmark of the philosophia perennis articulated by classical (Plato; Aristotle; Cicero) and Christian (Aquinas; Leibniz; C.S. Lewis) thinkers and noted for its common sense notions regarding God, man, and virtuous behavior.  

Espoused by some of the great architects of Western Civilization—including Aristotle and Cicero,  Aquinas and Calvin, Sir William Blackstone and the American Founders, the Nuremberg judges and Martin Luther King—the natural law tradition has provided the foundation for “the rule of law” so basic to all our rights and liberties.   As defined by Cicero:  “‘true law is right reason in agreement with nature, universal, consistent, everlasting, whose nature is to advocate duty by prescription and to deter wrongdoing by prohibition.’  He further stated that ‘we do not need to look outside ourselves for an expounder or interpreter of this law.’  God, he said, is the author and promulgator and enforcer of this law, and whoever tries to escape it ‘is trying to escape himself and his nature as a human being’” (p. 183).  

So, Gairdner explains:  “The precepts of natural law for rational human creatures are, then, rational directives of logic and morality aimed at the common good for humanity and at avoidance of everything destructive of the good.  This means that human rational fulfillment may be found in such things as preserving the existence of ourselves and others by begetting and protecting children, by avoiding dangers to life, by defending ourselves and our loved ones, by hewing to family and friends, and of course, by hewing to reason itself.  We know many such standards in religion as commandments.  In daily life we know them as natural commands and prohibitions:  love others, do unto them as you would have them do unto you, be fair, do not steal, do not lie, uphold justice, respect property, and so on” (p. 189).  Such precepts, as Aquinas insisted, are intuitively known, per se nota; they are as self-evident as the geometric axioms of Euclid or the North Star’s fixed location in the night sky.  Thus murder and lying and theft and rape are rightly recognized as intrinsically evil.  Honoring one’s parents, respecting the dead, valuing knowledge, and acting courageously are rightly deemed good.  

During the past century, as relativism has flourished, the natural law tradition was widely attacked and abandoned.  Strangely enough, most relativists grant the existence of “an extremely mysterious law of gravity that controls all matter but that is not itself a part of matter, but they will not consent to other types of laws that govern or guide human behaviour, such as a natural moral law” (p. 210).    Apparently, Gairdner says, one of the reasons “for the decline of natural law during the rise of the modern state is that just about every law school, every granting institution, every legal journal, and every public court and tribunal is largely funded by the state.  And no modern state wants to be told by ordinary citizens that any of its laws are not morally binding.  That is why Lord Acton referred to natural law as ‘a revolution in permanence.’  He meant a revolution by those who cherish a traditional society and a morality rooted in human nature against all those who attempt to uproot, reorder, and deny or replace these realities” (p. 182).  

The modern repudiation of absolutes followed developments in 19th and 20th century German philosophy, evident in Hegel, Nietzsche and Heidegger, reaching their apex in Nazi Germany.  Classical and Christian advocates of transcendent metaphysical principles, such as Plato and Aquinas, were discarded by a corps of “existentialists” determined to move “beyond good and evil” and devise a purely secular, humanistic ethos.  French intellectuals, following the lead of Jacques Derrida and Michel Foucault, imported Nietzsche and Heidegger, setting forth the currents of “deconstruction” and “postmodernism” so triumphant in contemporary universities and media centers.  “It was all an echo of Nietzsche’s ultra-relativist claim (later elaborated by Heidegger) that ‘there are no facts, only interpretations’” (p. 252).  

Ironically, Derrida himself, toward the end of his life, announced an important “Turn” in his thought.  He acknowledged “the logical as well as spiritual need for a foundation of some kind.  And out it came, as quite a shock to his adamantly relativist followers:  “‘I believe in justice.  Justice itself, if there is any, is not deconstructible’” (p. 266).  Derrida simply illustrates the perennial power of the natural law—there are some things we just can’t not know!    Derrida’s “Turn” underscores what Gairdner endeavors to do in this book:  “to expose the intellectual weakness of the relativism that pervades modern—especially postmodern—thought and also to offer a basic overview of the absolutes, constants, and universals that constitute the substance of the many fields explored here.  We have see them at work in culture through human universals, in physics via the constants of nature, in moral and legal thought via the natural law, and in the human body via our hardwired biology.  And not least, of course, in view of its close approximation to human thought itself, we have looked at the constants and universals of human language” (p. 308).  He persuasively demonstrates that “we do not live in a foundationless or relativistic world in which reality and meaning, or what is true and false, are simply made up as we go along and according to personal perceptions.  On the contrary, we live in a world in which every serious field of human thought and activity is permeated by fundamentals of one kind or another, by absolutes, constants, and universals, as the case may be, of nature and of human nature” p. 308).  

# # # 

244 Fewer . . . and Fewer of Us

  Among the handful of must-read 20th century dystopian novels—Aldus Huxley’s Brave New World; George Orwell’s 1984; C.S. Lewis’s That Hideous Strength—is P.D. James’s The Children of Men (New York:  Penguin Books, c. 1992), which prods both our imagination and reason by envisioning the potential consequences of demographic trends.  James, a distinguished novelist, known mainly for her riveting (and philosophically nuanced) mystery stories, portrays the world in 2021, twenty-six years after the last baby was born, dooming the race to extinction.  She challenged, in a powerful artistic way, one of the prevailing orthodoxies of our day—the threat of overpopulation.  The Children of Men boldly countered the message of Paul Ehrlich’s 1968 best-selling Population Bomb (one of the most egregiously misguided books published during that pivotal decade), which fueled to the mounting fears of ecological catastrophe then gripping the environmental community.  Because earth’s resources are finite, he predicted:  “In the 1970s the world will undergo famines—hundreds of millions of people are going to starve to death.”  Ehrlich was duly lauded by the academic community (after all he was a certified member of the elite, a professor at Stanford University with an enviable reputation as an entomologist) and courted by the complacent media (Johnny Carson gushing over him for his prescience).  

One of the few journalists willing to risk censure by differing with Ehrlich was Ben J. Wattenberg, who warned of actual population implosion rather than explosion.  Two decades later he wrote The Birth Dearth, examining the “Total Fertility Rate” which was falling around the globe.  He vainly hoped to stimulate a national conversation on the subject, but few (alas) recognized the reality of birth dearth.  Returning to his concern in Fewer:  How the New Demography of Depopulation Will Shape Our Future (Chicago:  Ivan R. Dee, c. 2004), he argued that “never have birth and fertility rates fallen so far, so fast, so low, for so long, in so many places, so surprisingly” (p. 5).  “For at least 650 years,” he says, “the total number of people on earth has headed in only one direction:  up.  But soon—probably within a few decades—global population will level off and then likely fall for a protracted period of time” (p. 5).  

European, Russian and Japanese populations are virtually in free fall, with a Total Fertility Rate (TFR) significantly less than requisite replacement levels (2.1 per woman).  Europe’s population will likely shrink from 728 million in 2000 to 632 million in 2050.  To replace lost babies, Europe would need to take in nearly two million (rather than the current 376,000) immigrants each year.  Russia has a TFR of 1.14 and will lose 30 percent of its population by mid-century.  Not only are folks having fewer children—they want less!  And lest we think this is true only of prosperous, highly industrialized nations, it also applies to Less Developed Countries, where a dramatic reduction in population growth has occurred within the past few decades.  China, for example, had a TFR of 6.06 forty years ago; after instituting a “one child only” policy, by  the beginning of the millennium the TFR fell to 1.8!  Similarly, South Korea’s 2005 rate fell to 1.17.  Brazil and Mexico reveal the same depopulating trajectory.  In fact, few nations are repopulating themselves.  America, however, has proved somewhat exceptional, sustaining a replacement level fertility rate—in part through welcoming immigrants who frequently have large families.  

To explain this phenomenon, Wattenberg points first to increased urbanization, where children are something of a liability rather than an asset.  Secondly, as women pursue higher education and careers—and as couples earn more money—they have proportionally fewer children.  “More education, more work, lower fertility” (p. 96).  Thirdly, abortion disposes of 45 million developing children every year.  Fourthly, divorce lowers fertility as single women welcome fewer children than their married counterparts.  Fifthly, contraception (exploding since the ‘60s) empowers couples to enjoy sexual pleasure without undesired consequences.  And sixthly, since couples marry later in life they inevitably have fewer offspring.  The ominous consequences of this depopulation cannot be ignored, because the welfare states established in virtually all modern countries simply cannot support growing numbers of elderly retirees funded by dwindling numbers of younger workers.  Successful businesses thrive by employing creative young workers and selling goods to expanding numbers of consumers—essential factors inevitably absent as populations decline.  Nations—and civilizations—will lose power and influence as their numbers decline.  Less free, less enlightened dictatorial successors may very well replace them.  The world, quite simply, will be a radically different place within a century.  

* * * * * * * * * * * * * * * * * * * * * * 

In What to Expect When No One’s Expecting:  America’s Coming Demographic Disaster (New York:  Encounter Books, c. 2013) Jonathan V. Last details the latest data regarding population prospects.  The book’s title reveals its thesis:  no one’s expecting these days—and paradoxically, as P.J. O’Rourke quipped, “the only thing worse than having children is not having them.”  Failing to heed the Bible’s first injunction—“be fruitful, and multiply, and replenish the earth”—modern man faces an uncertain prospect bereft of children, coddling pets as their “fuzzy, low-maintenance replacements” (p. 3).  The earth, it seems, will grow emptier.  Indeed, “only 3 percent of the world’s population lives in a country whose fertility rate is not declining” (p. 92).  We are moving from the “First Demographic Transition,” wherein children took center-stage and politicians built careers on looking out for them, to the “Second Demographic Transition,” wherein individual adults shun both families and children to pursue their own careers and pleasures.  “Middle-class Americans don’t have very many babies these days.  In case you’re wondering, the American fertility rate currently sits at 1.93,” significantly below the requisite replacement level (p. 4).  At the moment, the deficit is rectified by Hispanic women, who average 2.35 babies, but they too are rapidly choosing to have less and less.  For example, the once-plenteous supply of Puerto Rican immigrants to New York has collapsed as the island’s birthrate shrunk in 50 years from 4.97 to 1.64.  “Some day,” Last says, “all of Latin America will probably have a fertility rate like Puerto Rico’s.  And that day is coming sooner than you think” (p. 113).  Labor shortages in Latin countries will eliminate the need to emigrate and the U.S. population picture will quickly resemble Europe’s.    

Glancing abroad, by 2050 Greece may lose 20 percent of its people; Latvia has, since “1989 lost 13 percent” and “Germany is shedding 100,000 people a year” (p. 25).  Spain registers barely one baby per woman.  Japan’s population is shrinking and abandoned rural villages bear witness to the trend.  It’s the same in Russia:  “In 1995, Russia had 149.6 million people.  Today, Russia is home to 138 million.  By 2050, its population will be nearly a third smaller than it is today” (p. 25).  Consequently, Vladimir Putin has zealously promoted a variety of failing schemes designed to encourage women to have more children.  But they choose not to!  Other things seem more important.  “Divorce has skyrocketed—Russia has the world’s highest divorce rate.  Abortion is rampant, with 13 abortions performed for every 10 live births.  Consider that for a moment:  Russians are so despondent about the future that they have 30 percent more abortions than births.  This might be the most grisly statistic the world has ever seen.  It suggests a society that no longer has a will to live” (p. 137).  

Portents of things to come stand revealed in Hoyerswerda, a German city near the Polish border.  In 1980 the town had 75,000 residents and “the highest birth rate in East Germany” (p. 98).  With the collapse of the Soviet Union, the folks there (and, more broadly, throughout the former East Germany) simply stopped procreating.  The fertility rate abruptly plunged to 0.8 and within three decades the town lost half of its residents.  Hoyerswerda “began to close up shop” (p. 98).  Buildings, businesses, and homes stood vacant.  Similar developments across the country dictated a significant shift from “urban planning” to expand and develop infrastructures and suburbs to devising ways “to shrink cities” (p. 98).  Parks now proliferate, replacing factories and schools.  The wolf population is actually resurgent, with wolf-packs prowling around dwindling settlements.  

Mirroring these European trends is Old Town Alexandria (the D.C. suburb where Last and his wife lived for a while)—a “glorious preserve of eco-conscious yoga and free range coffee.  My neighbors had wonderfully comfortable lives in part because they didn’t take on the expense of raising children” (p. 25).  As a portent of things to come, white, college-educated women, shopping in Alexandria’s stylish boutiques and devotedly determined to pursue a career, have a fertility rate of 1.6—barely more than Chinese women after decades of that nation’s recently-suspended “one-child” policy.  In 1970 the average Chinese woman bore six children and the Communist regime envisioned multiple problems with the ticking population bomb.  Energetic policies were implemented until quite recently, when the rulers realized the implications of population implosion.  But a trajectory has been set and within forty years “the age structure in China will be such that there are only two workers to support each retiree” (p. 13).  

Looking to explain this world-wide pattern, the author lists a variety of “factors, operating independently, with both foreseeable and unintended consequences.  From big things (like the decline in church attendance and the increase of women in the workforce) to little things (like a law mandating car seats in Tennessee or the reform of divorce statutes in California), our modern world has evolved in such a way as to subtly discourage childbearing” (p. 16).  Certainly there are good reasons not to procreate.  Heading the list is money.  Rearing a child may very well cost parents a million dollars!  Financially, it’s the worst investment possible!  “It is commonly said that buying a house is the biggest purchase most Americans will ever make.  Well, having a baby is like buying six houses, all at once.  Except you can’t sell your children, they never appreciate in value, and there’s a good chance that, somewhere around age 16, they’ll announce:  ‘I hate you’” (p. 43).  

Complicating this are welfare state structures such as Medicare and Social Security that “actually shift economic incentives away from having children” (p. 46).  Though Texas Governor Rick Perry was ridiculed for suggesting it, Social Security really is a “Ponzi scheme” that will only work “so long as the intake of new participants continues to increase” (p. 107).  In 1950 three million folks were getting Social Security checks; thirty years later there were 35 million retirees expecting monthly payments; by 2005 nearly 50 million were on the role, taking $546 billion a year from taxpayers still working.  In its initial (New Deal) phase, Social Security only exacted one percent of a worker’s paycheck; 30 years later (under LBJ’s Great Society) the rate inched up to three percent; by 1971 it jumped to 4.6 percent; and today (shielded from any adjustments by Barack Obama) it amounts to 6.2 percent.  The sky, you might say, is the limit as an endless line of elders look to their shrinking numbers of children to pay the bills.  The same goes for Medicare—except the prognosis is worse by far!  It simply cannot survive in it present form, given the realities of a shrinking population.  Both programs “were conceived in an era of high fertility.  It was only after our fertility rate collapsed that the economics of the programs became dysfunctional” (p. 109).  

Yet looming above all else is “the exodus of religion from the public square” (p. 84).  Devout Catholics and Protestants have more kids.  They shun the behaviors facilitating population decline—contraception, abortion, cohabitation, delayed marriage, divorce—and church-going couples fully enjoy marriage in ways unavailable to their secular counterparts.  Practicing Protestants increasingly resemble practicing Catholics, procreating more than enough youngsters to support population growth.  But non-religious women, according to a 2002 survey, had a fertility rate of only 1.8, whereas women who rated religion as “not very important” clocked in at 1.3.  Ultimately “there’s only one good reason to go through the trouble” of rearing children:  “Because you believe, in some sense, that God wants you to” (p. 170).  For this reason our government especially should craft family-friendly, child-friendly policies—repudiating the feminist and homosexual ideologies shaping the laws and judicial decrees of the past half-century.   

* * * * * * * * * * * * * * * * * * * * * * 

Columnist Mark Steyn, whether writing or speaking, is justly renowned for his infectious humor and verbal dexterity, bringing to his discussions of serious subjects a sustained note of good cheer.  There is little to cheer about, however, in Steyn’s America Alone:  The End of the World as We Know It (Washington:  Regnery Publishing, Inc., c. 2006) wherein he casts a gloomy look at demographic realities and predicted “the Western world will not survive the twenty-first century, and much of it will effectively disappear within our lifetimes, including many if not most European countries” (p. xiii).  Within 40 years “60 percent of Italians [once lionized for their large and boisterous families] will have no brothers, no sisters, no cousins, no aunts, no uncles” (p. xvii).  Declining populations will leave welfare states unsustainable and civilization unfeasible.  “Civilizations,” said Arnold J. Toynbee, in A Study of History, “die from suicide, not murder,” and Western Civilization is in the midst of self-inflicted mortal wounds.  “We are,” Steyn says, “living through a rare moment:  the self-extinction of the civilization which, for good or ill, shaped the age we live in” (p. 3).  

Though we’re tempted to think such things have never happened before, Steyn jolts us with a quotation from Polybius (c. 150 B.C.), one of the greatest ancient historians, who said:  “In our own time the whole of Greece has been subject to a low birth rate and a general decrease of the population, owing to which cites have become deserted and the land has ceased to yield fruit, although there have neither been continuous wars nor epidemics. . . .  For as men had fallen into such a state of pretentiousness, avarice, and indolence that they did not wish to marry, or if they married to rear the children born to them, or at the most as a rule but one or two of them, so as to leave these in affluence and bring them up to waste their substance, the evils rapidly and insensibly grew” (The Histories, XXXVI).    

Basic to demographic decay, as both Polybius and Steyn argue, is an apparently irreversible moral and spiritual decay leaving listless increasing numbers of people.  Irreligious folks inevitably lose faith not only in an invisible God but in equally invisible ethical principles and reasons to live hopfully for the future.  Thus Europe’s population has plunged like a raft going over a waterfall in the wake of the de-Christianization of the continent.  Childless places like Japan and Singapore and Albania have little religious vitality.  Standing alone in the midst of all this is the United States, which still enjoys a modest population growth.  True to form, the U.S. is the extraordinary Western nation still featuring robust religious activity.  Unfortunately this may not long persist since “most mainline Protestant churches are as wedded to the platitudes du jour as the laziest politician” (p. 98).  They “are to one degree or another, post-Christian.  If they no loner seem disposed to converting the unbelieving to Christ, they can at least convert them to the boggiest of soft-left political clichés, on the grounds that if Jesus were alive today he’d most likely be a gay Anglican bishop in a committed relationship driving around in an environmentally friendly car with an ‘Arms Are for Hugging’ sticker on the way to an interfaith dialogue with a Wiccan and a couple of Wahhabi imans” (p. 100).  Without a resurgence of orthodox, muscular Christianity, Steyn thinks, America too will soon choose the childless path to historical oblivion.  

In addition to population implosion, Steyn devotes much attention in America Alone to the threat of Islamic imperialism, facilitated by the growing passivity—the unwillingness to resist terrorism—throughout much of what was once labeled “Western Civilization.”  Indicative of the trend was “the decision of the Catholic high school in San Juan Capistrano to change the name of its football team from the Crusaders to the less culturally insensitive Lions” (p. 158).  (Simultaneously, 75 miles to the south, lock-stepping with the culture, Point Loma Nazarene University—while I was on the faculty—similarly  changed its mascot from Crusaders to Sea Lions.)  This loss of a masculine will-to-fight, as will as the will-to-procreate, signifies a collapsing culture.  Indeed, the “chief characteristic of our age is “‘deferred adulthood’” (p. 191).  And it takes strong adults to create and sustain a culture.  

* * * * * * * * * * * * * * * * * * * 

However realistically we appraise the threat of Islamic Jihadism, demographic realities foretell coming calamities in Muslim lands during the next half-century.  This prospect becomes clear in David P. Goldman’s How Civilizations Die (And Why Islam Is Dying Too) (Washington:  Regnery Publishing, Inc., c. 2011).  Growing numbers of us are aware of the “birth dearth” haunting much of the world, but because it’s underreported, few know that within four decades “the belt of Muslim countries from Morocco to Iran will become as gray as depopulating Europe” (p. x).  For example females in Iran, though now surrounded by half-a-dozen of their siblings, will themselves “bear only one or two children during their lifetimes” (p. x).  “The fastest demographic decline ever registered in recorded history is taking place today in Muslim countries” (p. xv).  

Along with his description of demographic patterns in Muslim nations, Goldman’s discussion of “four great extinctions” makes his book worth pondering.  The first extinction took place a millennium before Christ, with the passing of the Bronze Age and the disappearance of cities such as Troy, Mycenae, and Jericho.  The second extinction, two hundred years before Christ, enervated the Hellenic civilization once centered in cities such as Athens and Sparta.  Aristotle says Sparta shrank within a century from 10,000 to 1,000 citizens.  Increasingly large landed estates, run for the benefit of an ever diminishing aristocracy less and less concerned with rearing children, indulging themselves with sexual perversions such as pederasty, left Sparta bereft of people and militarily listless.  The city was, he said  “ruined for want of men.” (Politics, II, ix).   

The third extinction marked the end of Rome’s power and grandeur in the fourth and fifth centuries A.D.  Even in the glory days of the Empire, when Augustus Caesar reigned, “there was probably a decline” in the empire’s population due to “the deliberate control of family numbers through contraception, infanticide and child exposure” (p. 131).  Augustus himself decreed punishments for “childlessness, divorce, and adultery among the Roman nobility” (p. 131), but nothing worked, and the empire’s needed laborers and soldiers were necessarily drawn from defeated (or volunteering) barbarians from the North.  We are now in the midst of the fourth extinction, when civilizations (East and West) are beginning to show symptoms of rigor mortis.   This extinction began in many ways with the French Revolution in 1789, the “world’s first attempt to found a society upon reason rather than reason” (p. 134), followed by the subsequent waves of revolutionary activity, that transformed Europe into a bastion of atheistic and anti-natal secularism.  

Though Islam seems to be a vibrant religion, currently regaining its virility through movements such as the Muslim Brotherhood, Goldman thinks it is in fact violently (and vainly) reacting to the global secularism fully evident in the dramatic decline of population throughout the Islamic world.  Joining “Western Civilization,” Islam is another dying culture!  So fewer and fewer of us will inherit the earth.  

243 Scared to Death

 Though I routinely recommend various books wishing them widely read, I occasionally finish one wishing everyone fully knew its contents, for, as the prophet Hosea said, “My people are destroyed for lack of knowledge” (4:6).   Scared to Death:  From BSE to Global Warming:  Why Scares Are Costing Us the Earth (New York:  Continuum, c. 2007; 2009 reprint), by two British journalists, Christopher Booker and Richard North, is one of those books.  In brief, they document Shakespeare’s insight in A Midsummer Night’s Dream (“In the night, imagining some fear, how easy is a bush supposed a bear”) showing how a succession of unfounded fears have panicked and harmed millions of people.  Each panic followed a “common pattern,” beginning with alleged scientific data portending a catastrophe in the making.  “Each has inspired obsessive coverage in the media.  Each has then provoked a massive response from politicians and officials, imposing new laws that inflicted enormous economic and social damage.  But eventually the scientific reasoning on which the panic was base has been found to be fundamentally flawed” (p. ix).  Though differing in details, they all resemble the “millennium bug” that so exercised millions of folks as January 2000 approached.  Eminent authorities warned of “potentially disastrous global consequences to both business and government” as computers were predicted to malfunction.  Scores of institutions invested millions of dollars preparing for the crisis.  But absolutely nothing happened!  

The first part of the book delves into a litany of “food scares” that profoundly affected Great Britain.  Beginning in 1985, a few cattle died as a result of brain infection—known as “cattle scrapie” and ultimately dubbed “Mad Cow Disease.”  At the same time scattered salmonella and listeria outbreaks led anxious experts to blame eggs and cheese as the culprits.  Government scientists and bureaucrats leapt into action, persuaded they needed to protect the public, decreeing the slaughter of herds and flocks.  Flooded with sensational statements in the newspapers and on TV, people around the world suddenly shunned British beef and eggs, bankrupting scores of small farmers.  Hygiene became a pressing and paramount issue, though food poisoning incidents “remained curiously stable” (p. 76).  Absolutely no evidence existed linking brain encephalopathies in livestock to human beings, yet nothing deterred government spokesmen and journalists from hyping the threat.  When the “Mad Cow disease” was finally  laid to rest, more than 8,000,000 cattle and sheep had been destroyed with a total cost of more than three billion pounds.  Comprehensively calculated, the panic cost twice that.  “Without question it was the most expensive food scare the world has ever seen” (p. 126).

Having examined, in detail, health-related scares in Britain, Booker and North devote the second part of Scared to Death to “general scares” that duplicate the same pattern.  “In many ways the first truly modern ‘scare’ was one that began in America” following the publication of Rachel Carson’s Silent Spring in 1962 (p. 167).  She blamed DDT, a powerful insecticide widely used following WWII, for poisoning the environment and causing cancer.  Though it had been remarkably successful—reducing malaria mortality rates by 95 per cent—fervent  environmentalists quickly crusaded to ban DDT.  Greenpeace and the World Wildlife Fund effectively pushed for a world-wide ban on the substance, despite the fact that, as Michael Crichton said:  “‘Since the ban two million people a year have died unnecessarily from malaria, mostly children.  All together, the ban has caused more than fifty million needless deaths.  Banning DDT killed more people than Hitler’” (p. 170).  No solid studies have found DDT remotely responsible for cancer in human beings.  Indeed its worst consequence seems to be the thinning of eggshells for birds of prey.  An examination of  “The Modern Witch Craze” documents the incredible claims of Satanic ritualistic abuse of children enkindled in the 1980s.  Beginning with allegations brought by a California  mother who believed her son had been abused in the McMartin Infant School and sustained by a corps of social workers and counselors who insisted children’s stories could not be doubted, dozens of innocent people were brought to trial (in Britain as well as America) and imprisoned before mounting evidence demonstrated the folly of it all.  Some of the accused committed suicide.  We now know that social workers (armed with state authority) separated the children from their parents for weeks or even months at a time to interview them.  The children “were repeatedly plied with leading questions of a type which would never have been allowed in a courtroom” (p. 191).  Their outlandish stories, venturing into the fantastical, were taken literally by the psychological “experts” (often claiming to help children recover repressed memories) and all too frequently trusted by prosecutors.  In time most of the adult “culprits” were vindicated and we now know how untrustworthy both children’s stories and social workers’ constructions can actually be.  But the actual pain and suffering resulting from the witch craze can hardly be calculated.  

Few of us filling our gas tanks with unleaded fuel realize the high price we pay results, in part, from the billions of dollars wasted through the “lead scare” that mandated it.  Concentrated doses of lead (e.g. in ancient Roman water pipes) can certainly be toxic and its presence in gasoline helped pollute the air.  But lead is a “miraculous” additive to gasoline, significantly improving engine efficiency, and there was absolutely no evidence that leaded gas residue was any threat to public health.  However, a single, scientifically dubious study (by Robert Needleman, a child psychologist from the University of Pittsburg) alleging harmful effects on children’s intellectual development, was manipulated by politicians and the Environmental Protection Agency to mandate unleaded gasoline and justify a massive social change.  Yet Needleman was acclaimed and awarded for his work and given the Rachel Carson Award for Integrity in Science in 2004.  According to EPA administrator Carol Browner:  “‘The elimination of lead from gas is one of the great environmental achievements of all time’” (p. 234).   If so, one must wonder precisely what was actually achieved apart from fuzzy feelings about helping the children!  

While no one today doubts the lethal effects of smoking cigarettes, the threat of “passive smoking” can hardly be proven.  Smokers harm themselves but not “innocent” bystanders.  Yet during the past several decades activists have successfully campaigned to require, at considerable cost, a “smoke-free” environment virtually everywhere.  Thus for 20 years it has been illegal to smoke in California “workplaces, bars and restraints, but also within a yard and a half of any public building and on it famous beaches” (p. 254).  Allegations that thousands of people die each year due to “passive smoking” quite simply lack any statistical or factual basis.  Non-smokers may be  offended by the smell of tobacco smoke, but they suffer no real harm.  A massive research project, commissioned by the World Health Organization and conducted by 27 esteemed epidemiologists and cancer specialists, demonstrated this.   “Across the board and in all seven countries, their conclusions were consistent.  They found no evidence that there was any ‘statistically significant’ additional risk from passive exposure to smoke, either at home or in the workplace” (p. 256).  

Another study, “the longest and most comprehensive scientific study ever carried out into the effects of passive smoking anywhere in the world,” commissioned by the American Cancer Society, similarly concluded that “there was no ‘causal relation between environmental tobacco smoke and tobacco-related mortality’” (p. 261).  One would think such evidence would lead to retractions and shifts in public pronounements and policy.  Wrong!  Anti-tobacco fanatics facilely disregarded the evidence, sought to suppress scholarly papers and silence dissenters, linking arms with politicians such as New York’s Mayor Bloomberg on their mission to purify the air of all taints of the hated weed!  Booker and North conclude:  “The triumph of the campaign against passive smoking had provided one of the most dramatic examples in history of how science can be bent and distorted for ideological reasons, to come up with findings that the evidence did not support, and which were in many ways the reverse of the truth.  In this respect, it provided one of the most vivid examples in modern times of the psychological power of the scare” (p. 270).  

Add to the fear of tobacco smoke the fear of asbestos, one of the world’s most wonderful fire-resistant minerals, widely used in water pipes, brake linings, and building materials.  It can be woven into cloth-like products or mixed with plaster and cement as a reinforcement stronger than steel.  As with tobacco, however, some forms of asbestos can prove deadly when inhaled and absorbed by the lungs.  This is true, however, of only one kind of it!  The more common “white asbestos” is “by far the most widely used” and “poses no measurable risk to human health” (p. 276).  Fully 90 percent of the asbestos found in America’s buildings was benign.  But the limited numbers of workers dying of cancer due to intensive exposures to the deadly form of  asbestos enabled scaremongering lawyers and contractors, buoyed by EPA edicts, to pounce on people’s ill-informed fear of any exposure to any kind of it.   Buildings of all sorts (churches, schools, factories, homes) must be cleansed!  Companies must be punished through lawsuits—and, in time, great corporations such as Johns-Manville were destroyed and even Lloyds of London nearly collapsed.  Cunning lawyers extracted billions of dollars from beleaguered asbestos suppliers.  Legislative efforts to curtail the proliferating lawsuits were blocked “by a caucus of Democrat senators [e.g. Joe Biden; Edward Kennedy; John Kerry; Hillary Clinton; John Edwards] who had each received huge sums in campaign funding from law firms” (p. 322).  Ultimately, says Professor Lester Brickman:  “Asbestos litigation has come to consist, mainly, of non-sick people . . . claiming compensation for non-existent injuries, often testifying according to prepared scripts with perjurious contents, and often supported by specious medical evidence . . . it is . . . a massively fraudulent enterprise that can rightly take its place among the pantheon of . . . great American swindles’” (p. 273).  

Even more devastating is the irrational fear of global warming, “the new secular religion,” which is now fraudulently branded “climate change” since the evidence for actual warming has faded.  Objective historians have long noted significant climate changes—a “pre-Roman Cold” period (700-400 B.C), a “Roman Warming” time (200 B.C.-500 A.D), a cold era during the “Dark Ages” (500-900 A.D), a “Medieval Warming” time (900-1300 A.D.), a “Little Ice Age” (1300-1800), and the “Modern Warming” era we’re now in.  It has been much warmer, and much colder, in the past two millennia.  But hugely influential and well-funded scientists such as Michael Mann have distorted the record with sensational “evidence” including his spurious “hockey stick” graph that flattened out both the Medieval Warming and Little Ice Age.  Alarmists such as Mann seek to demonstrate temperature change with data from a single “1993 study of one group of trees in one untypical corner of the USA” (p. 359) and an “unqualified acceptance of the recent temperature readings given by hundreds of weather stations across the earth’s surface” (p. 359).  Ignored is evidence from weather satellites or the probable contamination of weather stations near urban centers.  

Bolstered by suspicious scientific pronouncements, activists such as Al Gore and Barack Obama have ignited widespread fears and orchestrated policies designed to fundamentally alter human behavior on earth through such things as the 1997 Kyoto Protocol.  Though Gore’s documentary—“An Inconvenient Truth”—won awards in various quarters, it was perceptively labeled, by an Australian paleoclimatologist, as “‘a propaganda crusade’” largely “‘based on junk science.  ‘His arguments are so weak that they are pathetic.  It is incredible that they and his film are commanding public attention’” (p. 378).   Calling for a curtailment on burning fossil fuels or developing nuclear energy (by far the best solution to the problem of greenhouse gases), Gore and his green corps demand the development of various forms of “green energy.”  Interestingly enough, environmentalist rhetoric subtly shifted from warning regarding “global warming” to admonitions for “green energy”!  Yet to this point highly-touted “green alternatives” such as wind turbines make little dent on the production of carbon dioxide—e.g. the 1200 turbines  built in Britain that produce only one-eighth of the electricity supplied by one coal-fired plant in Yorkshire!  

In fact, “climate change” is most likely driven by solar activity and clouds, with only minimal impact attributable to human activities.  “In many respects, however, the alarm over global warming was only the most extreme example of all the scares described in this book.  Yet again it had followed the same familiar pattern:  the conjuring up of some great threat to human welfare, which had then been exaggerated far beyond the scientific evidence; the collaboration of the media in egging on the scare; the tipping point when the politicians marshaled all the machinery of government in support of the scare; and finally the wholly disproportionate regulatory response inflicting immense economic and social damage for a highly questionable benefit” (p. 403).  

* * * * * * * * * * * * * * * * * * * * * * * * * * *

Oklahoma Senator James Inhofe’s The Greatest Hoax:  How the Global Warming Conspiracy Threatens Your Future (Los Angeles:  WND Books, c. 2012) seeks to counter the positions promoted by Al Gore and environmental alarmists.  Throughout the book Senator Inhofe pillories Gore, oft-times portrayed by the media as a “climate prophet” or “Goricle.”  Indeed, “Katie Couric famously said that Gore was a ‘Secular Saint,’ and Ophrah Winfrey said that he was the ‘Noah’ of our time” (Kindle #1182)   Obviously Gore and environmentalists have embraced and promote a religion rather than a scientific position.  Thus dissenters from the environmentalist dogma like Inhofe are treated as heretics akin to “Holocaust deniers”!  To Robert F. Kennedy Jr., those who dare differ with Gore are traitors!  To deal with them, one journalist “called for Nuremberg-style trials for climate skeptics” (#1372)!   Their research must be proscribed, their publications censored!  

Folks such as Couric and Kennedy are, manifestly, full-fledged true believers who revel in hysterical rhetoric.  Folks like Senator Inhofe, in opposition, join a distinguished minority of highly-informed people who question the devotees of “climate change.”  Thus they find credible scientists such as Dr. Claude Allegre, a noted French geophysicist, “a former French Socialist Party leader, a member of both the French and U.S. Academies of Science, and one of the first scientists to sound the global warming alarm—who changed around 2006 from being a believer to a skeptic” (#1903).  Joining Allegre, Dr. David Bellamy, highly acclaimed, figure in the UK, was “also converted into a skeptic after reviewing the science.  Bellamy said that “‘global warming is largely a natural phenomenon’ and said that catastrophic fears were ‘poppycock.’  ‘The world is wasting stupendous amounts of money on trying to fix something that can’t be fixed,’ and ‘climate-change people have no proof for their claims.  They have computer models which do not prove anything’” (#1919).  

Sitting on the Senate Environment and Public Works Committee, Inhofe has political acumen and access to substantive scientific studies.  Consequently he played a critical role in defeating President Obama’s “cap and trade” proposals.  (He was, importantly, working at the same time to pass the Clear Skies Act, designed to improve air quality, so he can hardly be dismissed as an enemy of environmental health).  He proudly labels himself “a one man truth squad” on the global warming issue and includes a great deal of personal details regarding his background and concerns regarding the state of the American Union.  Consequently:  “This book, constitutes the wake-up call for America—the first and only complete history of the Greatest Hoax, who is behind it, the true motives, how we can defeat it—and what will happen if we don’t” (#88).  He knows, for example, according to the testimony of EPA Administrator Lisa Jackson, that even if the U.S. enacted the most stringent policies designed to reduce carbon levels in the atmosphere “it would only reduce global temperatures by 0.06 degrees Celsius by 2050.  Such a small amount is hardly even measurable” (#140).  Still more:  “‘No study to date  has positively attributed all or part [of the climate change observed] to anthropogenic causes’” (#706).  

So what’s actually taking place within the global warming scaremongering?  “Looking back, it is clear that the global warming debate was never really about saving the world; it was about controlling the lives of every American.  MIT climate scientist Richard Lindzen summed it up perfectly in March 2007 when he said ‘Controlling carbon is a bureaucrat’s dream.  If you control carbon, you control life’” (#440).  There’s no question that “progressives” from Woodrow Wilson to Barack Obama have striven to take control of our lives, purportedly to maximize pleasure and minimize pain for the public.  More broadly, to Vaclav Klaus, President of the Czech Republic:  “‘The global warming religion is an aggressive attempt to use the climate for suppressing our freedom and diminishing our prosperity.”  It is a “totally erroneous doctrine which has no solid relation to the science of climatology but is about power in the hands of unelected global warming activists ” (#19).  Klaus writes with an understanding of what European leaders such as French President Jacques Chirac envision when they tout the Kyoto treaty as “‘the first component of an authentic global governance’” (#553).  Equally perceptive, Canada’s Prime Minister Stephen Harper “called Kyoto a ‘socialist scheme’” (#561).  Consequently, Inhofe concludes:  “it is crystal clear that this debate was never about saving the world from man-made global warming; it was always about how we live our lives. It was about whether we wanted the United Nations to ‘level the playing field worldwide’ and ‘redistribute the wealth.’  It was about government deciding what forms of energy we could use” (#3280).  

Senator Inhoff’s book takes its title from his “Greatest Hoax” Senate speech, and he is deeply convinced that “global warming” or “climate change” is indeed a bogus scenario manufactured by liberal elites who “truly believe that they know how to run things better than any individual country ever could.  In this way they are like ‘super-liberals’ on an international scale.  On one of its websites, the UN even claims that its ‘moral authority’ is one of its ‘best properties’” (#653).  This moral authority apparently resides in the UN’s self-righteous commitment “to the utopian ideals of global socialism” (#653), frequently promoted as necessary for “sustainable development.   The spurious nature of this Hoax became clear when “Climategate, the greatest scientific scandal of our time broke” (#2319).  A careful reading of the emails between scientists in the UK and US (reprinted in considerable detail as an appendix to this treatise) reveals, in the words of Clive Crook:  “‘The closed-mindedness of these supposed men of science, their willingness to go to any lengths to defend a preconceived message, is surprising even to me.  The stink of intellectual corruption is overpowering’” (#2359).  

The Greatest Hoax is important primarily because of its author’s position in government.  Inhofe  has, to the degree possible for a busy politician, studied the evidence, assembled the data, and come to a reasoned conclusion regarding one of the most momentous issues of our day.  If the global warming alarmists are wrong, to follow their admonition can irreparably harm not only this nation but the world, plunging us into a cataclysmic economic and social black hole.

# # # 

242 Nancy Pearcey’s Apologetics

 As a restless, questing college student immersed in the relativism and subjectivism of her milieu, Nancy Pearcey found her way to Francis Schaeffer’s L’Abri Fellowship in Switzerland in 1971.  Here, for the first time, she found Christians (many of them long-haired hippies) seriously discussing and providing answers to the “Big Questions” she was asking.  Though reared in a Christian home and nurtured in a Lutheran church, she lacked the coherent, in-depth understanding of the faith Schaeffer set forth.  In subsequent decades, through advanced academic work, personal study and reflection, she established herself through a variety of publications as a thoughtful exponent of an orthodox evangelical worldview, writing for a popular audience.  She skillfully laces her discussions with quotations and illustrations, both personal and historical, making them accessible to all thoughtful readers.  Though at times overly simplistic (too easily reducing all issues to a “two-storey” graphic) and superficial (sharing Schaeffer’s distaste for  significant Catholic thinkers), she still provides helpful guidance in charting a meaningful framework for understanding and responding to our world.  Her most recent treatise, Saving Leonardo:  A Call to Resist the Secular Assault on Mind, Morals, & Meaning (Nashville:  B&H Publishing Group, c. 2010), continues her helpful endeavor to engage the culture from a Christian perspective.  

She begins by evaluating the everywhere-evident “threat of global secularism,” a massive cultural current transforming our world, primarily through our educational and artistic milieu.  Though often oblivious to its subtly and power, we Christians must awaken to its threat.  Following the example of Early Church thinkers, we must “address, critique, adapt, and overcome the dominant ideologies of our day” (p. 14) bearing in mind J. Gresham Machen’s maxim:  “‘False ideas are the greatest obstacle to the reception of the gospel’” (p. 15).  To Pearcey—as to John Henry Newman—the false idea of modern secularism is its reduction of all truth (including metaphysical, theological and moral truth) to personal preference.  “Whatever works for you” goes the modern mantra!   To which Christians must respond:  Absolutely Not!  Rightly grasped, Christianity is not primarily a personal perspective nor an inner feeling of peace and optimism, but a trustworthy knowledge of what really IS.  This means we cannot accept the fact/value distinction that frequently dominates worldview discussions.  Many modern thinkers insist that whereas they deal objectively with scientific “facts” all ethical “values” are subjective.  Unfortunately numbers of “Christian” thinkers embrace this disjunction.  Following Schaeffer’s lead, however, Pearcey insists this two-storey view cannot but fail one seeking for an integrated philosophy.  What must be recovered, she says, is a pre-Enlightenment perspective, seeing an interrelated symbiosis of natural and spiritual realities equally authored by an all-wise Creator.  

Having alerted us to the secularist threat, Pearcey gives us a “crash course on art and worldview”—nicely (if not lavishly) illustrated with scores of reprints in this well-appointed volume—that  that helps explain how it emerged during the past two centuries.  Throughout the ancient, medieval, and Renaissance-Reformation eras, styles changed but the underlying purpose endured:  highlighting beauty that reveals truth about God, man and nature, both visible and invisible.  As Walker Percy, one of America’s finest 20th century novelists, declared, “art is ‘a serious instrument for the exploration of reality.’  It is ‘as scientific and cognitive as, say, Galileo’s telescope or Wilson’s cloud chamber’” (p. 99).  

Enlightenment intellectuals, however, restricted “truth” to natural science.  Consequently, “art is merely decorative.  Ornamental.  Entertaining.  Isaac Newton called poetry ‘ingenious nonsense.’  . . . .  Hume denounced poets as ‘liars by profession.’  Philosopher Jeremy Bentham agreed:  ‘All poetry is misrepresentation’” (p. 98).  It might soothe one’s inner turmoil or exalt one’s expectations, but it reveals  nothing about anything ultimately real.  So too, many thought, for religion.  But revolutionary 18th century developments sparked not only political upheavals such as the French Revolution but artistic celebrations of highly individualistic and Romantic perspectives.  While scientists may well weigh and measure the external world of nature (how things are) artists insisted on freely imagining how things might or ought to be.  Thus, by the end of the 19th century, movements such as “impressionism” and “cubism” flourished as monuments to this disconnect between art and objective reality.  Rather than representing Reality like a photograph, Romantic art serves as a film projector in a theater, casting images on the screen, and throughout the 20th century, as Pearcey persuasively illustrates, this conviction intensified.  

As an antidote to these developments, Pearcey recommends a recovery of great Christian artistic works—including the music of Bach, the “fifth gospel,” which is, amazingly, quite popular in Japan.  Resulting from the work of Masaaki Suzuki, a famous conductor, thousands of Japanese have learned to play and appreciate the work of the gifted Baroque composer.  “‘Bach works as a missionary among our people,’ Suzuki said in an interview.  ‘After each concert people crowd the podium wishing to talk to me about topics that are normally taboo in our society—death, for example.  Then they inevitably ask me what “hope” means to Christians.’  He concluded:  ‘I believe that Bach has already converted tens of thousands of Japanese to the Christian faith’” (p. 267).   And along with recovering great art we need to cultivate a high quality art absent in the popular culture.  In our churches and homes we need to powerfully represent the Gospel story, shunning the “spiritual junk food” and “sentimentalism” that so frequently masquerades as Christian “music” and “art.”    

To R. Albert Mohler, Jr., President of the Southern Baptist Theological Seminary, “Nancy Pearcey helps a new generation of evangelicals to understand the worldview challenges we now face and to develop an intelligent and articulate Christian understanding . . . Saving Leonardo should be put in the hands of all those who should always be ready to give an answer—and that means all of us.”  

* * * * * * * * * * * * * * * * * * *

During the past century, the cultural consequences of taking natural science as the sole guide to truth have been increasingly (indeed alarmingly) evident.  Illustrating this trend, Eric Temple Bell, a professor at the California Institute of Technology and former president of the Mathematical Association of America, declared that modern (i.e. non-Euclidean) geometry makes mathematics and logic, as well as metaphysics and ethics, endlessly tentative, asserting, in The Search for Truth, that there is no such thing as “Truth.”  Trashing Euclid and Plato, Aristotle and Aquinas—all of whom “forged the chains with which human reason was bound for 2,300 years”—Bell celebrated the brave new world of modernity freed from any illusions regarding absolutes of any sort.  Consequently, as Pope Benedict XVI noted in his inaugural  message, we struggle with the “dictatorship of relativism” that renders all certainties suspect.  

Rightly responding to such “modern” views, Nancy Pearcey supported, in Total Truth:  Liberating Christianity from Its Cultural Captivity (Wheaton:  Crossway Books, 2003), Francis Schaeffer’s position as articulated in his 1981 address at the University of Notre Dame:  “‘Christianity is not a series of truths in the plural, but rather truth spelled with a capital “T.”  Truth about the total reality, not just about religious things.  Biblical Christianity is Truth concerning total reality—and the intellectual holding of that total Truth and then living in the light of that Truth’” (p . 15).  Most needed, Schaeffer and Pearcey insist, is a Christian “worldview” that fundamentally shapes the lives of millions of ordinary believers, thus transforming their culture.  “A worldview is like a mental map that tells us how to navigate the world effectively,” Pearcey says.  “It is the imprint of God’s objective truth on our inner life” (p. 23).  

Unfortunately, too many Christians (Evangelicals included) have reduced their faith to a purely internal, “spiritual” realm disconnected from the physical and social worlds.  In the opinion of Sidney Mean, a distinguished historian:  “‘This internalization or privatization of religion is one of the most momentous changes that has ever taken place in Christendom’” (p. 35).  Unfortunately, as Charles Malik noted, we need “‘not only to win souls but to save minds.  If you win the whole world and lose the mind of the world, you will soon discover you have not won the world’” (p. 63).  Thus pastors and teachers should do apologetics as well as preach salvation.  “Every time a minister introduces a biblical teaching, he should also instruct the congregation in ways to defend it against the major objections they are likely to encounter.  A religion that avoids the intellectual task and retreats to the therapeutic realm of personal relationships and feelings will not survive in today’s spiritual battlefield” (p. 127).  

Despite their many positive accomplishments—and Pearcey generously notes them—American Evangelicals may not survive today’s battles unless they take ideas seriously.  Though there was a scholarly (largely Calvinistic) dimension to 19th century Evangelicalism, the movement became, in revivalists’ hands,  inordinately populist, more concerned with converging the masses than cultivating their minds.  Believers  were, then, unprepared to resist and refute powerful and anti-Christian ideas propounded by Darwin, Marx, Freud, et al.  “The overall pattern of evangelicalism’s history is summarized brilliantly by Richard Hofstadter in a single sentence.  To a large extent, he writes, ‘the churches withdrew from intellectual encounters with the secular world, gave up the idea that religion is a part of the whole life of intellectual experience, and often abandoned the field of rational studies on the assumption that they were the natural province of science alone’” (p. 323).  

So it’s now time, Pearcey declares, to regain lost ground, to reestablish a Christian perspective, to redeem minds as well as hearts.  To do so, Christians need to structure their worldview in accord with three guiding certainties:  creation; fall; redemption.  An originally good creation has been scarred by sin, but God’s gracious redemptive work in Christ has (to a degree) restored His original intent.  Every worldview requires a creation story.  Materialists, ancient and modern, explain the world in terms of mindless matter-in-motion, and pantheists, whether Stoics or “process” thinkers, endow Nature with divine attributes.  Every worldview includes an explanation for the evil surrounding us—inadequately evolved species or demonic social institutions or malignant genes.  And every worldview promises redemption, whether through scientific breakthroughs or political revolutions or inner enlightenment.  Thus, as John Milton wrote, “the goal of learning ‘is to repair the ruins of our first parents’” (p. 129).  

To make a Christian case for Creation, Pearcey says we must deal cogently with Darwinism, stressing that it is, as Huston Smith said, “‘supported more by atheistic philosophical assumptions than by scientific evidence’” (p. 153).  Excluding any possibility of God, as Carl Sagan declared, “Nature . . . is all that IS, or WAS, or EVER WILL BE!”  Adamantly upholding this assumption, naturalistic Darwinism has become a “universal acid” eating away many of the most fundamental cultural certainties basic to Western Civilization.  “Half a century ago G.K. Chesterton was already warning that scientific materialism had become the dominant ‘creed’ in Western culture—one that ‘began with Evolution and has ended in Eugenics.’  Far from being merely a scientific theory, he noted, materialism ‘is really our established Church’” (p. 157).  Consequently:  “‘The so-called warfare between science and religion,’ wrote historian Jacques Barzun, should really ‘be seen as the warfare between two philosophies and perhaps two faiths.’  The battle over evolution is merely one incident ‘in the dispute between the believers in consciousness and the believers in mechanical action; the believers in purpose and the believers in pure chance’” (p. 173).  

The deleterious and far-reaching cultural impact of Darwinism stands illustrated by Joseph Stalin who, as a seminary student, lost his faith in God after encountering Darwin’s theory.  Subsequently he imposed his murderous form of atheism upon a large swathe of the world.  Less murderously, American thinkers—particularly pragmatists such as John Dewey and Oliver Wendell Holmes, Jr.—launched an effective assault on many of the traditions vital to this nation.  To them, all “truths” evolve in accord with naturalistic evolution and thus no permanent standards of right and wrong, in any area of life, actually exist.   Everyone “constructs” his own reality and writes his own rules.  Darwin himself realized, and feared, this logical outgrowth of his theory, confessing, in a letter:  “‘With me, the horrid doubt always arises whether the convictions of man’s mind, which has been developed from the mind of the lower animals, are of any value or at all trustworthy’” (p. 243).  

To refute Darwinism, Pearcey follows the lead of Philip Johnson, the U.C. Berkeley law professor who wrote Darwin on Trial and Reason in the Balance and helped launch the “intelligent design” movement.  Rather than bog down in secondary details regarding the age of the earth or the reality of microevolution, Christians need to “focus on the crucial point of whether there is evidence for Intelligent Design in the universe” (p. 174).  Importantly, we must grasp the significance of recent scientific insights, summed up by John Wheeler, a noted physicist, who said:  “‘When I first started studying, I saw the world as composed of particles.  Looking more deeply I discovered waves.  Now after a lifetime of study, it appears that all existence is the expression of information’” (p. 179).  The cosmos, it seems, is not ultimately mindless matter-in-motion; it is, rather, imprinted with an immaterial pattern, bearing information, or (as Christians have always believed) a Logos responsible for the design everywhere evident.  Just ponder for a moment the widely-heralded fact that every cell in your body contains as much information as 30 volumes of the Encyclopedia Britannica!  This Logos-structured world is increasingly evident as we begin to grasp the majesty of DNA, aptly defined by President Bill Clinton as “‘the language in which God created life’” (p. 1919).   

Darwin himself recognized this manifest design but sought to discount it as only “apparent,” not real.  Similarly, his modern apostle, Richard Dawkins, admits (in The Blind Watchmaker) that “‘Biology is the study of complicated things that give the appearance of having designed for a purpose’” (p. 183).   But be not deceived, he insists, it’s all an illusion spun by the random  machinations of natural selection.  Such statements, Pearcey shows, pervade evolutionary literature, imploring us all to ignore common sense and believe the sacrosanct theory launched by Darwin.  Outside the faithful community of naturalistic evolution, however, we find alternatives expressed by thinkers such as Nobel Prize-winner Arno Penzias, who says:  “‘Astronomy leads us to a unique event, a universe which was created out of nothing, one with the very delicate balance needed to provide exactly the conditions required to permit life, and one which has an underlying (one might say “supernatural”) plan.’  In fact, he says, ‘The best data we have are exactly what I would have predicted, had I nothing to go but the five books of Moses, the Psalms, the Bible as a whole’” (p. 189).  

Thus, Pearcey argues, the Bible contains the necessary ingredients for a coherent worldview.  Taking it seriously, living in accord with its precepts, gives us the basis for cultural activities.  Interested readers may begin this endeavor by consulting the annotated “recommended reading” list she supplies.  

                                                           * * * * * * * * * * * * * * 

For several years Nancy Pearcey worked with the late Chuck Colson, doing much of the research underlying his BreakPoint radio program and coauthoring his monthly column in Christianity Today.   Re-examining their coauthored  How Now Shall We Live (Wheaton:  Tyndale House Publishers, Inc., c. 1999), one suspects that Pearcey did the bulk of the research and preliminary writing with Colson adding his personal touch (scores of personal anecdotes, mostly taken from his prison ministry) and imprimatur.  This is especially evident in the book’s structure (basically duplicated, with added scholarly references in Pearcey’s subsequent Total Truth), stressing four themes:  Creation (where did I come from?); Fall (what’s wrong with me and the world?); Redemption (is there any hope?); and Restoration (how can we help repair what’s wrong?).  A probing query from Ezekiel, struggling in Babylon during the exile, sets the stage:  “How should we then live?” (33:10, KJV).  

We should live, Colson and Pearcey answer, by crafting and following a biblical worldview, empowered by the realization, as St. Hippolytus said in the third century, that when Jesus ascended “‘His divine spirit gave life and strength to the tottering world, and the whole universe became stable once more, as if the stretching out, the agony of the Cross, had in some way gotten into everything’” (p. 13).  Thus everything, rightly understood, points to and leads to the Christ.  All truth is God’s truth!  As the great astronomer Johannes Kepler declared:  “‘The chief aim of all investigations of the external world should be to discover the rational order and harmony which has been imposed on it by God’” (p. 51).  

Our challenge, as Christians, is to both discover and proclaim this truth in a world increasingly skeptical of all truth claims, frequently claiming to find neither rational order nor harmony anywhere.  Such skepticism grows logically from the philosophical naturalism dominant in our culture and on display in school textbooks, PBS programs, and the EPCOTT Center in Disney World.  In accord with the “just so” naturalistic story, for billions of years things just happened without design or purpose.  Lifeless chemicals somehow ignited biological beings (protons and molecules) mysteriously structured by DNA strands.  And we human beings are nothing more than enlivened chemicals, inexplicably endowed with both consciousness and self-awareness and conscience.  This is the materialist creed; consequently, as C.S. Lewis said:  “‘The Christian and the materialist hold different beliefs about the universe.  They can’t both be right.  The one who is wrong will act in a way which simply doesn’t fit the real universe’” (p. 307).  

To Colson and Pearcey, the Christian worldview enables us to live wisely and well in the real universe, for “Christianity gives the only viable foundation for intellectual understanding and practical living” (p. 489).  Whatever our gifts, whatever our vocation, we may play a vital role in God’s work so long as we do it Soli Deo Gloria—only to the glory of God!  We especially need to give attention to our homes, churches, and neighborhoods and schools, bearing in mind the words of the tempter in C.S. Lewis’s The Screwtape Letters:  “‘What I want to fix your attention on is the vast overall movement towards the discrediting, and finally the elimination of every kind of human excellence—moral, cultural, social, or intellectual’” (p. 331).  Our children and friends, as well as the world, need godly (i.e. good) music and literature, art and philosophy, films and TV.  Rather than bring Rock & Roll into the sanctuary we need to take Bach into the marketplace!  Rather than giving children video games, offer them Lewis’s Narnian chronicles and J.R.S. Tolkien’s trilogy of the Rings.  

People also need to live in neighborhoods where broken windows are repaired and playgrounds are safe, where laws are enforced and elders respected.  So well-informed political action (e.g. voting!), particularly on a local level, should be part of the Christian vocation.  Rather than trying to ape (and somehow Christianize) the world we should seek to transform it with divinely-rooted truths.  Thereby we will implement the vision of Francis Schaeffer, to whom this treatise is dedicated, volunteering for service in Christ’s corps of intellectual warriors, contending for the Faith once delivered to the saints.  

241 “Dupes” and “The Communist”

Writing Witness, Whittaker Chambers—one of the most celebrated 20th century repentant Communists who helped expose Alger Hiss and other Soviet spies—reflected on his years working within the Party:  “While Communists make full use of liberals and their solicititudes, and sometimes flatter them to their faces, in private they treat them with the sneering contempt that the strong and predatory almost invariably feel for the victims who volunteer to help in their own victimization.”  Such malleable liberals are analyzed in depth by Cold War historian Paul Kengor, in Dupes:  How America’s Adversaries Have Manipulated Progressives for a Century (Wilmington:  ISI Books, c. 2010).  He skillfully documents the prescience of Norman Thomas, the perennially nominated presidential candidate of the Socialist Party, who said Americans could not be persuaded to candidly establish “socialism” but under the aegis of  “liberalism” would gradually put it in place “‘without ever knowing how it happened’” (p. 479).  

Due to the collapse of the USSR, historians such as Kengor have profited from “the massive declassification of once-closed Cold War archives, from Moscow to Eastern Europe to the United States” (p. 3), and the archival materials depicting the Communist Party USA are especially illuminating.  Though American Progressives were not Communists, they fully supported the Communist wish list:  “workers’ rights, the redistribution of wealth, an expansive federal government, a favoring of the public sector over the private sector, class-based rhetoric (often demagoguery) toward the wealthy, progressively high tax rates, and a cynicism toward business and capitalism, to name a few.  The differences were typically matters of degree rather than principle” (p. 4).  To Kengor, however, such degrees of difference really matter, for by assisting the Communist movement they “knowingly or unknowingly contributed to the most destructive ideology in the history of humanity.  This is no small malfeasance” (p. 11).  

Following the Bolsheviks’ triumph in Russia, Communists organized two Chicago subsidiaries which soon merged into the Communist Party USA (CPUSA) that for 50 years dutifully followed edicts from Moscow.  To bring about revolution in America, however, Communists needed to strategically misrepresent themselves and subtly subvert the nation’s social and economic structures.  Thus they encouraged and showcased “Potemkin Progressives” such as Corliss Lamont—an atheistic “humanist” professor at Columbia University who helped lead the Friends of the Soviet Union and in 1933 wrote Russia Day by Day to celebrate the glories of the Soviet Union.  They also promoted the agenda of another Columbia professor , John Dewey—the renowned pragmatist philosopher who largely shaped the progressive educational agenda that has dominated America’s public schools for nearly a century.  His educational ideas incorporated significant swathes of Marxism and were actually implemented in Russia in the 1920s before being adopted by American schools.    

Invited to visit the USSR in 1928, Dewey was given the standard Potemkin village tour touting the grandeurs of Communism.  Returning home, he wrote glowing reports of what he’d seen, affirming that the  Bolshevik agenda was “a great success.”  He especially endorsed the Soviet schools, the “ideological arm of the Revolution,” which would lead to the success of “The Great Experiment and the Future” in Russia (pp. 90-99).  Dewey was not uninformed of the brutalities accompanying this great experiment and acknowledged its “secret police, inquisitions, arrests and deportations” (p. 100).  But he glibly rationalized them as necessary for the regime to prosper.  He would thus head a corps of influential intellectuals urging President Franklin D. Roosevelt to formally recognize and establish diplomatic relations with the USSR.  Soon thereafter, however, faced with mounting evidence regarding Stalin’s Great Purge and its massive bloodletting, Dewey bravely retracted his endorsement of the Stalinist regime.  He was especially distressed by Stalin’s attack on Leon Trotsky and joined a commission to Mexico to defend him.  For his apostasy the once-acclaimed philosopher was vilified and branded by Stalinists “an enemy of the people.”  

Though Stalinists distrusted and denounced President Roosevelt, they diligently sought to infiltrate his administration in the 1930s.  Because of his prominence and influence, Harry Hopkins—appointed by FDR to head the WPA, serving as his “right-hand man during World War II,” living in the White House and accompanying Roosevelt to the major WWII conferences—doubtlessly “stands as the most sensational case among the potential Soviet agents” (p. 124).  Newly available archival evidence demonstrates that Hopkins was in fact a Soviet spy who effectively duped the president on behalf of his “buddy,” Uncle Joe and manipulated programs such as Lend-Lease to benefit Russia.  Concurrently, FDR rejected warnings regarding Stalin and followed his “hunch” that the despot could be trusted.  Relying on Hopkins’ advice toward the end of the war, he believed Stalin wanted nothing “‘but security for his country, and I think that if I give him everything I possibly can and ask nothing from him in return, noblesse oblige, he won’t try to annex anything and will work with me for a world of democracy and peace’” (p. 165).  We now know, of course, that Stalin rapidly occupied Eastern Europe after WWII and provoked a Cold War that endured for 40 years, enabling the slaughter of millions of people and devastating the economies of dozens of countries.  FDR died ignorant of the havoc resulting from the fact that “from the start to the finish of his administration, the great New Dealer was greatly trashed, hated, and duped by Communists at home and abroad” (p. 181).  

Communists always recognized the power of propaganda, so using the arts—and especially the cinema as Lenin stressed—was imperative.  Accordingly, the shaping of Hollywood became a central objective for Soviet agents in America and they easily found multiplied dozens of easily duped liberal “stars.”  Playwrights such as Lillian Hellman, Dashiell Hammett, and Arthur Miller (whose The Crucible amply illustrates the process) provided the scripts.  Singers including Paul Robeson and actors such as Katharine Hepburn, Lucille Ball, Gregory Peck and Humphrey Bogart were easily enlisted in the “progressive” (i.e. communist) cause.  Consequently, in 1947 the House Committee on Un-American Activities summoned a number of Hollywood celebrities to testify.  Some, including Ronald Reagan (head of the Screen Actors Guild and “hero” of the hearings) and Gary Cooper, were “friendly witnesses” who documented and denounced Communist activities.  Unfriendly witnesses—most notably the “Hollywood Ten,” four of whom we now know were dedicated Communists)—parroted the Party line, labeling their critics as fascists on a witch-hunt and insisting there wasn’t a trace of communism in Hollywood.  Benefiting from the support of the press and powerful Democrat politicians (e.g. Claude “Red” Pepper of Florida), the propagandized public soon believed that it was the House Committee on Un-American Activities, rather than Communists in Hollywood, which was real threat to the nation!

When the Korean War erupted, Corliss Lamont and Lillian Hellman defended the North Koreans. When the United States joined the conflict in Vietnam protesters such as Tom Hayden (perennially elected from his Santa Monica base to assorted office in California), Dr. Benjamin Spock (author of a fabulously successful book on child-rearing), Arthur “Pinch” Sulzberger Jr. (in due time publisher of the New York Times), and John Kerry (the Democrat Senator and nominee for President in 2004), easily absorbed and articulated the Communist agenda.  Violent revolutionaries such as Bill Ayers worked for the defeat of the “American Empire” and the total transformation of America.  Many of these anti-war radicals in the ‘60s ultimately infiltrated and significantly shaped the Democrat Party, where they still exert influence. 

Resisting such radicals stood Ronald Reagan, fully aware of Communist strategies since his days as head of the Screen Actors Guild.   “For years as a private citizen and candidate, Reagan had fiercely opposed the accommodationist policy of detente and spoken frankly about the true nature of Soviet Communism” (p. 366).  To him, the USSR was an “evil empire” to be confronted and defeated.  Consequently, scores of “dupes” sought to deride and destroy him.  For example, Henry Steele Commager, an influential historian, called the President’s “evil empire” speech “‘the worst presidential speech in American history, and I’ve read them all.’  Why?  Because, said Commager, of “Reagan’s ‘gross appeal to religious prejudice’” (p. 393).  Senator Ted Kennedy chastised Reagan “for ‘misleading Red-scare tactics and reckless Star Wars schemes’” (p. 403).  Yet, says Kengor:  “As we now know from a highly sensitive KGB document, the liberal icon [Kennedy], arguably the most important Democrat in the country at the time, so opposed Ronald Reagan and his policies that the senator approached Soviet dictator Yuri Andropov, proposing to work together to undercut the American president” (p. 407).  

With the demise of the USSR and the emergence of radical Islam as the great threat to America, progressive “dupes” shifted gears while preserving their fundamental ideology.   Thus they opposed President George W. Bush’s Iraq policies.   Senate Democratic leader Harry Reid called him a “loser” and prominent politicians routinely labeled him a “liar.”  “Leftists in media and academia joined politicians like [Edward] Kennedy in attacking the White House” (p. 432).  A Columbia University historian—an avowed socialist and formerly president of the American Historical Association—Eric Foner declared:  “‘I’m not sure which is more frightening:  the horror that engulfed New York City or the apocalyptic rhetoric emanating daily from the White House’” (p. 432).  

The unexpected emergence of Barack Obama was quickly promoted by “Progressives for Obama,” spearhead by Tom Hayden, the leader of the Students for a Democratic Society in the ‘60s who had recruited a number of fellow-travelers such as Daniel Ellsberg and Jane Fonda (whom he married).  Admittedly, Obama is a “progressive liberal” rather than a SDS-style Marxist.   Yet “Hayden saw in Obama a long-awaited vehicle for ‘economic democracy,’ an instrument to channel an equal distribution of wealth—‘economic justice,’ or ‘redistributive change,’ as Obama himself once put it.  Hayden said that, ‘win or lose, the Obama movement will shape progressive politics . . . for a generation to come’” (p. 469).    Though Hayden has successfully operated within the Democratic Party in California, other ‘60s radicals (Bill Ayers and his wife, Bernardine Dohrn, Mark Rudd and Michael Klonsky) promote the cause within higher education.  Education, Ayers says, “‘is the motor-force of revolution’” (p. 475).  Ayers and Barack Obama worked together in Chicago to funnel money into the city’s schools so as to advance the cause of “social justice.”  Klonsky and Ayers co-authored an article that “raved about Arne Duncan, longtime head of Chicago public schools, whom the pair described as ‘the brightest and most dedicated schools leader Chicago has had in memory.’  Today Duncan is President Obama’s secretary of education” (p. 472).  

One of the aging radicals most thrilled with Obama’s election was a physician, Quentin Young, a long-term advocate of socialized medicine who had helped launch Obama’s career “in the living room of Bill Ayers and Bernardine Dohrn” (p. 477).  “Young noted that Obama, as a state senator in Illinois, had supported a ‘single-payer universal healthcare system’” that could be implemented when Democrats took complete control of Congress and the White House (p. 477).  Evaluating the 2008 election, Pravda declared “‘that like the breaking of a great dam, the American descent into Marxism is happening with breathtaking speed, against the backdrop of a passive, hapless sheep.’  That ‘final collapse,’ said the pages of the chief party organ of the former USSR, ‘has come with the election of Barack Obama’” (p. 478).  

For nearly a century communists have patiently worked behind the scenes, promoting their cause through progressive dupes.  Now, amazingly enough, in 2008 “Americans had voted CPUAS’s way:  the party could not contain its excitement over Obama’s victory.  The election of Barack Obama was the chance for a wish list to come true—a potential host of nationalizations, from the auto industry to financial services to health care, beginning with more modest steps like establishing the ‘public option’ in health-care reform, plus massive government ‘stimulus’ packages, more public-sector unionization and control, more redistribution of wealth, more collectivization.  ‘all these—and many other things—are within our reach now!’ celebrated Sam Webb in his keynote speech for the New York City banquet of People’s Weekly World, the official newspaper of CPUSA, which reprinted the speech under the headline ‘A New Era Begins.’  With the election of Obama, said Webb, the impossible’ had become ‘possible’” (p. 478).  A “century of dupery” has succeeded!  

* * * * * * * * * * * * * * * * * * * 

In The Communist:  Frank Marshall Davis:   The Untold Story of Barack Obama’s Mentor (New York:  Threshold Editions, c. 2012), Paul Kengor moves from the general story told in Dupes to a very particular case of a little-known, card-carrying Communist who significantly influenced our nation by helping shape the young Barack Obama while he was growing up in Hawaii.  Kengor’s purpose, however, is not to question Obama’s ideology or agenda.  “My purpose is to show that Frank Marshall Davis—who clearly influenced Obama—was a communist member and closet member of CPUSA with private loyalties to Mother Russia” (p. 18).  The story can now be more clearly told thanks to the treasure trove of documents now available following the collapse of the USSR.  Though Davis’s associates and pro-Soviet journalistic pieces elicited the attention of the House Committee on Un-American Activities following WWII, he and his defenders always denied he was actually a Communist.  But the propriety of the Committee’s concern has been validated by Davis’s recently-revealed admission that sometime during the war, he had “‘joined the Communist party’” (p. 92).  

Kengor tells the story of Davis, from his Arkansas City, Kansas, roots to his Chicago involvement (as a journalist) in Communist causes to his final days in Hawaii.  Three pivotal years in Atlanta in the early ‘30s, witnessing the notorious Scottsboro Boys trial (nine black boys were accused of raping two white girls), acerbated his anger with racism and receptivity to the CPUSA’s lavishly-funded Scottsboro propaganda campaign.    Returning to Chicago and initially working for the Associated Negro Press, he dove quickly into the intellectual waters colored by the views of “dupes” such as John Dewey and Margaret Sanger.  He also interacted with both secret members of the Party (such as the singer Paul Robeson) and more open devotees, including the celebrated writers Langston Hughes and Richard Wright (who later resigned from and lamented his support for the Party).  Following the CPUSA line they supported movements such as the American Peace Mobilization and promoted “progressive” causes of various hues.  (Though self-consciously communists, they invariably insisted on using the term “progressive” to define both themselves and their “social justice” objectives).  Davis also worked with prominent “progressive” black leaders in Chicago including Robert Taylor and Vernon Jarrett (one the maternal grandfather, the other the father-in-law of Valerie Jarrett, widely considered President Obama’s closest friend and adviser).  And he joined Harry Canter (subsequent to his years in Moscow) and his son, David, working with newspapers to advance worker’s unions; in due time the younger Canter mentored David Axelrod, who became Barack Obama’s political guru.  

In Chicago, “Frank Marshall Davis was increasingly involved in events sponsored or covertly organized by the communist left” (p. 88), teaching a History of Jazz for the Abraham Lincoln School (widely labeled the “little Red school house”), and joining assorted communist fronts.  Fortuitously, he was enabled to freely “uncork his opinions” in the pages of “a full-blown pro-CPUSA newspaper of his own lead and editorship:  the Chicago Star” (p. 104).  He especially vilified anti-Communist statesmen such as Winston Churchill and Harry Truman, closely following the “Party line, not questioning Stalin” (p. 103).  Recruited to write for the Star were communist luminaries such as Howard Fast, the Hollywood writer who was awarded the Stalin Prize in 1953.  Florida’s leftist Senator Claude “Red” Pepper also graced the paper’s pages, promoting his favorite cause:  socialized medicine.  Pepper’s chief-of-staff, Charles Kramer, whom we now know was a Soviet agent, both “handed over important information to the USSR” and wrote a bill to create a National Health Program (p. 121).  And Lee Pressman, another Soviet agent and “close colleague” of both Kramer and Alger Hiss, added his weight to the Star’s roster of writers.  These writers, of course, “co-opted the ‘progressive’ label, claiming to be merry liberals” simply devoted to fulfilling the American dream of “social justice” (p. 125). 

Then Davis abruptly left his beloved Chicago in 1948, moving to Hawaii, where he wrote a weekly column for the Honolulu Record, a Communist paper, and worked closely with Harry Bridges, the “progressive” leader of the International Longshoremen’s and Warehousemen’s Workers Union (ILWU).   Though oft-celebrated by the likes of Nancy Pelosi, the ILWU was “one of the most communist-penetrated and –controlled unions of the time” (p. 145).  While Davis claimed to receive no money from the Record, there is every reason to believe he was generously subsidized by the CPUSA in Hawaii, where it was hoped a “mass revolutionary movement” would establish “a satellite in the Soviet orbit” (p. 150).  With his pen Davis was strategically placed to assume a pivotal role in Stalin’s strategy in the Pacific.  So he consistently attacked President Truman, the Marshall Plan, and America’s military excursions in Asia (Korea; Vietnam).   A recently opened 600 page FBI file on Davis reveals that he also took numerous telephoto pictures of Hawaii’s shoreline.  Consequently, he was listed as a security threat on the government’s Security Index, joining a select group of folks deemed highly dangerous to the nation.  

Little actually came of the CPUSA plan for Hawaii, and an aging Frank Davis slipped into the obscurity of retirement.  Yet though he accomplished little as a journalist he left a larger imprint on the world through his acquaintance with Stanley Dunham, Barack Obama’s grandfather, with whom he enjoyed drinking and playing poker.  As is clear in Dreams from My Father, wherein “Frank” frequently appears, young Barack Obama (desperate for male guidance) easily slipped within Davis’s sphere of influence as he sought to define himself.  “‘Away from my mother, away from my grandparents, I was engaged in a fitful struggle.  I was trying to raise myself to be a black man in America, and beyond the given of my appearance, no one seemed to know exactly what that means’” (p. 233).  But Frank Davis provided some clues—and a reading list of radicals such as Franz Fanon.   Consequently, Kengor says:   “Frank remained a thread in the life and mind of Obama” (p. 237).  Thus, when he arrived in California as a college student at Occidental, he was considered a “fellow believer” by one of his then-Marxist friends, John Drew who, in a recent interview with Kengor recalled that:  “‘Obama was definitely a Marxist and that it was very unusual for a sophomore at Occidental to be as radical or as ideologically attuned as young Barack Obama was’” (p. 251).  

In sum:  “The people who influence our presidents matter” (p. 298).  To understand President Obama we need to weigh the role of his “mentor,” Frank Marshall Davis, in his formation.  The Communist thus provides essential information in evaluating him.  

240 Logic of Liberty

Michael Polanyi (1891-1976), a distinguished Hungarian chemist who immigrated to England in the 1930s and devoted his mature years to philosophical inquiry, was one of the premier thinkers of the 20th century.  His Personal Knowledge remains an epistemological classic, especially appealing to philosophical scientists such as John Polkinghorne.  Similarly significant is his The Logic of Liberty:  Reflections and Rejoinders (Chicago:  The University of Chicago Press, c. 1951), an eloquent defense of both personal and political freedom as necessary for reason and social well-being.  Appalled by the claims of reductionistic materialists—for whom thought is nothing more than mechanical or chemical reactions in brain—he proposed, in a series of essays, to demonstrate that the necessity of the freedom requisite in scientific inquiries is equally needed in other realms.  

Unfortunately, the advance of science during the past two centuries was too frequently accompanied by a philosophical Positivism that “believed implicitly in the mechanical explanation of all phenomena” (p. 11) and denied any non-empirically demonstrable truths.  Thus virtues such as courage and wisdom and standards regarding beauty and goodness were considered purely subjective preferences or social conventions.  Rightly understood, however:  “Science or scholarship can never be more than an affirmation of the things we believe in.  These beliefs will, by their very nature, be of a normative character, claiming universal validity; they must also be responsible beliefs, held in due consideration of evidence and of the fallibility of all beliefs; but eventually they are ultimate commitments, issued under the seal of our personal judgment.  To all further critical scruples we must at some point finally reply:  “‘for I believe so’” (p. 31).  Choosing what we believe to be true necessarily follows a process of the free inquiry best illustrated by the scientific community.  One person seeks to solve a problem or explain a phenomenon and, having satisfied his own mind, informs others of his discovery.  His thesis is then tested by qualified scholars and, if found true, accepted by them.  A creative thinker like Einstein set forth novel ideas—but he needed the assent of his peers to establish the truth of his position.  “This unity between personal creative passion and willingness to submit to tradition and discipline is a necessary consequence of the spiritual reality of science” (p. 40).  Only free thinkers, working within a free society, can effectively advance science, as its still-birth or demise in totalitarian societies clearly reveals.  

The same is true in economics, wherein it is clear that “the central planning of production” so celebrated in socialist circles “is strictly impossible” (p. 111).  This was evident when Soviet planners tried to collectivize Russian agriculture, dictating “the scope and the kind of cultivation to be practiced on every one of the twenty-five millions of peasant farms’” (p. 131).  Disastrous consequences followed—famines, rebellions, and the slaughter of recalcitrant Ukrainian kulaks.  “Lenin’s attempt to replace the functions of the market by a centrally directed economic system caused far greater devastation than he worst forms of laissez faire ever did” (p. 169).  There is in fact a “spontaneous order based on persuasion” basic to both scientific and economic development; any effort to dictate truths or policies cannot but fail in markedly  destructive ways.  Setting forth sophisticated mathematical reasons, especially emphasizing the importance of “polycentricity” profoundly evident in the “postural reflexes which keep us in equilibrium while sitting, standing or walking”, Polanyi shows why multitudes of free persons making decisions are always better than bureaucrats dictating policies (p. 176.)  This truth ultimately dawned on Trotsky, one of the leaders of the Bolsheviks, who “declared that it would require a Universal Mind as conceived by Laplace to make a success of such as system” (p. 126).  

Still more:  Polanyi was distressed by two 20th century developments ultimately destructive to the good society—skepticism and utopianism, wherein “an utter disbelief in the spirit of man is coupled with extravagant moral demands” (p. 4).  Europe was ravaged by nihilistic, revolutionary humanitarians because a deep-seated “skepticism had destroyed popular belief in the reality of justice and reason” (p. 5).  Compassion became a political platform (“social justice”) rather than an individual virtue, and it “was turned into merciless hatred and the desire for brotherhood into deadly class-war” (p. 5).  This was evident in super-planners like Friedrich Engels, who declared “that men ‘with full consciousness will fashion their own history’ and ‘leap from the realm of necessity into the realm of freedom,’” thereby showing “the megalomania of a mind rendered unimaginative by abandoning all faith in God.  When such men are eventually granted power to control the ultimate destinies of their fellow men, they reduce them to mere fodder for their unbridled enterprises” (p. 199).  

* * * * * * * * * * * * * * * * * * * * * 

While better known for his classic anti-socialist manifesto, The Road to Serfdom, Friedrich A. Hayek’s magnum opus is doubtless The Constitution of Liberty (Chicago:  The University of Chicago Press, c. 1960), judged by Henry Hazlitt in a Newsweek review as “one of the great political works of our time . . . the twentieth century successor to John Stuart Mill’s essay, ‘On Liberty,’”  While recognized and awarded a Nobel Prize as an economist, Hayek sought in The Constitution of Liberty to address “the pressing social questions of our time” and provide “a comprehensive restatement of the basic principles of a philosophy of freedom” (p. 3).  Citing John Stuart Mill, he wrote with this conviction:  “It is impossible to study history without becoming aware of ‘the lesson given to mankind by every age, and always disregarded—that speculative philosophy, which to the superficial appears a thing so remote from the business of life and the outward interest of men, is in reality the thing on earth which most influences them, and in the long run overbears any influences save those it must itself obey’” (p. 113).  

To provide a philosophical foundation for liberty is imperative, for “liberty is not merely one particular value but . . . the source and condition of most moral values” (p. 6).  He wrote with the tragic awareness that for nearly a century people around the world had been embracing “western ideals at a time when the West had become unsure of itself and had largely lost faith in the traditions that have made it what it is.  This was a time when the intellectuals of the West had to a great extent abandoned the very belief in freedom which, by enabling the West to make full use of those forces that are responsible for the growth of all civilization, had made its unprecedented quick growth possible” (p. 21).   

In the first of the book’s three parts, Hayek asserts “the value of freedom.”  Rightly defined, freedom is neither political nor metaphysical, neither the power to do things celebrated by political progressives nor the inner freedom of the will noted by theologians.  It is, quite simply, the “independence of the arbitrary will of another” (p. 12).  Individuals freely thinking and making decisions, freely cooperating with other individuals doing the same, enable civilization to develop and thrive.  Every individual is fallible and limited, so no one has the knowledge and wisdom necessary to dictate policies, but thinking and acting together we accomplish what is best for mankind.  “The argument for liberty is not an argument against organization, which is one of the most powerful means that human reason can employ, but an argument against all exclusive, privileged, monopolistic organization, against the use of coercion to prevent others from trying to do better” (p. 37).  It was aptly summarized by Cato, who Cicero says believed “the Roman constitution was superior to that of other states because it ‘was based upon the genius, not of one man, but of many; it was founded, not in one generation, but in a long period of several centuries and many ages of men.  For, said he, there never has lived a man possessed of so great a genius that nothing could escape him, nor could the combined powers of all men living at one time possibly make all the necessary provisions of the future without the aid of actual experience and the test of time’” (p. 57).  

The liberty Cato celebrated and Hayek defends developed in 18th century England, rooted in “an interpretation of traditions and institutions which had spontaneously grown up and were but imperfectly understood” (p. 54).  Unlike the utopianism evident in Rousseau and spawned by the French Revolution, culminating in various Sparta-style totalitarian democracies, British thinkers such as Adam Smith, Edmund Burke, and William Paley took an empirical, historical approach, locating “‘the essence of freedom in spontaneity and the absence of coercion’” rather than “‘in the pursuit and attainment of an absolute collective purpose’” (p. 56).  The French approach, however, implementing Rousseau’s reliance on the “general will,” promoted “popular sovereignty and even declared “the voice of the people is the voice of God,” British (and American thinkers such as Madison) sought to limit the power of passing majorities.   Thus, as Burke continually insisted, a respect for tradition safeguards freedom’s flourishing.  Unfortunately, during the 20th century “the French tradition has everywhere progressively displaced the English” (p. 55).  Even in 19th century England, as Benthamite Philosophical Radicals displaced the Whigs in shaping liberalism, the democratic (Fabian) socialism installed by Clement Atlee in 1940 gained ground.   

In socialist systems individuals cede to the state the responsibility for many things (employment, education, health care, etc.), whereas in free societies they take responsibility for their actions.  “Liberty and responsibility are inseparable” (p. 71).  Basic to liberty is “finding a sphere of usefulness, an appropriate job”—perhaps “the hardest discipline that a free society imposes on us.  It is, however, inseparable from freedom, since nobody can assure each man that his gifts will be properly used” (p. 80), and it is up to the individual person to discern and develop his talents.  Nothing we do outweighs the importance of finding and following one’s vocation, playing a productive role in the world.  Importantly:  “In a free society a man’s talents do not ‘entitle’ him to any particular position” (p. 82).  Nor does a free society insure him against failure or distress.  “All that a free society has to offer is an opportunity of searching for a suitable position, with all the attendant risk and uncertainty which such a search for a market for one’s gifts must involve” (p. 82).  

Taking responsibility for themselves, lovers of liberty resist any form of “equality” other than “equality before the law” (p. 85).  We are, as human beings, remarkably different in physique, intelligence, aptitude, ambition, inheritance, etc.  Endeavoring to eliminate such distinctions in the guise of establishing equality is to violate the natural order of things, for the “boundless variety of human nature—the wide range of differences in individual capacities and potentialities—is one of the most distinctive facts about the human species” (p. 86).  Egalitarians committed to abolishing such differences inevitably propose leveling everyone, through governmental mandates, to a common plane by redistributing the wealth, regulating behaviors, subsidizing failures, providing for old age, and excusing assorted deviancies.  “The modern tendency to gratify this passion is to disguise it in the respectable garment of social justice is developing into a serious threat to freedom” (p. 93), whereas a bona fide understanding of justice restricts it to the virtue of giving others what is due them.  “It is,” Hayek says, “one of the great tragedies of our time that the masses have come to believe that they have reached their high standard of material welfare as a result of having pulled down the wealthy, and to fear that the preservation or emergence of such a class would deprive them of something they would otherwise get and which they regard as their due” (p. 130).  

In part two of the book, “freedom and the law,” Hayek critiques all kinds of the coercion.  “True coercion occurs when armed bands of conquerors make the subject people toil for them, when organized gangsters extort a levy for ‘protection,’ when the knower of an evil secret blackmails his victim, and, of course, when the state threatens to inflict punishment and employ physical force to make us obey its commands” (p. 137).   Notably, there is a difference between commands and laws!  Whereas commands negate personal freedom, good laws preserve and enable it to thrive.  Commands (whether issued by Czars in Russia or the White House) privilege cronies and impair adversaries; laws (enforced by judges) insure the even-handed enforcement of policies and adjudication of disputes.  In his Second Treatise on Government John Locke said:  “The end of the law is, not to abolish or restrain, but to preserve and enlarge freedom.  For in all the states of created beings capable of laws, where there is no law there is no freedom.  For liberty is to be free from restraint and violence from others; which cannot be where there is no law:  and is not, as we are told, a liberty for every man to do what he lists.”  Embracing Locke’s understanding, Hayek says:  “The conception of freedom under the law that is the chief concern of this book rests on the contention that when we obey laws, in the sense of general abstract rules laid down irrespective of their application to us, we are not subject to another man’s will and are therefore free.  It is because the lawgiver does not know the particular cases to which his rules will apply, and it is because the judge who applies them has no choice in drawing the conclusions that follow from the existing body of rules and the particular facts of the case, that it can be said that laws and not men rule” (p. 153).  

Lex, Rex!  Law is King!  Locke and other English (Whig) thinkers insisted that the rule of law liberates individuals.  Still more:  a written constitution and the separation of powers guarantees their freedom.  Both distinguished the United States of America at its inception.  In his History of Freedom Lord Acton noted:  “Europe seemed incapable of becoming the home of free States.  It was from America that the plain ideas that men ought to mind their own business, and that the nation is responsible to Heaven for the acts of State . . . burst forth like a conqueror upon the world they were destined to transform, under the title of the Rights of Man.”  Unlike the English, the Revolutionary colonists acknowledged that we are “endowed by our Creator with certain unalienable rights” and inscribed their convictions on paper.  Thus no one (presidents and judges and passing congressional majorities included) can arbitrarily force free Americans to submit to governmental mandates.  No one reading America’s founding documents can avoid concluding that limiting governmental power was their objective.  “Thus the great discovery was made of which Lord Acton later said:  ‘Of all checks on democracy, federalism has been the most efficacious and the most congenial. . . .  The Federal system limits and restrains sovereign power by dividing it, and by assigning to Government only certain defined rights.  It is the only method of curbing not only the majority but the power of the whole people, and it affords the strongest basis for a second chamber, which has been found essential security for freedom in every genuine democracy’” (p. 184).  The Ninth and Tenth Amendments to the Constitution accentuated this commitment.  

These Amendments were blatantly ignored by the progressive architects of the Fair Deal, New Freedom, New Deal, and Great Society in the 20th century.  In 1933, an “extraordinary” man, FDR—equipped with an “attractive voice and limited mind”—believed he knew what the country needed.  He “conceived it as the function of democracy in times of crisis to give unlimited powers to the man it trusted, even if this meant that it thereby ‘forged new instruments of power which in some hands would be dangerous’” (p. 190).  Constitutional principles were cast aside in order to “get the country moving.”  The “liberalism” of the Founding Fathers, which called for limited government, became the coercive democratic “Liberalism” of the Democrat Party.  Largely lost in that process was the liberty won in the American Revolution.  Thus 20th century America slowly moved in the direction of European Continent, where “absolute” governments in France and Prussia had “destroyed the traditions of liberty” (p. 193) in favor of an administratively imposed socialistic “equality.” 

This process has been facilitated by intellectual developments designed to “discredit the rule of law” and support “a revival of arbitrary government” (p. 233).  Arguing the importance of “social” or “distributive” justice, posing as defenders of the “poor” and “disadvantaged,” progressives scoffed (in the words of Anatole France) “at ‘the majestic equality of the law that forbids the rich as well as the poor to sleep under bridges, to beg in the streets and to steal bread’” (p. 235).  Influential jurists, especially in post-WWI Germany, crafted a “legal positivism” that supplanted the Natural Law (dismissed as a “metaphysical superstition”) and attributed individual rights to government rather than God.  The government can, for instance, either grant or withdraw the “right to life” from selected groups such as the unborn or disabled or unproductive.  The government can either guarantee or abolish property rights.  Bolsheviks, Fascists, and Nazis all embraced and implemented this “legal positivism.”  Less brutally, socialists in England and progressives in America promulgated the same philosophy, replacing the rule of law with majority rule.  Thus Dean Roscoe Pound, one of America’s finest legal scholars, noted the “paternalistic policies of the New Deal” and declared:  “‘Even if quite unintended, the majority are moving in the line of administrative absolutism which is a phase of the rising absolutism throughout the world’” (p. 247).  

Such absolutism cannot but characterize welfare states, leading Hayek to title part three of the book “Freedom in the Welfare State.”  Without question social reformers, throughout the West since 1848, have promoted various versions of socialism which, for a century or so generally “meant common ownership of the means of production and their ‘employment for use, not for profit’” (p. 254).  Discredited by disillusioning developments in Russia and Germany, however, reformers committed to “social justice” rephrased and redirected their agenda to establishing a “welfare state” that redistributes income.  Personal liberties must be restricted in order to promote the general welfare, and government must use “its coercive powers to insure that men are given what some expert thinks they need,” providing for their “health, employment, housing, and provision for old age” (p. 261).  Especially by monopolizing social security and medical care—highly effective means of income redistribution—welfare states become dictatorial.  

Redistribution through taxation (e.g. the graduated income tax proposed by Marx and Engels in 1848) further typifies welfare states and is “universally accepted as just” (p. 306).  Earlier thinkers, such as J.R. McCulloch had warned:  “‘The moment you abandon the cardinal principle of exacting from all individuals the same proportion of their income or of their property, you are at sea without rudder or compass, there is no amount of injustice and folly you may not commit’” (p. 308).  But zealous reformers saw it is a singularly effective means to achieve their vaunted “equality” and by the dawn of the 20th century most nations had sanctioned it.  Though allegedly a way to make the wealthy bear their “fair share,” in fact it makes the “masses . . . accept a much heavier load than they would have done otherwise” (p. 311).  It effectively increases the power of the state, which is the real goal of socialists of all stripes.  

Inevitably these welfare state policies and monopolies prove inefficient and ease the slide toward bankruptcy, but that never deters the utopians promoting them.  Unfortunately, as Justice Lewis Brandeis warned, writing in Olmstead v. United States in 1927:  “Experience should teach us to be most on our guard to protect liberty when the Government’s purposes are beneficent.  Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers.  The greatest dangers to liberty lurk in insidious encroachment by men of zeal, well meaning but without understanding.”  To understand Brandeis’ concern and to fully appreciate the importance of liberty as both a crucial ingredient of human nature and an ultimate good for human society, no better treatise exists than The Constitution of Liberty. 

239 “The Spirit of Munich”–Assessing Appeasement

 In The Gathering Storm (volume one of Winston Churchill’s account of the Second World War) the former prime minister penned these memorable words:  “It is my purpose, as one who lived and acted in these days, to show how easily the tragedy of the Second World War could have been prevented; how the malice of the wicked was reinforced by the weakness of the virtuous; how the structure and habits of democratic States, unless they are welded into larger organisms, lack those elements of persistence and conviction which can alone give security to humble masses; how, even in matters of self-preservation, no policy is pursued for even ten or fifteen years at a time.  We shall see how the counsels of prudence and restraint may become the prime agents of mortal danger; how the middle course adopted form desires for safety and a quiet life may be found to lead direct to the bull’s-eye of disaster.”  And his martyred contemporary, Dietrich Bonhoeffer, further declared:  “Silence in the face of evil is itself evil.  God will not hold us guiltless.  Not to speak is to speak.  Not to act is to act “ (pp. 17-18).  

Few historical truths are more easily documented than this:  appeasing evil is evil!  So Bruce S. Thornton’s The Wages of Appeasement:  Ancient Athens, Munich, and Obama’s America (New York:  Encounter books, c. 2011) deserves careful reading and reflection.  A professor of classics and humanities at California State University as well as a fellow at the Hoover Institution, the author brings both erudition and acumen to his text, laying the groundwork for his discussion with an account of ancient Athens’ failure to resist the aggressions of Phillip II of Macedon.  The city’s civic virtues had decayed—as Demosthenes  so eloquently declared—and with the loss of courage to defend themselves came the loss of freedom so celebrated by Pericles a century earlier.   So Athens slowly slid into obscurity and irrelevance.  

Two millennia later the Athenian approach marked Prime Minister Neville Chamberlain’s 1938 attempt in Munich in to forge a “peace with honor” with Adolph Hitler.  For nearly two decades, disillusioned with the consequences of WWI, the resolve of men such as Chamberlain had softened as  increasing numbers of jaded intellectuals (e.g. Bertrand Russell) and cheerful clerics (e.g. the “Red” Dean of Canterbury) embraced self-abasement, disarmament, pacifism, and a naive faith in the efficacy of internationalism.  “The ‘sniggering of the intellectuals at patriotism and physical courage,’ as [George] Orwell put if of prewar English anti-patriotism, ‘the persistent effort to chip away English morale and spread a hedonistic, what-do-I-get-out-of-it attitude to life, has done nothing but harm’” (p. 279).  One of the “sniggering intellectuals” Orwell condemned was the historian G. M. Trevelyan, who said, “‘Dictatorship and democracy must live side by side in peace, or civilization is doomed’” (p. 107).  George Lansbury, a Labor Party leader, actually admitted “he would ‘close every recruiting station, disband the Army and disarm the Air Force.  I would abolish the whole dreadful equipment of war and say to the world “do your worst”’” (p. 94).   And the world did its worst in short order.  As Mussolini and Hitler flexed their muscles—invading Ethiopia and the Rhineland—few men of courage opposed them.  Sufficiently emboldened, Hitler pursued his designs, fearing little resistance from England and France.  “‘I saw them at Munich,’ he said.  ‘They are little worms’” (p. 118).  WWII, with all its horrors, inexorably followed.

“The spirit of Munich,” said Alexander Solzhenitsyn when accepting his Nobel Prize, “has by no means retreated into the past; it was not a brief episode.  I even venture to say that the spirit of Munich is dominant in the twentieth century.  The intimidated civilized world has found nothing to oppose the onslaught of suddenly resurgent fang-baring barbarism, except concessions and smiles.  The spirit of Munich is a disease of the will of prosperous people; it is the daily state of those who have given themselves over to a craving for prosperity in every way, to material well-being as the chief goal of life on earth.  Such people—and there are many of them in the world today—choose passivity and retreat, anything if only the life to which they are accustomed might go on, anything so as not to have to cross over to rough terrain today, because tomorrow, see, everything will be all right.  (But it never will!  The reckoning for cowardice will only be more cruel.  Courage and the power to overcome will be ours only when we dare to make sacrifices)” (pp. 24-25).    

To Solzhenitsyn, America’s retreat from Southeast Asia, abandoning Vietnam just as victory was immanent, revealed the collapse of courage most visible in this nation’s intellectual and political elite.  The spirit of Munich, Thornton says, spread throughout the burgeoning anti-war and anti-American community in the ‘60s ‘70s, seriously compromising our intelligence agencies as well as demoralizing our armed forces.  It saturated the Carter Administration, whose “appeasing response to the Iranian crisis” in 1979 opened the gates to Islamic Jihad around the world.  The architect of the Iranian revolution, the Ayatollah Khomeini had set forth his objective in 1942:  “‘Those who study jihad will understand why Islam wants to conquer the whole world.  All the countries conquered by Islam or to be conquered in the future will be marked for everlasting salvation’” (p. 166).  He pulled no punches, declaring:  “‘Those who know nothing of Islam pretend that Islam counsels against war.  Those [who say this] are witless.  Islam says:  Kill all the unbelievers just as they would kill you!’” (p. 166).  The Koran and the sword are welded together—the book guides the faithful; the sword slays the infidels.  When the Iranians took hostage 66 Americans and seized our Embassy, President Carter equivocated and pleaded, sending a “groveling” letter assuring Khomeini of his commitment to “good relations ‘based upon equality, mutual respect and friendship’” (p. 168).  Though the hostages were released when Ronald Reagan was elected, our world forever changed as radical Muslims embraced jihad.  Despite a series of attacks on American installations, neither Reagan nor Bill Clinton responded decisively, confirming Osama “bin Laden’s estimation that U.S. power was “‘built on foundations of straw’” (p. 190).  Clinton, who regularly “wilted” when faced with politically risky decisions, responded to al Qaeda’s attack on the U.S.S. Cole by ordering “all U.S. Navy vessels to head for the safety of open waters and to avoid using the Suez Canal” (p. 197).  

Then came September 11, 2001!  A very different president, George W. Bush, responded quite differently, committing American troops to wars in Afghanistan and Iraq.  But despite initial support, as the wars drug on President Bush had to deal with hoards of increasingly shrill critics—mainly Democrat luminaries such as Al Gore, Barbara Boxer, Jimmy Carter and Howard Dean, who in 2004 commandeered anti-war throngs by spouting “Marxist clichés about ‘imperialism’ and ‘colonialism’ and the evils of capitalism’” (p. 203).  Joining the anti-war brigade, Barack Obama “campaigned on a foreign policy predicated on moralizing internationalism, a preference for diplomacy and transnational institutions, a focus on human rights and foreign development, and the assumption that the United States was flawed and in need of some humility after the reckless aggression and oppressive practices of the Bush administration” (p. 214), signaling “a return to the Carter philosophy that had helped put in power an Islamist regime in Iran and ignited a Soviet global expansion” around the world (p. 216).  “Thus we rationalize away the jihadists’ careful justifications for their violence in the theology of Islam and seek to ameliorate what we think are the true causes—poverty, lack of political participation, or historical injustices—rather than realizing that those who believe they are divinely sanctioned to kill others will not be talked or bribed out of their beliefs, but can only be destroyed” (p. 280).  

So there is every reason to acknowledge that Obama, committed as he is to an “outreach” to Muslims, personifies the “spirit of Munich.”  He appeases the Islamists just as Chamberlain appeased the Nazis, even insisting that administrative officials—and journalists—sanitize their language in order to falsely portray Muslims (at home and abroad) as “peace-loving” and jihad as “spiritual improvement rather than violence against the enemies of Islam” (p. 255).    In his repeated laments for the sins of the West, he simply ignores the historical evidence that “over the centuries Muslims have conquered, killed, ravaged, plundered, and enslaved Christians and occupied their lands in vastly greater numbers than all the dead resulting from European colonial incursions of America’s recent wars in Muslim lands put together” (p. 263).   Facing such hostile world, Thornton insists, America must respond in ways atypical of the democracies led by men such as Chamberlain and Obama, which almost always make short-sighted, self-serving, emotionally-based decisions.  

* * * * * * * * * * * * * * 

Markedly different from the spirit of Munich so evident in today’s progressive intelligentsia was the “Iron Lady” Margaret Thatcher!  As Claire Berlinski makes clear in “There is No Alternative:  Why Margaret Thatcher Matters” (New York:  Basic Books, c. 2011), appeasement was not in her genes!  Rather, she demonstrated the kind of courage which only comes from deeply-held convictions.  For her, these were established in her early years as she attended a Methodist church.  In contrast to those unprincipled politicians who forever float with the winds of popular opinion, Lady Thatcher refused to bend when principles such as freedom and justice were at stake.  To her there were never two sides to an issue—there was only one, the right side!  In a remarkable statement, responding to those who urged compromise and “consensus,” she declared:  “To me consensus seems to be:  the process of abandoning all beliefs, principles, values and policies in search of something in which no one believes, but to which no one objects; the process of avoiding the very issues that have to be solved, merely because you cannot get agreement on the way ahead.  What great cause would have been fought and won under the banner ‘I stand for consensus’?” (Downing Street Years, p. 167).

In particular, she “was one of the most vigorous, determined, and successful enemies of socialism the world has known” (p. 5).  Living in a “weakly socialist nation” slowly shaped over the decades by the Labour Party following Fabian strategies, she smelled Britain’s festering decadence and refused to sanction the “basic immorality” of socialism, believing that “socialism itself—in all its incarnations, wherever and however it was applied—was morally corrupting.  Socialism turned good citizens into bad ones; it turned strong nations into weak ones; it promoted vice and discouraged virtue; and even when it did not lead directly to the Gulags, it transformed formerly hardworking and self-reliant men and women into whining, weak and flabby loafers.   Socialism was not a fine idea that had been misapplied; it was an inherently wicked idea” (pp. 7-8).  

She emerged as a political powerhouse with a speech to the Conservative Party in 1976, calling for a return to free enterprise economics, a repudiation of the Labour Party which was then “‘committed to a program which is frankly and unashamedly Marxist,’” a shift which would bring about “‘the rebirth of a nation, our nation—the British nation’” (p. 68).  She urged her party to launch a crusade by appealing “‘to all those men and women of goodwill who do not want a Marxist future for themselves or their children or their children’s children.  This is not just a fight about national solvency.  It is a fight about the very foundations of the social order.  It is a crusade not merely to put a temporary brake on socialism, but to stop its onward march once and for all’” (p. 69).  Citing Shakespeare’s rendition of King Henry V’s words before the pivotal Battle of Agincourt, she concluded:  “‘As was said before another famous battle:  “It is true that we are in great danger; the greater therefore should our courage be”’” (p. 69).  

And the courage of King Henry was needed when the Iron Lady came to power in 1979, for Britain was widely regarded as the Sick Man of Europe (a “disgrace,” in the opinion of Henry Kissinger), “sunk to begging, borrowing, stealing” (p. 10).  Little remained of the nation that a century earlier had proudly orchestrated the Pax Britannica, ruling one fourth of the world.  “It was the world’s undisputed premier naval power; it controlled the world’s raw materials and markets; it had long been the world’s leading scientific and intellectual power; it was the financial center of the world and the premier merchant carrier; it had invented the Common Law; it had invented modern parliamentary democracy” (p. 9).  Only faded remnants of such glory days remained.  To Thatcher, Britain’s decline was “a punishment for the sin of socialism” installed by her countrymen in 1945 when they elected Clement Atlee prime minister and established a enveloping welfare state.  Having defeated the Nazis in war, they surrendered to the socialists in peace!  She thought that “in 1945 the good and gifted men and women of Britain had chosen a wicked path.  They had ceased to be great because they had ceased to be virtuous.  In ridding Britain of socialism, she intimated, she would restore it to virtue.  She would make it once again worthy of greatness” (p. 13).  Consequently:  “Hatred of communism, hatred of Marxism, hatred of socialism—and an unflinching willingness to express that hatred in the clearest imaginable terms—was the core of Thatcherism” (p. 47).  

“Thatcherism,” Berlinski says, was rooted in the religiously devout, industriously middle class rearing that nurtured Margaret Thatcher.  Her father was a Wesleyan lay preacher whose influence and convictions shaped her.  As a child she began speaking publically by reading passages in church, sharing in her father’s ministry and resolving to follow his example, frequently citing the Scriptures in her political pronouncements.  Accordingly:  “She did what was right, she did what was right, she did what was right.  She did it because her father told her to” (p. 21).  Her education (a chemistry degree from Oxford, supplemented by a law degree earned while working as a research chemist) equipped her.  Her marriage, to a prosperous businessman, Denis Thatcher, sustained her.  But no external factors fully explain her!  She was, quite simply, a remarkable woman with sharply-honed political skills who guided Britain through her “longest sustained period” of “economic expansion of the postwar era” (p. xix).  Using extensive interviews with her allies and enemies, Berlinski enables us to better appreciate Thatcher’s genius and success.  

Early on, as Prime Minister, she refused to back down to Argentina and successfully waged a 1982 war to maintain British control of the Falkland Islands.  “Without this victory, it is unlikely that the Thatcher Revolution could have occurred” (p. 158).  Standing firm and winning the war greatly enhanced the prestige of both Thatcher and her country, building the popular support she needed to win “a massive victory in the 1983 general election” (p. 179) and subsequently to deal with domestic issues.  “‘We had to fight the enemy without in the Falklands,’ she said.  ‘We always have to be aware of the enemy within, which is much more difficult to fight and more dangerous to liberty’” (p. 183).  The foremost “enemy within,” by all odds, was the thoroughly Marxist trade unions, effectively strangling the British economy.  So Thatcher defied and denatured them, despite a long and bloody coal miners’ strike.  She did so by “stockpiling coal, training the military to drive trains in the event of a sympathy strike by the railway workers, accelerating the development of nuclear power, importing electricity by cable from France, and refurbishing coal-fired power stations to permit them to run on oil” (p. 208).  

Thatcher’s opposition to Marxism included a deep hostility to the USSR, which, when she became Britain’s prime minister in 1978, “appeared to be not only invincible, but ascendant” (p. 270).  In her 1976 speech to her party, she boldly announced her convictions, saying:  “‘I stand before you tonight in my Red Star chiffon evening gown, my face softly made up and my fair hair gently waved, the Iron Lady of the Western World.  A Cold War warrior, an Amazon philistine, even a Peking plotter.  Am I any of these things?  Well, yes, if that’s how they wish to interpret my defense of values and freedoms fundamental to our way of life’” (p. 263).  Undeterred by her critics, impervious to their poison darts, she gladly joined Ronald Reagan in doing whatever possible to throttle Communism wherever it appeared in the 1980s.  “Publicly, Thatcher—and only Thatcher, among the leaders of the world—supported Reagan unwaveringly, despite massive domestic and international pressure to do otherwise” (p. 273).  By doing so, she earned Reagan’s enduring gratitude and friendship.  Together they successfully dealt with a new kind of Communist, Mikhail Gorbachev, whom they both liked, and in short order the Soviet Union collapsed.  

Reviewing the book, Peter Schweizer, the author of Reagan’s War, concluded:  “Finally the Iron Lady gets her due.  Claire Berlinski brilliantly lays out how Margaret Thatcher’s strength and conviction changed the world.  Without a Prime Minister Thatcher there might not have been a President Ronald Reagan.  And Berlinksi reminds us how the whole world would benefit from a new Thatcher today.”  

* * * * * * * * * * * * * * * * * * * 

In Obama’s Globe:  A President’s Abandonment of U.S. Allies Around the World (New York:  Beaufort Books, c. 2012), Bruce Herschensohn places the nation’s foreign policy within the context of the past half-century, showing how the Carter and Obama administrations failed to rightly understand and handle challenges abroad.  Unlike FDR and JFK, Ronald Reagan and George W. Bush, all of whom boldly pressed for victory over our foes, presidents Carter and Obama tried to appease the nation’s enemies and in the process deserted our friends.  “Carter’s abandonment of El Salvador and Nicaragua,” for example, “ended with a fall of those two governments that cost over 70,000 deaths of Central Americans fighting against Soviet proxies who had taken advantage of the opportunity given to them” (p. 27).  He lacked what JFK called “the stuff of presidents,” envisioning himself as a purveyor of “human rights,” a peace-maker rather than a commander-in-chief.  

Following the Carter approach, President Obama annulled agreements with Poland and the Czech Republic in order to curry favor with Russia.  Distancing himself from Britain, Obama (through his Secretary of State Hillary Clinton) proclaimed America’s neutrality regarding the on-going dispute between Argentina and the United Kingdom over the Falkland Islands.  In Tunisia and Egypt, Obama shunned long-term allies, opening the gates for Islamists (notably the Muslim Brotherhood) to take control of Arab nations.  To Herschensohn:  “The celebration of Mubarak’s fall was much more reminiscent of 1979’s fall of the Shah of Iran and the welcome of the Ayatollah Khomeini” (p. 60).  And in Iran, when millions took to the streets demanding liberty from the oppression imposed by Khomeini and his heirs, Obama refused to give even verbal support!  So too in Syria—Obama has feebly protested the genocidal regime of Bashar al-Assad but done nothing of substance to overthrow him.  His policy—termed “leading from behind” by a White House advisor—illustrates timidity and constant concern for his own political position.  Obama’s treatment of Israel further illustrates his abandonment of American allies around the globe.  By consistently referring to the West Bank and Gaza Strip as land “occupied by Israel,” and by announcing, on May 19, 2011, his desire to return to the 1967 boundaries between Israel and Palestine (a noon-existent state), he made “one of the worst, if not the very worst statement made by any U.S. President regarding a friendly nation that won a war” (p. 89), the president has egregiously distanced himself from our only allies in the Middle East.  

Wherever Herschensohn looks (and he deals with many more areas that I’ve indicated), President Obama seems committed to appeasing our enemies and abandoning our allies, hoping the world be safer without American leadership.  

238 Bad Religion, Toxic Charity, Dead Aid

 To St. Augustine:  “Prudence is love choosing wisely between the things that help and those that hinder” (De Morib. Eccl. xv).  Contrary to the Beatles’ message, love is not all you need, for many well-motivated acts do much harm.  So cautionary tales, such as Ross Douthat’s Bad Religoin:  How We Became a Nation of Heretics (New York:  Free Press, c. 2012), should provoke serious reflection on what we are actually doing in our religious life.  Douthat is the youngest-ever op-ed columnist for The New York Times as well as a practicing, traditional Catholic who writes with a deep concern for the current and future well being of Christianity in America.  He argues:   “America’s problem isn’t too much religion, as a growing chorus of atheists have argued; nor is it an intolerant secularism, as many on the Christian right believe.  Rather, it’s bad religion:  the slow-motion collapse of traditional faith and the rise of a variety of pseudo-Christianities” (p. 3).  Clearly “most Americans are still drawing some water from the Christian well.  But a growing number are inventing their own versions of what Christianity means, abandoning the nuances of traditional theology in favor of religions that stroke their egos and indulge or even celebrate their worst impulses” (p. 4).  Consequently, we are less a Christian nation than a nation of heretics!  

Knowing the word “heretic” is a loaded term, Douthat takes Alister McGrath’s definition for his own:  “‘a form of Christian belief that, more by accident than design, ultimately ends up subverting, destabilizing or even destroying the core of Christian faith’” (p. 9).  Its converse is the historic orthodoxy defined in ecumenical creeds that have distinguished conservative Christians (both Catholic and Protestant) for centuries.  During the past half-century, however, such orthodoxy has virtually disappeared.   Douthat documents the vigorous health of America’s churches following WWII—churches and seminaries overflowed; preachers such as Billy Graham and Fulton Sheen effectively reached millions with a soul-saving gospel; serious thinkers and writers such as Reinhold Niebuhr and Jacques Maritain, Walker Percy and C.S. Lewis provided compelling intellectual guidance.  It was, quite simply, an American age of faith.

“The crucial element” in this era, Douthat says, “was a deep and abiding confidence:  not just faith alone, but a kind of faith in Christian faith, and a sense that after decades of marginalization and division, orthodox Christians might actually be on the winning side of history.”  The churches “at midcentury offered believers a relatively secure position from which to engage with society as a whole—a foundation that had been rebuilt, as we have seen, rather than simply inherited, and that seemed the stronger for it” (p. 53).  “For a fleeting historical moment, it seemed as though the Christian churches might” in fact “become something more like what the Gospels suggested they should be:  the salt of the earth, a light to the nations, and a place where even modern man could find a home” (p. 54).  

With the rapidity of a punctured balloon, however, this burgeoning religious world deflated in “the locust years” of the ‘60s and ‘70s.  Despite desperate attempts to soften standards and accommodate cultural trends—especially  regarding sex and marriage, abortion, euthanasia, and women’s ordination—the  “Protestant Mainline’s membership stopped growing abruptly in the mid-1960s and then just as swiftly plunged” (p. 58).  As if sharing the same harness, the post-Vatican II Catholic Church dramatically lost priests, monks, nuns, schools, and mass-attendees.  “Only what Dean Kelley described as the ‘conservative churches’ bucked these trends” (p. 60), though in general “the heretics carried the day completely.  America in those years became more religious but less traditionally Christian; more supernaturally minded but less churched; more spiritual in its sentiments but less pious in its practices” (p. 64).  Reflecting this societal shift, a surging “dismissive attitude” triumphed in the nation’s elite institutions—universities, media, bureaucracies—so that by the century’s end “the tastemakers and power brokers and intellectual agenda setters” snidely dismissed orthodox Christianity as “completely declasse” (p. 82). 

Churches zealously accommodating to the culture—substituting a message of “social justice” for personal redemption, replacing theology with sociology, embracing Harvey Cox’s prescriptions in The Secular City—were the biggest losers as their seminaries and congregations quickly shrank.  They perfectly illustrated Dean Ralph Inge’s dictum:  “He who marries the spirit of the age is soon left a widower.”  Claiming to function “in the spirit of Vatican II,” accommodating Catholics (especially in universities, religious orders and liturgical committees) quickly distanced themselves from embarrassing vestiges of antiquity.  By the mid-‘80s, one scholar noted that “‘the dismantling of traditional Roman Catholic theology, by Catholics themselves, is by now a fait accompli” and seminarians were taught “‘that Jesus of Nazareth did not assert any of the divine or messianic claims the Gospels attribute to him and that he died without believing he was Christ or the Son of God, not to mention the founder of a new religion’” (p 100).  

Resisting the spirit of the age, of course, were believers who dared to be somewhat old-fashioned, and “it became increasingly clear that what vitality remained in American Christendom was being sustained by the unexpected alliance between Evangelicals and Catholics” (p. 115).  Their stance was visibly present in the mesmeric Pope John Paul II, who, George Weigel says, “‘did not propose to surrender to modernity.  He proposed to convert it’” (p. 119).  “In effect, John Paul made his pontificate a rallying point for the resistance to the redefinition of Christianity.  And rally many Catholics did” (p. 120).  They were joined in that endeavor by Evangelicals such as Francis Schaeffer, who early urged his readers to oppose the culture of death and deftly critiqued many of the threats posed by modernity.  

Turning from his historical assessment, Douthat points out various heresies now captivating Christianity in America.  There is, first, the effort to add various “gospels” to the New Testament canon.  Accomodationist scholars such as John Dominic Crossan, Bart Ehrman, and Elaine Pagels, employing historical criticism and generally discarding any notion of biblical inspiration, propose adding various “gospels” to the New Testament canon, reverting to variants on the Gnosticism early condemned by the Church.  Their views are invariably non-judgmental and tolerant of all sexual orientations, promoting self-esteem and the political agenda of the Democrat Party.  But ultimately, “whether we end up with Jesus the Gnostic mystic, the Cynic philosopher, the proto-feminist, or the apocalyptic prophet—the present-day theological implications of his ‘real’ identity usually turn out to look a lot like the accomodationist Christianity of the Protestant Mainline” (p. 161).  Traditional orthodoxy, rooted in the thought of St. Paul, is discarded by emulating rather than worshipping Christ, treating the crucifixion as an example of brotherly love, and reducing the resurrection to a psychological insight.  Equally attuned to the spirit of the age—and equally heretical—popular, entrepreneurial preachers such as Joel Osteen promote a God who “gives without demanding, forgives without threatening to judge, and hands out His rewards in this life rather than in the next” (p. 183).  The Houston megachurch pastor has effectively refashioned “Christianity to suit an age of abundance, in which the old war between monotheism and money seems to have ended, for many believers, in a marriage of God and Mammon” (p. 183).  Douthat cites a study “suggesting that 50 of the 260- largest churches in America now preach prosperity theology” (p. 192).    

Even less attached to historical Christianity are various New Age spokesmen, such as Elizabeth Gilbert, who promote a mystical, pantheistic “God within” who “‘dwells within you as you yourself, exactly the way you are’” (p. 230).  The author of the phenomenally successful Eat, Pray, Love, Gilbert felt inspired to leave her husband and travel the world in search of personal enlightenment, carving out for herself a satisfying religion.  “‘You have every right,’” she writes, “‘to cherry-pick when it comes to moving your spirit and finding peace in God’” (p. 214).  Whatever works for you must be true!  The God within  speaks to Gilbert in her own voice—the same message “preached by a cavalcade of contemporary gurus, teachers, and would-be holy men and women” as well as the same “theology that Elaine Pagels claims to have rediscovered in the lost gospels of the early Christian Church” (p. 215).  Fortuitously, “the greatest popularizer of God Within theology” is Oprah Winfrey, effectively (and profitably) using her TV empire to spread the message.  “It’s the church of the Oprah Winfrey Network, you might say:  religion as a path to constant self-affirmation, heresy as self-help, the quest for God as the ultimate form of therapy” (p. 230).  Needless to say, the God of the New Age resembles neither the Yahweh of the Jews nor the Holy Trinity of the Christians.    

To “recover” Christianity in America, Douthat urges and prays for revival—a return to the kind of good religion so brilliantly set forth in G.K. Chesterton’s Orthodoxy and C.S. Lewis’s Mere Christianity.  He hopes “to persuade even the most skeptical reader that traditional Christian faith might have more to offer this country than either its flawed defenders or its fashionable enemies would lead one to believe” (p. 293).  It should be:  1) “political without being partisan;” 2) “ecumenical but also confessional;” 3) “moralistic but also holistic;” and, 4) “oriented toward sanctity and beauty.”  “We are waiting, not for another political savior or television personality, but for a Dominic or a Francis, an Ignatius or a Wesley, a Wilberforce or a Newman, a Bonhoeffer or a Solzhenitsyn.  Only sanctity can justify Christianity’s existence; only sanctity can make the case for faith; only sanctity, or the hope thereof, can ultimately redeem the world” (p. 292).  

* * * * * * * * * * * * * * * * * * * 

In Toxic Charity:  How Churches and Charities Hurt Those They Help (And How to Reverse It) (New York:  HarperOne, c. 2012), Robert D. Lupton offers invaluable advice regarding compassionate ministries.  Having worked for more than 40 years in inner-city Atlanta and studied projects around the world, he is determined to discover how best to help the needy.  Early in his work he joined a group of sincere believers giving Christmas gifts to a needy family.  While the gifts were being opened he noticed the father of the family quietly slipping away, obviously humiliated by the fact he was unable to buy toys and clothes for his own family.  To Lupton that incident provided a key to ministry:  giving care without providing a cure cannot be right.  Providing momentary assistance without orchestrating lasting development cannot be wise.  

Unfortunately, though we Americans are quite charitable, “much of that money is either wasted or actually harms the people it is targeted to help” (p. 1).  “Take Haiti, for example.  No other country in the Western Hemisphere has received more charitable aid and services from governments and nonprofits.  Yet its poverty and dysfunction continue to deepen” (p. 36).  So too in America:  “For all our efforts to eliminate poverty—our entitlements, our programs, our charities—we have succeeded only in creating a permanent underclass, dismantling their family structures, and eroding their ethic of work.  And our poor continue to become poorer” (p. 3) as we promote “disempowering charity through our kindhearted giving.  And religiously motivated charity is often the most irresponsible” (p. 4).  

This is particularly evident in many “mission trips,” sending groups of teenagers or young adults to impoverished areas around the globe.  In 2006, 1.6 American Christians took such trips, spending $2.4 billion.  However:  “The money spent by one campus ministry to cover the costs of their Central American mission trip to repaint an orphanage would have been sufficient to hire two local painters and two new full-time teachers and purchase new uniforms for every student in the school” (p. 5).  Clearly  these folks sought to uplift the impoverished.  Without question they learned something from their endeavor.  But the ultimate, too often unasked and unanswered question is this:  did they actually help the people they “helped”?  Lupton insists they do not.  They effectively harm the poor, discouraging their work ethic, and promote a demeaning dependency.  In fact, they do little more than polish the self-image of the helpers!  Too easily we forget this axiom:  “Little affirms human dignity more than honest work.  One of the surest ways to destroy self-worth is subsidizing the idleness of able-bodied people” (p. 151).  

Lupton records a conversation with Juan, the Nicaraguan director of Opportunity International, who lamented that “entrepreneurship declines as dollars and free resources flood in, how people become conditioned to wait for the next mission group to arrive instead of building their businesses through their own efforts.  He talked about how dignity is eroded as people come to view themselves as charity cases for wealthy visitors, how they pose with smiling faces for pictures to be taken back for the marketing of the next group.  ‘They are turning my people into beggars,’ Juan said’” (p. 21).  He discovered what Jacques Ellul declared, in Money and Power:  “‘It is important that giving be truly free.  It must never degenerate into charity, in the pejorative sense.  Almsgiving is Mammon’s perversion of giving.  It affirms the superiority of the giver, who thus gains a point on the recipient, binds him, demands gratitude, humiliates him and reduces him to a lower state than he had before’” (p. 34).  

The same occurs in poverty-stricken neighborhoods in America.  In 1991, Jimmy Carter launched the Atlanta Project, “the largest private antipoverty initiative in Atlanta and the boldest effort of its kind in the country” (p. 87).  A massive organization, relying on the best the brightest scholars, promised to transform the city.  Hundreds of folks were hired and dozens of offices were opened, offering various kinds of training and financial aid.  But in a few years little remained of the Carter initiative.  Its “greatest achievement,” a Stanford University analysis concluded, was ‘‘consolidating application forms for social services from sixty-four pages to eight.  All of this for $33.6 million’” (p. 92).  An alternate approach was taken by some entrepreneurs who bought an aging golf course adjacent to an impoverished section of Atlanta.  Investing wisely and rebuilding shrewdly they developed a world-class course, attracting well-heeled competitors.  In the process, small businesses opened in the adjoining neighborhood and scores of jobs were afforded residents.  A thriving eddy of prosperity spread its goodness in dramatic contrast to the Carter project.  

Given the problem of toxic charity, Lupton suggests benevolent organization take “The Oath for Compassionate Service:  

  • Never do for the poor what they have (or could have) the capacity to do for themselves.
  • Limit one-way giving to emergency situations.
  • Strive to empower the poor through employment, lending, and investing, using grants sparingly to reinforce achievements.
  • Subordinate self-interests to the needs of those being served.
  • Listen closely to those you seek to help, especially to what is not being said—unspoken feelings may contain essential clues to effective service.
  • Above all, do no harm” (p. 9).  

As is evident in this oath, Lupton consistently concentrates on practical, common sense solutions to poverty.  Anyone concerned to wisely invest either time or money in charitable endeavors will profit from a careful study of this book.  

* * * * * * * * * * * * * * * * * 

In Dead Aid:  Why Aid is Not Working and How There is a Better Way for Africa (New York:  Farrar, Straus and Giroux, c. 2009), Dambisa Moyo, a Zambian economist with an earned doctorate from Oxford, demonstrates the deadening influence of the massive amounts of “aid” given African nations.  Whether distributed by international organizations (e.g. the World Bank), nations (e.g. the U.S.), or humanitarians (such as the rock star Bono), billions of dollars have harmed their recipients.  “Aid has been, and continues to be,” she says, “an unmitigated political, economic, and humanitarian disaster for most parts of the developing world” (p. #85 in Kindle edition).  In the five decades following the demise of European colonialism, more than US$2 trillion has been showered on “developing” nations, primarily in Africa, reflecting the desire of the rich to help the poor.  And what has happened?  “Aid has helped make the poor poorer and growth slower” (#218).  It “has been, and continues to be, an unmitigated political, economic, and humanitarian disaster for most parts of the developing world” (#224).  This is not true of  “emergency” or “charity-based aid” sent by organizations such as World Vision to alleviate a famine or provide clean water for villages!  Her criticism is directed at the “billions transferred each year directly to poor countries’ governments” (p. 8) through grants and loans.  

Moyo provides a succinct history of post-WWII efforts to lift impoverished peoples into prosperity.  Decade by decade, a variety of strategies, mediated through multitudinous organizations, have been tried and proven unsuccessful, basically because they propped up “client regimes” controlled by corrupt dictators and while rendering dependent and dispirited the alleged recipients of this largesse.  “Vast sums of aid not only foster corruption—they breed it” (p. 52).  Sadly enough:  “One of the most depressing aspects of the whole aid fiasco is that donors, policymakers, governments, academicians, economists and development specialists know, in their heart of hearts, that aid doesn’t work, hasn’t worked and won’t work.  Commenting on at least one aid donor, the Chief Economist at the British Department of Trade and Industry remarked that ‘they know it’s crap, but it sells the T-shirts’” (p. 46).  

The few African exceptions to this pattern, such as Botswana, “succeeded by ceasing to depend on aid” (p. 38).  And the model for helping impoverished nations shines forth in places such as India and Chile, where an emerging middle class practicing self-reliance and encouraging free markets have replaced aid-dependency.  Amazingly, representatives from China, making investments and building roads and seeking raw materials and establishing profitable industries have done far more good for Africa than Western aid!  “The mistake the West made was giving something for nothing.  The secret of China’s success is that its foray into Africa is all business.  The West sent aid to Africa and ultimately did not care about the outcome; this created a coterie of elites and, because the vast majority of people were excluded from wealth, political instability has ensued” (p. 152).  “China, on the other hand, sends cash to Africa and demands returns.  With returns Africans get jobs, get roads, get food, making more Africans better off, and (at least in the interim) the promise of some semblance of political stability.  It is the economy that matters” (p. 152).  

Consequently, Moyo proposes market-based solutions for Africa’s problems.  As “Senegal’s President Wade remarked in 2002:  ‘I’ve never seen a country develop itself through aid or credit.  Countries that have developed—in Europe, America, Japan, Asian countries like Taiwan, Korea and Singapore—have all believed in free markets.  There is no mystery there.  Africa took the wrong road after independence’” (p. 149).  And multiplied millions of people have suffered the consequences.  

237 “Hit the Road, Barack”

   A recent issue of Newsweek featured a cover story by a distinguished Harvard historian, Niall Ferguson, titled “Hit the Road, Barack,” setting forth a substantial argument for the current president’s speedy retirement.  I share Ferguson’s position, largely as a result of reading, during the past four years, 40 plus books by and about President Barack Obama (cf. the appended sources).  Most of them, like prosecutors’ briefs, seek to critique the president for various reasons and are generally polemical in nature.  It’s important to remember, however, that a prosecutor’s brief, despite its imbalance, may be eminently accurate and trustworthy.  A passionate polemic, despite its extreme wording, may very well present accurate information and highlight wrongful behavior.  For example, in 1964 J. Evetts Haley, a distinguished Texas Historian with considerable personal experience in state politics, published A Texan Looks at Lyndon:  A Study in Illegitimate Power.  Many urbane reviewers cavalierly dismissed its intensely negative portrait of President Johnson.  Fifty years later, however, in light of Robert Caro’s exhaustive multi-volume biographical work (detailing Lyndon Johnson’s power-pursuits and machinations), much that Haley said seems accurate and spot-on.  Had voters in 1964 believed Haley and repudiated LBJ the nation would have been spared much agony, most particularly the trauma resulting from his Vietnam War strategies and the trillions of wasted dollars concurrently poured into his Great Society boondoggles.    

So I take many of Barack Obama’s critics seriously, particularly when there appears to be a clear consensus among them regarding the evidence regarding his background and philosophy, his maneuvers and accomplishments.  In this essay I will note several areas wherein I join Obama’s critics in finding him unworthy of re-election.   

1.  AN ILL-PREPARED AND INCOMPETENT CHIEF EXECUTIVE 

Doubts regarding Obama’s lack of executive experience have been amply affirmed during the past four years.  Not only had he never run a business—he hadn’t even chaired an academic department or legislative committee before entering the White House!  His role as a community organizer was limited to  orchestrating protests without establishing effective structures or lasting solutions.  He is, as former President Bill Clinton allegedly declared, an inept “amateur”—the title of a book by Edward Klein, the one-time foreign affairs editor of Newsweek and former editor-in-chief of the New York Times Magazine.  Hillary Clinton too has apparently found Obama frustrating to work with, according to some sources, and is “fed up with ‘a president who can’t make up his mind.’” After hundreds of interviews and wide-ranging research, Klein concluded Obama is “at bottom temperamentally unsuited to be the chief executive and commander in chief of the United States.”  Similar illustrations revealing a president frequently “indecisive and dilatory” appear in Richard Miniter’s persuasive case in Leading from Behind:  The Reluctant President and the Advisors Who Decide for Him.  

In Debacle:  Obama’s War on Jobs and Grown and What We  Can Do Now to Regain Our Future, Grover G. Norquist (president of American for Tax Reform) and John R. Lott, Jr. (a distinguished economist who has taught at some of the nation’s elite universities), massively and minutely document the president’s failure to deal with what’s often called the Great Recession.  Shortly before Obama’s inauguration, Democrat Senator Byron Dorgan, evaluating the new economic team, warned:  “You’ve picked the wrong people!”  Thus Obama made promises impossible to deliver regarding job creation and economic recovery, rammed through a stimulus bill that subsidized his allies while ultimately harming the nation, and plunged us into almost unfathomable debt.  Rather than helping America recover from the recession, Barack Obama (duplicating FDR’s failures in the 1930s) has aggravated and prolonged it.  

Truth to tell, “the man we have elected president of the United States doesn’t know what he’s doing,” says David Gelernter, a Yale computer professor and Orthodox Jew who portrays Obama as the personification of the educational failures amply evident in the ignorance now plaguing the schools shaped by the ‘60s generation.  Writing in America-Lite:  How Imperial Academia Dismantled Our Culture (and Ushered in the Obamacrats), Gelertner insists the president speaks for the mindless “adversary culture” incubated in the elite universities (Occidental; Columbia; Harvard) he attended, absorbing the views of PORGIs (post-religious, global intellectuals).  Obama is, consequently, “merely a mouth for garden-variety left-liberal ideas—a mouth of distinction, a mouth in a million, but a mere mouth just the same.  He is important not as a statesman but as a symptom, a dreadful warning.  He is important not because he is exceptional but because he is typical.  He is the new establishment; he represents the post-cultural revolutionary PORGI elite.”  By temperament and training an able agitator, the president lacks those statesmanlike skills needed to lead this nation.    

2.  AN IDEOLOGICAL LEFTIST 

Mounting evidence points to Barack Obama’s leftist convictions and agendas.  In his formative years virtually every influential person in his life promoted such notions.  As Dinesh D’Souza shows, he deeply admires his Kenyan father, Barack Sr., who was an anti-colonial socialist.  His mother, something of a poster girl for the atheistic anti-American countercultural ethos of the ‘60s, embraced various left-wing, anti-American, utopian causes throughout her life.  His primary philosophical mentor in Hawaii, Frank Marshall Davis (as Paul Kengor recently demonstrated in The Communist) was an avid Stalinist.  The books he read and the professors he admired at Occidental and Columbia were frequently Marxists.  The radical circles (e.g. unrepentant Weathermen terrorists Bill Ayers and Bernadine Dohrn) within which he moved in Chicago were markedly shaped by Saul Alinski, the noted socialist organizer who wrote Rules for Radicals.  Jeremiah Wright, the pastor he frequently lauded, was a fiery advocate of a Marxist-rooted liberation theology.  His chief political guru, David Axelrod, was mentored by Harry and David Canter; the   elder, Harry, spent time in the ‘30s in Moscow, translating Lenin’s works.  Thus Axelrod has deep roots in radical Chicago circles, consistently espouses leftist ideals and exemplifies their strategies.   

Though muted and disguised with rhetorical skill, Obama’s leftist agenda became evident during his first administration as he sought to centralize power and expand the welfare state, today’s vehicle for attaining traditionally socialist objectives.  Through legislation (Obamacare; Dodd-Frank) and regulation (EPA edicts; NLRB policies) he has dramatically departed from the clear federalism of the Constitution.  Vowing, upon his inauguration, to “fundamentally transform” this nation, the president has consistently worked to establish a European-style social democracy and “spread the wealth around.”  Consummating a century of progressivism, running through Woodrow Wilson, Franklin D. Roosevelt and Lyndon B. Johnson, Barack Obama presides over what Mark R. Levin describes (in Ameritopia) as “a post-constitutional, democratic utopia of sorts.  It exists behind a Potemken-like image of constitutional republicanism.” Manifestly, warns Daniel Hannon, a British member of the European Parliament, the United States “is Europeanizing its health system, its tax take, its day care, its welfare rules, its approach to global warming, its foreign policy, its federal structure, its unemployment rate.”  

Obama’s ideology informs his commitment to an environmentalism clarified in important administrative appointments.  Obama’s secretary of transportation, Ray LaHood, wants to “coerce people out of their cars” into trains and bicycles; so billions of dollars from the “stimulus package” went to light rail projects, despite their well-documented economic infeasibility.  His secretary of energy, Steven Chu, hoped higher gas prices would make Americans more like Europeans; facing $8 a gallon at the pump they would be forced to buy better cars and use public transportation.  His Science and Technology advisor, John Holdren, a one-time close associate of Paul Ehrlich, seeks to implement the vision etched in extremist ‘60s manifestoes such as The Population Bomb.   His special advisor for green jobs, Van Jones, forced to resign after it was revealed he identified his communist commitments, openly acknowledges using environmentalism as a popular vehicle to establish racial and social justice, in his words “spreading the wealth” and “changing the whole system.”   

His “mania for green energy,” says David Limbaugh in The Great Destroyer, “exceeds all bounds of reason or prudence.  He has dedicated tens of billions of dollars to a wide assortment of fantastic green projects, often falsely advertising them as being geared toward creating jobs and sparking economic growth.”  Philosophically opposed to using or developing the enormous resources of fossil fuels in the United States and Canada, he manipulated the BP Deepwater Horizon drilling rig oil spill in the Gulf of Mexico in 2010 to impose a moratorium on exploratory drilling off all the nation’s coasts.  When a federal judge blocked him, he defied the judge and issued an analogous order restricting leases (speaking through his spokesman Ken Salazar, who as a senator from Colorado had said he would oppose all offshore drilling, even if gas prices soared to $10 a gallon).  He effectively cancelled the Keystone XL pipeline from Canada to Texas, alleging environmental concerns when in fact hundreds of similar pipelines safely and efficiently transport oil throughout the nation.  

Enamored with the illusion that “green energy” will create jobs and make America self-sufficient, Obama authorized dozens of “investments” from the Recovery Act in companies such as the now bankrupt Solyndra (repeatedly celebrated by the president as a perfect example of his aspirations).  Amply illustrating the “crony capitalism” endemic to this regime, Solyndra demonstrated the president’s version of environmentalism.  In Throw Them All Out, Peter Schweizer notes that 80 percent of the renewable energy companies subsidized by the Department of Energy are either owned or run by Obama donors!  “This is,” Schweizer asserts, precisely the same as “Boss Tweed’s financial payoffs writ large,” since the folks who worked for and donated millions of dollars to Obama’s election campaign soon “received billions in government-backed loans and outright grants.”  One of Obama’s staunchest financial supporters, Pat Stryker, a Colorado billionaire, rejoiced to see one of her companies, Abound Solar, awarded $400 million in federal grants as well as $4.7 billion in loan guarantees.  Right in the middle of disbursing these quid pro quod deals one finds lobbyists such as Deana Perlmutter, then-wife of the fabulously-wealthy Colorado Democrat Congressman Ed Perlmutter.    

Compounding this corrupt cronyism, hundreds of “green” projects have abjectly failed.  Proving the folly of most of Obama’s “green energy” investments, the DeSoto Solar Center in Florida, touted by the president as the “largest solar power plant in the United States” and given $150,000 of stimulus money, hired 400 construction workers to build the facility and ended up employing a grand total of two permanent workers.  Further illustrating this pattern, the city of Seattle received $20 million by promising to weatherize homes in needy neighborhoods, creating 2,000 jobs and retrofitting 2,000 homes; ultimately a splendid total of 14 (mainly administrative) jobs and three better-insulated homes bore witness to the efficacy of federal largesse!  

3.  INDIFFERENT TO OUR INALIENABLE NATURAL RIGHTS 

Working out his progressive philosophy, Barack Obama holds the rights enshrined in the Declaration of Independence and Bill of Rights are granted by government rather than God.  He adamantly supports abortion-rights, unconcerned about our most basic human right, life.  While running for president, he made every effort to obfuscate the issue, but facts prove him to be the most pro-abortion chief executive in the nation’s history.  He supports partial birth abortion and promised (when speaking to Planned Parenthood’s Action Fund in 2007):  “The first thing I’d do as President is sign the Freedom of Choice Act,” thereby annulling all efforts to regulate or reduce the number of abortions in the nation.  He wants tax monies to subsidize abortions—thus increasing the millions of dollars Planned Parenthood and other abortion providers rake in from public coffers and then lavishly support him and his “party of death.”  He is, furthermore, one of the few Democrats to oppose saving the lives of babies who survive abortions.  When an effort was made in Illinois to enact legislation, identical to the federal “Born Alive Infants Protection Act,” Obama spoke and voted against it.  

Still more:  he has strongly, if often subtly, moved to infringe upon and deny the “free exercise” of religion guaranteed by the First Amendment.  Early on his spokesmen crafted a linguistic shift from “freedom of religion” to “freedom to worship.”  What you do within your own mind, or within the walls of an approved facility, will be tolerated.  But there is no guaranteed freedom to actually live out your religious beliefs regarding such things as abortion or same-sex marriage.  Inevitably, as his Department of Health and Human Services bureaucrats began implementing Obamacare, they required religious organizations to provide “contraceptive care,” abortifacient drugs and sterilization.   

Shirking his constitutional duty to enforce federal laws, Obama simply announced (through his justice department) he would not defend the Defense of Marriage Act.  Indeed, Attorney-General Eric Holder and his subordinates in the justice department were soon in court arguing against it!  Awakened by this, New York Archbishop Timothy Cardinal Dolan warned the president that his “campaign against DOMA, the institution of marriage it protects, and religious freedom,” would “precipitate a national conflict between church and state of enormous proportions.”  But when the political winds favored it, the president re-embraced his deeply-held determination, announced in the 1996, to support same-sex marriage and evident in his early decision to annul the “don’t ask, don’t tell” policy in the nation’s armed forces.   Responding to such visible political currents, numerous organizations have endorsed homosexual rights and activities, and gay-rights activists have effectively worked through the media to brand as bigots individuals and institutions who support traditional marriage.  

Obama’s disdain for the Second Amendment and our right to self-defense, verbally expressed at various times in his career, has been carefully documented by the NRA and scholars such as John Lott who are concerned with a free people’s right to self-defense.  Lott, the distinguished legal scholar who wrote More Guns, Less Crime, taught with Barack Obama at the University of Chicago Law School in the 1990s.  When they met, Obama said to Lott:  “Oh, you’re the gun guy.”  When Lott affirmed the statement, Obama said:  “I don’t believe that people should be able to own guns.”  When Lott suggested they might meet and discuss the issue, Obama simply “grimaced and turned away, ending the conversation.” It is thus evident that he revealed his convictions when addressing a group of supporters in San Francisco in 2008, disdaining backward folks in “small towns” who “get bitter” and “cling to guns or religion or antipathy to people who aren’t like them . . . as a way to explain their frustrations.”  

If Katie Pavlich’s well-documented and disturbing Fast and Furious:  Barack Obama’s Bloodiest Scandal and Its Shameless Cover-Up and Richard Miniter’s Leading From Behind are substantially true, the president and his attorney-general are possibly responsible for a scheme to advance their anti-gun agenda by funneling guns to Mexican drug cartels under the direction of the ATF.  “Emails released under congressional subpoena” Pavlich says, “suggest that Attorney General Eric Holder and Homeland Security Secretary Janet Napolitano and their senior lieutenants were involved in devising and approving the program in 2009.”  Some critics even believe “that Operation Fast and Furious was built to fail, from a straight law enforcement point of view, and built to succeed in promoting gun control.  Certainly that has been its net effect, helping to justify the administration’s new regulations on long guns.”  Things backfired badly, however, when hundreds of innocents, including border agent Brian Terry were killed the rifles.  The newly-empowered Republicans in Congress launched an investigation into the operation, but it has been stifled by the president’s placing the whole endeavor under the protection of his “executive privilege.”  

4.  A CHICAGO-STYLE POLITIAN

In The Case Against Barack Obama David Fredosso evaluated Obama’s claim to be a “new” kind of politician, a reformer intent on constructive change.  His speeches certainly move the multitudes, but his activities replay an old Chicago script.  He never supported reform or supported change in Chicago.  He has always worked, hand-in-glove, with the entrenched political machine there, supporting the notoriously corrupt Strogers (father and son) who helped run Cook County’s political machine.  The inner circle of Obama’s campaign staff are veteran Chicago operatives and Mayor Daley’s hand deftly massaged the Obama presidential campaign via his veteran publicist, David Axelrod.   

Much the same must be said of Obama’s years in Springfield as a state senator where he linked up with Emil Jones, the senate president who quickly envisioned for Obama a route to the U.S. Senate.  Jones incarnates the patronage system that distinguishes Chicago politics—using tax money to distribute grants and subsidize all sorts of programs (and relatives) to perpetuate one’s career.  When he entered the U.S. Senate Obama repaid his benefactor by earmarking millions of dollars for some of Jones’ pet projects—as well an important one of his own by designating $1 million for the University of Chicago Medical Center where his wife Michelle was a vice president; her $200,000 salary conveniently doubled just as her husband entered the U.S. Senate!  Soon thereafter she changed the center’s bidding process so as to give the Blackwell Consulting firm a package deal worth $600,000, a nice remuneration for a long-time political ally and friend, Robert Blackwell.  

Barack Obama was barely into the first year of his presidency when Michelle Malkin published Culture of Corruption:  Obama and His Team of Tax Cheats, Crooks, and Cronies.  Researching the records of various associates and appointees Malkin found “a dysfunctional and dangerous conglomerate of business-as-usual cronies” including Rahm Emanuel, Valerie Jarrett, Joe Biden, Larry Summers, Tim Geithner, and Eric Holder.  Vice President Biden, while hardly as wealthy as many senators, has placed his children in cushy positions as lobbyists and hedge funds operatives and early profited from the services of the now-imprisoned ponzi-fund Texas financier R. Allen Stanford.  While publically decrying lobbyists, Biden earmarked more than $3.4 million for clients his son represented.  One of Obama’s confidantes, Jim Johnson, named to his pre-inaugural transition team, headed Fannie Mae from 1991 to 1998, despite having “accepted more than $7 million in below-market-rate loans from Countrywide,” and his successor,  Franklin Raines, quickly netted a sweet $1 million loan from Countrywide before retiring with a golden parachute worth $240 before both Fannie and Countrywide collapsed, helping  trigger the Great Recession.  

David Fredosso, in Gangster Government:  Barack Obama and the New Washington Thugocracy, takes his title from an article by Michael Barone, one of the nation’s most respected political analysts; it is an apt summation of the president’s penchant for rewarding his political allies with financial windfalls.  And it is, it seems, the reliably tested-and-tried Chicago way that’s become the Obama-way.   

SOURCES 

Alinsky, Saul D.  Rules for Radicals:  A Pragmatic Primer for Realistic Radicals.  New York:  Randam House  c. 1971.  

Blackwell, Ken, and Klukowski.  Resurgent:  How Constitutional Conservatism Can Save America.  New York:  Threshold Editions, c. 2011.  

Bork, Robert H., ed.  “A Country I Do Not Recognize”:  The Legal Assault on American Values.  Stanford, CA:  Hoover Institutiojn Press, c. 2005.  

Brooks, Arthur C.  The Battle:  How the Fight Between FREE ENTERPRISE and BIG GOVERNMENT   Will Shape America’s Future.  New York:  Basic Books, c. 2010.

_____.    The Road to Freedom:  How to Win the Fight for Free Enterprise.  New York:  Basic Books, c. 2010.  

Corsi, Jerome R.  The Obama Nation:  Leftist Politics  and the Cult of Personality.  New York:  Threshold Editions, c. 2008.  

Crowley, Monica.  What the (Bleep) Just Happened?  The Happy Warrior’s Guide to the Great American Comeback.  New York:  HarperCollins Broadside Books, c. 2012.  

D’Souza, Dinesh.  The Roots of Obama’s Rage.  Washington:  Regnery Publishing, Inc., c. 2010.  

_____.  Obama’s America:  Unmaking the American Dream.  Washington:  Regnery Publishing, Inc., c. 2012.  

Fredossa, David.  The Case Against Barack Obama:  The Unlikely Rise and Unexamined Agenda of the Media’s Favorite Candidate.  Washington:  Regnery Publishing, Inc., c. 2008.  

_____.    Gangster Government:  Barack Obama and the New Washington Thugocracy.  Washington:  Regnery Publishing, Inc., c. 2011.  

Gelertner, David.  America-Lite:  How Imperial Academia Dismantled Our Culture (and Ushered in the Obamacrats).  New York:  Encounter Books, c. 2012.  

Hannan, Daniel.  The New Road to Serfdom:  A Letter of Warning to America.  New York:  HarperCollins, c. 2010.  

Hewitt, Hugh.  The Brief Against Obama:  The Rise, Fall & Epic Fail of the Hope & Change Presidency.  New York:  Center Street, c. 2012.  

Higgs, Robert.  Crisis and Leviathan:  Critical Episodes in the Growth of American Government.  New York:  Oxford University Press, c. 1987.  

Horowitz, David, & Jamie Glazov, eds.  The Hate America Left.  N.D., N.P.  

Horowitz, David, & Jacob Laksin.  The New Leviathan:  How the Left-Wing Money Machine Shapes American Politics and Threatens America’s Future.  New York:  Crown Forum, c. 2012.  

Horowitz, David, & Richard Poe.  The Shadow Party:  How George Soros, Hillary Clinton, and Sixties Radicals Seized Control of the Democratic Party.  Nashville:  Thomas Nelson, c. 2006.  

Kengor, Paul.  The Communist:  Frank Marshall Davis:  The Untold Story of Barack Obama’s Mentor.   New York:  Threshold Editions, c. 2012.  

Kerpen, Phil.  Democracy Denied:  How Obama is Ignoring You and Bypassing Congreess to Radically Transform America—and How to Stop Him.  Dallas:  BenBella Books, Inc., c. 2011.  

Kibbe, Matt.  Hostile Takeover:  Resisting Centralized Government’s Stranglehold on America.  New York:  HarperCollins, c. 2012.  

Klein, Aaron.  Fool Me Twice:  Obama’s Schocking Plans for the Next Four Years Exposed.  New York:  WorldNetDaily Books, c. 2012.  

_____.   The Manchurian President.  New York:  WorldNetDaily Books, c. 2010.  

_____.   Red Army:  The Radical Network That Must Be Defeated to Save America.  New York:  HarperCollins, c. 2011.  

Klein, Edward.  The Amateur:  Barack Obama in the White House.  Washington:  Regnery Publishing, Inc., c. 2012.  

Kurtz, Stanley.  Radical-in-Chief:  Barack Obama and the Untold Story of American Socialism.  New York:  Threshold Editions, c. 2010.  

Levine, Mark.  Ameritopia:  The Unmaking of America.  New York:  Threshold Editions, c. 2010.  

_____.    Liberty and Tyranny:  A Conservative Manifesto.  New York:  Simon & Schuster , Threshold Editions, c. 2009.  

Limbaugh, David.  Crimes Against Liberty:  An Indictment of President Barack Obama.  Washington:  Regnery Publishing, Inc., c. 2010.  

_____.   The Great Destroyer:  Barack Obama’s War on the Republic.  Washington:  Regnery Publishing, Inc., c. 2012.  

Malkin, Michelle.  Culture of Corruption:  Obama and His Team of Tax Cheats, Crooks, and Cronies.  Washington:  Regnery Publishing, Inc., c. 2009.  

McCarthy, Andrew C.  The Grand Jihad:  How Islam and the Lert Sabotage America.  New York:  Encounter Books, c. 2010.  

Miniter, Richard.  Leading from Behind:  The Reluctant President and the Advisors Who Decide for Him.  New York:  St. Martin’s Press, c. 2012.  

Norquist, Grover G., and John R. Lott Jr.  Debacle:  Obama’s War on Jobs and Growth and What We Can Do Now to Regain Our Future.  New York:  John Wiley & Sons, Inc., c. 2012.  

Obama, Barack Hussain.  The Audacity of Hope.  New York:  Crown, c. 2006.

_____.  Dreams from My Father.  New York:  Crown, c. 1995.  

Olson, Walter.  Schools for Misrule:  Legal Academia and an Overlawyered America.  New York:  Encounter Books, c. 2011.  

Pavlich, Katie.  Fast and Furious:  Barack Obama’s Bloodiest Scandal and Its Shameless Cover-Up.  Washington:  Regnery Publishing Inc., c. 2012.  

Prager, Dennis.  Still the Best Hope:  Why the World Needs American Values to Triumph.  New York:  HarperCollins, c. 2012.  

Robison, James, and Richards, Jay.  Indivisible:  Restoring Faith, Family, and Freedom Before It’s Too Late.  New York:  Faith Words, c. 2012.

Schweizer, Peter.  Throw Them All Out:  How Politicians and The Friends Get Rich Off Insider Stock Tips, Land Deals, and Cronyism That Would Send the Rest of us to Prison.  New York:  Houghton Mifflin Harcourt Publishing Company, c. 2011.  

Sirico, Robert.  Defending the Free Market:  The Moral Case for a Free Economy.  Washington:  Regnery Publishing, Inc., c. 2012.  

Steele, Shelby.  A Bound Man:  Why We Are Excited About Obama and Why He Can’t Win.  New York:  Free Press, c. 2008.  

Twight, Charlotte A.  Dependent on D.C.:  The Rise of Federal Control Over the Lives of Ordinary Americans.  New York:  St. Martins Press, c. 2002.

Woods, Thomas E. Jr., ed.  Back on the Road to Serfdom:  The Resurgence of Statism.  Wilmington:  ISI Books, c. 2010.  

236 Stanley Jaki

During his long and prolific (41 books) academic career Stanley L. Jaki, an Hungarian-born Benedictine priest, received many of the prestigious honors bestowed upon eminent scholars—e.g. the Lecomte du Nouy Prize, the Templeton Prize, the Gifford Lectureship.  In A Mind’s Matter:  An Intellectual Autobiography (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2002), he charted the significant events and accomplishments in his life as a physicist-theologian with earned doctorates in both disciplines.  This is not a conventional, chronological life-story, however, since Jaki wrote “for those who, because they have found the message of my books instructive, would like to see its development through the eyes of their author” (p. vii).  He also wrote to counter thinkers who are “systematically shredding the last bits of its Christian cultural inheritance.  This is what weighs most heavily on my mind” (p. 217).  This volume provides a handy précis of his thought  (though extensive reading in the corpus of books would of course provide deeper insight into his positions). 

  As a 20th century Christian intellectual, Jaki sought to offset the secular humanism birthed half-a-millennium ago in the Renaissance, an epoch wherein it “was not man added to God, but man minus God, the God of the supernatural dispensation.  The Renaissance wanted to dispense man of any concern about that God” (p. x).  In subsequent centuries an army of secularists endeavored “to deny any tie, factual or possible, between Christianity and science” (p. x), repudiating any Supernatural dimension to Reality.  “For most in academe the basic dogma is that science is the savior of mankind, and is already liberating mankind from that highest form of superstition, which is Christian belief in the supernatural” (p. 57).  Thus there is a mighty battle being waged for man’s mind, a battle Jaki resolutely joined whenever possible, fully aware of “the great contestatation which has taken on a frightening vigor for the past two or three decades and got into high gear during the 1990s.  It is a wholesale attack by the champions of naturalism and secularism on the supernatural as mainly represented by the Catholic Church” (p. xii).  This rejection of God has, however, been accompanied by a growing ennui, a general malaise of meaninglessness.  To Jaki:  “All our cultural ills and woes, the disintegration of Western culture unfolding before our very eyes, are due to a growing loss of the sense of purpose” (p. 171).  

Jaki began his scholarly work with a doctoral dissertation on ecclesiology, a treatise that occupies a “place of honor” in the library of Pope Benedict XVI due to the fact that Jaki early discerned some of the devastation that would follow liturgical and theological innovations in the wake of Vatican II—“the greatest self-inflicted wounds which theologians have ever inflicted on the Church in the shortest conceivable time” (p. 17).  Following the Second World War he came to the United States and, while teaching theology, found himself drawn to graduate studies in physics with a special interest in the history of science.  To his dismay, he found that scientists, “who in their own field demanded utmost carefulness with data, were shown to act in a cavalier manner with respect to historical facts relating to their topics” (p. 32).  So he devoted himself to rectifying the record, exploring archives rather than experimenting in laboratories.  His theological concerns “led me first into the deep waters of modern physics, and from there to the even deeper currents of the history and philosophy of science.  The work I have done in that field was dedicated to the defense of certain theses—the existence of mind as distinct from matter; the fundamental importance for scientific method of an epistemology embodied in the classical proofs of the existence of God; the limited validity or relevance of exact science or physics; the crucial importance of Christian belief in creation for the unique rise of science” (p. 120).  

He especially discovered—and sought to make known—that the allegations of Enlightenment intellectuals such as Voltaire regarding the “alleged darkness of the Christian Middle Ages” was basically “the work of those who write intellectual history from the dark recesses of their prejudices” (p. 40).  This was definitively proven by the great French scholar, Pierre Duhem, a “kindred mind” whose insights (in Systeme du monde) provided Jaki exhaustive evidence regarding the scientific accomplishments of Medieval Christian scholars.  In truth:   “The only viable birth of science took place in a culture [Christendom] steeped in a vision wherein history, cosmic and human, appeared to be subject to a single one-directional movement, for which a straight arrow may be the most appropriate symbol” (p. 51).  In addition to rehabilitating the Middle Ages, Jaki sought to demonstrate the errors of naturalistic biology, flowering in the soil of Darwin, whose inner core stands revealed in some letters he wrote responding to the inquiries of a young German who wondered “whether evolutionary theory is compatible with belief in biblical revelation in general and in Christ in particular” (p. 64).  With uncustomary frankness Darwin acknowledged “that he did not think that there ever was a revelation.” (p. 64).  This statement led Jaki to compose some lectures published as The Savior of Science wherein he “set forth for the first time the scientific impact of the Logos doctrine” inscribed in the Nicene Creed (p. 66).  The saint responsible for much of the creed’s formulation, “Saint Athanasius clearly perceived an all-important implication:  The divinity of the Logos demanded that the universe created by the Father in the Son be fully logical, that is, fully ordered as befitted a truly divine Logos” (p. 66).  The universe is logical, not chaotic; ordered, not chance-driven.  “At any rate, the evidence of design, indicative of some purpose, is overwhelming everywhere in nature” (p. 174).  

Invited to deliver the 1976-1977 Gifford Lectures (the most prestigious award granted natural philosophers), Jaki wrote The Road of Science and the Ways to God, meticulously detailing significant developments in the history of science, showing how the greatest minds—Newton, “Galileo, Copernicus, Oresme, and Buridan, all endorsed natural theology insofar as they held that reflection on the natural world could propel the mind to recognize the Creator” (p. 95).  He especially stressed their common commitment to methodical realism, with its assumption of a rationally-ordered cosmos.  Indeed, “the gradual de-Christianization of the West logically brought about a progressive turning away from the objectively real to the subjectively perceived” (p. 183).  With this established, he turned to a defense of classical arguments for the existence of God.  

In addition to his scientific studies, Jaki devoted considerable attention to John Henry Newman, one of the 19th century intellectual giants who “again and again predicted the coming collapse of the liberal Western world’s intellectual and moral fabric” (p. 202).  Newman “characterized his entire life as a struggle against the principle of liberalism.  He specified it as the natural man’s standard that all religions are equally good, that there had been no supernatural revelation, that man never experienced a Fall and therefore stood in no need of a supernatural salvation” (p. 203).  Newman perceptively diagnosed the deadly threat of naturalism (with its scientific trappings) posing as mankind’s savior.  Nor was he ever a “Darwinist, not even an evolutionist,” primarily because the word “‘evolution’ eventually became synonymous with randomness and chance and therefore with discontinuity.  Evolution no longer means that something evolves because it has been there at least in embryo on the first place” (p. 214).  Still more:  “Newman’s mind was immune to the illogicalities of Darwin’s arguments” (p. 214).  Thus, while touring London’s Botanical Garden in 1876, he delighted in the wonders of the flora and exclaimed:  “But what argument could the Evolutionists bring against this as evidence of the work of Mind?’” (p. 215).  

The evidence of Mind in nature (and the functioning of mind within man) reverberates throughout  Jaki’s Mind’s Matter, reminding one of one of his finest treatises, Brain, Mind and Computers, wherein he persuasively demonstrated the long succession of failing endeavors to reduce man’s mind to any kind of mechanical device, computers included.  The mystery of the human mind simply cannot be dispelled by scientific means.  Nor can the ultimate Mystery of a Mind-designed Universe be reduced to atoms-in-motion or the chance-and-necessity of natural selection.  While a prior appreciation for his works enables one to more fully absorb this book, anyone concerned for the intersections of science and theology in the past century will profit from perusing it.  

* * * * * * * * * * * * * * * * * *

Though most of Jaki’s works focus on the history of science, in Means to Message:  A Treatise on Truth (Grand Rapids:  Wm. B. Eerdmans Publishing Col., c. 1999) he developed some of the guiding philosophical themes basic to his endeavors.  Much of his work was devoted to the history of science, and he always “considered the study of history to be a branch of philosophy” which “teaches by examples, which are such only if they serve as so many mirrors in which one can take a proper measure of oneself and of society” (p. 4), and “to write history is to do philosophy” (p. 199).  His historical perspective frequently enables him to remain nonplussed by sensational and inevitably transient “scientific” claims or alleged discoveries or end-of-discussion theories.  For him: “Philosophy has to be the love of reality all across the spectrum, which is, however, complexity incarnate” (p. 2).  There are neither simple answers nor demonstrable encyclopedic theories available suitable for the love of wisdom.   

In response to question “What is Truth?  The answer, ‘adaequatio rei ad intellectum,’ [correspondence of a thing to the intellect] is Aquinas’ definition of truth” (p. 10).  The thing exists, and the mind can know it.  Philosophy, rightly done, begins with the reality of objects.  The failure to begin with an objectively given reality “is responsible for the fact that the history of philosophy may appear to be a chain of errors” (p. 17).  Beginning with one’s own mind, reducing truth to subjectivity, following the lead of thinkers such as Descartes and Spinoza, Kant and Kierkegaard, sabotages the real task of philosophy, founded on “Chesterton’s great philosophical and linguistic tour de force:  “‘There is an is!’” (p. 23).  “The reality of objects should be a truth clearer than daylight” (p. 27).  Yet modern philosophers “want clear ideas,” whether or not they stand anchored in existent things.  

Though deeply interested in science, Jaki insists that it provides no foundation for philosophy.  “The road that connects philosophy and exact science is a one-way road.  One can travel from philosophy to science, but not from science to philosophy, unless one confuses science with the philosophy which scientists through around their science” (p. 54).   Unfortunately, great scientists such as Einstein, ignoring his own confession that “the man of science is a poor philosopher’” (p. 43) frequently make seriously flawed philosophical pronouncements.  In truth, science is “a superb tool for handling things, because they all have quantitative properties, but is of no help in understanding anything else about things, let alone what things are” (p. 63).  As a primary example Jaki cites Werner Heisenberg, who tried to expand his discoveries in quantum mechanics to a philosophical principle of uncertainty (rejecting causality) regarding everything and sinking into demonstrably fallacious reasoning.  

The failure of science to truly understand human nature stands revealed in the denial of free will often asserted under the rubric of “brain science,” earlier explored in Jaki’s Brain, Mind, and Computers.  “In a sense,” he says, “the philosophy of free will amounts to a declaration similar to the immediacy of kicking a stone, or Samuel Johnson’s famed demonstration of external reality.  The one-liner, ‘Sir (said he), we know our will is free, and there’s an end on’t’ . . . capsulized all that can be said in essence about free will as a reality” (p. 65).  Philosophers like Spinoza and the Enlightenment thinkers who followed him denied free will by reducing the mind to a machine—a practice still widely evident in evolutionary psychology manifestoes.  As formulated by Fichte:  “‘In my immediate consciousness I appear to myself to be free, but through reflection upon the whole of nature I find that freedom is utterly impossible’” (p. 69).  Despite their assertions, however, those who deny the reality of free will inevitably  employ it in their denials!  No intellectual writes a book persuaded he cannot refrain from doing so!  “Poincare’s now century-old dictum, ‘no determinist argues deterministically,’ sums up it all” (p. 70).   As a Christian, Jaki makes the significant point that God “created man to be free so that man’s service may have that merit which only a freely performed act can have.  God therefore has to remain a subtly hidden God, lest man should find himself ‘constrained’ to obey Him” (p. 78).   

We are free because we can think rationally, we know we are conscious and can understand ourselves and our surroundings.  Our minds function much differently from the mechanical process of the material world.  “Unlike bodies, thoughts are not extended.  Unlike bodies that move necessarily, mental operations are often performed with an explicit sense of freedom and for a purpose at that” (p. 127).  Our ability to formulate and use words—the “incredibly strange faculty which is language” (p. 130)—especially illustrates a non-material depth to our minds.  “What, however, can be known (it had been pointed out in 1927 and by an unabashed materialist) is a frightful conundrum:  If one’s mental processes are the equivalent to actions of atoms, one can have no reason to assume that one’s beliefs are true.  Those beliefs may be sound chemically, but not intellectually.  Hence there remains no reason for even supposing that one’s brain (or one’s computer) is composed of atoms” (p. 133).  “All this appears especially baffling when seen from Darwin’s evolutionary perspective, in which material needs and use precede all developments, including the development of intellectual faculties.  Already Wallace pointed out to Darwin that his explanation of the evolution of the mind was equivalent to putting the cart before the horse.  . . . .  The only answer Darwin could give was an imperious No!, which he wrote on the margin of a reprint of an essay which Wallace sent him.  This reply may have satisfied Darwin’s resolve to fight tooth and nail anything indicative of something non-material in man, but if left the problem fully intact” (pp. 129-130).  

That there is purpose in the cosmos seems manifestly evident to Jaki, though denied by the 18th century Enlightenment thinkers who reduced man, as Carl Becker approvingly wrote, to “‘little more than a chance deposit on the surface of the world, carelessly thrown up between two ice ages by the same forces that rust iron and ripen corn, a sentient organism endowed by some happy or unhappy accident with intelligence indeed, but with an intelligence that is conditioned by the very forces which it seeks to understand and control’” (p. 84).  Add Darwin’s 19th century theory of natural selection to the mix wherein, Jaki says:   “The comedy of philosophical myopia was crowned by the matter-of-fact acceptance of the view that an evolutionary process, which is seemingly and allegedly purposeless, could produce a being whose very nature is to act for a purpose.  The view implies a monumental non-sequitur, which remains the Achilles’ heel of an evolutionary science turned into an ideology of evolutionism.  The latter has not better foundation than the miscegenation of chance and necessity.  Of these two, chance remains a glorious cover-up for necessity.  As to necessity, it is refuted by the very freedom whereby it is posited” (p. 82).  

As in creation, there is a moral end or purpose to life.  The lack of such is revealed in Captain Ahab in Melville’s  Moby Dick, where he says, “All my means are sane, my motive and my object mad.” Thus Jaki asks, “Was this merely madness or was it a sin?  By trying to cover up with a psychiatric label something essentially unethical, that is, sinful, Captain Ahab anticipated modern man’s desperate footwork to obliterate the categories of moral good and moral evil so that he might escape the profoundly moral predicament of human existence, steeped, to put it bluntly, in sin” (p. 155).  Many efforts to destroy ethical distinctions wrap themselves in “scientific” theories (e.g. relativity, natural selection, and sociobiology).  “In fact,” Jaki notes, “science, when left to itself, invites the opposite to moral probity.  On the basis of science alone, mankind is but another animal species, locked in a grim struggle for survival” (p. 163).  As Darwin himself admitted, “‘A man who has no assured and no present belief in the existence of a personal God or a future existence with retribution and rewards, can have for his rule of life, as far as I can see, only to follow those impulses and instincts which are the strongest or which seem to him the best ones’” (p. 163).  Consequently:  “For the first time in history man is experimenting with one-parent families, taken for a normal alternative to old-fashioned monogamous unions.  Same-sex unions are receiving legal protection to the extent of their being granted the right of adoption.  Advocates of polygamy have begun to raise their voice, and they need no other arguments than the ones used with success on behalf of same-sex unions” (p. 156).  Such developments stem from reducing “the categories of the ethically good and ethically evil” rooted in Transcendent Reality to “the categories of legally permissible and legally prohibited” subjectively determined by human beings (p. 157).  There is, therefore, nothing absolutely forbidden.  

“One can bemoan the fact,” says Jaki, “that the Ten Commandments have turned in public perception into the Ten Counsels, or something much less, but the process is undeniable and logical.  If man, as he actually exists, is believed to do the good by natural inclination, there remains no barrier against viewing any and all inclination of man as something naturally good and thereby entitled to legal protection.  The latter, as is well known, is ultimately a function of counting the votes cast at regularly repeated elections” (p. 158).  This democratic way of setting moral standards was, however, decisively rejected by Moses, who declared:  “Neither shall you allege the example of the many as an excuse for doing wrong, nor shall you, when testifying in a lawsuit, side with the many in perverting justice” (Ex. 23:2).  

In citing Moses, of course, Jaki make clear his conviction that morality must be derived from a Higher Authority than man’s reason or experience.  That God is Real ever remains central to his thought; that His existence is demonstrable remains his informed conviction, for the very limits of our own reality necessarily suggests the limitless Reality underlying our existence.  “The explanation can only be found in a being which is self-explaining in the sense that it possesses all perfection in an absolutely perfect way, and above all the perfection to exist and do so without any limitation” (p. 165).  Such an elucidation is deeply philosophical and theological rather than scientific, for scientific methods simply cannot probe intangible realities.  Demonstrating the existence of God or the soul requires thinking that is “radically inferential” inasmuch as it “has for its object something which is no longer material but strictly spiritual” (p. 176).   

This was the approach of Medieval Scholastics such as Thomas Aquinas, whom Jaki cheerfully follows.  And he shares the commendation given them by Condorcet, himself a hostile witness:  “‘We owe to the Schoolmen,’ Condorcet wrote, ‘more precise notions concerning the ideas that can be entertained about the supreme Being and his attributes; the distinction between the first cause and the universe which it is supposed to govern; the distinction between spirit and matter; the different meanings that can be given to the word liberty; what was meant by creation; the manner of distinguishing the various operations of the human mind; and the correct way of classifying such ideas as it can form of real objects and their properties’” (p. 179).  

# # #