290 Some Polish Perspectives–Leyutko & Kolakowski

 Though Poland as a nation has frequently suffered occupation and exploitation, Polish artists (Chopin) and thinkers (Pope John Paul II) have blessed the world with their works.   Ryszard Legutko’s recent The Demon in Democracy:  Totalitarian Temptations in Free Societies (New York:  Encounter Books, c. 2016) adds another name to the list of writers dealing with the nature of the modern world.  When communists controlled his country, Legato edited an underground philosophy journal espousing the principles of Solidarity—the movement that liberated the nation 30 years ago.  “Solidarity,” he says, stood up in defense of human dignity (in its original and not the corrupted sense), access to culture, respect for the truth in science and for nobility in art, and a proper role given to Christian heritage and Christian religion.  It seemed that suddenly those great ideas at the root of Western civilization—which this civilization had slowly begun to forget—were again brought to life and ignited as a fire in the minds of the members of a trade union” (#819).  Sadly, following the collapse of the Iron Curtain these values quickly evaporated within the  liberal-democracy that replaced communism in Poland and now dominates in much of Europe.  

As a practicing philosopher Legutko has, for many years, pondered developments within this “liberal democracy” and has concluded it contains some of the same flaws that made communism so pernicious.   Leftism, even under a “democratic” banner, is still collectivist and authoritarian.  He makes clear that the “liberal-democracy” he critiques is not the classic system espoused by Thomas Jefferson or Winston Churchill but the modern system evident in both the social democracies supporting the European Parliament and America’s Democrat Party.  After watching with amazement how easily former communists became champions of liberal-democracy, Legutko argues they both “proved to be all-unifying entities compelling their followers how to think, what to do, how to evaluate events, what to dream, and what language to use.  They both had their orthodoxies and their models of an ideal citizen” (#152).  The European Union (EU) increasingly dictates to rather than represents the people of the continent’s nations.  “Even a preliminary contact with the EU institutions allows one to feel a stifling atmosphere typical of a political monopoly, to see the destruction of language turning into a new form of Newspeak, to observe the creation of a surreality, mostly ideological, that obfuscates the real world, to witness an uncompromising hostility against all dissidents, and to perceive many other things only too familiar to anyone who remembers the world governed by the Communist Party” (#174).  

What the two systems share, most deeply, is a commitment to change the world through technology—to “modernize” everything, to bring into being both a new human being and a perfect world.   The past provides neither things worth preserving nor guidance for the future inasmuch as it followed superstitious, medieval, old-fashioned notions.  Neither cultural traditions nor churches nor traditional families nor written constitutions matter, for what’s imperative is the construction of a totally new, modern world.  Rather than accepting the givenness of things as created, both communists and liberal-democrats endeavor to transform them; rather than dealing with reality they propose to construct it.  “In both systems a cult of technology translates itself into acceptance of social engineering as a proper approach to reforming society,  changing human behavior, and solving existing social problems” #233).   “In one system [the U.S.S.R.] this meant reversing the current of Siberia’s rivers, in the other [the U.S.], a formation of alternative family models; invariably, however, it was the constant improvement of nature, which turns out to be barely a substrate to be molded into a desired form” (#240).  

Legutko devotes a chapter to the shared communistic and liberal-democratic perspective on history—what astute thinkers such as C.S. Lewis condemned as “historicism.”  This is the notion derived from Hegel that there is a predetermined, irresistible evolutionary force shaping human events.  To swim with its progressive current is embrace and champion all things modern.  To be on the “right side of history” is to be altogether wise and righteous.  To oppose, to react against this course of events demonstrates stupidity and misanthropy.  For communists, forcefully establishing an egalitarian socialism is the goal; for liberal-democrats, the same end must be attained through peaceful, electoral means.  Both deeply believe they are the change-agents entrusted with perfecting both human nature and the world in general.  “A comparison between the liberal-democratic concept of history and that of communism shows a commonality of argument as well was of images of the historical process” (#412), generally drawn from Marxist sources:  1)  the triumphant march of freedom, vanquishing tyrannies of various sorts (monarchies; churches); 2)  the liberation of various victim (class; race; gender) groups; and, 3) the ultimate, thoroughly scientific enlightenment of homo sapiens.  

To make his case persuasive, Legutko suggests we imagine the differences between an old man and a youngster.  By virtue of his experience, the old man fears change, knowing it often stems from immaturity and ignorance.  The old man knows much about what has happened, including the tragedies and misfortunes resulting from well-intended, imprudent decisions.  But the youngster thinks he and his companions rightly envision a better world and need only to act quickly to achieve it.  “The old man is balanced in his reactions and assessments, looking for the appropriate courses of action in the world which, according to him, was founded on human error, ignorance, poor recognition of reality, and premature ventures; the youngster has an excitable nature, moving from desperation to euphoria, eagerly identifying numerous enemies whose destruction he volubly advocates, and equally happy to engage in collaborative activities with others because—he believes—the world is full of rational people.  The old man says that, given the weaknesses of the human race, institutions and communities (families, schools, churches) should be protected because over the centuries they have proven themselves to be tools to tame human’s evil inclinations; the young man will argue that such institutions and communities need to be radically exposed to light, aired out, and transformed  because they are fossils of past injustices.  The old man is a loner who believes that only such an attitude as his can protect the integrity of the mind; the youngster eagerly joins the herd, enjoying the uproar, mobilization, and direct action” (#536).  

Obviously the modern mind is that of a youngster, full of technical information and lofty aspirations, optimistically envisioning “the promise of a great transformation” that has enraptured so many intellectuals since the Renaissance.  Such intellectuals envision themselves as leaders on the “cutting edge of history,”  and they endlessly engage in “a favorite occupation of the youngster:  to criticize what is in the name of what will be, but what a large part of humanity, less perceptive and less intelligent than himself, fails to see” (#565).   A century ago, the U.S.S.R. served as a lodestar for “youngsters” such as John Reed, who sought therein the realization of their dreams.   More recently, the “youngsters” took to the barricades in Paris in 1968 or marched in America’s streets in support of Ho Chi Min.  The ‘60s revolutionaries chanted “a medley of anarchist slogans, a Marxist rhetoric class struggle and the overthrowing of capitalism, and a liberal language of rights, emancipation, and discrimination.  Capitalism and the state were the main targets, but universities, schools, family, law, and social mores were attacked with equal vehemence” (#1580).  One need only study carefully the rhetoric and policies of Bernie Sanders and Elizabeth Warren to note the empowerment of those ‘60s revolutionaries.  

Indicating one adolescent aspect of the modern mind is the importance of entertainment—a point persuasively made three decades ago by Neil Postman in Amusing Ourselves to Death.  In earlier, more religious times, entertainment was understood to be a non-consequential activity designed to provide a brief break from the serious work assigned us.  But, Legutko, says:  “In today’s world entertainment is not just a pastime or a style, but a substance that permeates everything:  schools and universities, upbringing of children, intellectual life, art, morality and religion” (#753).  Modern entertainment resembles the divertissement so acutely diagnosed by Pascal at the beginning of the modern era:  it’s an activity “that separates us from the seriousness of existence and fills this existence with false content” (#753).  We don’t escape reality for a few hours—we immerse ourselves in an imaginary world.  “By escaping the questions of ultimate meaning of our own lives, or of human life in general, our minds slowly get used to that fictitious reality, which we take for the real one, and are lured by its attractions” (#760).  

Rivaling historicism in its importance for both communists and liberal-democrats is utopianism, generally flying the multicolored flag of social justice.  “Utopia is thus not a political fantasy but a bold project bolder than others because it aims at a solution to all the basic problems of collective life that humanity has faced since it began to organize itself politically.  Utopia is—I beg the reader’s pardon for such a vile-sounding phrase—the final solution” (#931).  Beginning in the Renaissance, various utopians proposed political solutions to man’s ancient ills and aspirations, insisting “man can achieve greatness and be equal to God, because he has unlimited creative potential” (#931).  The republic envisioned by America’s Founders was not utopian, but the egalitarian liberal-democracy promoted by 20th century progressives—from Richard Ely and Woodrow Wilson to John Rawls and Barach Obama—certainly is.    

Counterintuitively, the “classical liberalism” that began with Adam Smith and Thomas Jefferson celebrating individualism slowly became “a doctrine in which the primary agents were no longer individuals, but groups and the institutions of the democratic state.  Instead of individuals striving for the enrichment of social capital with new ideas and aspirations, there emerged people voicing demands called rights and acting within the scope of organized groups.”  Special interest groups working within a relentlessly-expanding state, orchestrated legislative enactments and judicial decisions, “demanding legal acceptance of their position and acquired privileges.  In the final outcome the state in liberal democracy ceased to be an institution pursuing the common good, but became a hostage of groups that treated it solely as an instrument of change securing their interests” (#1205).  Ironically, today’s liberals (most notably homosexuals and feminists) are hardly liberal, in as much as they strive to regulate virtually every aspect of life, including “language, gestures, and thoughts” (#1284).  They’re just Leftists intent on imposing their agenda.  

The political system shaped by both communists’ and liberal-democrats’ historicist-utopianism becomes all-intrusive, ever intent on removing all vestiges of property or class distinctions.  Leftist ideologies of the ’60s now dominate the liberal-democratic academic and media complex.  And the Christian churches, sidelined by pernicious church-state separation decrees, have largely accommodated themselves to the deeply anti-Christian ways of modernity.  Consequently, many churches have tailored their teachings to fit “the requirements of the liberal-democratic state and, consequently, to revise their doctrines substantially, sometimes beyond recognition” (#2885).  Having successfully marched through our cultural institutions, triumphant liberals have “managed to silence and marginalize nearly all alternatives and all nonliteral view of political order” (#1536).  

Reading Legutgo’s provocative and deeply-informative analysis of these realms both clarifies and challenges our understanding of the our world.   I share my good friend John Wright’s strong endorsement of this work.  It is, as John O’Sullivan says in his Introduction, a “culturally rich, philosophically sophisticated, and brilliantly argued book” that deserves our attention if we’re concerned about our civilization. 

* * * * * * * * * * * * * * * * * * * * * *

Fortunately for the general reader, first-rate philosophers often write accessible essays, addressing both current issues and perennial truths.  Thus Leszek Kolakowski, a Polish thinker rightly renowned for his magisterial, three-volume Main Currents of Marxism, published a score of short essays in Modernity on Endless Trial (Chicago:  The University of Chicago Press, c. 1990) that offer serious readers valuable insights into some main intellectual currents of the 20th century.  Whenever an erstwhile Marxist casts a favorable glance as Christianity it makes sense for believers to consider his reasons.   

One set of essays focus “On Modernity, Barbarity, and Intellectuals.”  Strangely enough, a corps of intellectuals has orchestrated the barbarism that has emerged during the last three centuries—an  era labeled “modernity.”  Since Kolakowski cannot see how “postmodern” differs from “modern,” he discerns the loss of religion (and loss of taboos) as the primary current in modern (and postmodern) times, leading to “the sad spectacle of a godless world.  It appears as if we suddenly woke up to perceive things which the humble, and not necessarily highly educated, priests have been seeing—and warning us about—for three centuries and which they have repeatedly denounced in their Sunday Sermons.  They kept telling their flocks that a world that has forgotten God has forgotten the very distinction between good  and evil and has made human life meaningless, sunk into nihilism” (pp. 7-8).  A series of influential, secularizing skeptics prepared the way for the destructiveness of “Nietzsche’s noisy philosophical hammer” crafted to re-order the world (p. 8).  The “intellectuals” responsible for this process were not the scholars—scientists or historians—who “attempt to remain true to the material found or discovered” (p. 36) apart from themselves.  A barbarizing “intellectual” is someone who wishes not “simply to transmit truth, but to create it.  He is not a guardian of the word, but a word manufacturer” (p. 36).  Invariably, such intellectuals are seductive, spinning wondrous tales of utopian vistas.  

To Nihilists such as Nietzsche, truth is illusory.  Consequently, various cultures’ “truths” are equally “true” even if they are obviously contradictory!  Such cultural relativism—declaring all cultures are equal, praising the Aztecs as well as the Benedictines—easily embraces an admiration for various forms of what was once judged barbarism.  The sophisticated, scholarly “tolerance” so mandatory in elite universities and journals ends by granting “to others their right to be barbarians” (p. 22).  What we are witnessing is the Enlightenment devouring itself!  In Kolakowski’s judgment:  “In its final form the Enlightenment turns against itself:  humanism becomes a moral nihilism, doubt leads to epistemological nihilism, and the affirmation of the person undergoes a metamorphosis that transforms it into a totalitarian idea.  The removal of the barriers erected by Christianity to protect itself against the Enlightenment, which was the fruit of its own development, brought the collapse of the barriers that protected the Enlightenment against its own degeneration, either into a deification of man and nature or into despair” (p. 30).  

Another set of essays deal with “the Dilemmas of the Christian Legacy,” for modernity’s secularizing process has significantly, if indirectly, shaped much of the Christian world “through a universalization of the sacred,” sanctifying worldly developments as “crystallizations of divine energy.” (p. 68).  The “Christianity” rooted in process theology—as propounded by Teilhard de Chardin for example— envisions universal salvation and unending evolutionary progress.  “In the hope of saving itself, it seems to be assuming the colors of its environment, but the result is that it loses its identity, which depends on just that distinction between the sacred and the profane, and on the conflict that can and often must exist between them” (p. 69).  Kolakowski detects and dislikes what he finds in these circles—“the love of the amorphous, the desire for homogeneity, the illusion that there are no limits to the perfectibility of which human society is capable, immanentist eschatologies, and the instrumental attitude toward life” (p. 69).  Losing their sense of the sacred, this-worldly philosophies and religions fail to provide any basis for culture.  Indeed:  “With the disappearance of the sacred, which imposed limits to the perfection that could be attained by the profane, arises one of the most dangerous illusions of our civilization—the illusion that there are no limits to the changes that human life can undergo, that society is ‘in principle’ an endlessly flexible thing and that to deny this flexibility and this perfectibility is to deny man’s total autonomy and thus to deny man himself” (p. 72).  A rejection of the sacred invites the denial of sin and evil.  

Though not overtly Christian, Kolakowski himself rejected the atheistic Marxism of his early years and found Christianity the best hope for the world and become a cheerleader for, if not a devotee of the Faith.  “There are reasons why we need Christianity,” he argues, “but not just any kind of Christianity.  We do not need a Christianity that makes political revolution, that rushes to cooperate with so-called sexual liberation, that approves our concupiscence or praises our violence.  There are enough forces in the world to do all these things without the aid of Christianity.  We need a Christianity that will help us move beyond the immediate pressures of life, that gives us insight into the basic limits of the human condition and the capacity to accept them, a Christianity that teaches us the simple truth that there is not only a tomorrow but a day after tomorrow a well, and that the difference between success and failure is rarely distinguishable” (p. 85).   

Given his critique of modernity, Kolakowski has little patience with the modernist (or liberal) Christianity that focuses on “social justice,” peace, and ephemeral earthly progress—the this-worldly political agenda so routinely proclaimed in some quarters.  “Christianity is about moral evil, malum culpae, and moral evil inheres only in individuals, because only the individual is responsible” (p. 93).  To even speak of “a ‘morally evil’ or ‘morally good’ social system makes no sense in the world of Christian belief” (p. 93).  The vacuous “demythologization” project of modernists such as Rudolph Bultmann elicits Kolakowski’s erudite disdain, for it was merely a fitful gasp of the irrational skepticism launched centuries ago by William of Occam and the nominalists, then subtly advanced by David Hume and the 18th century empiricists.  In truth, “there is no way for Christianity to ‘demythologize’ itself and save anything of its meaning.  It is either-or:  demythologized Christianity is not Christianity” (p. 105).  

Demythologized Christianity contradicts itself.  In this respect it’s simply another utopian political ideology.  Having early advocated the Marxist version of utopia, Kolakowski easily detects the many currents of such blissful imagining—popularly expressed in John Lennon’s popular song “Imagine.”  Consider the fantasies of folks who envision a world wherein fraternity is realized, where equality prevails in every realm.  They “keep promising us that they are going to educate the human race to fraternity, whereupon the unfortunate passions that tear societies asunder—greed, aggressiveness, lust for power—will vanish” (p. 139).  Inevitably they establish dictatorships designed to enforce the mirage of equality.  Allegedly admirable goals—caring for the impoverished and weak—require the abolition of private property and a state controlled economy, the abolition of the free market.  However noble the intentions, “the abolition of the market means a gulag society” (p. 167).  

In the name of compassion, giving preferential treatment to various disadvantaged groups, societies easily “retreat into infantilism” (p. 173).  Citizens become dependent, childlike welfare recipients.  The State assumes more and more responsibility to care for everyone’s needs, and we “expect from the State ever more solutions not only to social questions but also to private problems and difficulties; it increasingly appears to us that if we are not perfectly happy, it is the State’s fault, as though it were the duty of the all-powerful State to make us happy” (p. 173).  The State, of course, cannot possibly do this.  Yet this blatantly utopian longing drove some of the most powerful mass movements of the 20th century, most of them Marxist to some degree.  Marx, of course, didn’t envision the gulags that would result from the implementation of his socialistic ideas!  But Lenin and Trotsky were, in fact, faithful to his precepts, installing a “dictatorship of the proletariat” that could not but violently pursue its agenda.  Reducing ethics to “fables” and doing whatever necessary to advance his cause, Lenin simply implemented his Marxist principles. 

289 Notable Conversions–Andrew Claven & Sally Read

 

When I review books I hope some readers find bits of valuable information and perhaps pick up a copy if it interests them.  But some books I not only read and relish but wish everyone could enjoy the enlightenment and beauty they afford.  Such is Andrew Klavan’s The Great Good Thing:  A Secular Jew Comes to Faith in Christ (Nashville:  Nelson Books, c. 2016), wherein a gifted writer speaks persuasively, reaffirming the perennial allure of the the Incarnate Savior, our Lord Jesus Christ.  Klavan is well-known in the literary world, considered by Stephen King a “most original American novelist of crime and suspense.”  But rather than keeping us in suspense Klavan, in The Great Good Thing, tells us about his conversion, culminating with his Christian baptism at the age of 49.  “No one could have been more surprised than I was,” he says.  “I never thought I was the type.  I had been born and raised a Jew and lived most of my life as an agnostic.  I believed in the fullest freedom of thought into the widest reaches of fact and philosophy  I believed in science and analysis and reasonable explanations.  I had no time for magical thinking of any kind.  I couldn’t bear solemn piety.  I despise even the ordinary varieties of willful blindness to the tragic shambles of life on earth.”  In short, for half-a-century he’d been a hard-boiled realist—“a worlding by nature” (p. xiii).  

Flourishing as a writer, Klavan “was one of the men of the coasts and cities, at home among the snarks and cynics of these postmodern times” (p. xvi).  Yet here he was, confessing “that Jesus Christ was Lord” and accepting “the uniquely salvific truth of his life and preaching, death and resurrection—this it seemed to me even in the moment, was to renounce my natural place in the age, to turn against my upbringing and my kind.  It felt, so help me, as I were flinging myself off the deck of a holiday cruise ship, falling away from its lighted ballrooms and casinos, from the parties and the music and sparkling wine of Fashionable Ideas, to go plunging down and down and did I mention down into a wave-tossed theological solitude” (p. xv).  In a sense it made no sense!  But in a deeper sense, it was a coming together of the central themes of his novels wherein his “heroes were always desperately on the run desperately trying to get at a truth that baffled their assumptions and philosophies” (p. xvi).  They wanted to make sense of the world but couldn’t find the key.  

Slowly, through much reading and writing and personal experience, he discovered  the key—the answer to Pilate’s question, “What is truth?”— could be found only in the message proclaimed by The Gospel According to St. John!  Jesus Is the Truth!  Klavan’s spiritual journey, rather like C.S. Lewis’s, took place over a number of years wherein he moved from agnosticism to belief in God.  He’d begun praying and found his life improved by the discipline.  He’d “become like a character in one of my own stories, desperately trying to unknit the fabric of fact and perception, to separate the warp of psychology from the weft of objective truth, before time ran out” (p. xix).  He fully understood the risks entailed—a successful Jewish writer, safely ensconced in an up-scale Santa Barbara suburb, daring to declare himself a Christian.   What would that mean?  “‘Oh, God,’ I prayed fervently more than once, ‘whatever happens, don’t let me become a Christian novelist!’” (p. xx).  “Would I descend into that smiley-faced religious idiocy that mistakes the good health and prosperity of the moment for the supernatural favor of God?” (p. xx).  And in becoming a Christian he determined not to forsake his Jewish ancestry and culture.  Could it happen?  

Well, it did.  He found Christ—or, in that paradoxical mystery of redemption, Christ found him!  Consequently, he found himself “rejoicing.  I was convinced and fully convinced:  my mind was God’s, my soul was Christ’s, my faith was true.  How had that happened and why?  Given the spiritual distance I’d traveled, given the depths of my doubts, given the darkness of my most uncertain places, and given, most of all, the elation and wonder I felt at the journey’s end, it seems to be a story worth telling” (p. xxv).  

It’s a story worth telling—and for us it’s a story worth reading!  Klavan recounts  his early years in Great Neck, New York, “a wealthy town, a well-tailored suburban refuge from the swarming city,” where he was immersed in an upper-middle-class, secularized Jewish community, the son of a successful New York morning drive radio personality.   But as a child he was inwardly unhappy and spent much time daydreaming, constructing elaborate fantasies featuring himself as the invariably tough-guy hero.  Much of his school-time was devoted to fantasizing rather than studying.  He seemed to be a good student, “but it was all fraud.  I could read well and write well and talk glibly and even figure out math problems in my head.  So I could bluff my way through subjects I knew nothing about, and neither my teachers nor my parents, nor even my friends, were aware that I was hardly doing any schoolwork at all” (p. 28).   In fact he learned nothing—“no historical facts, no mathematical formulas, no passages from the books we were supposed to have read” (p. 28).  

Nor did he learn much about Judaism.  His family’s Jewishness was purely cultural, extending to only a few traditions.  “God was not a living presence in my home,” (p. 45), and his required attendance (“suffocating torture”) at Hebrew school in the local synagogue left no impression on him.  “My father used to say, ‘You can’t flunk out of being Jewish.’  But man, I tried” (p. 48).  Forced to submit to his Bar Mitzvah, he ad-libed his way through it and was startled by the “fortune in gifts” he received.   But inwardly he felt only “rage and shame” at participating in a ritual mouthing words he disbelieved.  He knew he was a hypocrite and hated himself for it.  “With great pomp and sacred ceremony, they had made me declare what I did not believe was true—and then they had paid me for the lie with these trinkets!  I felt that I had sold my soul” (p. 55).  

Though his family’s Judaism hardly affected young Klavan, a brief exposure to a thoroughly Christian family did!  A woman, Mina, who worked for his family and became virtually a family member, gave him “a substantial portion of what mothering I had” (p. 61).  With neither husband nor children of her own, “she just took care of people, that’s all” (p. 62).  Though he didn’t really understand it, Mina “was a true Christian.  Religious, I mean, even devout” (p. 64).  She never mentioned “Jesus to me, but he was alive and real to her” (p. 64).  Allowed to go to her gaily decorated, music-filled house one Christmas, Klavan felt himself in a “wonderland” surrounded by cheerful, caring people.  Many years later, preparing for his baptism, he marveled that “Jesus had first entered my consciousness” at that first Christmas at Mina’s house” (p. 68).  Thenceforth, even in his most agnostic, secular stages, he retained a deep fondness for Christmas, even celebrating the season with a sincerity lacking in some Christian circles!  But:  “It was Christmas we loved, the bright tradition, not Christ, never Christ” (p. 74).  

Nor did he love schooling of any kind!  Early on he’d determined to become a writer, and he thought only personal experience could teach him what he wanted to know.  So he entered a program designed to enable youngsters to finish high school early and launched out on his own, at the age of 17, to enter “the world of Experience behind the walls” (p. 119).  He worked at various jobs, traveled hither and yon, and certainly experienced many things.  In the midst of his wandering, for reasons he cannot recall, he applied for admission to the University of California at Berkeley and was accepted.  But he was late on the scene.  “The radical years were over.  The riots and mayhem I’d been hoping to see had passed like a storm” (p. 123).  Though nominally a student, he mainly drank and slept and tried to teach himself how to write.  

When he did go to class, he “went through all the usual dazzle-dazzle shenanigans:  bluff and fakery.  I read none of the books.  I conned and wrote my way to passing grades” (p. 124).  But for some reason he always bought the books required for the courses and kept them.  Back then, when postmodernism was just beginning its onslaught, some university professors still assigned the “classics” and encouraged students to engage in the “Great Conversation, an interchange carried on across the centuries by the major thinkers and artists of the Western canon.  The idea was that by studying this conversation you could move closer to the Truth and so find a fuller wisdom about reality and what made for the Good life” (p. 134).  So Klavan’s growing library contained the works of the masters and one day, lying listlessly in bed, he picked up a William Faulkner novel.  Suddenly he discovered what literature was all about!  And he began to read, on his own, the classic literary works of Western Civilization.  “Without knowing it, I had joined the Great Conversation” (p. 138).  

Not only did he discover the classics at Berkeley—he found a wife, Ellen, the daughter of the chairman of the Berkeley English department.  Ellen’s parents embraced him, and he managed to graduate from the university as well as marry her and make a lasting marriage of invaluable worth.  Though he’d been “a fool in many ways,” by marrying her “somehow—and not for the last time in my life—I had managed to stumble into the great good thing” (p. 158).  The young couple then moved to New York and managed to survive, working at various jobs while he tried to become the writer of his dreams.  Amidst his manifest lack of success as a writer he began struggling with mental and emotional issues that led him to enter a five-year stint of psycho-analysis, wherein a gifted therapist greatly helped him to get mentally healthy.  In the midst of his depression (both mental and financial), however, he decided to write a suspense novel—using a pseudonym, since such was not the genre of “serious” writers.  He and his brother quickly wrote—and sold—the book, which then won the Edgar Award for best paperback mystery.  Better yet, they also got a movie deal.  He not only made money but discovered “that telling such stories was my gift” (p. 170).  In time he would become a highly-acclaimed and financially successful suspense novelist.  

Among the many events that opened his mind to God was the birth of his first child.  “Sex, birth, marriage, these bodies, this life, they were all just representations of the power that had created them, the power now surging through my wife in this flood of matter the power that had made us one:  the power of love.  Love, I saw now, was an exterior spiritual force that swept through our bodies in the symbolic forms of eros, then bound us materially, skin and bone, in the symbolic moment of birth.”  Watching the baby emerge from the womb, Klavan experienced a truly mystical moment.  “I became not one flesh with my wife but one being beyond flesh with the love I felt for her.  My spirit washed into that love and became part of it, a splash in a rushing river.  In that river of love, I went raging down the plane of Ellen’s body until the love I was and the love that carried me melded with the love I felt for the new baby we had made together and I became part of the love as well,” and he saw he “was about to flow out into the infinite.  I saw that, beyond the painted scenery of mere existence, it was all love, love unbounded, mushrooming, vast, alive, and everlasting.  The love I felt, the love I was, was about to cascade into the very origin of itself, the origin of our three lives and of all creation” (p. 191).  

In time he realized:  “You cannot know the truth about the world until you know God loves you, because that is the truth about the world” (p. 236).  Tasting the reality of love, he sought Love!  He began slipping into churches and even attending services—and then met an engaging Episcopal rector.  He appreciated the music as well as the messages and began to “realize there was a spiritual side to life, a side I had been neglecting in my postmodern mind-set” (p. 195).  Intuitively, he knew morality itself requires a transcendent foundation.  “An ultimate Moral Good cannot just be an idea.  It must be, in effect, a personality with consciousness and free will” (p. 205).  “In the chain of reasoning that took me finally to Christ, accepting this one axiom—that some actions are morally better than others—is the only truly nonlogical leap of faith I ever made.  Hardly a leap really.  Barely even a step, I know it’s so.  And those who declare they do not are, like Hamlet, only pretending” (p. 206).  

Coming to faith in Christ proved momentous:  “My personality was so transformed I hardly recognized myself” (p. 211).  Filled with joy, Klavern flourished as a writer and father.   His written works reveal his mind’s journey, refuting the postmodernism firmly entranced in the nation’s intelligentsia and working through the anti-semitism obvious in the Western Christian Culture he’d come to love.   He saw the truth revealed in the words of one of his own characters in True Crime:  “‘You want to believe in God,’ the pastor says, ‘you’re gonna have to believe in a God of the sad world’” (p. 225).  Sin has shattered our world, and it’s full of evil—including the Holocaust.  But the Savior has saved us from sin!  “In this new mental freedom, I came to see that the dilemma I had been wrestling with—my love of a culture that had done so much evil and yet produced such lasting beauty—was only my personal portion of the greater human paradox.  We are never free of the things that happen.  Even evil weaves itself into the fabric of history, never to be undone.  Yet at the same time—at the very same time—each of us gets a new soul with which to start the world again.”  Jesus “offered a spiritual path out of the history created by Original Sin and into the newborn self remade in his image.  It is the impossible solution to the impossible problem of evil.  All reason says it can’t be so.  But it’s the truth that sets us free” (p. 229).  

This book is so good that (as is evident in the long quotations) it’s tempting just to duplicate the entire text!  So let me just share Eric Metaxis’ encomium:  “Andrew Klavan’s superb new book deserves to become a classic of its kind.  Klavan’s immense talents as a writer are on full view in what must certainly rank as his most important book to date.  Tole lege [take up and read—the words Augustine heard prior to his conversion].”  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Inasmuch as all heaven rejoices when a single sinner repents, all converts are equally treasured.  But inasmuch as every person is unique, each conversion story adds depth and texture to the ways of God with man. Thus one of the most recent conversion accounts, Night’s Bright Darkness:  A Modern Conversion Story (San Francisco:  Ignatius Press, c. 2016), by Sally Read, a contemporary English poet, merits our attention.  

Reared in a militantly atheist home—her father a vociferous Marxist journalist—Sally Read represents much about today’s secularism.  “At ten I could tell you that religion was the opiate of the masses; it was dinned into me never to kneel before anyone or anything.  My father taught me that Christians, in particular, were tambourine-bashing intellectual weaklings.  As a young woman I could quote Christopher Hitchens and enough of the Bible to scoff at.  My father would happily scoff with me” (#83).  There is neither God nor soul.  Matter alone exists, she thought.  Yet as she began working as a nurse in a psychiatric nurse she met patients whose sufferings and dyings gave her pause.  And amidst the intemperate drinking and casual sex that punctuated her work-week routine, she occasionally felt strangely drawn to old churches in the neighborhood.  

Then her father’s death at the age of fifty-six distressed her.  “I felt as if a god had died.  The creator of my world and my protector had gone” (#231).  Feeling abandoned and inwardly empty, she felt as if she were in hell and wondered what, indeed, life is all about.  She lost weight and hair.  She “even considered, in a desperate and vague way, invoking God.”  Perhaps some kind of faith would make life “liveable.  But it seemed entirely unfeasible to believe in any God; I thought I could never lower myself to that degree of self-delusion” (#246).  Looking back, she now considers that desolate phase of her life a blessing, for God was mysteriously working therein to bring her to Himself.  “His absence was so painfully loud it seems, now, to prove his existence,” for He “reaches us wherever we are, even if we are so far from knowing him that we mistake him completely.  His infinity always contains our finitude” (#253).  

In the six years following her father’s death, Sally Read became a published poet, married an Italian and gave birth to a baby girl, Florenzia.  She had to “battle” for both a wedding and a child in a world which welcomed neither, but in her heart she just knew such things are right.  Then ultimate, metaphysical questions began to haunt her, and while pregnant she wondered, “What was it I was creating?” (#313).  Having successfully published two volumes of poetry, she envisioned writing a more journalistic work designed to help women nourish their emotional and reproductive health.  Personally, she “had suffered debilitating physical and psychological effects while taking oral contraceptives” and wanted to explore that issue as well as “abortion and its effects on women” (#364).  So she interviewed various women while researching the book and in Rome encountered some devout American Catholics whose husbands were studying at pontifical universities.  Though they lived by standards utterly unlike hers, she was strangely warmed by their zest for life and constant awareness of God in their daily activities.  

Consequently, she began visiting churches and got acquainted with a godly guide, Father Gregory, a Ukrainian studying for the priesthood, whose gentle counsel and literary references help nudge her to belief in God.  He encouraged her to pray, though she had no idea how.  But she picked up T.S. Eliot’s Four Quartets and sensed almost immediately an inner peace, an acceptance of a Reality around her that unleashed a torrent of tears.  Telling Father Gregory of that moment, he sent her a poem by St. John of the Cross which further inspired her.  Entering a nearby church she felt:  “The strange calm that had come upon me that night the week before had settled into a new longing to know what to do.”  Looking up, she saw an icon of Christ’s face in a window and said:  “‘If you’re there, you have to help me’” (#650).  And He did!  “I felt almost physically lifted up.  My eyespots crying instantly, my face relaxed.  It was like being in the grip of panicked amnesia, when suddenly someone familiar walked into the room and gave myself back to me—a self restored to me more fully than before.  It was a presence entirely fixed me as I was on it, and it both descended toward me and pulled me up.  I knew it was him.  This was the hinge of my life; this compassion and love and humility so great it buckled me as it came to meet me.  Later, I would read in Simone Weil’s writings what seemed a very similar experience—how, as she prayerfully recited George Herbert’s poem ‘Love,’ ‘Christ himself came down and took possession of me’” (#658).  

Thenceforth she freely prayed, reciting the Our Father, knowing she was safe in His arms and feeling “as if the Birth, the Crucifixion, the Resurrection were plunged into my being in one gorgeous blow—this is how it is to all of a sudden know the meaning of reality:  the heart kick-started to sense its intrinsic architecture of logic, love, and reason” (#658).  She intimately sensed the Presence of the Living Lord.  “There was a feeling of being known in every cell.  My aloneness was taken away from me; and though it has often since returned, I know that loneliness is the illusion and Christ beside me the reality.  This was my earliest prayer:  being attuned to Christ’s presence, which by grace I perceived in those early days as strongly was my daughter’s breathing or the sound of the blackbird singing at night in the garden.  Prayer became essential,” and she sensed “being touched—if so pale a word can describe the sensation of being broken and healed—touched that he had come to me when I had rejected him and spoken against him and published lies about him in my books” (#680). 

At that point, though Catholics had certainly guided her, she had no interest in Catholicism, with all its dogmas and rules.  But she began reading the Gospels and found the Jesus revealed therein quite unlike the “gentle Jesus, meek and mild” proclaimed in liberal churches.  She came to see that the Truth was something given to us, not something we fashion, something best established in the Church of the Apostles.  And the Truth she encountered led her, step by step into the Catholic Church.  Night’s Bright Darkness reveals a poet determined to discern and beautifully describe Reality entering into its fullness.  

# # #

288 Debunking Utopia

 In Dostoyevski’s The Brothers Karamazov, Father Zossima tried to counsel a distraught woman by encouraging her to embrace an “active love” by helping others.  She  unfortunately failed to follow his advice, settling into an abstract “love for humanity.”  Father Zossima called her’s a “love in dreams” and noted that “love in action is a harsh and dreadful thing compared with love in dreams,”  for when daydreaming, or imagining how we can help “humanity,” we slip into a non-existent future that helps no one.  Consequently we label “utopian” those societies designed for beings quite unlike our species.  Nima Sanandaji, in Debunking Utopia:  Exposing the Myth of Nordic Socialism (Washington, D.C.:  WND Books, c. 2016), reminds us, with ample facts, that socialism forever fails simply because it cannot succeed in the real (as opposed to the imaginary) world.  He makes three important points:  1) post-WWII Scandinavia’s economic success results from the region’s cultural roots rather than socio-political structures; 2) trying to duplicate Nordic economic structures elsewhere cannot but to fail; and 3) “democratic socialism” is now collapsing even in Scandinavia due to its intrinsic flaws—indeed, “the days when the Nordic countries could actually be socialist are long gone” (#2333).  

Sanandaji’s family emigrated from Iran to Sweden in 1989, and he personally enjoyed all the benefits of that nation’s generous (socialistic) welfare state, ultimately finishing a Ph.D. in economics, publishing 20 books and scores of scholarly papers.  After years of carefully examining the “democratic socialism” of Nordic countries (Sweden, Norway, Finland, Iceland, and Denmark), he understands why it’s seriously flawed and now imploding.  Indeed, though American socialists (e.g. Bernie Sanders) and  progressives (e.g. President Obama) naively laud them, today “only one of the five Nordic countries has a social democratic government” (#107).  In various ways Scandinavians seem to be moving away from a failed model.  

Without question the Nordics enjoy many good things—longevity, education, health care, women’s rights, generous vacations, etc.  But the good life there evident, Sanandaji insists, results mainly from Nordic culture rather than socialist structures.  They all “have homogenous populations with nongovernmental social institutions that are uniquely adapted to the modern world.  High levels of trust, a strong work ethic, civic participation, social cohesion, individual responsibility, and family values are long-standing features of Nordic society that predate the welfare state.  These deeper social institutions explain much of the success of the Nordics” (#337).  Imagining the United States—or African or South American countries—could duplicate the Nordic model without the Nordic culture is simply wishful (and extraordinarily frivolous) thinking.  Still more:  it’s important to acknowledge that many of the world’s finest places, enjoying the highest level of well-being, are places like Switzerland and Australia which differ markedly from the Nordics.  

For instance, using one of the main criteria for national well-being—longevity—we find Japan, Switzerland, Singapore, and Australia at least as good as the Nordics.  “Instead of politics, the common feature seems to be that these are countries where people eat healthily and exercise” (#423).   Rather than thinking welfare states make everyone healthy through universal health care, we should understand the life-style ingredients that truly matter.  Then consider the celebrated economic “equality” praised by the likes of  Bernie Sanders.   Sanandaji’s brother Tino (also an economist) notes:  “‘American scholars who write about the success of the Scandinavian welfare states in the postwar period tend to be remarkably uninterested in Scandinavia’s history prior to that period.  Scandinavia was likely the most egalitarian part of Europe even before the modern era’” (#521).  

In part this grew out of the region’s agrarian roots.  For centuries hard-working farmers had survived in an unusually difficult environment.  Necessarily they forged a culture “with great emphasis on individual responsibility and hard work” (#630).  They also secured property rights and embraced a market system that enabled them to thrive as independent yeomen committed to the “norms of cooperation, punctuality, honesty, and hard work that largely underpin Nordic success” (#659).  These norms were then brought to the United States by Scandinavian immigrants in the 19th century, and we find transplanted Swedish-American and Norwegian-American communities distinguished by conscientious, law-abiding, hard-working people.  Consequently they thrived and easily entered the mainstream of their new nation.  Today the eleven million Americans who identify themselves as Nordic are doing even better than their kinsmen still living in Scandinavia and “have less than half the average American poverty rate” (#830).   Culture, not economics, explains the difference!   

Rather than helping improve Scandinavia, Sanandaji says, socialism has actually harmed the region!  As an article in “The Economist explains:  ‘In the period from 1870 to 1970 the Nordic countries were among the world’s fastest-growing countries, thanks to a series of pro-business reforms such as the establishment of banks and the privatization of forests.  But in the 1970s and 1980s the undisciplined growth of government caused the reforms to run into the sands.’  Around 1968 the Left radicalized around the world . . . .  The social democrats in Sweden and other Nordic countries grew bold, and decided to go after the goose that lay the golden eggs:  entrepreneurship” (#1104).  Implementing “democratic socialism” they targeted and taxed the “rich”—the businessmen, the wealth-creators, the very folks responsible for their nations’ prosperity.  Though Scandinavian countries enjoyed remarkable prosperity immediately following WWII, by becoming welfare states they struggled for the next half-a-century to preserve it.  “Third Way socialist policies are often upheld as the normal state Swedish policies.  In reality, one can better understand them as a failed social experiment, which resulted in stagnating growth and which with time have been abandoned” (#1127).  

Rather than celebrating the glories of socialism, the Nordics have learned a sad lesson and recently turned toward a more free market economy.  They grew “rich during periods with free-market policies and low taxes, and they have stagnated under socialist policies.  Job growth follows the same logic” (#1250).  Small government and low taxes spell prosperity; intrusive government and high taxes make for slow (or no) growth.  Recognizing this—and retreating from state-run monopolies—educational and health care facilities have been “opened up in Sweden as voucher systems, allowing for-profit schools, hospitals, and elderly care contorts operate with tax funding” (#1458).  Such moves “drove up wages, evident by the fact that these individuals gained 5 percentage points’ higher wages than similar employees whose workplaces had not been privatized” (#1485). 

Sanandaji has written this book to warn Americans who look favorably on “democratic socialism” in a nation “only very marginally more economically free than Denmark” (#2348).  Noting that Franklin D. Roosevelt was the “architect of the American welfare state,” he then reminds us that FDR also warned:   “‘The lessons of history, confirmed by the evidence immediately before me, show conclusively that continued dependence upon relief induces a spiritual and moral disintegration fundamentally destructive to the national fibre.  To dole out relief in this way is to administer a narcotic, a subtle destroyer of the human spirit.  It is inimical to the dictates of sound policy.  It is in violation of the traditions of America’” (#1512).  Recently a German scholar, Friedrich Heinemann, has validated FDR’s concern regarding “moral disintegration” in welfare states.  Data from the World Value Survey indicate “a self-destructive mechanism exists” in them which dissolves norms.   This is sadly evident, Sanandaji says, in Nordic lands, whose celebrated “work ethic” has dissipated.   Many young Scandinavians work less diligently than their parents, fail to form solid families, and falsely claim to be “sick” to avoid work when it suits them.  

Debunking Utopia should give us Americans additional pause as we endeavor to shape our nation’s immigration policies as well as its economy, for Sanandaji dolefully describes the mounting problems Sweden faces as a result of opening the border to refugees from Muslim nations.  “Sweden is in a ditch because many politicians , intellectuals and journalists—on both the left and the right—have claimed that refugee immigration is a boon to the country’s economy and that large-scale immigration is the only way of  sustaining the welfare state.  . . . .  But of course, serious research has never shown that refugee immigrants boost the Swedish economy.  The truth is quite the opposite” (#2216).    In truth, poverty and crime and educational problems have accelerated as waves of immigrants have washed over the country.  

In fact, Americans should look seriously at their own traditions and seek to revive them rather than fantasizing about any form of “democratic socialism.”  As an old country song declares:  “there ain’t no living in a perfect world,” and “the true lesson from the Nordics is this:  culture, at least as much as politics, matters.  If the goal is to create a better society, we should strive to create a society that fosters a better culture.  This can be done by setting up a system wherein people are urged to take responsibility for themselves and their families, trust their neighbors and work together.  The Nordic countries did evolve such a culture—during a period when the state was small, when self-reliance was favored.  For a time these societies prospered while combining strong norms with a limited welfare state, which was focused on providing services such as education rather than generous handouts.  Then came the temptation to increase the size of the welfare state.  Slowly a culture of welfare dependency grew, eroding the good norms” (#2395).  Only by resurrecting those good norms—and abandoning the failed welfare state—can Scandinavians or Americans truly prosper.  

                     * * * * * * * * * * * * * * * * * * * * * * * * * * 

“Two decades ago, New Zealand went through a dramatic transformation, from a basket case welfare state saddled with crushing public debt, rampant inflation, and a closed and moribund economy, to what is today widely regarded as one of the freest and most prosperous countries in the world.  This is the story of how that happened.”  So Bill Frezza sums up his New Zealand’s Far-Reaching Reforms:  A Case Study on How to Save Democracy from Itself (Guatemala, Guatemala:  The Antigua Forum, c. 2015).  Following WWII—and before such reforms—as more and more money was printed to fuel more and more welfare programs, New Zealand graphically illustrated H. L. Mencken’s quip:  “Every election is a sort of advance auction sale of stolen goods.”  Indeed, Roger Muldoon won election as prime minister in 1975 by running on a platform described by critics as a “denial of economic reality accompanied by bribery of the voters.”  

But all this changed through reforms orchestrated by two politicians (Roger Douglas and Ruth Richardson) representing opposing parties, who made “common cause in a fight against their own party leaders” to save their nation from corrosive inflation and ultimate bankruptcy in the late 1980s and early 1990s.  They both demanded “honest and open accounting” wherewith to tell the truth regarding the nation’s condition.  “If private businesses kept their books the way most governments keep their books, our jails would be full of CEOs” (#361)  But the reformers determined:  “The country was going to be run more like a successful business than a public piggy bank” (#554).  Richardson especially focused on reforming the educational system, turning it to charter schools answerable to the parents.  Douglas worked to reduce corporate and personal income taxes, eliminate inheritance taxes, and establish a consumption tax.  Since they eliminated government sponsored enterprises, such as Fannie Mae and Freddie Mack in the U.S., New Zealand didn’t suffer the housing bubble that burst and devastated America in 2008.  

Consequently, “New Zealand enacted its most lasting reforms when advocates for efficient government, free markets, free trade, sound money, and prudent fiscal policy came together” and legislative acts were passed that “forever changed the way New Zealand’s government did business” (#248).  Freeza explains the important acts and shows how they changed the country.  Government agencies were privatized and compulsory union membership eliminated.  “All civil service employees were moved from job-for-life union contracts and a seniority based advancement regime to individual employment contracts and a merit based regime” (#331).  Opposition to such changes was inevitably intense.  “As a poster child for the bitter medicine being administered, Ruth became the most hated politician in New Zealand.  Effigies were burned in the streets, protesters poured a pot of blue paint on her (she saved the ruined dress for a charity auction), and police had to protect her on her jog to work every morning” (#549). 

But the reforms worked and made a lasting difference.  The 2014 Index of Economic Freedom ranks “New Zealand fifth in the world, behind Hong Kong, Singapore, Australia, and Switzerland with ratings of 90 percent or higher for rule of law, freedom from corruption, business freedom, and labor freedom” (#623).  The nation’s GDP increased fourfold while the national debt shrunk from $25 billion in 1993 to $15 billion in 2007.  Trade, especially with China, has flourished.  In retrospect, Freeza says (wondering if the New Zealand story can be duplicated):  “The list of success factors required for a democracy to flourish economically is not long:  honesty, integrity, transparency, accountability, efficiency, thrift, prudence, flexibility, freedom, leadership, and courage.  Does anyone care to stand up and deny that these are virtues not just of good government but of a good life?  Although universally acclaimed by economists, philosophers, and theologians, why are these virtues so hard to find in governments and politicians?” (#668).  Unfortunately, politicians such as Roger Douglas and Ruth Richardson, willing to risk losing elections and incurring criticism, rarely appear.  But without them majoritarian democracies will, it seems, sadly enough, generally follow a destructive path.

* * * * * * * * * * * * * * * * * * * * * * * *

In The Problem with Socialism (Washington, D.C.:  Regency Publishing, c. 2016), Thomas DiLorenzo notes that a recent poll showed “43 percent of Americans between the ages of eighteen and twenty-nine had a ‘favorable’ opinion of socialism” and preferred it to capitalism (#85).  Another poll indicated 69 percent of voters under 30 would support a socialist for President—as Bernie Sanders’ near victory in the Democrat Party primaries certainly illustrated.  Misled by a multitude of educators, these young folks fail to realize G.K. Chesterton’s insight:  “the weakness of all Utopias is this, that they take the greatest difficulty of man [i.e. original sin] and assume it to be overcome, and then give an elaborate account [i.e. redistribution of wealth] of the overcoming of the smaller ones.  They first assume that no man will want more than his share, and then are very ingenious in explaining whether his share will be delivered by motorcar or balloon” (Heretics).   Or, as Lady Margaret Thatcher famously quipped:  the ultimate problem with socialists is they finally run out of other people’s money to spend.

Though socialism in the 19th century meant the “government ownership of the means of production,” in the 20th century it morphed into redistributive measures designed to eliminate all sorts of inequalities through progressive taxes and regulatory edicts.  Inevitably socialists want government to control as many industries (e.g. health care), confiscate as much land (e.g. national forests), and destroy capitalism “with onerous taxes, regulations, the welfare state, inflation, or whatever they thought could get the job done” (#127).  Also inevitably, nations embracing socialism impoverish themselves.  Africa bears witness to the fact that 40 years of socialistic experiments made them “poorer than they had been as colonies” (#183).  Indeed, one of DiLorenzo’s chapters is entitled:  “socialism is always and everywhere an economic disaster.”  A glance at American history shows how socialistic endeavors in colonial Jamestown utterly failed.  But, Matthew Andrews says:  “‘As soon as the settlers were thrown upon their own resources, and each freeman had acquired the right of owning property, the colonists quickly developed what became the distinguishing characteristic of Americans—an aptitude for all kinds of craftsmanship coupled with an innate genius for experimentation and invention’” (#255).  Socialism, whether of the dictatorial or majority-rule democratic variety, is all about planning.  It’s preeminently “the forceful substitution of governmental plans for individual plans” (#588).  Planned economies always look wonderful to the planners.  But the plans inevitably founder when implemented because they run counter to human nature.

Socialists further violate human nature by seeking to dictate economic equality (e.g. “free” education, health care, housing, food, etc.) which “is not just a revolt against reality; it is nothing less than a recipe for the destruction of normal human society,” as became brutally evident in Russia and China (#374).  By eliminating capitalism’s division of labor and freeing each person to cultivate his own talent as well as his own garden, socialism (Leon Trotsky believed) would enable the perfection of our species so that the “human average will rise to the level of an Aristotle, a Goethe, a Marx.”   That such never happens—indeed could never happen—effectively refutes such utopianism.  “How remarkable it is that to this day, self-proclaimed socialists in academe claim to occupy the moral high ground.  The ideology that is associated with the worst crimes, the greatest mass slaughters the most totalitarian regimes ever, is allegedly more compassionate that the free market capitalism that has lifted more people from poverty created more wealth, provided more opportunities for human development, and supported human freedom more than any other economic system in the history of the world” (#697).  

The intrinsic deficiencies of socialism are also on display in those “islands of socialism in a sea of capitalism—government-run enterprises like the U.S. Postal Service, state and local government public works departments, police, firefighters, garbage collection, schools, electric, natural gas, and water utilities, transportation services, financial institutions like Fannie Mae, and dozens more” (#500).  Though there may very well be practical reasons for their existence, they are “vastly more inefficient, and offer products or services of far worse quality than private businesses” (#508).   Economists generally hold “that the per-unit cost of a government service will be on average twice as high as a comparable service offered in the competitive private sector” (#508).  That privately owned and operated firms like UPS and FedEx prosper, while the USPS needs abiding subsidies, surprises no economist.  Nor does it surprise anyone that USPS employees “earn 35 percent higher wages and 69 percent greater benefits than private industry employees” (#558).  

This problem ultimately led to recent changes in Scandinavia, where free-market reforms are currently reversing decades of “democratic socialism.”  The Swedish Economic Association recently reported “that the Swedish economy had failed to create any new jobs on net from 1950 to 2005.”  Consequently, Sweden is actually “poorer than Mississippi, the lowest-income state in the United States” (#880).  Within a half-century, the nation slipped “from fourth to twentieth place in international income comparisons.”  It has simply proved “impossible to maintain a thriving economy with a regime of high taxes, a wasteful welfare state that pays people not to work and massive government spending and borrowing” (#855).  Of Denmark’s 5.5 million people, 1.5 million “live full-time on taxpayer-funded welfare handouts” (#890).  One Swedish economist, Per Byland, says giving out “benefits” and thereby “‘taking away the individual’s responsibility for his or her own life, a new kind of individual is created—the immature, irresponsible, and dependent.’”  Thus the celebrated, carefully planned Swedish “welfare state”  has unintentionally created multitudes of “psychological and moral children’” (#872).  

Sadly enough, DiLorenzo concludes, socialism ultimately harms the very folks its designed to help—the poor.  It’s a “false philanthropy.”   And it should be resisted wherever possible. 

287 The Kingdom of Speech; Undeniable; Evolution 2.0

 For five decades Tom Wolfe has remained a fixture atop the nation’s literary world—helping establish the “new journalism,” publishing essays and novels, credibly claiming to discern the pulse and diagnose the condition of America.  His most recent work, The Kingdom of Speech (New York:  Little, Brown and Company, c. 2016), finds him entering (with his customary wit) the somewhat arcane worlds of biological evolution and linguistics, finding therein much to question and pillory while educating us in the process.  He was prompted to research the subject when he read of a recent scholarly conference where “eight heavyweight Evolutionists—linguists, biologists, anthropologists, and computer scientists” had given up trying to answer “the question of where speech—language—comes from and how it works” (#18).   It’s “as mysterious as ever” they declared!  Amazingly, one of the eight luminaries was Noam Chomsky, for 50 years the brightest star in the linguistics’ firmament!  Now for academics such as Chomsky this is no small admission, for:  “Speech is not one of man’s several unique attributes—speech is the attribute of all attributes” (#36).  When the regnant Neo-Darwinian theory of evolution fails to explain language it fails to explain virtually all that matters!   

To put everything in historical context, Wolfe guides us through some fascinating developments in evolutionary theory, including deft portraits of Alfred Wallace and Charles Darwin (who maneuvered to co-opt Wallace as the singular architect of the theory of biological evolution of species through natural selection).  While styling himself an empirical scientist, Darwin subtly propounded a cosmogony that closely resembles the creation stories of many American Indians.  In fact, Darwin’s story, with its “four or five cells floating in a warm pool somewhere” developing into a world teeming with remarkable creatures was, rightly understood, a “dead ringer” for that of the Navajos!  “All cosmologies, whether the Apaches’ or Charles Darwin’s faced the same problem.  They were histories, or, better said, stories of things that had occurred in a primordial past, long before there existed anyone capable of recording them.  The Apaches’ scorpion and Darwin’s cells in that warm pool were somewhere were by definition educated guesses” (#281).  They were all “sincere, but sheer, literature” (#293).  

  While telling his story, however, Darwin recognized that speech “set humans far apart from any animal ancestors.”  Other traits he might passably explain, but he utterly failed to show how “human speech evolved from animals” (#205).  “Proving that speech evolved from sounds uttered by lower animals became Darwin’s obsession.  After all, his was a Theory of Everything” (#215).  Critiquing this theory was England’s most prestigious linguist, Max Muller, who insisted there is radical difference in kind between man and beast—and that difference is language.  “Language was the crux of it all.  If language sealed off man from animal, then the Theory of Evolution applied only to animal studies and reached no higher than the hairy apes.  Muller was eminent and arrogant—and made fun of him” (#860).  And then, just when Darwin mustered up the nerve to publish The Descent of Man, and Selection in Relation to Sex, declaring apes and monkeys evolved into human beings, the pesky Alfred Wallace (who had been busily writing trenchant biological treatises) wrote an article, “The Limits of Natural Selection as Applied to Man,” pointing out certain uniquely human traits, including language, impossible to explain through natural selection.  “No said Wallace, ‘the agency of some other power’ was required.  He calls it ‘a superior intelligence,’ ‘a controlling intelligence.’  Only such a power, ‘a new power of definite character,’ can account for ‘ever-advancing’ man” (#694).  But this Darwin could not allow!  All must be the result of purely material, natural processes!  “He had no evidence,” Wolfe says, but he told a good “just so” story that captured much of the public mind.  Yet his followers, for 70 years, gave up trying to explain the origin of language and turned to simpler evolutionary matters, upholding the Darwinian standard and insisting, with Theodosius Dobzhansky:  “Nothing in Biology Makes Sense Except in the Light of Evolution.”  But not even Dobzhansky ventured to suggest precisely how speech evolved!  

Then came Noam Chomsky, who (as a graduate student at the University of Pennsylvania) set forth a revolutionary theory of linguistics, a “radially new theory of language.  Language was not something you learned.  You were born with a built-in ‘language organ’” (#1000).  Along with your heart and liver, you’re given it—a biological “language acquisition device” (routinely referred to as the LAD in the “science” of linguistics).  Chomsky summed it all up in his 1957 Syntactic Structures and thereby became “the biggest name in the 150-year history of linguistics” (#1012).  But what, precisely was this LAD?  Was it a free-standing organ or an organ within the brain?  Like all else in the evolutionary scheme, it had to be something material.  But where could it be found?  Take it by faith, Chomsky said—in time empirical scientists would find it!  

After 50 years of absolute preeminence in the field of linguistics, however, Chomsky suddenly faced an antagonist!  Daniel L. Everett, having spent 30 years living with a small tribe in the Amazon jungle—the Piraha, arguably the most primitive tribe on earth—dared to challenge the Master!  He declared Chomsky’s theory falsified by the Indians he studied.  They “had preserved a civilization virtually unchanged for thousands . . . many thousands of years” (#1313), and no “language organ” or “universal grammar” could explain how they spoke.  When Everett presented his findings to the public a decade ago—declaring they provided “the final nail in the coffin for Noam Chomsky’s hugely influential theory of universal grammar” (#1393)—a “raging debate” ensued.  In fact, it was total war, with Chomsky and his epigones determined to destroy Everett!  They questioned his integrity, discounted his credentials, and schemed to ostracize him from the academic community.  

Fighting back, in 2008 Everett published Don’t Sleep, There Are Snakes, summarizing his 30 years among the Piraha.  Amazingly, for a linguistics treatise, it became something of a best-seller!  “National Public Radio read great swaths of the book aloud over their national network and named it one of the best books of the year” (#1637).  Dismissing Chomsky’s celebrated theory, Everett argued:  “Language had not evolved from . . . anything.  It was an artifact” (#1631).  He followed this up with Language:  The Cultural Tool, insisting “that speech, language is not something that had evolved in Homo sapiens, the way the breed’s unique small-motor-skilled hands and . . . or its next-to-hairless body.  Speech is man-made.  It is an artifact . . . and it explains man’s power over all other creatures in a way Evolution all by itself can’t begin to” (#1675).  Soon he found some distinguished defenders, including Michael Tomasello—co-director of the Max Planck Institute for Evolutionary Anthropology.  In an article entitled “Universal Grammar Is Dead,” Tomasello opined:  “‘The idea of a biologically evolved, universal grammar with linguistic content is a myth’” (#1663).  Then Vyvyan Evans published The Language Myth and simply dismissed the innate “language instinct” notion.  Still others soon joined the growing condemnation of the Chomsky thesis!  

Chomsky of course responded, defending himself—but subtly retracting some of his earlier hypotheses.  Then, in a long, convoluted article, we find him confessing:  “‘The evolution of the faculty of language largely remains an enigma’” (#1734).  An enigma, no less!  Fifty years of feigning The Answer!  (It seems Chomsky knows less than Aristotle, who concluded that humans have a “rational soul” enabling them to function in uniquely human ways.)  And to Tom Wolfe, this at least became crystal clear:  “There is a cardinal distinction between man and animal, a sheerly dividing line as abrupt and immovable as a cliff:  namely, speech” (#1890).  He thinks:  “Soon speech will be recognized as the Fourth Kingdom of Earth.”  In addition to the mineral, vegetable, and animal worlds, there is “regnum loquax, the kingdom of speech, inhabited solely by Homo Loquax” (#1938).  How interesting to find Wolfe affirming what an earlier (and deeply Christian) literary giant, Walker Percy, identified (in Lost in the Cosmos) as the “delta factor”—the symbolic language unique to our species.  There’s an immaterial dimension to language, rendering it impossible to reduce to (or explain by) mere matter.  

* * * * * * * * * * * * * * * * * * * * * * * * * 

While practicing their craft, scientists cannot but ask philosophical questions.  The empirical details of their discipline may very well prove interesting to certain scholars, but the deeper philosophical implications of their findings constantly press for examination and explanation.  Thus, in Undeniable:  How Biology Confirms Our Intuition that Life Is Designed (New York:  HarperOne, c. 2016), Douglas Axe, a highly-credentialed biologist (degrees from Cal Tech and Cambridge University; research articles published in peer reviewed journals) notes:  “The biggest question on everyone’s mind has never been the question of survival but rather the question of origin—our origin in particular.  How did we get here?” (#195).  We cannot but wonder:  “What is the source from which everything else came?  Or, to bring it closer to home:  To what or to whom do we owe our existence?  This has to be the starting point for people who take life seriously—scientists and nonscientists alike.  We cannot rest without the answer, because absolutely everything of importance is riding on it” #275).  

Axe mixes many enlightening personal anecdotes—struggling to survive within an antagonistic academic establishment while entertaining serious questions concerning the dogmas espoused therein—with an expertise honed in laboratories (most notably Cambridge University) and through interactions with both eminent biologists and cutting-edge publications.  But he urges us to rely not upon prestigious authorities.  We should trust our common sense, believing what we see and intuitively know rather than what we’re told to see and believe.  He shares St. Paul’s probing conviction that “the wrath of God is revealed from heaven against all ungodliness and unrighteousness of men, who suppress the truth in unrighteousness, because what may be known of God is manifest in them, for God has shown it to them.  For since the creation of the world His invisible attributes are clearly seen, being understood by the things that are made, even His eternal power and Godhead, so that they are without excuse, because, although they knew God, they did not glorify Him as God, nor were thankful, but became futile in their thoughts, and their foolish hearts were darkened.  Professing to be wise, they became fools” (Ro 1:18-22).  

At an early age children (even if reared in atheist homes) prove St. Paul’s point, sensing there’s an ultimate God-like source responsible for a world that seems to function in accord with certain regularities and principles.  This Axe labels the universal design intuition that recognizes an intelligent dimension to all that is.  Thus children “innately know themselves to be the handiwork of a ‘God-like designer,’” only to suffer schools wherein they’re generally “indoctrinated with the message that they are instead cosmic accidents—the transient by-products of natural selection” (#843).  To refute that materialistic dogma, philosophical rather than scientific, Axe to presents in-depth scientific information pointing to intelligent design as the answer to our deepest questions.  He’s particularly adept at showing how the latest findings in molecular biology (in particular the tiny and incredibly complex proteins he examines in the laboratory) and cosmology make purely naturalistic explanations truly improbable.  Fortunately, for the general reader, Axe explains things in simple, intelligible ways while demonstrating his mastery of the materials.  And he insists:  “What is needed isn’t a simplified version of a technical argument but a demonstration that the basic argument in its purest form really is simple, not technical” (#898).  We don’t need a Ph.D. in science to understand the common sense science basic to the question of origins.  

Axe’s argument actually takes us back to Aristotelian metaphysical tradition (though he doesn’t overtly align himself with it), for the world we observe contains real beings (what he calls “busy wholes) innately orientated to discernible ends.  There’s more to Reality than mindless matter—there’s information, reason, a Logos giving shape to that matter.  “When we see working things that came about only by bringing many parts together in the right way, we find it impossible not to ascribe these inventions to purposeful action, and this pits our intuition against the evolutionary account” (#1264).  Consider such amazingly complex creatures as spiders, salmon, and whales, each of which “is strikingly compelling and complete, utterly committed to being what it is” (#1117).  The utter inescapability of the material, formal, efficient, and final causes necessary to understand and explain them cannot be denied!  Thus life doesn’t just happen as a result of atoms randomly bouncing through space.  And to imagine life originated in a primordial pond of inorganic compounds violates both the empirical evidence of science and the laws of thought.  To anyone with eyes to see, “life is a mystery and masterpiece—an overflowing abundance of perfect compositions” that cannot be explained in accord with Darwin’s natural selection (#1129).  

Having presented his case, Axe says:  “The truth we’re arrived at is important enough that we have a responsibility to stand up for it.  Think of this is a movement, not a battle.  When a good movement prevails, everyone wins” (#2835).  He further believes that Darwinian devotees are now on the defense, retreating on many fronts.  They know Darwin himself understood his theory’s vulnerability, admitting:  “If it could be demonstrated that any complex organ existed, which could not possibly have been formed by numerous successive, slight modifications, my theory would absolutely break down.”  But though he credibly described the survival of the species he simply failed—as have his successors—to explain their arrival!  A century of intensive research leaves unanswered the truly fundamental questions:  how did organic life (e.g. the first cell containing proteins providing genetic instruction for making proteins) arise from inorganic materials?  why are humans uniquely conscious and marked by distinctively non-utilitarian traits such as altruism?  

Unlike many advocates of Intelligent Design, who insist they are not making an argument for the existence and power of God, Axe forthrightly moves from his scientific data and philosophical arguments to “naming God as the knower who made us.  I see no other way to make sense of everything we’ve encountered on our journey” (#3096).  The material world can only be—and be understood—because of an immaterial world, the spiritual and supernatural world.  “In the end,” he says, “it seems the children are right again.  The inside world is every bit as real as the outside one.  Consciousness and free will are not illusions but foundational aspects of reality, categorically distinct from the stuff of the outside world.  Following the children, if we allow ourselves to see the outside world as an expression of God’s creative thought, everything begins to make sense” (#3190).  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

An electrical engineer by training and profession and fully immersed in cutting-edge computer developments, Perry Marshall’s consuming interest in evolutionary theory has prompted him to publish Evolution 2.0:  Breaking the Deadlock Between Darwin and Design (Dallas:  Benbella Books, Inc., c. 2015) in hopes of bringing about a rapprochement between folks primarily committed to Scripture and those strongly rooted in evolutionary Science.  Neither a “God of the Gaps” young earth creationism nor a mindless matter-in-motion Old School Darwinism will suffice, given the best available evidence.  Reared in young-earth creationist milieu but educated as a scientist, Marshall long struggled with seemingly unanswerable questions.  But:  “One day I had a huge epiphany:  I suddenly saw the striking similarity between DNA and computer software.  This started a 10-year journey that led me down a long and winding road of research, controversy, and personal distress before I discovered a radical re-invention of evolution” (#333).   This combination of a divinely-engendered creation and evolutionary process he calls Evolution 2.0 and urges it upon his readers.  

Codes provide patterns for both computers and biological organisms.  Though we struggle to understand mysterious powers such as gravity and thermodynamics, we fully understand how to create computer codes.  And we absolutely know that codes do not write themselves!  Codes exist because minds devise them!  Without intelligent coders the there could be no codes!  He supports both creationists (believing God encodes creation) and Darwinists (believing much of the evolutionary account).   Independently wealthy, Marshall even offers a multi-million dollar prize to the “first person who can discover a process by which nonliving things can create code” (#416).  But:    “Nobody knows how you get life from nonlife.  Nobody knows how you get codes from chemicals.  So nobody gets to assume life just happens because you have some warm soup and a few billion years; we want to understand how and why” (p. 178).  He stresses, in bold print:  “Codes are not matter and they’re not energy.  Codes don’t come from matter, nor do they come from energy.  Codes are information, and information is in a category by itself” (p. 187).  

Marshall begins this very personal book by confessing that his childhood cosmological beliefs were seriously challenged by the data he faced as a scientist.  He sincerely wanted to discover the truth and resolved to find it, whatever the cost.  “At the core of my being, I knew I could not live apart from integrity; I could no somehow make myself believe something that was demonstrably untrue” (p. 6).  Trusting his engineering training, he resolved let it guide him, fully aware it might lock him into atheism.   The electrical engineering he’d mastered is highly mathematical—everything works precisely.  And as he delved into biological science he found tiny living organisms working just as precisely, following sophisticated instructions.  He also found that evolutionary theory nicely explained much that is evident in living creatures, confirming Darwin’s insights.   But one of the Darwinists’ core beliefs—that random mutations fully explain the information necessary for living beings—he found untenable.  There must be some intelligent source for the information markedly present in all that lives.  

It soon dawned on him that computer codes and biological DNA are remarkably alike.  It’s information that enables them both to work in such wonderful ways.  Prior to any evolving organisms there must be information, precisely coded in the DNA, that enables them to function.  “The pattern in DNA is a complex, intricate, multilayered language.  And incredibly efficient and capable language at that” (p. 165).  It’s not “natural selection” but “natural genetic engineering” that best explains the living world.  Marshall carefully discusses important things such as “transposition,” “horizontal gene transfer,” “epigenetics,” “symbiogenesis,” and “genome duplication,” to illustrate the wonderful ways cells function.  “One cell can hold a gigabyte of data; plant and animal tissues have a billion cells per cubic centimeter” (p. 102).  A simple cell has information equivalent “to about a hundred million pages of the Encyclopedia Britannica” (p. 106).  Indeed:  “As amazing as Google is, a single cell possesses more intelligence than a multibillion-dollar search engine” (p. 235).  And within each cell there are the true building blocks of all organisms—tiny, information-laden proteins that enable to cell to thrive.  “No human programmer has ever written software that does anything even close to this.  Could randomness do it?  No chance” (p. 142)

Only God could have written the software evident in our world.  That many of the the greatest scientists of the past—Newton, Kepler, Kelvin, Mendel, Planck—believed in such a God should encourage us to follow their example.  For himself, Marshall says:  “after years of research, expense, scrutiny, and debates, my conclusion is:  Not only is Evolution 2.0 the most powerful argument for a Designer that I’ve ever seen (!), but people of faith were on the cutting edge of science for 900 or the last 1,000 years.  The rift between faith and science might heal if everyone could see how evolution actually works” (p. 270).    Marshall has read widely and provides helpful bibliographic materials.    Though not a trained philosopher, he clearly understands sophisticated arguments and logic, and his scientific preparation enables him to both grasp and explain current data.  While not as conclusively persuasive as he might like, he does provide a valuable treatise on this absolutely crucial issue.

286 Clinton Cash; Armageddon; Stealing America

 Peter Schweizer is an investigative journalist with a muckraker’s penchant for pursuing the darker dimensions of American politics, looking for scoundrels whose behavior needs exposing.  So in Architects of Ruin he detailed governmental corruption underlying the 2008 financial collapse; in Makers and Takers he highlighted the many faults of the welfare state; and in Throw Them All Out he brought to light the many suspicious stock trades enriching members of Congress.  Just recently, in Clinton Cash:  The Untold Story of How and Why Foreign Governments and Businesses Helped make Bill and Hillary Rich (New York:   Harper Collins, c. 2015), he documents the extraordinary number of questionable ties linking the Clintons and their foundation to wealthy foreign governments and businessmen.  Most all of his critical findings present only circumstantial evidence.  Demonstrable quid pro quo transactions are by their very nature are enshrouded in secrecy and rarely leave overt proof.  But Schweizer’s evidence leads the reader to suspect the Clintons of massive corruption and malfeasance in office.  Legally, there’s a Latin phrase—res ipsa loquitur (the thing speaks for itself)—that fully applies to Schweizer’s evidence.  When first published, the book was attacked and dismissed by the Clinton-supporting mainline media.  Thus ABC’s George Stephanopoulos (without disclosing the fact that he personally had contributed $75,000 to the Clinton Foundation) glibly assured viewers that nowhere did Schweizer establish any “direct  action” taken by Hillary “on behalf of the donors.”  Thus, he declared, there were no quid pro quo deals.  However, subsequent Congressional and FBI investigations make Schweizer’s case increasingly credible.  Res ipsa loquitur!  

Admittedly there has always been considerable dishonesty in American politics.  But the Clintons have been unusually close to wealthy foreigners, raking in millions of dollars for speeches and garnering  contributions for the Clinton Foundation.  Indeed, “the scope and extent of these payments are without precedent in American politics.  As a result, the Clintons have become exceedingly wealthy” (#167 in Kindle).  Indeed:  “No one has even come close in recent years to enriching themselves on the scale of the Clintons while they or a spouse continued to serve in public office” (#201).  “Dead broke” in 2001, Hillary claimed, they quickly prospered (accumulating $136 million within a decade) by circumventing the law which prohibits foreign interests from contributing to political campaigns.  Lavish speaking fees and gifts to the Clinton Foundation (which employed friends and covered lush “expense” accounts for the inner circle) were the “legal” (in fact the only discernable) ways whereby the Clintons became inordinately wealthy.  “The issues seemingly connected to these large transfers are arresting in their sweep and seriousness:  the Russian government’s acquisition of American uranium assets; access to vital US nuclear technology; matters related to the Middle East policy; the approval of controversial energy projects; the overseas allocation of billions in taxpayer funds; and US human rights policy, to name a few” (#236).  

Symptomatic of things to come was President Bill Clinton’s pardon (just before leaving the White House in 2001) of billionaire fugitive Marc Rich, who was living abroad to avoid facing a variety of charges.  One of the world’s richest men, he’d been indicted for illegal trading practices and tax evasion.   His “business ties included a ‘who’s who’ of unsavory despots, including Fidel Castro, Muammar Qaddafi, and the Ayatollah Khomeini.”  Rich “owed $48 million in back taxes that he unlawfully tried to avoid and faced the possibility of 325 years in prison,” earning him a place on the FBI’s Most Wanted List.  A federal prosecutor, Morris Weinberg, said:  “The evidence was absolutely overwhelming that Marc Rich, in fact, committed the largest tax fraud in the history of the United States.”  Rather than risk imprisonment, Rich fled the country in 1991.  Fortunately, he had a charming former wife, Denise, who ingratiated herself with the Clintons, making lavish contributions—moving $1.5 million into their coffers.  President Clinton then pardoned Marc Rich soon after Denise donated “$100,000 to Hillary’s 2000 Senate campaign, $450,000 to the Clinton Library, and $ 1 million to the Democrat Party” (#270).  These transactions were helped along by Rich’s lawyer (and former White House counsel) Jack Quinn, who pled her case with Bill and Hillary.  Informed of the pardon, Mayor Rudolph Giuliani, the U.S. attorney who spearheaded the Rich investigations, refused to believe it.  Surely it was “impossible” that a president would pardon him.  But Clinton did.  Ever mindful of the letter of the law, he evaded clear quid pro quo connections, but what rational person could deny it!  Res ipsa loquitor!  Rich’s was merely one of Clinton’s many last-minute pardons—crooks, con men, relatives, ex-girlfriends, former cabinet members and congressmen.   

Such suspicious Clintonian behavior persisted—indeed escalated—during the following years as Bill and Hillary established the Clinton Foundation and erected the Clinton Library, soliciting funds from various donors and negotiating huge fees (often amounting to $500,000 or more) for speaking engagements around the world—especially in developing nations such as Uzbekistan and Kazakhstan where despots flush with cash sought to multiply it.  Skeptical journalists such as Christopher Hitchens wondered:  “why didn’t these third world oligarchs ‘just donate the money directly [to charities in their own countries] rather than distributing it through the offices of an outfit run by a seasoned ex-presidential influence-peddler?’” (#300).  Their activities caused the Obama team to voice significant concern regarding Hillary’s financial ties when she was appointed Secretary of State, so she promised to fully disclose their financial activities and secure State Department approval before accepting gifts to the foundation from foreign sources.  But she quickly broke these promises:   “Huge donations then flowed into the Clinton Foundation while Bill received enormous speaking fees underwritten by the very businessmen who benefited from these apparent interventions” (#395).  Interestingly enough, while ex-presidents’ speaking fees gradually decline once they’re out of office, Bill Clinton’s dramatically escalated when his wife became Secretary of State.  

One of the businessmen most frequently involved in the Clintons’ financial endeavors was a Canadian mining tycoon, Frank Giustra, who first connected with them  in the 1990s and frequently provided a luxurious private jet for Bill to fly around the world (or to campaign for Hillary) after he left the White House.  It was Giustra who arranged a meeting between Bill and the dictator of Kazakhstan that led to an involved uranium deal, helped along by then Senator Hillary Clinton’s activities in the Congress.  This “deal stunned longtime mining observers,” and soon thereafter “Giustra gave the Clinton Foundation $31.3 million” (#593).   Yet another uranium deal involved a Canadian company and Russian investors who sought to gain control of “up to half of US uranium output by 2015” (#751).  This move was set in motion by Vladimir Putin, who personally discussed various issues, including trade agreements, with Secretary of State Clinton in 2010.  Monies then flowed into the Clinton Foundation, thanks to significant gifts from folks invested in the uranium industry.  “Because uranium is a strategic industry, the Russian purchase of a Canadian company holding massive US assets required US government approval.  Playing a central role in whether approval was granted was none other than Hillary Clinton” (#821).  Though a number of congressmen protested the deal, it was duly authorized by the Committee on Foreign Investment in the United States—a select committee that included the Secretary of State and other Obama Cabinet members.   Coincidentally, Salida Capital, one of the Canadian companies involved in the transaction and possibly a “wholly owned subsidiary of the Russian state nuclear agency,” would give “more than $2.6 million to the Clintons between 2010 and 2012” (#875).  Ultimately, “Pravda hailed the move with an over-the-top headline:  ‘RUSSIAN NUCLEAR ENERGY CONQUERS THE WORLD’” (#969).  

Since most of the millions flowing through the Clintons’ hands goes to (or through) the Clinton Foundation, Schweitzer devotes many pages to probing its activities as well as providing fascinating portraits of its denizens.  Though its “window-display causes” portray the foundation as admirably charitable, helping victims of AIDS, poverty, obesity, etc., it’s more probably both “a storehouse of private profit and promotion” (#1326) and a generous employer for a numbers of Clinton associates, advisers and political operatives.  (A recent review of the foundation’s 2014 IRS report reveals that of the $91 million expended only $5 million actually went to needy causes; the rest was devoted to employees, fundraising, internal expenses.)  In fact, the foundation has virtually no infrastructure and does very little to actually help those in need.  Rather, it seeks broker deals between “government, business, and NGOs” (#1349).  That some good is ultimately done cannot be denied, but it’s not actually done by the foundation itself.  “While there are plenty of photos of Bill, Hillary, or Chelsea holding sick children in Africa, the foundation that bears their name actually does very little hands-on humanitarian work” (#1356).  When Hillary became Secretary of State, she utilized a special government employee (SGE) rule that allowed her to appoint aides, including Huma Abedin, to her department while simultaneously garnering a salary from the Clinton Foundation.  “Abedin played a central role in everything Hillary did” (#1589), and according to the New York Times “‘the lines were blurred between Ms. Abedin’s work in the high echelons of one of the government’s most sensitive executive departments and her role as a Clinton family insider’” (#1595).  

The Clintons’ approach to “charitable” work was manifest following the devastating 2010 earthquake in Haiti which killed some 230,000 people and left 1.5 million more homeless.  Days after the earthquake, Secretary of State Hillary Clinton flew to the island, as did husband Bill.  “With a cluster of cameras around him, Bill teared up as he described what he saw” (#2497).  “The Clintons’ close friend and confidante, Cheryl Mills, who was Hillary’s chief of staff and counselor at the State Department [recently granted immunity for telling the FBI what she knew about the thousands of Hillary’s deleted emails] was assigned responsibility for how the taxpayer money, directed through USAID, would be spent” (#2497).  A special committee, with Bill as cochair, was appointed to distribute these funds, and he made speeches describing how Haiti would marvelously recover under his guidance.  But little construction actually took place!  For example, in “December 2010 Bill and Hillary approved a ‘new settlements program’ that called for fifteen thousand homes to be built in and around Port-au-Prince.  But by June 2013, more than two and a half years later, the GAO audit revealed that only nine hundred houses had been built” (#2712).  

Rather than rebuilding the nation’s infrastructure, the money was spent on “worthless projects,” and “in several cases Clinton friends, allies, and even family members have benefited from the reconstruction circumstances” (#2521).  Consider the story of Denis O’Brien, an Irish billionaire who studiously curried the Clintons’ favor (often making available his Gulfstream 550) while successfully promoting his mobile phone company, Digicel.  The firm profited enormously from its Haitian programs and O’Brien himself collected $300 million in dividends in 2012.  O’Brien invited Bill to speak three times in three years in Ireland, and almost simultaneously his company was granted profitable positions in Haiti.  Then there’s Hillary’s brother, Tony Rodham, who had absolutely no background in the mining industry but became a member of the board of advisors for a mining company that secured a “gold mining exploitation permit”—a “sweetheart deal” that outraged the Haitian senate.  Meanwhile, Bill’s brother Roger collected $100,000 for promising builders he’d arrange a sweet deal with the Clinton Foundation.  “In sum, little of the money that has poured into Haiti since the 2010 earthquake has ended up helping Haitians.  And how that money was spent was largely up to Hillary and Bill” (#2770).  

In conclusion:  “The Clintons themselves have a history of questionable financial transactions” (#2806).  They neither follow the same rules nor receive the same treatment as most Americans, yet they have famously flourished within modern American politics.  That they have succeeded, despite the record of questionable activities detailed in Clinton Cash, should give us pause!  

* * * * * * * * * * * * * * * * * * * * * *

Few political insiders know Bill and Hillary Clinton better than Dick Morris, the architect of Bill’s “triangulation” strategy which enabled him to coast to re-election in 1996.  Morris’s Armageddon:  How Trump Can Beat Hillary (West Palm Beach, FL:  Humanix Books, c. 2016), co-written with his wife, Eileen McGann, offers a unique perspective on this year’s election.  Given Morris’s checkered history, his pronouncements must always be considered with significant reservations!  Much of his life he’s worked as a “hired gun” and shown little ethical concern when involved in the rough and tumble world of partisan politics.  But inasmuch as he was one of Bill Clinton’s most trusted consultants in the 1990s he certainly provides information worth pondering as we consider Hillary’s presidential aspirations.  Morris also discusses Donald Trump’s prospects, but it’s his knowledge of the Clintons that most interests me.  

As the book’s title indicates, Morris writes as an alarmist:  “The ultimate battle to save America lies straight ahead of us:  It’s an American Armageddon, the final crusade to defeat Hillary Clinton” (#138).  Her election, he says, listing a litany of fears, will consign us to “four long years of another bizarre Clinton administration, featuring the Clintons’ signature style of endless drama, interminable scandals, constant lies, blatant cronyism and corruption, incessant conflicts of interest, nepotism, pathological secrecy, hatred of the press, his and her enemies lists, misuse of government power, inherent paranoia, macho stubbornness, arrogant contempt for the rule of law, nutty gurus, and thirst for war.  Those will be the disastrous and unavoidable hallmarks of a Hillary regime” #246).  With a cast of characters including Bill and Chelsea Clinton, Sidney Blumenthal, David Brock, Terry McAuliffe, et al.—“unqualified and greedy cronies and  her money-grubbing family members” roaming Hillary’s White house—the nation will suffer gravely.  When we think of the Clinton scandals, we usually focus on Bill’s sexual escapades, but Morris declares “that almost every single scandal in the Bill Clinton White House was caused by Hillary:  Travelgate, Whitewater, Filegate, her amazing windfall in the commodities futures market, the Health Care Task Force’s illegal secrecy, the household furniture and gifts taken from the White House to Chappaqua, Vince Foster’s suicide, Webb Hubbell’s disgrace—all Hillary scandals” (#412).  

In his first chapter Morris lists “A Dozen Reasons Hillary Clinton Should Not Be President.”  These include:  1) her dismal failure to respond well to the terrorist attack in Benghazi; 2) her compulsive, life-long lying about almost everything; 3) her penchant for hawkish, pro-war pronouncements; 4) her ties with the Muslim Brotherhood, as evident in her close ties to Huma Abadin, whose parents (and she herself) fervently supported the organization; 5) her easily documented record of flip-flops on a variety of issues (e.g. gay marriage, free trade) during the course of her life; 6) her manifest corruption—a “way of life” most evident in her multifaceted financial deals, e-mails, and Clinton Foundation; 7) her obsessive concern for secrecy; 8) her queen-like ignorance regarding ordinary Americans; 9) her economic vacuity; 10) her reliance on disreputable “gurus” such as Sidney Blumenthal; 11) her stubbornness; and 12) her notorious nepotism.  

Clearly Dick Morris dislikes and distrusts Hillary Clinton.  How seriously you take his warnings naturally depends upon how much you trust him.  But when placed in proper context, and compared with other accounts corroborating his data, he’s persuasive.  

* * * * * * * * * * * * * * * * * 

With the election of 2016 approaching, Dinesh D’Souza published two clearly polemical treatises designed to warn America about Hillary Clinton and the Democratic Party:  Stealing America:  What My Experience with Criminal Gangs Taught Me About Obama, Hillary, and the Democratic Party (New York:  Broadside Books, c. 1016), and Hillary’s America:  The Secret History of the Democratic Party (Washington, D.C.:  Regnery Publishing, c. 2016).   For many years D’Souza has espoused conservative principles, shaped in part by his unique story as an immigrant (from India) feeling deeply blessed to thrive in his adopted country.  For me his treatises serve to elicit thought, not to chart a course!  

In 2012 D’Souza gave a friend running for a state office in New York $10,000 and persuaded two others to donate the same amount, for which he reimbursed them.  He knew he was skirting the campaign finance limit but didn’t think he was breaking the law.  Soon thereafter, however, he was pursued by the Justice Department and (unlike virtually all other violators) found himself paying half-a-million dollars in legal fees and serving eight months of nights in a confinement center in San Diego.  That he’d just produced an anti-Obama film (2016) was, he believed, anything but coincidental!  Commenting on his case, Harvard law professor Alan Dershowitz said:  “‘What you did is very commonly done in politics, and on a much bigger scale.  Have no doubt about it, they are targeting you for your views’” (p. 14).  In confinement D’Souza “understood, for the first time, the psychology of crookedness.  Suddenly I had an epiphany:  this system of larceny, corruption, and terror that I encountered firsthand in the confinement center is exactly the same system that has been adopted and perfected by modern progressivism and the Democratic Party” (p. 26).  He came to see the party of Obama and the Clintons not simply as “a defective movement of ideas, but as a crime syndicate” (p. 26). 

Pursuing this thesis—however preposterous it might seem—makes for interesting reading.  In particular, one learns much about the criminal underclass populating America’s prisons and its utter cynicism regarding the political system.  The murderers and thieves with whom D’Souza lived noted that most politicians enter “office with nothing and leave as multimillionaires.  So how did this happen?  It just happened?” (p. 47).  If nothing else they understood crime—and they knew criminality undergirds this process!  D’Souza soon grasped the truth of St. Augustine’s famous observation in The City of God:  “What are kingdoms but gangs of criminals on a large scale?  What are criminal gangs but petty kingdoms?”  Translating that truth into contemporary America, D’Souza concludes that “the ideological convictions of Obama, Hillary, and the progressives largely spring out of those base motives and that irrepressible will to power.  The progressives have unleashed a massive scheme for looting the national treasury and transferring wealth and power to themselves, and their ideology of fairness and equality is to a large degree of justification—a sales pitch—to facilitate that larceny.  Previously I didn’t see this very clearly; now I do” (p. 50).  

The same basic message characterizes D’Souza’s Hillary’s America, the book basic to the widely-viewed documentary bearing the same title.  He clearly believes Hillary is a threat to the republic, but more basically he argues the Democratic Party has (since its inception under Andrew Jackson) supported a variety of evil endeavors running from stealing Indians’ lands to enslaving Africans to endorsing Jim Crow laws and racist eugenics.  To D’Souza the “progressive narrative” of American history is “a lie” and the Democratic Party must be held accountable for its desultory past.  Hillary Clinton is merely the current representative of a movement that has “brutalized, segregated, exploited, and murdered the most vulnerable members of our society.”  As such Hillary and the Democrats must, he insists, be defeated!  

# # #

285 Deadly Notions

Given our rationality, ideas inevitably have consequences and deeply shape human history.  In The Death of Humanity:  and the case for life (Washington:  Regnery Faith, c. 2016), California State University historian Richard Weikart helps explain the “culture of death” so pervasive throughout the past century—during which both belief in the dignity of man and the actual lives of millions of men demonstrably perished.  Consider the case of the serial killer and cannibal Jeffrey Dahmer:  following his arrest in 1991, he said that he believed “‘the theory of evolution is truth, that we all just came from the slime, and when we died . . . that was it, there was nothing—so the whole theory cheapens life.’  With this vision, he saw no reason not to kill and eat other men.  As he confessed, ‘If a person doesn’t think there is a God to be accountable to, then what’s the point in trying to modify your behavior to keep it in acceptable ranges?’” (#224).  Similarly, Eric Harris, one of the killers in Columbine High School in 1999, confessed (in his journal) to loving Thomas Hobbes and Friedrich Nietzsche; furthermore, he wore a T-shirt declaring “Natural Selection” when he launched his killing spree.  

Having survived Auschwitz, the great Austrian psychologist Victor Frankl analyzed the intellectual currents he held responsible for the Holocaust:  “If we present a man with a concept of man which is not true, we may well corrupt him.  When we present man as an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity, and environment, we feed the nihilism to which modern man is, in any case, prone.  I became acquainted with the last stage of that corruption in my second concentration camp, Auschwitz.  The gas chambers of Auschwitz were the ultimate consequences of the theory that man is nothing but the product of heredity and environment—as the Nazi liked to say, of ‘Blood and Soil.’  I am absolutely convinced that the gas chambers . . . were ultimately prepared not in some Ministry or other in Berlin, but rather at the desks and in the lecture halls of nihilistic scientists and philosophers” (The Doctor and the Soul,  xxvii).  

In many ways, Weikart’s work is an extended commentary on the anthropological ideas Frankl  held responsible for genocide, holding man to be “an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity, and environment.”  During the past 200 years a multitude of thinkers have embraced varieties of philosophical materialism and rejected the traditional Christian “sanctity-of-life” ethic.  Some of them, beginning with Julien Offray de La Mettrie in 1747, imagined humans in terms of Man the Machine.  As a machine running within a mechanistic universe utterly devoid of meaning or purpose, it follows that a man is as irresponsible for his behavior as is the moon circling the earth.  Picking up on this notion, Ludwig Feuerbach famously said “Man is what he eats” (Der Mensch ist, was er isst!) and Karl Marx drank deeply from this fountain of atheism as he began his revolutionary career.  In our day, Francis Crick, celebrated for his DNA discoveries, “has probably done as much as anyone to promote the idea that humans can be reduced to their material basis” (#807) and speaks for many eminent academics.  Since we’re nothing but genes and molecules, i.e. matter-in-motion:  “‘No newborn infant should be declared human until it has passed certain tests regarding its genetic endowment and that if it fails these tests it forfeits the right to life’” (#826).   

Other philosophical materialists reduce man to a highly-evolved animal, following Darwin’s dictum that he was “created from animals” and has no soul.  That view, as adumbrated by People for the Ethical Treatment of Animal’s Ingrid Newkirk, holds:  “A rat is a pig is a dog is a boy.  They are all mammals.”  PETA enthusiasts, of course, elevate animals to human status, demanding they be treated tenderly.  But others lower humans to animals, treating them as disposable if worthless.  Following Darwin, human life must be devalued, since all animals share a common ancestry and natural selection requires the denial of any purpose to life.  Necessarily there can be no moral standards—might-makes-right as the fittest survive and individuals struggle for supremacy in life-and-death competition.  

Weikart traces the powerful trajectory of Darwinism (a theme he earlier documented in From Darwin to Hitler:  Evolutionary Ethics, Eugenics, and Racism in Germany and Hitler’s Ethic:  The Nazi Pursuit of Evolutionary Progress), culminating in some of the pronouncements of Peter Singer—Princeton University’s Professor of Philosophy.  Studying Singer—famed for his promotion of “animal liberation,” infanticide, bestiality, and “unsanctifying human life”—one realizes how much that’s wrong with our world can be traced to Charles Darwin, whose moral relativism justified any behavior which increased an individual’s survival potential (“even killing one’s own offspring”) if advantageous.  “Singer admits that Darwinism informs his own position that humans are not special or uniquely valuable.  He claims that Darwin ‘undermined the foundations of the entire Western way of thinking on the place of our species in the universe’” (#1054).  Without those Christian foundations, there is no reason to condemn the might-makes-right victors in the struggle for existence.  

Differing somewhat from the Darwin (who stressed environmental factors and understood little of what we label genetics), there are biological determinists such as Harvard University’s Stephen Pinker.  In The Blank Slate:  The Modern Denial of Human Nature, he justifies infanticide inasmuch as one’s “genes made me do it.”  Pinker labels just-born humans “neonates,” whose killing he calls “neonaticide” rather than murder.  Since the neonate lacks “morally significant traits” and is not demonstrably a full person, he has no more right to life than a mouse.  Hard-wired by our genes, we have no free-will and simply follow what’s prescribed for us.  Criminals are thus not to be held responsible for their crimes—a position increasingly held by criminologists and judges who attack this nation’s incarceration policies.  

Biological determinism was strongly asserted, more than a century ago, by Charles Darwin’s cousin, Francis Galton, who eagerly embraced the theory of natural selection and applied it to eugenics—a “‘new religion’ that would humanely improve humans biologically” (#1792).   In Galton we encounter “Social Darwinism” in its purist form.  Enthusiasts for this endeavor promoted a blatantly-racist agenda in Europe and America, passing laws and influencing a variety of academic disciplines.  Later tarnished by its association with the Nazis, eugenics following WWII, but it has recently revived under the rubric of “sociobiology” and “evolutionary psychology.”  Sociobiology’s architect, Harvard University’s E. O. Wilson, restricted reality to “chance and necessity” and insisted “that everything about humans—behavior, morality, and even religion—is ultimately explicable as the result of completely material processes” (#1955).  Given this assumption, virtually any behavior may be “good” as long as it contributes to the evolutionary process.  If one finds animal species engaging in suicide or infanticide or incest, numbers of evolutionary devotees declare such behavior may very well be appropriate for humans as well.  

Environmental determinists glibly declare “my upbringing made me do it”—as did Clarence Darrow (the famed defense attorney at the Scopes monkey trial) when he defended Nathan Leopold and Richard Loeb in a celebrated Chicago case a century ago.  The two young men, both brilliant and wealthy, had murdered a 14 year old boy simply to carry out the “perfect” crime.  Darrow (working out the implications of the Darwinism he had early defended in the Scopes trial) declared they were simply acting out what had been programmed into them and were thus guiltless of any crime!  Wikhart traces the genealogy of this position across 200 years, running from Helvetius through Robert Owen and his socialist supporters to Marx and his 20th century revolutionaries such as Stalin and Mao.  Prominent American psychologists, led by John B. Watson and B.F. Skinner, declared that one can prescribe any kind of behavior by applying the right stimulus.  Thus criminals ought not be held responsible for their acts—society shapes them and they do what they cannot but do.  

Yet another powerful component in the “culture of death” is the “love of pleasure” most sharply evident in the works and influence of the Marquis de Sade, who embraced any kind of behavior (including sadism) that “feels good.”  He and other Enlightenment thinkers recovered and promoted Epicurus and Lucretius—ancient writers clearly at odds with the Christian tradition.  Subsequently, Jeremy Bentham and John Stuart Mill constructed an ethical system—Utilitarianism—reducing all moral questions to a pleasure/pain calculus.  There are no “natural” rights, only more or less pleasurable experiences.  Maximizing pleasure becomes the sole “good,” whether one considers an individual or a society.  Subsequently, Sigmund Freud set forth his highly-influential psychology, reducing most every question to its sexual implications and satisfactions.  His case for sexual liberation had enormous influence, particularly as the counterculture of the ‘60s worked out its hedonistic ethos.  

Some of the 20th century’s most toxic deadly notions flow from existential and postmodern philosophers.  One of the chief sources for both movements is Friedrich Nietzsche, who declared, in Also sprach Zarathurstra:  “Die at the right time; thus teaches Zarathustra . . . .  Far too many [people] live and hang much too long on their branches.  May a storm come to shake all these rotten and worm-eaten ones from the tree.”  When Clarence Darrow mounted his defense of Leopold and Loeb he invoked Nietzsche as well as Darwin to explain away their responsibility for murder.  Nietzsche certainly shared Darwin’s view of human origins, writing:  “‘You have made your way from worm to man, and much of you is still worm’” (#3307).  When one carefully studies the careers of Mussolini and Hitler it becomes evident that many of the most murderous regimes were influenced by the atheistic existentialism of Nietzsche, including his contempt for the less fit in life’s struggle.  We have a picture of Hitler looking at a bust of Nietzsche in 1938.  A caption for the picture proudly claims “Nietzsche was a forerunner of Nazism” (#3300), and Hitler certainly wanted to move “beyond good and evil” in his will-to-power ambitions.  Traditional ethical notions, such as opposing suicide and infanticide, were to be discarded in an endeavor to purify and elevate the race.  

Having looked at the many thinkers responsible for our culture of death, Weikart assesses the fact that suicide, euthanasia, infanticide, and abortion have become increasingly acceptable in much of our world.  Thus we find two medical ethicists, in 2012, proposing we re-conceptualize infanticide as “after-birth abortion” to insure its social acceptability.   “Death-With-Dignity” initiatives have succeeded in Washington, Oregon, and California and promise to succeed elsewhere as secularism replaces Christianity as the nation’s moral foundation.  Many secularists (including the famous “situation ethicist” Joseph Fletcher) insist that mere human beings are not fully “persons” and have no right to life.  Persons, Fletcher asserted, “must have certain qualities, such as the ability to make moral decisions, self-awareness, self-consciousness, and self-determination” (#4078).  Similarly, Peter Singer says, neither an unborn “fetus” nor a newly-born baby can be considered a “person.”  Nor do severely handicapped individuals or terminally ill comatose patients qualify as “persons.”  

In his “Conclusion,” Weikart says:  “Humans on display in zoos.  Comparing farm animals in captivity to Holocaust victims.  ‘After-birth abortion.’  Physicians killing patients, sometimes even when they are not sick or in pain.  Accusing  fetuses of assaulting their mothers, just because they are living peaceably in utero.  Promoting ‘love drugs’ to make us more moral.  Granting computer programs moral status.  These are just a few examples that powerfully illustrate how sick our society is.  As many intellectuals have abandoned the Judeo-Christian sanctity-of-life ethic in favor of secular philosophies, we have descended into a quagmire of inhumanity.  Some today view humans as nothing more than sophisticated  machines or just another type of animal.   For them, human s are nothing special—just another random arrangement of particles I an impersonal cosmos” (#4936).  

The Death of Humanity deserves careful study and reflection.  J. Budziszewski, one of today’s finest Christian philosophers, says:  “So often I have heard the question, ‘How did we ever become so muddled in this twenty-first century?  What happened?’  This is a question for a historian, who can weave a single coherent story about a great many sources of confusion.  Richard Weikart is that historian, and I will be recommending his sane and lucid book often.”  As will I—and am so doing with this review!   

* * * * * * * * * * * * * * * * * * * * * * 

In Architects of the Culture of Death (San Francisco:  Ignatius Press, c. 2004), Donald De Marco and Benjamin Wiker provide brief vignettes of 23 thinkers, grouped together in seven sections, who bear responsibility for the dehumanizing “culture of death” facilitating the killing of innocent persons.   De Marco is a philosopher; Wiker is a biologist; both are committed Catholics who write to promote the “Personalism” associated with Pope John Paul II and deeply embedded in two millennia of Christian thought.  “It is precisely because of the infinite value of each human person, as revealed especially in the great drama of Jesus Christ, that truly Christian culture must be a Culture of Life, a culture that sees the protection of persons and their moral, intellectual, and spiritual development as the defining goals of society.  Whatever contradicts these goals can have no place in the Culture of Life” (p. 14).  

Clearly at odds with the Culture of Life is Friedrich Nietzsche, one of the “will worshippers” who celebrated a “Will to War, a Will to Power, a Will to Overpower’” (p. 41).   His heroes were “Supermen” like Julius Caesar who imposed their will on others, using whatever (frequently violent) means necessary.  In 1940 an American historian, Crane Brinton, diagnosed the impact of his literary works:  “Point for point he preached . . . most of the cardinal articles of the professed Nazi creed—a transvaluation of all values, the sanctity of the will to power, the right and duty of the strong to dominate, the sole right of great states to exist, a renewing, a rebirth, of German and hence European society. . . .  The unrelieved tension, the feverish aspiration, the driving madness, the great noise Nietzsche made for himself, the Nazi elite is making for an uncomfortably large part of the world’” (p. 52).   

Though not connected with them in any formal way, Nietzsche certainly shared much with eminent eugenicists of his era, who all embraced Charles Darwin’s notions of evolution through “natural selection” and the “survival of the fittest.”  Though Darwin himself evaded the implications of his theory for human beings for much of his life, it became clear in 1871, with the publication of his Descent of Man, that he was a eugenicist.  And he was also “a racist and a moral relativist” (p. 76).  Thus his cousin, Francis Galton, enthusiastically worked out the social implications of Darwinism by promoting eugenic measures designed to improve the race.  Just as we can breed better dogs we can breed better babies.  Inferior members of the species are best left to die off or forced to embrace celibacy.  Private correspondence between cousins Galton and Darwin proves how totally the latter endorsed the work of the former, so the two share responsibility for what we term “Social Darwinism.”  Embracing some of the deadlier aspects of this movement, the German zoologist Ernst Haeckel championed a rather ruthless form of evolutionary philosophy he called Monism, “drawing out the full implications of Darwinism” (p. 107).  He fervently espoused “eugenics and racial extermination” as well as “abortion, infanticide, and euthanasia as well” (p. 107).  Haeckel’s books were widely read at the turn of the 20th century and demonstrably influenced many of the policies crafted by Adolf Hitler.   

“Secular utopianists,” preeminently Karl Marx, prepared the way for mass-murderers such as Stalin and Mao.  Though his devotees religiously absolve Marx from any responsibility for the behavior of Communist regimes—asserting all efforts to implement his teachings strayed from the founder’s intent—there is clearly a deadly dimension to all efforts to establish a perfectly egalitarian world.  In fact, “Marx could not be more limpid in his call for violence.  He advocated hanging capitalists from the nearest lampposts” (p. 125).  Aligned with Marx (sharing both his atheism and communism), the French Existentialist Jean-Paul Sartre’s “philosophy leads logically and directly to despair and suicide. . . .  His world of atheism is a kingdom of nothingness plunged into intellectual darkness, convulsed with spiritual hate and peopled by inhabitants who curse God and destroy each other in their vain attempt to seize his vacant throne” (p. 175).  (There is thus some warrant for Paul Johnson to suggest, in his biography of Darwin, that Pol Pot, the genocidal Cambodian Communist, derived some of his murderous ideas from both Sartre, who introduced him to Darwin, and from Darwin himself!)

While the “pleasure seekers” might not seem to promote the culture of death, at least indirectly they do!  Thus Helen Gurley Brown, who made Cosmopolitan magazine a stellar success (especially on college campuses), singularly promoted “feel-good sex.”  Her “Sex and the Single Girl, [was} a ‘shameless, unblushing, runaway, unmitigated’ manual advising and instructing women on how to seduce men and enjoy their inalienable right to have as much sex as humanly possible” (p. 237).  Her message helped shape the enormously successful television show, “Sex in the City,” mainstreaming her ideas.   Inevitably she approved adultery, contraception, and abortion—anything pleasuring yourself was fine.  

So too “sex planners” added their notions to the anti-life brew.  Margaret Mead, named “Mother of the World” by Time magazine in 1969, was certainly one of the most influential anthropologists of the 20th century and reached a broad women’s audience through her regular columns (1961-1978) for Redbook magazine, helping “bring the twentieth-century sexual revolution to its culmination”  (p. 250).  As a young woman she published Coming of Age in Samoa (1928) and instantly became an academic superstar.  Though her misleading portrayal of the sexually libertine Samoans was “autobiography disguised as anthropology,” the book would be required reading in hundreds of university classes and help undermine the Christian tradition’s commitment to chastity and opposition to abortion.  Joining Mead as a spurious “scholar” was Alfred Kinsey, who sought to justify his own covert homosexuality and pedophilia with allegedly statistical studies on the Sexual Behavior of the American male and female.  The Kinsey Reports lent an aura of respectability to deviant behaviors simply by falsely stating large numbers of Americans actually practiced them.  

Finally, there are the “death peddlers”—Derek Humphry, who in Final Exit championed suicide; Jack Kevorkian, the pathologist who bragged about his “mercy-killing” activities and “personifies the Culture of Death” as vividly as anyone; and Peter Singer, the Princeton philosopher who seeks to discard the “traditional Western ethic” which for 2,000 years has promoted the “sanctity of life.”  Taking Darwinism to its ultimate conclusions” (p. 363), Singer denies significant differences between humans and other animals.  He also believes a “person” is a human being with certain capacities and thus not all humans are “persons” worthy of being.  His books—and his international prestige as being one of the preeminent ethicists in the world—bear witness to the triumph, in many sectors, of a noxious ideology.  

284 Hillary’s History

In a court of law, eyewitness testimony is highly privileged, considered “first-hand” evidence most worthy of consideration.  So too historians relentlessly seek out “primary” sources—eyewitness accounts showing “how it actually was.”  Eyewitnesses may, of course, render skewed accounts—shaped by personal biases or faulty memories or delimited vision.  They may very well be a bit inarticulate and disjointed in telling their stories.  So juries and historians take such things into account and try to put everything in its proper context.  But in the final analysis eyewitness testimony and primary sources provide us our surest route to historical truth.

One recent eye-witness account meriting attention is Gary J. Byrne’s Crisis of Character:  A White House Secret Service Officer Discloses His Firsthand Experience with Hillary, Bill, and How They Operate (New York:  Center Street, c. 2016).  After serving in the Air Force, Byrne realized his vocational aspirations and became “an elite White House Secret Service officer, a member of its Uniformed Division,” entrusted with guarding the President, his family and staff.  He began his assignment when George H.W. Bush (affectionately referred to as “Papa Bush”) was still President.  “I assumed every president would follow Papa Bush’s example,” Byrne says.  “The work ethic, love of country, work environment, and respect for the people serving would be constant, and politics would never matter” (p. 36). 

But his high regard for the Bush family turned to anguish as he watched the Clintons occupy the White House and witnessed first hand—among other things—the Monica Lewinsky affair.  In addition:  he  “saw a lot more.  I saw Hillary, too.  I witnessed her obscenity-laced tirades, her shifting of blame” (p. ix) and other traits disqualifying her from most any high office, much less the presidency.  He and his fellow officers “were measured by the highest of ethical requirements” while “[t]hose at the very pinnacles of power held themselves to the very lowest standards—or to none whatsoever” (p. x).  “The Clintons are crass.  Papa Bush is class” (p. 277).  To Bryne, Hillary  “simply lacks the integrity and temperament to serve in the office.  From the bottom of my soul I know this to be true.  So I must speak out” (p. xi).   

Byrne’s critical comments are confirmed and underscored by other agents, who provided Ron Kessler the information recorded In The President’s Secret Service:  Behind the Scenes with Agents in the Line of Fire and the Presidents They Protect—an historical narrative of the agency.  In the chapter devoted to the Clintons, Kessler says that Bill was charming, if utterly undisciplined, but “Hillary Clinton could make Richard Nixon look benign.  Everyone on the residence staff recalled what happened when Christopher B. Emery, a White House usher, committed the sin of returning Barbara Bush’s call after she had left the White house.  Emery had helped Barbara learn to use her laptop.  Now she was having computer trouble.  Twice Emery helped her out. For that Hillary Clinton fired him” (p. 146).  He would then be unemployed for a year, thanks to the vindictive First Lady!  One agent said:  “‘When she’s in front of the lights, she turns it on, and when the lights are off and she’s away from the lights, she’s a totally different person.’” Off stage she was “‘very angry and sarcastic and is very hard on her staff.   She yells at them and complains.’”  Though publically she pretended to adore the agents assigned to protect her, she “‘did not speak to us.  We spent years with her.  She never said thank you’” (p. 147).  That other agents share Byrne’s disdain for Hillary lends his account considerable credibility!  

Agent Byrne first encountered the Clintons in 1992 when he worked at some of the candidate’s  campaign rallies.  Chatting with a sheriff from Arkansas, he mentioned the many rumors then revolving around the Clintons.  The sheriff “gave me a thousand-yard stare.  ‘Let me tell you something, Gary.  Everything—everything they say about them is true.  The Clintons are ruthless.  And [the media-led public] don’t even know the half of it’” (p. 39).  The next six years amply proved to Byrne the truth of that sheriff’s assertion.  The polite, orderly White House deteriorated into “helter-skelter” chaos as the Clinton crew failed to “focus, pace themselves, or even delegate.  Staff wore jeans and T-shirts and faced each problem with grand ideological bull sessions” (p. 50).  Hillary Clinton’s “doting, barely post-adolescent staffers resembled enabling, weak-willed parents.  She threw massive tantrums” (p. 56) which only intensified as the years passed.  Her friendly, empathetic public facade belayed the private fury evident in “antics [that] made my job interesting.  She’d explode in my face without reservation or decorum, then confide in some visiting VIP, ‘This is one of my favorite officers, Gary Byrne’” (p. 60).  

Byrne provides important details regarding various scandals and insights into personalities in the Clinton White House, but he is best known for his testimony regarding the Monica Lewinsky affair that figured largely in the impeachment of the president.  She was what the secret service called a “straphanger” or “loiter”—a young volunteer intern with political connections, wondering about the White House seeking access to powerful persons.  Lewinsky clearly stalked President Clinton, doing everything possible to frustrate the agents who tried to shield him from her advances.  But rather quickly it became an open secret that she and Clinton were having an affair—one many such trysts the president engaged in while living in the White House, including sessions with Eleanor Mondale, the daughter of the former vice president.  Still more, a fellow agent told Byrne:  “‘You have no idea what it’s like on the road’” (p. 107), where women regularly traipsed in and out of Clinton’s quarters.  He “had difficulty managing where he saw his many mistresses, whether it was at the White House or on the road.  It baffled the Uniformed Division as to how he could manage all these women without any of them realizing there were so many others.  We wondered how he got any work done and joked that he would have been better at running a brothel in a red-light district than the white House” (p. 127).  

After encountering Lewinsky, President Clinton put her on the White House payroll and gave her his top-secret phone number so they could have intimate talks.  To Byrne:  “paying a mistress with taxpayer funds and giving her security clearance?  These were new lows” (p. 111).  Ultimately the semen-stained blue dress would prove the president guilty of perjury and lead to his impeachment.  Then when special prosecutor Ken Starr, investigating Clinton’s affair with Paula Jones, learned of the Lewinsky affair, he brought the weight of the Justice Department to bear on Byrne, seeking information helpful to his investigation.  So very much against his will he was subpoenaed and forced to tell what he had observed in the White House.  Testifying via videotape before a grand jury, he would soon be seen by the nation on C-SPAN—though he had been promised his testimony would remain sealed.  As a Secret Service agent he had vowed to protect the president—committed to never revealing “information that might jeopardize [his] safety and security”—so he refused to discuss certain things.  But as a citizen he had to reveal certain details relevant to the Starr inquiry.  Consequently, he became one of the most important under-oath witnesses regarding the Clintons’ behavior in the White House.

Now safely removed from that crisis-ridden epoch, Byrne can look back and assess it.  While testifying, he remembered that Arkansas sheriff’s words regarding the Clintons’ ruthlessness, and he confesses to fearing them and what might happen to him and his family because of his testimony.  Still more, he’s outraged:  “I was compelled to tell the truth, but why the hell was neither the president nor Mrs. Clinton ever really compelled to tell the damn truth?” (p. 165).  Bill Clinton misbehaved and lied and easily moved on virtually unscathed while many “little people” had their lives ruined by his behavior and his wife’s machinations.  “This is the man I was protecting?  That’s what I tolerated?  I had tried and tried to prevent harm to this president, but he failed us all!” (p. 177).  

Two decades later, Byrne says:  “Our collective amnesia about the Clinton White House is dangerous because it could happen again—maybe with a different Clinton dealing the cards, but with the same stacked deck” (p. 273).  So he has written this book to dissuade us from electing Hillary, particularly in light of her careless handling of classified materials and suspicious work with the Clinton Foundation.  He “was there with the Clintons.  I could not keep silent then, and I can’t keep silent now” (p. 274).  

* * * * * * * * * * * * * * * * * * * * *

In the early ‘60s David Schippers led the Justice Department’s Organized Crime and Racketeering Unit, successfully prosecuting mobsters such as Sam Giancana.   A lifelong Democrat who twice voted for Bill Clinton, he was renowned for his skills as a prosecutor and trial attorney.  More importantly:  he was known as a man of integrity.  As The House of Representatives began the inquiries which led to the impeachment of President Clinton, Shippers was brought to Washington to lead an oversight investigation of the Justice Department and ultimately became Chief Counsel of the House Managers entrusted with pursuing evidence for the president’s impeachment.  In Sellout:  The Inside Story of President Clinton’s Impeachment,Shippers provided an “insider’s account” of what happened nearly 20 years ago.  

In the light of evidence he probably knew better than anyone else, Shippers believed Clinton should have been removed from his office for his “high crimes and misdemeanors.”  Though the president claimed to be “proud of what we did” during the impeachment process—declaring he “saved the Constitution”—Schippers thought him demonstrably guilty of  “some of the most outrageous conduct ever engaged in by a president of the United States” (p. 3).  He quickly learned to detect and deeply abhor the Clintons’ guiding modus operandi:  do anything to avoid the truth.  White House spin-masters manipulated the media (portraying the president as a victim) and glossed over his incessant lies which were obvious to skilled lawyers who saw through his legalistic obfuscations.  To Shippers, Clinton’s real “high crimes and misdemeanors” were perjury and obstruction of justice.  But he and his media accomplices successfully reduced the whole inquiry to nothing more than questions of lamentable sex with Monica Lewinsky.  “The White House never ceased to astound and dismay me in the extent to which it demonstrated its utter contempt for the Judicial Branch, the Legislative Branch, and the American people” (p. 171).  

As much as anyone, then, David Schippers understands the Clintons’ duplicitous behavior.  So when he commends a recent book by Dolly Kyle we may assume he validates much of her account in Hillary:  The Other Woman:  A Political Memoir (Washington, D.C.:  WND Books, c. 1916).  Schippers says the book “is as timely as tomorrow’s newspaper” inasmuch as it contains “Ms. Kyle’s firsthand knowledge obtained over many years” (#56).  Acutely aware of the investigations he conducted 20 years ago, he affirms the truth of Kyle’s memoir since she’s known the Clintons for half-a-century and occupies “a unique position to reveal the truth about Billy and Hillary that no one else can tell” (#178).  She wrote this book because “Hillary Rodham Clinton is running for president.  She is morally and ethically bankrupt” (#144).  From Kyle’s perspective:  “The average person cannot comprehend that two politicians could have managed to get where they are with so many crimes in their wake, and so little reporting about it” (#1028).  The Clintons are, to be candid:  “lying, cheating, manipulative, scratching, clawing, ruthlessly aggressive, insatiably ambitious politicians . . . and nothing about them has changed in the past forty-plus years, except that they have deluded more and more people” (#1034).  

Dolly Kyle met Bill Clinton in 1959 in Hot Springs, Arkansas, when she was eleven years old.  They both graduated from Hot Springs High School in 1964, and she provides many details and insights into the community and families that help us better understand “Billy” Clinton.  She was immediately attracted to him and “a liaison . . . evolved from puppy love to dating to friendship to a passionate love affair” (#209) that lasted, off-and-on, for 40 years.  Their affair was pretty much an “open secret” in Arkansas, though it attracted little media attention.  “I’m not proud (and have repented) of having that decades-long affair with Billy Clinton, but it is a fact” (#448).  They became lawyers, married other persons, had children, and repeatedly interacted with each other.  Sadly enough, for too many years she simply thought of him as a lovable rascal, indulging his appetites with a series of willing women.  “I didn’t realize until many years later, that Billy was a serial sexual predator and a rapist” (#1886).  Nor did she then understand Hillary’s role in suppressing any evidence of his philandering.  

When Hillary Rodham moved to Arkansas and married Bill Clinton, she necessarily had contact with her husband’s Arkansas friends, including Dolly Kyle.  Though Dolly retains a lingering affection for “Billy” (despite his wayward ways he’s “a charming rogue who was sexually addicted”), she clearly dislikes Hillary.  In her opinion, Bill moved in with Hillary while they were students at Yale in order to share her wealth and the two have simply used each other to advance their respective careers ever since.    During their early years, it was “generally Hillary’s job to make the money and provide the financial base from which she and Billy could maneuver their way to the White House” (#2424).  “Even their decision to have a child was a calculated political maneuver to make them appear to be a normal couple” (#1533).  

Meeting Hillary for first time in Little Rock in 1974, Kyle was shocked at the “dowdy-looking woman who appeared [at a distance] to be middle-aged” (#590), wearing thick glasses, shapeless dress and sandals; she clearly cared little for style or personal appearance.  When Bill introduced them, Dolly “smiled and extended my right hand in friendship,” but Hillary “responded only with a glare at me.  Finally, seeing my hand still extended, she managed a grudging nod.  She did not condescend to shake my hand” (#608).  Obviously there would be little love lost between these two women in Bill Clinton’s world!  But their encounters were minimized as Bill usually attended events (such as high school reunions) without Hillary and could easily engage in various liaisons to his liking.  At the 30th reunion there occurred “the infamous scene between the two of us that was immortalized under oath in the impeachment investigation” (#935).  

Ultimately, when he was president, Bill Clinton’s sexual affairs came under increased judicial scrutiny, and Kyle (under oath in a disposition in the Paula Jones v. Clinton lawsuit) disclosed the nature of their relationship.  She had earlier discovered first-hand the malice and vindictiveness with which Hillary pursued any woman who might endanger her aspirations.  In fact, when an English journalist was about to disclose her affair with Bill when he was running for president in 1992, her own brother had warned her, speaking for Billy:  “If you cooperate with the media, we will destroy you’” (#3291) if she confirmed the truth about her relationship with him.  So in time she concluded:  “While proclaiming himself to be the champion of women’s rights, Billy Clinton has continually betrayed the woman he married, the girl he fathered, and the untold numbers of women he used for his sexual gratification.  Meanwhile, proclaiming herself to be the champion of women’s rights, Hillary Clinton has been behind the threats and intimidation of the women her own husband abused and molested” (#1579).  

In addition to providing details regarding Billy’s sexual misconduct, Kyle shares what she knows about the Clintons’ multifaceted adventures in Arkansas and the White House.  She discusses important  personalities such as Webb Hubbell and Vince Foster (one of Bill’s childhood friends and a partner with Hillary at the Rose Law Firm).  She cynically notes that Hillary was first hired and later became a partner of the Rose Law Firm at precisely the same moments her husband became attorney-general and then governor of Arkansas!  Vince Foster “knew the facts about Hillary’s double-billing practices that had enabled her to receive questionable foreign money with strings attached” as well as the “FBI files that had been taken illegally for illegal purposes and would later be found with Hillary’s fingerprints on them” (#3525).  He knew all the details regarding the Clintons’ financial adventures.  In time, Kyle thinks, he committed suicide simply because he could not handle all the stress he experienced as a result of his work with the Clintons, dying under the weight of being betrayed by his friends.  

Dolly Kyle also conveys—as she documents the evils done by the Clintons—a deep sense of betrayal.  She feels personally betrayed, but in a larger sense she’s persuaded they have betrayed an enormous number of others and this nation itself.  While distressingly disorganized and subject to criticism because of her personal animosities, Hillary, the Other Woman, certainly gives us first-hand insights into the character (or lack of it) of two of the most prominent politicians of our era.  

* * * * * * * * * * * * * * * * * * * * * 

            Perhaps the best-known victim of the terrorists’ attacks on September 11, 2001, was Barbara  Olson, the wife of the nation’s Solicitor-General, Ted Olson.  Like her husband, she was a lawyer, and had served as both a prosecutor for the Department of Justice and as counsel to a House congressional committee that investigated some of the Clintons’ scandals.  She died aboard the hijacked airplane that smashed into the Pentagon two days before her long-awaited book—ironically titled Final Days—was to be published.  She concluded that book with a solemn reminder and a warning regarding the deeply radical views of Bill and Hillary Clinton which she had earlier catalogued in Hell to Pay: The Unfolding Story of  Hillary Rodham Clinton (Washington: Regnery Publishing, Inc., c. 1999).

          Olson’s eyes opened while investigating allegations regarding missing FBI files and the firing of White House Travel Office employees in order to give the jobs to some of the Clintons’ Arkansas friends.  .  Immersing herself in the witnesses’ evidence, Olson came “to know Hillary as she is—a woman who can sway millions, yet deceive herself; a woman who has persuaded herself and many others that she is ‘spiritual,’ but who has gone to the brink of criminality to amass wealth and power” (p. 2). Olson had “never experienced a cooler or more hardened operator,” a more singularly calculating public figure, whose  “ambition is to make the world accept the ideas she embraced in the sanctuaries of liberation theology, radical feminism, and the hard left” (p. 3).  Machiavellian to the core, Hillary proved herself to be “a master manipulator of the press, the public, her staff, and-likely-even the president” (p. 3).

          Intellectually gifted, Hillary attended Wellesley College in the late ‘60s.  Awash in the currents of the counterculture, she gradually embraced its radical agenda, participating in antiwar marches, defending a Black Panther murderer, and enlisting fellow students to change the world.  She was selected to speak at her commencement following an address by Massachusetts’ Republican Senator Edward Brooke.  Rather than give her prepared speech, however, Hillary “‘gave an extemporaneous critique of Brooke’s remarks’” (p. 41), rudely reproving him. “We’re not interested in social reconstruction,” she shouted; “it’s human reconstruction” (p. 42). Nothing less than the Marxist “new man”—would satisfy her.  

          That youthful obsession, Olson argues, persisted.  Hillary found Western Civilization bankrupt, needing more than reform.  Only “remolding,” only radical new structures, can bring about the “social justice” she pursues.  Such can come only “from the top—by planners, reformers, experts, and the intelligentsia.  Reconstruction of society by those smart enough and altruistic enough to make our decisions for us.  People like Bill and Hillary Clinton.  Hillary, throughout her intellectual life, has been taken by this idea, which is the totalitarian temptation that throughout history has led to the guillotine, the gulag, and the terror and reeducation camps of the Red Guard”  (p. 311).  Overstated?  Well, Olson knew Hillary well!  

283 Feminist Fallout

   That the unintended consequences of revolutionary political and social movements frequently surpass their original intent may be easily discerned in the study of history.  This truth poignantly surfaces in Sue Ellen Browder’s Subverted:  How I Helped the Sexual Revolution Hijack the Women’s Movement (San Francisco:  Ignatious Press, c. 2015).  She begins with this confession:  “I can give you no justification for what I did I my former life.  I will only say this is my weak defense:  I was a young woman searching for truth, freedom, and meaning in the world, but I had no clue where to find them” (#37 in Kindle).  

In part Subverted is an autobiography, an account of a modern journalist.  As a youngster growing up in Iowa, Browder longed to escape her small-town environs and join the more exciting, opportunity-laden cosmopolitan world she saw in magazines and television.  Determined to become a writer, she entered and then graduated from the University of Missouri’s School of Journalism.  She then worked briefly for a publication in Los Angeles before going to New York and landing a job as a free-lance writer with Helen Gurley Brown’s Cosmopolitan, which in 1970s was “the undisputed reigning queen of women’s magazines—the hottest women’s magazine in the nation” (#45).  She basked in the glow of early success, seeing her talents displayed in the pages of  a publication renowned for promoting the causes she most ardently supported—including the ‘60s sexual revolution.  

She’d all so quickly realized her adolescent dream!  “Only later would I realize how dark the dream had become.  Eventually, it would lead to a cacophony of mixed, confused messages in our culture about women, work, sex, marriage, and relationships—errors that have divided our nation and continue to haunt us to this day.  It would lead me to make disastrous decisions” (#63).  But as she and her husband and two children moved about the country, finding a variety of positions and surviving as writers, she continued, for 24 years, publishing articles in Cosmopolitan, telling “lie upon lie to sell the casual-sex lifestyle to millions of single, working women” (#69).  

So Browder’s purpose in writing the book is more than autobiographical—she wants to clarify where and why she went so wrong for so wrong.  It all begin with here naive enlistment in the women’s movement.  Though she’d been reared by parents clearly committed to her personal development, reading Betty Frieden’s The Feminine Mystique when she was 17 powerfully affected her.  “‘The only way,” Frieden declared, “for a woman, as for a man, to find herself, to know herself as a person, is by creative work of her own.  There is on other way’” (#265).   That goal Browder successfully pursued.  But she also met and married another writer, Walter Browder, launching a relationship which would put her at odds with the liberationist feminism Frieden promoted.  She naturally “took the Pill without a qualm,” imagining she could “enjoy sterile sex and control my own sex life” (#334), not knowing how it would “put me on a hormone-powered emotional roller-coaster, which regularly plunged me into black pits of depression” (#334).  

Despite the Pill she became pregnant and had a baby shortly before moving to New York—another complicating relationship!  In her initial interview with the Cosmopolitan staff (knowing Helen Gurley Brown “saw the single girl as ‘the newest glamour girl of our times’ and viewed children as ‘more of a nuisance than a blessing’”) she carefully avoiding mentioning the fact she was a mother.  “At Cosmo, I was a dedicated follower of Planned Parenthood founder Margaret Sanger, the foremost proponent of birth control as a panacea to the world’s problems.  Sanger idolized sex without kids.  ‘Through sex,’ Sanger sang joyously in The Pivot of Civilization, ‘mankind may attain the great spiritual illumination which will transform the world, which will light up the only path to an earthly paradise’” (#556-557).  Writing for “Cosmo,” the author laments, “I danced in Sanger’s procession” (#557).

Hired to write articles for Brown’s magazine, Browder quickly learned that lots “of the alleged ‘real people’ we wrote about in the magazine were entirely fictitious” (#527).  While working in California, she’d seen journalists blithely make up “sources” and write articles without doing the hard work of actually investigating events, so constructing stories about a Cosmo Girl who would “sleep with any man she pleased” and enjoy an upwardly mobile career became quite easy for her.   She just constructed imaginary stories, writing about an unreal world.  She remained “a loyal foot soldier in the women’s movement’s media army.  Even as I rejected the sexual revolution lifestyle as a sham, I scrambled to climb aboard NOW’s freedom train” (#694), promoting “a false path to freedom that was not just reprehensible but evil” (#717).  

Blatant evil triumphed when Betty Frieden led the National Organization of Women to join forces with Larry Lader’s NAROL, an abortion-rights group determined to secure abortion-on-demand.  “At Cosmo,” Browder confesses, “the one assumption I never thought to question in my confusion was whether or not abortion and contraception were good for women” (#930). On a personal level, Browder herself would abort a baby when family finances seemed to dictate.  But she found that having an abortion was hardly the trivial affair Cosmopolitan readers assumed!  Part of herself, as well as her tiny baby, died on that gurney.  As she would later learn when she researched the subject, Lader’s spurious book, Abortion, was cited repeatedly by Justice Harry Blackmun in his Roe and Doe decisions.  In time, Browder would carefully read and reflect on Blackmun’s role in prescribing abortion-on-demand for the country, finding the man and his judicial work seriously flawed.  

  Even while writing her Cosmo articles, at home Browder found in her husband and son a different world, a “better way,” a life “filled with light, laughter and love” (#595).  Her success as a writer only temporarily satisfied her, whereas her work as a mother was “sheer delight” (#1831).  She finally realized “that by focusing almost exclusively on money, power, and career, while denying women’s deeper longings for love and a family, the modern women’s movement got its priorities upside down and backward” (#2475).  So she began asking deeper, more philosophical questions.  Initially, she embraced the “self-actualization” psychology of Abraham Maslow’s—in reality a “self-as-god” way of thinking that cannot but fail.  “Detached from God,” she laments, “I was ready to listen to any blowhard who came my way” (#1436).  Ultimately, she and her husband “went back to church” and found, much too late in many ways, the truth she’d always sought.  “After we returned to church, everything in our lives seemed fresh and new.  Never had we been so happy” (#2192).   

The Browders initially entered an Episcopal church in Connecticut.  Later, while living in California and ever-more deeply hungering for God’s Reality, they entered the Catholic Church in 2003.  To her amazement, “This wasn’t the ‘stuffy, old, patriarchal church’ I’d heard about.  The Church’s teachings were all about love, joy, and forgiveness.”  Still more:  “This was a complete system of philosophical thought and mystical faith with answers the entire world needed to hear” (#3031).  Subverted is an engrossing story, packed with important insights, that tells us much that’s gone wrong in our country during the past half-century.  

                                           * * * * * * * * * * * * * * * * * * * * *

In Tied Up in Knots:  How Getting What We Wanted Made Women Miserable (New York:  Broadside Books, c. 2016), Andrea Tantaros sets forth a secular critique of modern feminism that blends praise and protest for what’s happened for and to women during the past half-century.  She grew up taking to heart Betty Friedan’s message in The Feminine Mystique.  Empowered thereby she pursued a media career and ultimately landed a position with Fox News, where she regularly airs her views before a national audience.  What more could a young woman want?  And she likes what she’s got and still supports the feminist agenda—“If I have to choose between feminism and the pre feminist days, I will choose feminism without hesitation” (#3320).  Yet, it turns out, amidst all her success there has come a gnawing suspicion that there’s more to life than the feminist mantra of “making it in a man’s world.”  Acting like men, feminists insisted they “pay our bills, open our own doors, and carry our own bags” (#757).  But as they stopped acting like women real men steadily lost interest in them.  Ah there’s the rub!

Certainly “women should be equal with men, but, at the same time,women aren’t men.  Equal does not mean the same” (#199).  Yet that’s what many feminists demanded.  Consequently, “feminism doesn’t feel very feminine” (#204).  What Tantaros calls “the Power Trade” negotiated by feminists was in fact “a deal with the devil,” for by imitating men women “abandoned our potent and precious female power” and ceased to act like ladies (#210).  Indeed, many of the movement’s leaders have waged war against men and done lethal harm to healthy heterosexual romance and marriage.  Speaking autobiographically, Tarantas says:  “I have been a one-woman focus group on the tenets of feminism for three decades.  But it wasn’t until I found myself single after two back-to-back long-term relationships that I realized how different the dynamic between the sexes had become” (#233).  In short:  she’d become a highly successful woman with  neither husband nor children—and that’s not really how it’s supposed to be!  Sadly:  “Postponing marriage and motherhood comes with huge costs—and no one is telling young girls this” (#2753).  

Given her own predicament, she’s written this book to try and understand it.  But her analysis, alas, is too often as superficial as the life she’s lived!  She makes interesting observations, tells vivid anecdotes and cites various studies, but she lacks the philosophical, much less theological, resources to address the real issues that so obviously trouble her.  She knows she wants something but cannot actually understand what it is.  So daydreams about the the “superrelationship” she and her “generation of women” await:    “We want a soulful, sexy, and inspired union that can help us realize our full potential in life.  We want a deep connection with a best friend, an emotional and spiritual confidant, and intellectual counterpart who gets our inside jokes, matches us financially, and who loves us with a passion that rivals Romeo’s.  Women have gained power and are refusing to settle—and that is a good thing.  Women can find that kind of love, but we just have to be patient enough to wait for it and refuse to settle for anything less than what we want:  love, fidelity, kindness, respect” (#1142-44).  Such soaring aspirations rarely find fulfillment simply because they’re basically unreal—so lonely women like Tantaras will forever be “tied up in knots” I fear.  

* * * * * * * * * * * * * * * * * * * *

One of the 20th century’s most remarkable women was Edith Stein, a Jewess who studied philosophy with Edmund Husserl, taught philosophy in German universities, and then converted to the Catholic Church.  She joined the Carmelite order, devoting herself to teaching (clearly her great vocation) in its schools.  When the Nazis gained control of Germany, Stein fled to Holland but was in time arrested, sent to a concentration camp, where she perished.  In 1998 Pope John Paul II elevated her to sainthood, standing as a wonderful witness to bother her intellectual brilliance and spiritual sanctity.  During the 1930s she wrote and delivered as lectures a series of papers now collected in volume two of her collected works and titled Essays on Woman, Second Edition, Revised (Washington:  ICS Publications, c. 1996).  That few if any leading feminist thinkers (e.g. Betty Frieden) are first-rate thinkers becomes clear when one reads how a truly great philosopher addresses the topic!  Given the nature of a collection of papers, many of Stein’s positions are routinely repeated and a careful perusal of a selected few would reveal the essence of her thought  

Feminism, in accord with a litany of other ideologies, inevitably fails inasmuch as it misrepresents and endeavors to evade Reality.  But as a serious philosopher, Stein understood her task:  to see clearly and better understand whatever is.  Thus she continually sought to probe the essence of womanhood—“what we are and what we should be”—discerning therein direction for evaluating the feminist movement and describing the proper life—and particularly the redemptive form of life—best for females.  She applied St. Thomas Aquinas’ understanding of analogy entis to her work, seeing God’s image in human beings who need (like a planted seed) both human assistance and divine grace to attain their true end.  Though feminists generally insisted there were no significant differences between men and women, thus calling for identical educational curricula and vocational opportunities, Stein upheld what she considered an indubitable truth:  sexual differences matter greatly.  

Thus, in “The Ethos of Women’s Professions,” she sought to discuss work in light of the “an inner form, a constant spiritual attitude which the scholastics term habitus” (#718) which necessitates we recognize “specifically feminine” vocations.  To Stein, there  are “natural feminine” traits that “only the person blinded by the passion of controversy could deny” (#747).  As both Scripture and common sense make clear, “woman is destined to be wife and mother.  Both physically and spiritually she is endowed for this purpose.”  Giving structure to her bodily being is that spiritual reality—the anima forma corpus—which differentiates her from men of the same species.  Thus she “naturally seeks to embrace that which is living, personal and whole.  To cherish, guard, protect, nourish and advance growth is her natural, maternal yearning” (#755).  Unlike men, with their penchant for abstractions and devotion to tasks, women relish more concrete, living things.  Their “maternal gift is joined to that of companion.  It is her gift and happiness to share the life of another human being and, indeed, to take part in all things which come his way, in the greatest and smallest things, in joy as well as in suffering, in work, and in problems” (#762).   Works of charity, in particular, come quite naturally to her.  

Understanding this God-given reality, women rightly enter various professions, and “there is no profession which cannot be practiced by a woman” (#815).  Yet some work—nursing, teaching, social work—more easily accommodate the “sympathetic rapport” that comes naturally to them.  Yet “the participation of women in the most diverse professional disciplines could be a blessing for the entire society, private or public, precisely if the specifically feminine ethos would be perserved” (#844).  Still more, in light of the Thomistic position that “Grace perfects nature—it does not destroy it,” women should always seek to flourish in accord with their unique nature, their femininity, serving God through “quiet immersion in divine truth, solemn praises of God, propagation of the faith, works of mercy, intercession, and vicarious reparation” (#858).  Surrendering to God, seeking to do His will, opens the door to human flourishing.  “God created humanity as man and woman,” she concludes, “and He created both according to His own image.  Only the purely developed masculine and feminine nature can yield the highest attainable likeness to God.  Only in this fashion can there be brought about the strongest interpenetration of all earthly and divine life” (#955).  

Stein consistently contends for “The Separate Vocations of Man and Woman According to Nature and Grace.”  If we carefully attend to what’s real, “the person’s nature and his life’s course are no gift or trick of chance, but—seen with the eyes of faith—the work of God.  And thus, finally, it is God Himself who calls.  It is He who calls each human being to that to which all humanity is called, it is He who calls each individual to that to which he or she is called personally, and, over and above this, He calls man and woman to such to something specific as the title of this address indicates” (#974).  In the biblical creation account, Adam and Eve “are given the threefold vocation:  they are to be the image of God, bring forth posterity,and be masters over the earth” (#990).  Given that assignment, Eve is called to be Adam’s “helpmate”—an “Eser kenedo—which literally means ‘a helper is if vis-a-vis to him’” (#1000).  Both sexes are equal and equally important, sharing responsibility to “fill the earth and subdue it,” though their roles in carrying out the assignment rightly differ.  Theirs is a complementary relationship:  “man’s primary vocation appears to be that of ruler and paternal vocation secondary (not subordinate to his vocation as ruler but an integral part of it); woman’s primary vocation is maternal:  her role as ruler is secondary and included in a certainty in her maternal vocation” (#1228). 

Rather than point to an evil “patriarchy” or unjust polity, Stein locates the source of the problems women experience:  As a result of man’s Fall:  “Everywhere about us, we see in the interaction of the sexes the direct fruits of original sin in the most terrifying forms:  an unleashed sexual life in which every trace of their high calling seems to be lost; a struggle between the sexes,one pitted agains the other, as they fight for their rights and, in doing so, no longer appear to hear the voices of nature and of God.  But we can see also how it can be different whenever the power of grace is operative” (#1264).  So there is, in God’s Grace, hope for us all:  “The redemption will restore the original order.  The preeminence of man is disclosed by the Savior’s coming to earth in the form of man.  The feminine sex is ennobled by virtue of the Savior’s being born of a human mother; a woman was the gateway through which God found entrance to humankind.  Adam as the human prototype indicates the future divine-human king of creation; just so, every man in the kingdom of God should imitate Christ, and in the marital partnership, he is to imitate the loving care of Christ for His Church.  A woman should honor the image of Christ in her husband by free and loving subordination; she herself is to be the image of God’s mother; but that also means that she is to be Christ’s image” (#1160).  

As a committed Catholic, Stein defends the Church’s tradition regarding the priesthood.  “If we consider the attitude of the Lord Himself, we understand that He accepted the free moving services of women for Himself and His Apostles and that women were among His disciples and most intimate confidants.  Yet he did not grant them the priesthood, not even to his mother, Queen of the Apostles, who was exalted above all humanist in human perfection and fullness of grace” (#1375).  Why?  Because in the natural order designed by God, “Christ came to earth as the Son of Man.  The first creature on earth fashioned in an unrivaled sense as God’s image was therefore a man; that seems to indicate to me that He wished to institute only men as His official representatives on earth” (#1391).  Men and women are equally called to enter into communion with their Lord, but they are called to follow different paths in doing so.  “It is the vocation of every Christian, not only of a few elect, to belong to God in love’s free surrender and to serve him” (#1391).  This is, above all, everyone’s vocation and therein there is “neither male nor female.”  

“God has given each human being a threefold destiny,” Stein says:  “to grow into the likeness of God through the development of his faculties, to procreate descendants, and to hold dominion over the earth.  In addition, it is promised that a life of faith and personal union with the Redeemer will be rewarded by eternal contemplation of God.  These destinies, natural and supernatural, are identical for both man and woman.  But in the realm of duties, differences determined by sex exist” (#1627).  Especially in the process of procreating and rearing children, women must carefully sense and assent to God’s plan for man.  Though single women like Stein herself have an important calling, for most women marriage and children should be fundamental—for it is, in truth, most vital to their being and ultimate happiness.  

Had feminists in the 20th century thought as deeply as Stein—and followed the truth wherever it leads—much of the negative fallout felt by today’s young women could have been avoided!   Were influential academics as committed to truth telling as Stein we’d not be burdened with the strident declarations that there are absolutely no differences between the sexes!  Were Christians more concerned with God’s will than politically correct posturing, there would be greater focus and effectiveness to the Church’s mission.

# # #  

282 A “Republican” Constitution?

  Following the work of the Constitutional Convention of 1787, a Philadelphian asked Benjamin Franklin:  “Well, Doctor, what have we got, a republic or a monarchy?”  Franklin promptly responded, “A republic, if you can keep it.”   He and his colleagues obviously sought to establish a constitutional republic, subject to laws rather than men, but they also (as was evident in many of their debates) wanted to preserve this “republic” from a host of “democratic” abuses that might threaten it.  This differentiation sets the stage for Randy E. Barnett’s insightful treatise, Our Republican Constitution:  Securing the Liberty and Sovereignty of We the People (New York:  HarperCollins, c. 1916), wherein he argues that we must interpret the Constitution in light of the Declaration of Independence’s memorable assertion:  “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.  That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.”  

That pre-existing, natural Rights are given by the Creator and possessed by individual persons, not groups of people, marks a true Republic!  Such rights were then secured by a written document, the Constitution, affording coming generations protection from those who would infringe upon them.  “A Republican Constitution views the natural and inalienable rights of these joint and equal sovereign individuals as preceding the formation of governments, so first come rights and then comes government” (#621).  Contrariwise, when one thinks rights reside in collectives—and are therefore posited or granted by certain groups, e.g. a majoritorian government—he champions Democracy.  “A Democratic Constitution is a ‘living constitution whose meaning evolves to align with contemporary popular desires, so that today’s majority is not bound by what is called ‘the dead hand of the past.’  The will of yesterday’s majority cannot override the will of the majority today” (#592).  It logically follows that in a Republic there are elected “representatives” who serve the people; in a Democracy there are “leaders” who court and implement the will of their supporters. 

To oversimplify, Americans lived under a Republican Constitution for the first century of this nation’s existence.  During the next century, however, an increasingly Democratic Constitution became normative.  At issue today is this:  can we—will we—find ways to restore the Republic established by Franklin and his colleagues?  To do so requires us, firstly, to rightly understand the Constitution as crafted in 1787, beginning with the Declaration of Independence and its reliance on the “Laws of Nature.”  Here Barnett, a distinguished professor of law at Georgetown University, exemplifies his pedagogical profession, describing and explaining it.  To understand what the Declaration meant by this phrase, Barnett cites an illuminating passage from a sermon delivered by the Reverend Elizur Goodrich in 1776:  “‘the principles of society are the law, which Almighty God has established in the moral world, and made necessary to be served by mankind; in order to promote their true happiness, in their transactions and intercourse.’  The laws, Goodrich observed, ‘may be considered as principles, in respect of their fixedness and operation,’ and by knowing them, ‘we discover the rules of conduct, which direct mankind to the highest perfection, and supreme happiness of their nature.’  These rules of conduct ‘are as fixed and unchangeable as the laws which operate in the natural world.  Human art in order to produce certain effects, must confirm to the principles and laws, which the Almighty Creator has established in the natural world’” (#812).  This succinctly summarizes the “Natural Law” tradition.

The Constitution composed in Philadelphia sought to establish a tightly limited government rooted in these natural laws, securing “we the people’s” inalienable rights from the pervasive excesses of democracy under the Articles of Confederation—on display to James Madison wherever “measures are too often decided, not according to the rules of justice and the rights of the minor party, but by the superior force of an interested and overbearing majority’” (#1120).  The people are indeed sovereign, the source of the republic’s authority.  But such sovereignty, as clearly recognized by John Jay and John Wilson, the nation’s preeminent judicial thinkers, resided in individuals, not the collectivist “general will” of Rousseau.   

Yet Rousseau’s position helped shape the Democratic Party which was established by Andrew Jackson and Martin Van Buren in 1832.  “The concept of the will of the people was central to Van Buren’s ‘true democracy.’  He believed that the great principle first formally avowed by Rousseau ‘that the right to exercise sovereignty belongs inalienably to the people’” who should rule through popular majorities (#1585).  In the 1850s Stephen A. Douglass would pick up on this idea and promote his vision of “popular sovereignty” in defense of allowing the diffusion of slavery wherever the people supported it.  Abraham Lincoln, of course, took a different view, and the Republican Party first waged a war and later passed the 13th, 14th, and 15th Amendments to secure the individual rights of all persons, thus eliminating slavery in this nation.   

Following the Reconstruction era, however, Barnett says we began “losing our Republican Constitution” when the Supreme Court effectively gutted the three Amendments that freed the slaves and recognized their status as citizens, thereby acceding to the will of racist Democrats in the South.  Simultaneously the Court (as personified by Oliver Wendell Holmes) gradually endorsed legislation passed by Progressives (both Democrat and Republican) who wanted to change the nation by implementing a variety of political, economic and social reforms—often through administrative agencies and courts, staffed with the “experts” so beloved by Progressives.  They insisted the Constitution is a “living” compact—a “living and organic thing” said Woodrow Wilson—constantly subject to change in whatever direction a majority of the people desire.  With the triumph of FDR and the New Deal, this “living” Constitution—a will-of-the-people Democratic agreement—  became the “law of the land.”  

Though this “Democratic” understanding of the Constitution still prevails in this nation’s corridors of power, Barnett thinks it possible to restore the original, “Republican,” understanding to its rightful place.  The federalism and limited government intended by the Founders in 1787 still matter if we are concerned with our God-given rights and personal liberties.  And since 1986, with the appointment of William Rehnquist to the Supreme Court, hopeful signs of a renewed federalism (apart from economic policies) are on the horizon, though President Barack Obama has done everything possible to frustrate this possibility.  Thus Barnett thinks we need to add ten new amendments (initiated by the states) to the Constitution, so as to preserve its Republican nature.  

Though some of Barnett’s presentation will appeal only to readers with suitable backgrounds in legal history and political philosophy, he has set forth a meaningful way to understand the basic issues in this nation’s 200 year history.  Restoring a Republican Constitution would require heroic work in many ways, but it is certainly a goal worth pursuing for citizens concerned for the real welfare of this Republic.

* * * * * * * * * * * * * * * * * 

In Living Constitution, Dying Faith:  Progressivism and the New Science of Jurisprudence (Wilmington, DE, c. 2009) Bradley C. S. Watson aims “to elucidate the connection that American progressivism as philosophical movement and political ideology has with American legal theory and practice” (p. xvi).  Combining Social Darwinism and Pragmatism (the twin ingredients of Progressivism evident in William James and John Dewey, Oliver Wendell Holmes and Louis Brandeis, Theodore Roosevelt and Woodrow Wilson and Barack Obama), we are now subject to “historicist jurisprudence”—taking what is at the moment as good and true simply because it is the current cusp of historical processes, being “on the right side of history.”  We have a judicial system that “is not only hostile to the liberal constitutionalism of the American Founders, but to any moral-political philosophy that allows for the possibility of a truth that is not time-bound” (p. xvi).  These Progressives consciously rejected the Natural Law tradition running from Plato to the American architects of the Constitution, that good law must be anchored in abiding truths authored by God.  

The “living” or “organic” Constitution as promoted by Progressives was on display when the Supreme Court, in Planned Parenthood v. Casey (1992), which justified (as a constitutional right) abortion inasmuch as every person has “the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life” (p. 3).  Within a decade the Court (in Lawrence v. Texas) further affirmed an “emerging recognition” of homosexual behavior that would lead, within another decade, to the legalization of same-sex marriage.  Only an ever-evolving “constitution,” utterly unhinged from the written document of 1787, could rationalize such judicial edicts!  But this was clearly the Progressive vision set forth by Herbert Croly century ago when he urged jurists to discard “Lady Justice,” blindfolded and holding a scale in her hands.  To replace her he suggested a studious woman wearing spectacles, committed to “social justice,” with suitable tools at hand with which to accomplish her goals.  Judges were to decide how to make the world better, not to give “what is due” to all persons.

To show how Progressivism has changed the nation, Watson revisits the “Constitution of the Fathers” which set forth the American Creed, beginning with the Declaration of Independence’s great affirmation that we “hold these truths to be self-evident, that all men are created equal.”  As Abraham Lincoln—one of the greatest of the Constitution’s interpreters—believed, “there are such things as natural rights that do not change with time, that the American Constitution is dedicated to preserving them, and that the role of great political actors, while responding to urgent necessities, is to look backward rather than forward” (p. 38).  When Lincoln famously declared (in 1863) that this nation was “conceived in liberty, and dedicated to the proposition that all men are created equal,” he clearly appealed to “the laws of nature and nature’s God,” undergirding America’s constitutional republic.  

Yet even as Lincoln was invoking God’s laws, Charles Darwin was unleashing an intellectual  revolution, reducing everything to evolution through natural selection.  Consequently, Social Darwinists, enamored with evolutionary “progress,” declared themselves freed from all allegedly eternal principles and embraced the historical developments that improve both the human animal and society as well.  Change is constant—and under the guidance of natural selection (which is helped along by scientifically-trained experts in the social world) it is always for the better!  In America, enthusiastic Darwinists, most notably John Dewey, provided a philosophy (Pragmatism) for committed Progressives from FDR to Barack Obama, who sought to improve things through “progressive education, the welfare state, and the redistribution of capital” (p. 83).  “Long before ‘the courage to change’ became effective presidential campaign slogan, Dewey helped ensure that ‘change’ would have a central position in American political rhetoric” (p. 84).   

After retelling the story of Progressivism’s political triumphs, running from Woodrow Wilson’s “New Freedom” through FDR’s “New Deal” to LBJ’s “Great Society,” Watson explains how it shaped “the new science of jurisprudence” whereby the “moral realism” of Madison and Lincoln was replaced by skepticism and sociological jurisprudence.  Thus Progressive jurists, Richard Epstein says, “‘attacked the twin doctrines that most limited government power—federalism, on the one hand, and the protection of individual liberty and private property, on the other. . . .  However grandly their rhetoric spoke about the need for sensible government intervention in response to changed conditions, the bottom line, sadly, was always the same:  replace competitive processes, by hook or by crook, with state-run cartels’” (p. 117).  

To influential jurists such as Oliver Wendell Holmes, the Constitution means whatever the Supreme Court decrees.  He and his disciples openly disdained any objective moral standards—right and wrong simply changed over the course of time as the stronger rightly dominated the weaker!  Thus “Holmes is a candidate for many labels—pragmatist, utilitarian, Nietzschean, social Darwinist, nihilist” (p. 132).  Rather like the  Thrasymachus in Plato’s The Republic, Holmes considered “justice” to be whatever the dominant person or system determined.  Whatever the established government wants, it rightly gets.  In a democracy, whatever the majority of the people wants, they should get.  In time, their wants will change, so laws (or constitutions) must change to implement  their desires.  By rejecting the Natural Law, Holmes and his followers clearly repudiated Lincoln and Madison, but they also rejected “the very notion that human beings are creatures of a certain type, with transcendent purposes and ends that do not change with time.  The new jurisprudence was suspicious of the very idea of justice itself” (p. 145).  

Obviously dismayed by the impact of this “new science of jurisprudence, Watson concludes his work by noting “the future is now.”  For the good of our nation, for the good of coming generations, it’s imperative to return to the wisdom of the Founders as endorsed by Abraham Lincoln.  To do so requires us first of all to recover our language.  Progressives, as Orwell’s 1984 makes clear, manipulate language, massaging it to attain their ends.  Thus advocates of same-sex marriage effectively change the meaning of marriage, a noun which by definition requires an opposite-sex union, something affirmed through centuries of “common law and American constitutional law” (p. 186).  Advocates of same-sex marriage dramatically illustrate the power of philosophical Nominalism—saying so makes it so!  More radically, Watson predicted, “courts will routinely declare men to be women and vice versa, according to the political pressures of the age” (p. 191).   

* * * * * * * * * * * * * * * * * *

  Living in a “constitutional republic,” we Americans should (one would think) seriously seek to understand the document that sets forth its principles and precepts.  To do so, it’s helpful to consult The Constitution:  An Introduction (New York:  Basic Books, 2015) by Michael Stokes Paulson and Luke Paulson.  This is a father (Michael) and son (Luke) duo, written during nine summer vacations while Luke was in high school and college and while Michael was teaching law at the University of Minnesota.  Their partnership initially involved Michael writing a chapter and allowing Luke to edit it with an eye on readability for students and non-lawyers, hoping “to provide a reasonably short, reader-friendly, intelligent interaction to the United States Constitution in all respects—its formation, its content, and the history of its interpretation” (#87).  

Successfully separating from Great Britain, this nation’s founders inscribed their convictions in two pivotal documents:  The Declaration of Independence and The Constitution of the United States, both declaring “the ultimate right of the people to  freely chosen self-government, protective of their natural rights” (p. 4).  When the Articles of Confederation failed to function effectively, a distinguished company of men—the “Framers”—gathered in Philadelphia in 1787 to compose “something entirely new:  a written constitution for a confederate republic, covering a vast territory and embracing thirteen separate states.  . . . .  There was literally nothing in the world like what the framers were trying to achieve” (p. 23).  That it was to be written was hugely important, establishing a government of laws, not men, clearly setting limits to what it could do and not do.  Thus “the meaning of the Constitution is fixed by the original meaning of its words.  The people can change their written Constitution by amendment, but they should not be able to evade or supplant the ‘supreme Law of the Land’ simply by inventing their own meanings for words or altering meanings to suit their purpose” (p. 27).  

As a result of considerable debate and compromise, the Constitution prescribed a federalism balancing powers within the national government (two legislative bodies, an independent executive, an unelected judiciary) and reserving important rights to the states.  When working rightly, this checks-and-balance system guards personal freedom within the legitimate controls of good government.  Though each branch of government has extensive powers, they are limited to those “enumerated” or “granted” and further curtailed by the first ten amendments.  Thus James Madison “worried aloud, when introducing his proposed Bill of Rights in the House of Representatives, that liberties like religious freedom not be set forth in language too narrow, as if to suggest that they were granted by the Constitution rather than recognized in the Constitution” (p. 99).  The Paulsons effectively describe the work of the Founders, providing helpful biographical vignettes of the leading Framers and celebrating their genius.  But one of their compromises—the three-fifths provisions regarding slavery—sullied their work and scared the new nation’s face for 70 years until a bloody war and three constitutional amendments abolished it.  

Having detailed the important components of the written Constitution, the authors address arguments set forth by proponents of a “living Constitution.”  Obviously the Founders crafted a permanent document which would not change over time, except as properly amended.  But various actions (by all three branches of the government) beginning in the first administration of George Washington and advanced by John Marshall’s Supreme Court slowly expanded its powers.  With the Union’s victory in the Civil War and Reconstruction the powers of the national government grew quickly, as was evident in Lincoln’s Emancipation Proclamation, and it is clear “that the Civil War was fought over the meaning of the Constitution—and over who would have the ultimate power to decide that meaning” (p. 155).  Then the 14th Amendment abruptly “transferred vast areas of what formerly had been exclusive state responsibility to federal government control” (p. 181).  

With the demise of Reconstruction, however, the authors lament the epoch of “betrayal”—the years from 1876-1936 when the Supreme Court “abandoned the Constitution,” denying equal rights to women, upholding racial segregation, nullifying social welfare legislation, etc.  Here it seems to me they think that whenever the Court failed to endorse progressive legislation and ideas it “betrayed” the Constitution.  Other scholars, more libertarian or conservative in their orientation, definitely see these years quite differently!  Fortunately, say the Paulsons, FDR rode to the rescue, and the New Deal Court rightly restored the Constitution by correcting earlier abuses.  FDR’s appointees upheld the constitutionality of his commitment to extend “national government power over the economy” (p. 220), though they curtailed the executive branch’s authority by annulling one of President Truman’s orders in the pivotal Youngstown case.  Especially important was the Warren Court’s Brown v. Board of Education, ending racially segregated schools and launching “the process of dismantling America’s history of legal racial apartheid” (p. 220).  

From 1960 to the present, the national government has expanded dramatically, leaving little of the Constitution’s original “federal” structure standing.  As judicial activists in the courts have sustained this process, we increasingly have an unwritten constitution, meaning whatever the current Supreme Court desires it to be, even claiming itself the “supreme authority to interpret the Constitution—provocatively elevating its own decisions to the same level as the Constitution itself.  However questionable that claim, nobody successfully challenged it” (p. 262).  Such arrogance was fully on display in the Roe v. Wade decision that imposed abortion-on-demand through the land.  “Not even Dred Scott, warped as it was in its distortion of constitutional text, so completely seemed disregard the text as Roe did” (p. 270).  Other critical decisions—ranging from affirmative action to same-sex marriage—further illustrate the withering of the “written Constitution” which once preserved this nation as one under laws rather than men.

# # #  

281 Something, Not Nothing

    If I stumble over something in the dark, I know something’s there.   It’s not something I’m dreaming about, something solely in my mind.  What it is I know not, though when carefully inspected it’s obviously a stool.  That it’s there I’m certain—such sensory information can be painfully indubitable.  It’s something!  What it is I may later determine, finding it’s clearly a four-legged steel stool, useful for reaching things on high shelves but injurious to the bare foot!  Why it’s there, however, involves an altogether different kind of reasoning, as Aristotle famously demonstrated in his Metaphysics.   When asking why the stool was there—or why it was made of steel rather than wood—I unconsciously assume the truth of an ancient philosophical proposition:  Ex nihilo nihil fit—nothing comes from nothing.  The same reasoning process ensues when I venture into the world around me.  That there’s material stuff I encounter is indubitable.  What it is I can ascertain through certain tests.  But why it exists requires a philosophical, not a scientific way of thinking.  

Empirical questions we rightly investigate using scientific means.  But there are deeper  questions which cannot be similarly pursued since they address non-empirical realities such goodness,  beauty, and God.  Thus Einstein allegedly said “scientists make lousy philosophers.”  In ancient Greece most pre-Socratic thinkers were empirical, monistic materialists, though some did think a mysterious kind of infinite, non-material Being existed.  “The decision of this question,” Aristotle said, “is not unimportant, but rather all-important, to our search for truth.  It is this problem which has practically always been the source of the differences of those who have written about nature as a whole.  So it has been and so it must be; since the least initial deviation from the truth is multiplied later a thousandfold” (On the Heavens, I, 5; 271 [5-10]).  

Aristotle’s insight is nicely illustrated in Lawrence M. Krauss’s A Universe from Nothing:  Why There is Something Rather Than Nothing (New York:  Atria, c.  2013).   A noted physicist-turned-cosmologist, Krauss tries to show, as the book’s title says, how the universe literally came from nothing.  Realizing the linguistic pit he’s digging, however, he tries to re-define the word “nothing” to mean, it seems to me:  “well, almost nothing,” since there’s a mysterious but necessarily material realm that magically gives birth to the material world.  Krauss also realizes the word “why” brings with it all sorts of philosophical baggage—especially denoting a rational direction and purpose to the cosmos—which he resolutely refuses to consider.  So he declares that scientists such as himself deal only with “how” questions—the only ones worth pondering.   And “the question” he cares about, “the one that science can actually address, is the question of how all the ‘stuff’ in the universe could have come from no ‘stuff,’ and how, if you wish, formlessness led to form” (#130 in Kindle).  Dismissive of  both philosophy and theology, he insists that he and his guild alone can provide the answers to life’s important questions.  But he slides, incessantly, from “how” to “why” questions, showing how  “scientists make lousy philosophers.”  

On one level, A Universe from Nothing offers the general reader a fine summary of what scientists have discovered during the past century.  It is indeed a fascinating “cosmic mystery story”—-filled with dark holes and quarks and dark matter—told with zest and skill.  We have before us an amazing amount of data regarding the age and shape of the material world, though the conclusions reached regarding the data certainly changed with time.  “String” theories have given way to “multi-universe” hypotheses.  The “steady-state” position once championed by distinguished physicists has been replaced by the “big-bang” view now accepted by most “authorities.”  To theists who for centuries have believed God created (ex nihilo) all that is, the big-bang notion fits easily into their cosmology—the universe simply came into being, in an instant, as God spoke it into being.  An eternal, purely spiritual Being could easily bring into being all that is.  But to materialists such as Krauss there must be a purely material Source—and he devotes this treatise to showing how it might in fact conceivably exist.  And as he chooses to use the word, “‘nothing’ is every bit as physical as ‘something,’ especially if it is to be defined as the ‘absence of something’” (#241).  

There is thus an Alice-in-Wonderland quality to Krauss—words simply mean whatever he chooses them to mean.  “‘When I use a word,’ Humpty Dumpty said, in rather a scornful tone, ‘it means just what I choose it to mean—neither more nor less.’ ‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’ ‘The question is,’ said Humpty Dumpty, ‘which is to be master—that’s all.’”  So too Kraus insists words such as “nothing” mean what he wants them to mean, not what they really mean!   (And, to confuse matters even further, important word meanings shift as the book’s argument develops!).  There is thus an enormous amount of data accompanied by only a passing awareness of logic—a vital part of the philosophical thinking he disdains!  That he first asked the late Christopher Hitchens to pen an introduction to this treatise—and then turned to Richard Dawkins who assented to do so—indicates the “new atheist” agenda undergirding this book!  That Dawkins could have  seriously referred to the “selfish genes” and “memes” so memorably lampooned by the Australian philosopher David Stove shows how poorly “scientific” superstars lack basic reasoning skills!  And a similarly deficiency blemishes Krauss’s presentation.  

                                             * * * * * * * * * * * * * * * * *

In Why Does the World Exist?  An Existential Detective Story (New York:  Liverright Publishing Corporation, c. 2012), Jim Holt employs his journalistic expertise  to explore what Martin Heidegger labeled the greatest of all philosophical questions:  Why is there something rather than nothing at all?  That  is the “super-ultimate why” question!  For many years Holt has pondered this and voraciously read first-rate tomes regarding it—as is evident in his “philosophical tour d’horizon” and “brief history of nothing.”  For this book, however, he primarily conducted interviews around the world with the foremost thinkers who are trying to fathom the mystery.  Unlike Lawrence Krauss, Holt understands that the ultimate origin question requires a “meta-scientific” approach, for as the great Harvard astronomer Owen Gingrich said, this is essentially a teleological, not a strictly scientific, question.  

Holt interviewed thinkers such as diverse as Adolf Grunbaum, a distinguished philosopher of science, a dogmatic atheist who simply dismissed the question as meaningless, and Richard Swinburne, a devout Eastern Orthodox theist who has devoted his life to demonstrating the validity of the traditional belief in “God the Father, maker of heaven and earth, and of all things visible and invisible.”  He talked with David Deutsch, who thinks quantum physics justifies a “many worlds” or “multiverse” hypothesis—if there are an infinite number of universes, then it is quite probable that our universe would have just popped into existence.  Then he sought out Steven Weinberg, who wrote The First Three Minutes and is widely regarded as one of the greatest 20th century cosmologists and said:  “The more the universe seems comprehensible, the more it also seems pointless.”  Yet in his Dreams of a Final Theory, published in 1993, he admitted there was simply too much physicists don’t know for any of them to pontificate on ultimate issues, illustrating an “epistemic modesty” that “was refreshing after all the wild speculation I’d been hearing over the past year” (p. 155).    

Since Plato postulated the eternal existence of intellectual forms, many mathematicians have been Platonists of some sort, believing, as Alain Connes says, “‘there exists, independently of the human mind, a raw and immutable mathematical reality’” (p. 172).  Connes is a distinguished French mathematician who shares Kurt Godel’s confidence in the reality of this non-material numeric realm.  “How else can we account for what the physicist Eugene Wigner famously called the ‘unreasonable effectiveness of mathematics in the natural sciences’?” (p. 172).   Another world-class mathematician, Oxford’s Roger Penrose, is an “unabashed Platonist” who takes “mathematical entities to be as real and mind-independent as Mount Everest” (p. 174).  When interviewed, Penrose said there are really three worlds, “‘all separate from one another.  There’s the Platonic world, there’s the physical world, and there’s also the mental world, the world of our conscious perceptions’” (p. 177).      

John Leslie, considered by many “the world’s foremost authority on why there is Something rather than Nothing,” confesses he thought when he was young that he’d found the answer to the question.  But then he learned, ‘“to my horror and disgust,’” that “‘Plato had got the same answer twenty-five hundred years ago!’” (p. 197).  Subsequently he developed “extreme axiarchism,” positing that “reality is ruled by abstract value—axia being the Greek word for ‘value’ and archein for ‘to rule’” (p. 198).  “‘For those who believe in God,’ he thinks, ‘it has even provided an explanation for God’s own existence:  he exists because of the ethical need for a perfect being.  The idea that goodness can be responsible for existence has had quite a long history—which, as I’ve said was a great disappointment for me to discover, because I’d have liked it to have been all my own’” (p. 199).  

Holt ends the book rather as he began it—interested in all sorts of interesting theories but persuaded by none!  Though the question he’s asking is fundamentally serious, there’s a certain intellectual detachment, almost a levity, to the book.  But it does provide an interesting survey of the cosmological scene, leaving the reader to sort out what’s important or irrelevant to him.

* * * * * * * * * * * * * *

When the erudite Boston College philosopher Peter Kreeft says “This is, quite simply, the single best book I have ever read on what most of us would regard as the single most important question of philosophy:  Does God exist?  It will inevitably become a classic,” one is wise to read carefully Michael Augros’ Who Designed the Designer:  A Rediscovered Path to God’s Existence (San Francisco, Ignatius Press, c. 2015).   Unlike the many works of apologetics that rely on cosmology, with its heavy load of scientific theory and evidence, this treatise simply asks us to reason carefully.  Rather than think inductively, collecting facts, we must think deductively, following reason.  Simple, self-evident assumptions—absolute, universal propositions such as the Pythagorean theorem—carefully developed into arguments, lead necessarily to certain indubitable conclusions.   “As the argument advances,” he promises, “I will never ask you to believe in some else’s findings or observations.  Instead, all the reasoning will begin from things you yourself can immediately verify” (p. 12).   That “equals added to equals make equals” or “every number is either even or odd” cannot be denied simply because they are self-evident.  

So Augros begins with the simple truth that children incessantly ask why?  “This endearing (if sometimes trying) property of children is human intellectual life in embryo.   In its most mature forms of science and philosophy, the life of the human mind still consists mainly in asking why and in persisting in that question as long as there remains a further why to be found.  Ultimately we wonder:  Is there a first cause of all things?  Or must we ask why and why again, forever, reaching back and back toward no beginning at all?  Does every cause rely on a prior cause?  Or is there something that stands in need of no cause, but just is?” (p. 9).  In response, Augros unambiguously intends “to show, by purely rational means, that there is indeed a first cause of all things and that this cause must be a mind” (p. 10).  In many ways he simply seeks to fully demonstrate the elegant simplicity and persuasiveness of the ancient Kalam argument so successfully defended in our day by William Lane Craig:  

Premise 1:  Everything that begins to exist has a cause.

Premise 2:  The universe began to exist.

Conclusion:  Therefore, the universe must have a cause

Then let’s begin!  Whenever we reason we seek to find the causes of things.  To Aristotle:  “Evidently there is a first principle, and the causes of things are neither an infinite series nor infinitely various in kind” (Metaphysics).  On this point, “Twenty-five centuries’ worth of great philosophers and scientists nearly all are agreed” (p. 30).  But this cause is not necessarily temporal!  The universe might well be eternal and still stand in need of a First Cause!  An acting cause, such as a potter making a vase, is simultaneous with, not prior to, the product he’s producing.  “Recognizing causal priority as distinct from temporal priority opens the door to first cause of an eternal effect” (p. 32).  Thus “the great thinkers who all insist there is a first cause used the expression first cause not to mean (necessarily) a cause before all other causes in time, but a cause before all others in causal power.  It meant a cause of other causes that does not itself depend on any other cause.  It meant, in others words, something that exists and is all by itself, without deriving its existence or causal action from anything else.  And it meant not a thing stuck in the past, but a thing existing in the present” (pp. 32-33).  Ultimately, “it is impossible for things caused by something else to be self-explanatory.  There must also be something by which things are caused and which is not itself caused by anything” (p. 37). 

Granting the certain existence of a first cause, however, is only the first step in demonstrating the existence of God, Who Is the First Cause and whose Mind sketched the blueprint for the universe—the Latin word for “turned into one.”  Unlike the Greek polytheists, who assigned events to various gods, monotheists following Moses think there is only One true Cause of all that is.  Carefully considered, the material world—matter-in-motion—could not have caused itself and is quite evidently “the first thing from the first cause” (p. 66).  “Matter is not the first cause.  It is impossible for it to be so.  Matter is subject to motion.  The first cause, on the other hand, is not” (p. 60).  Only a non-material Being could be a self-mover, moving everything else.  The ancient Chinese thinker, Lao-Tzu, noted that “‘to turn a wheel, although thirty spokes must revolve, the axle must remain motionless; so both the moving and the non-moving are needed to produce revolution.’  This reasoning sounds the death knell for the theory that matter is the first cause.  Matter, energy, and fundamental particles are all subject to motion.  The first cause [the axle] is not” (p. 62).   

Thus the first cause must be non-material, incorporeal, spiritual.  Given our immersion in material things it is, admittedly, difficult to conceive of purely non-material realities!  But just as a mathematical point (which has no parts) is not a visible dot on the paper but a necessary, indivisible reality-without-parts, so too there are metaphysical realities that utterly transcend the physical world.  And the first cause, though not material, is “the most intensely existing thing” of all!  There is a hierarchy to the universe, leading from fundamentally material to essentially non-material beings.  Plants are superior to rocks, and animals are better than plants, and human beings are higher than fish and pheasants.  “Mineral, vegetable, animal, human.  These kinds of beings form a ladder of sorts.  Ascending from one rung to another, we find something more capable of including beings within its own being” (p. 91).  Higher beings possess more fullness of being.  On the highest rung, possessing the most being, is the Supreme Being, giving being to all lesser beings.  And since it is axiomatic that “nothing gives what it does not have,” we conclude that everything that exists owes it existence to the One who most fully exists, who simply IS.  

Since we are thinking beings making sense of all sorts of things, it follows that the Supreme Being is the ultimate Thinker.  Even atheistic scientists cannot but acknowledge the seeming intellectual dimension to the cosmos.  Thus Richard Dawkins cautions his fans to beware of taking seriously the “apparent” design of things.  And Stephen Hawking confesses that the “apparent laws of physics” seem to be amazingly well-designed to make for a life-welcoming universe.  But atheists cannot open the door to such non-material realities as “purpose” without bringing into question their materialist dogma.  So the evolutionary biologist Richard Lewontin confessed:  “It is not that the methods and institutions of science somehow cope us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create and apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated.  Moreover, that materialism is an absolute, for we cannot allow a Divine Foot in the door’” (pp. 147-148).  But, Augros counters, even our limited minds can “understand all things, at least in a general way” and then conceptualize a universe.  Using our limited minds we legitimately envision an Omniscient Mind knowing all things—a First Cause responsible for their existence.  Indeed:  “The intelligence of the first cause of all things explains the look of design everywhere in the universe” (p. 113).  

Rightly discerned, this omnipresent design gives things their distinctive beauty and goodness.  Wonder, both Plato and Aristotle noted, is basic to the philosophic quest—pausing to note the sheer givenness of all that is, reflecting on its mysterious configurations, delving into the why-ness of what’s beheld.  Thus Whittaker Chambers, writing about the sheer beauty of his infant daughter’s ear in Witness, dated his break with Communism to that moment.  He was overwhelmed with wonder while gazing at “the delicate convolutions of her ear—those intricate, perfect ears.  The thought passed through my mind:  ‘No, these ears were not created by any chance coming together of atoms in nature (the Communist view).  They could have been created only by immense design’” (p. 100).  Then there’s a fascinating passage in Sir Arthur Conan Dole’s Memoirs of Sherlock Holmes, where Watson recalls Holmes reflecting on “What a lovely thing a rose is!”  Gazing at the color and configuration of a moss-rose, the great detective declared:  “There is nothing in which deduction is so necessary as in religion.  It can be built up as an exact science by the reasoner.  Our highest assurance of the goodness of Providence seems to me to rest in the flowers.  All other things, or powers, our  desires, our food, are really necessary for our existence in the first instance.  But this rose is an extra.  Its smell and its color are an embellishment of life, not a condition of it.  It is only goodness which gives extras, and so I say again that we have much to hope from the flowers.”  As we wonder (with Chambers and Holmes) at the beauty and goodness of beings, we cannot but think there must be a first cause, a Supreme Being, responsible for all this.  

In the book’s “Epilogue,” Augros notes he stands on “the shoulders of giants” such as Aristotle and Aquinas.  Though primarily relying on ancient and medieval thinkers and differing in his approach from Rene Descartes, he shares some of the “first modern” philosopher’s confidence that:  “The existence of God would pass with me as at least as certain as I have ever held the truths of mathematics.”  Thinkers such as Descartes have ever worked by “deducing the logical consequences of timelessly valid principles.  It is not by chance that those principles have arisen in the thoughts of great minds again and again down through the centuries.  They are the common heritage of the human mind.  ‘Nothing comes from nothing.’  ‘What is put into action depends on what acts by itself.’  ‘Nothing gives what it does not have.’  ‘Some things are nobler than others.’’  And on and on.  Such are the laws of being, expressed in terms too universal for science to employ, let alone refute.  We are free to ignore them, since the explicit recognition of their truth is in no way necessary for our daily existence.  . . . .  The just quietly await our notice.  The conclusion that God exists, when deduced from principles like these, is true and hard-won knowledge, worthy of the name” (p. 208).  

That such laws of being point persuasively to the existence of God is the conclusion of this highly readable treatise.  Thus, with Thomas Hibbs, Honors College Dean at Baylor University I say:  “I know of no other book about the existence and nature of God that is as readable and enjoyable as this one.”