283 Feminist Fallout

   That the unintended consequences of revolutionary political and social movements frequently surpass their original intent may be easily discerned in the study of history.  This truth poignantly surfaces in Sue Ellen Browder’s Subverted:  How I Helped the Sexual Revolution Hijack the Women’s Movement (San Francisco:  Ignatious Press, c. 2015).  She begins with this confession:  “I can give you no justification for what I did I my former life.  I will only say this is my weak defense:  I was a young woman searching for truth, freedom, and meaning in the world, but I had no clue where to find them” (#37 in Kindle).  

In part Subverted is an autobiography, an account of a modern journalist.  As a youngster growing up in Iowa, Browder longed to escape her small-town environs and join the more exciting, opportunity-laden cosmopolitan world she saw in magazines and television.  Determined to become a writer, she entered and then graduated from the University of Missouri’s School of Journalism.  She then worked briefly for a publication in Los Angeles before going to New York and landing a job as a free-lance writer with Helen Gurley Brown’s Cosmopolitan, which in 1970s was “the undisputed reigning queen of women’s magazines—the hottest women’s magazine in the nation” (#45).  She basked in the glow of early success, seeing her talents displayed in the pages of  a publication renowned for promoting the causes she most ardently supported—including the ‘60s sexual revolution.  

She’d all so quickly realized her adolescent dream!  “Only later would I realize how dark the dream had become.  Eventually, it would lead to a cacophony of mixed, confused messages in our culture about women, work, sex, marriage, and relationships—errors that have divided our nation and continue to haunt us to this day.  It would lead me to make disastrous decisions” (#63).  But as she and her husband and two children moved about the country, finding a variety of positions and surviving as writers, she continued, for 24 years, publishing articles in Cosmopolitan, telling “lie upon lie to sell the casual-sex lifestyle to millions of single, working women” (#69).  

So Browder’s purpose in writing the book is more than autobiographical—she wants to clarify where and why she went so wrong for so wrong.  It all begin with here naive enlistment in the women’s movement.  Though she’d been reared by parents clearly committed to her personal development, reading Betty Frieden’s The Feminine Mystique when she was 17 powerfully affected her.  “‘The only way,” Frieden declared, “for a woman, as for a man, to find herself, to know herself as a person, is by creative work of her own.  There is on other way’” (#265).   That goal Browder successfully pursued.  But she also met and married another writer, Walter Browder, launching a relationship which would put her at odds with the liberationist feminism Frieden promoted.  She naturally “took the Pill without a qualm,” imagining she could “enjoy sterile sex and control my own sex life” (#334), not knowing how it would “put me on a hormone-powered emotional roller-coaster, which regularly plunged me into black pits of depression” (#334).  

Despite the Pill she became pregnant and had a baby shortly before moving to New York—another complicating relationship!  In her initial interview with the Cosmopolitan staff (knowing Helen Gurley Brown “saw the single girl as ‘the newest glamour girl of our times’ and viewed children as ‘more of a nuisance than a blessing’”) she carefully avoiding mentioning the fact she was a mother.  “At Cosmo, I was a dedicated follower of Planned Parenthood founder Margaret Sanger, the foremost proponent of birth control as a panacea to the world’s problems.  Sanger idolized sex without kids.  ‘Through sex,’ Sanger sang joyously in The Pivot of Civilization, ‘mankind may attain the great spiritual illumination which will transform the world, which will light up the only path to an earthly paradise’” (#556-557).  Writing for “Cosmo,” the author laments, “I danced in Sanger’s procession” (#557).

Hired to write articles for Brown’s magazine, Browder quickly learned that lots “of the alleged ‘real people’ we wrote about in the magazine were entirely fictitious” (#527).  While working in California, she’d seen journalists blithely make up “sources” and write articles without doing the hard work of actually investigating events, so constructing stories about a Cosmo Girl who would “sleep with any man she pleased” and enjoy an upwardly mobile career became quite easy for her.   She just constructed imaginary stories, writing about an unreal world.  She remained “a loyal foot soldier in the women’s movement’s media army.  Even as I rejected the sexual revolution lifestyle as a sham, I scrambled to climb aboard NOW’s freedom train” (#694), promoting “a false path to freedom that was not just reprehensible but evil” (#717).  

Blatant evil triumphed when Betty Frieden led the National Organization of Women to join forces with Larry Lader’s NAROL, an abortion-rights group determined to secure abortion-on-demand.  “At Cosmo,” Browder confesses, “the one assumption I never thought to question in my confusion was whether or not abortion and contraception were good for women” (#930). On a personal level, Browder herself would abort a baby when family finances seemed to dictate.  But she found that having an abortion was hardly the trivial affair Cosmopolitan readers assumed!  Part of herself, as well as her tiny baby, died on that gurney.  As she would later learn when she researched the subject, Lader’s spurious book, Abortion, was cited repeatedly by Justice Harry Blackmun in his Roe and Doe decisions.  In time, Browder would carefully read and reflect on Blackmun’s role in prescribing abortion-on-demand for the country, finding the man and his judicial work seriously flawed.  

  Even while writing her Cosmo articles, at home Browder found in her husband and son a different world, a “better way,” a life “filled with light, laughter and love” (#595).  Her success as a writer only temporarily satisfied her, whereas her work as a mother was “sheer delight” (#1831).  She finally realized “that by focusing almost exclusively on money, power, and career, while denying women’s deeper longings for love and a family, the modern women’s movement got its priorities upside down and backward” (#2475).  So she began asking deeper, more philosophical questions.  Initially, she embraced the “self-actualization” psychology of Abraham Maslow’s—in reality a “self-as-god” way of thinking that cannot but fail.  “Detached from God,” she laments, “I was ready to listen to any blowhard who came my way” (#1436).  Ultimately, she and her husband “went back to church” and found, much too late in many ways, the truth she’d always sought.  “After we returned to church, everything in our lives seemed fresh and new.  Never had we been so happy” (#2192).   

The Browders initially entered an Episcopal church in Connecticut.  Later, while living in California and ever-more deeply hungering for God’s Reality, they entered the Catholic Church in 2003.  To her amazement, “This wasn’t the ‘stuffy, old, patriarchal church’ I’d heard about.  The Church’s teachings were all about love, joy, and forgiveness.”  Still more:  “This was a complete system of philosophical thought and mystical faith with answers the entire world needed to hear” (#3031).  Subverted is an engrossing story, packed with important insights, that tells us much that’s gone wrong in our country during the past half-century.  

                                           * * * * * * * * * * * * * * * * * * * * *

In Tied Up in Knots:  How Getting What We Wanted Made Women Miserable (New York:  Broadside Books, c. 2016), Andrea Tantaros sets forth a secular critique of modern feminism that blends praise and protest for what’s happened for and to women during the past half-century.  She grew up taking to heart Betty Friedan’s message in The Feminine Mystique.  Empowered thereby she pursued a media career and ultimately landed a position with Fox News, where she regularly airs her views before a national audience.  What more could a young woman want?  And she likes what she’s got and still supports the feminist agenda—“If I have to choose between feminism and the pre feminist days, I will choose feminism without hesitation” (#3320).  Yet, it turns out, amidst all her success there has come a gnawing suspicion that there’s more to life than the feminist mantra of “making it in a man’s world.”  Acting like men, feminists insisted they “pay our bills, open our own doors, and carry our own bags” (#757).  But as they stopped acting like women real men steadily lost interest in them.  Ah there’s the rub!

Certainly “women should be equal with men, but, at the same time,women aren’t men.  Equal does not mean the same” (#199).  Yet that’s what many feminists demanded.  Consequently, “feminism doesn’t feel very feminine” (#204).  What Tantaros calls “the Power Trade” negotiated by feminists was in fact “a deal with the devil,” for by imitating men women “abandoned our potent and precious female power” and ceased to act like ladies (#210).  Indeed, many of the movement’s leaders have waged war against men and done lethal harm to healthy heterosexual romance and marriage.  Speaking autobiographically, Tarantas says:  “I have been a one-woman focus group on the tenets of feminism for three decades.  But it wasn’t until I found myself single after two back-to-back long-term relationships that I realized how different the dynamic between the sexes had become” (#233).  In short:  she’d become a highly successful woman with  neither husband nor children—and that’s not really how it’s supposed to be!  Sadly:  “Postponing marriage and motherhood comes with huge costs—and no one is telling young girls this” (#2753).  

Given her own predicament, she’s written this book to try and understand it.  But her analysis, alas, is too often as superficial as the life she’s lived!  She makes interesting observations, tells vivid anecdotes and cites various studies, but she lacks the philosophical, much less theological, resources to address the real issues that so obviously trouble her.  She knows she wants something but cannot actually understand what it is.  So daydreams about the the “superrelationship” she and her “generation of women” await:    “We want a soulful, sexy, and inspired union that can help us realize our full potential in life.  We want a deep connection with a best friend, an emotional and spiritual confidant, and intellectual counterpart who gets our inside jokes, matches us financially, and who loves us with a passion that rivals Romeo’s.  Women have gained power and are refusing to settle—and that is a good thing.  Women can find that kind of love, but we just have to be patient enough to wait for it and refuse to settle for anything less than what we want:  love, fidelity, kindness, respect” (#1142-44).  Such soaring aspirations rarely find fulfillment simply because they’re basically unreal—so lonely women like Tantaras will forever be “tied up in knots” I fear.  

* * * * * * * * * * * * * * * * * * * *

One of the 20th century’s most remarkable women was Edith Stein, a Jewess who studied philosophy with Edmund Husserl, taught philosophy in German universities, and then converted to the Catholic Church.  She joined the Carmelite order, devoting herself to teaching (clearly her great vocation) in its schools.  When the Nazis gained control of Germany, Stein fled to Holland but was in time arrested, sent to a concentration camp, where she perished.  In 1998 Pope John Paul II elevated her to sainthood, standing as a wonderful witness to bother her intellectual brilliance and spiritual sanctity.  During the 1930s she wrote and delivered as lectures a series of papers now collected in volume two of her collected works and titled Essays on Woman, Second Edition, Revised (Washington:  ICS Publications, c. 1996).  That few if any leading feminist thinkers (e.g. Betty Frieden) are first-rate thinkers becomes clear when one reads how a truly great philosopher addresses the topic!  Given the nature of a collection of papers, many of Stein’s positions are routinely repeated and a careful perusal of a selected few would reveal the essence of her thought  

Feminism, in accord with a litany of other ideologies, inevitably fails inasmuch as it misrepresents and endeavors to evade Reality.  But as a serious philosopher, Stein understood her task:  to see clearly and better understand whatever is.  Thus she continually sought to probe the essence of womanhood—“what we are and what we should be”—discerning therein direction for evaluating the feminist movement and describing the proper life—and particularly the redemptive form of life—best for females.  She applied St. Thomas Aquinas’ understanding of analogy entis to her work, seeing God’s image in human beings who need (like a planted seed) both human assistance and divine grace to attain their true end.  Though feminists generally insisted there were no significant differences between men and women, thus calling for identical educational curricula and vocational opportunities, Stein upheld what she considered an indubitable truth:  sexual differences matter greatly.  

Thus, in “The Ethos of Women’s Professions,” she sought to discuss work in light of the “an inner form, a constant spiritual attitude which the scholastics term habitus” (#718) which necessitates we recognize “specifically feminine” vocations.  To Stein, there  are “natural feminine” traits that “only the person blinded by the passion of controversy could deny” (#747).  As both Scripture and common sense make clear, “woman is destined to be wife and mother.  Both physically and spiritually she is endowed for this purpose.”  Giving structure to her bodily being is that spiritual reality—the anima forma corpus—which differentiates her from men of the same species.  Thus she “naturally seeks to embrace that which is living, personal and whole.  To cherish, guard, protect, nourish and advance growth is her natural, maternal yearning” (#755).  Unlike men, with their penchant for abstractions and devotion to tasks, women relish more concrete, living things.  Their “maternal gift is joined to that of companion.  It is her gift and happiness to share the life of another human being and, indeed, to take part in all things which come his way, in the greatest and smallest things, in joy as well as in suffering, in work, and in problems” (#762).   Works of charity, in particular, come quite naturally to her.  

Understanding this God-given reality, women rightly enter various professions, and “there is no profession which cannot be practiced by a woman” (#815).  Yet some work—nursing, teaching, social work—more easily accommodate the “sympathetic rapport” that comes naturally to them.  Yet “the participation of women in the most diverse professional disciplines could be a blessing for the entire society, private or public, precisely if the specifically feminine ethos would be perserved” (#844).  Still more, in light of the Thomistic position that “Grace perfects nature—it does not destroy it,” women should always seek to flourish in accord with their unique nature, their femininity, serving God through “quiet immersion in divine truth, solemn praises of God, propagation of the faith, works of mercy, intercession, and vicarious reparation” (#858).  Surrendering to God, seeking to do His will, opens the door to human flourishing.  “God created humanity as man and woman,” she concludes, “and He created both according to His own image.  Only the purely developed masculine and feminine nature can yield the highest attainable likeness to God.  Only in this fashion can there be brought about the strongest interpenetration of all earthly and divine life” (#955).  

Stein consistently contends for “The Separate Vocations of Man and Woman According to Nature and Grace.”  If we carefully attend to what’s real, “the person’s nature and his life’s course are no gift or trick of chance, but—seen with the eyes of faith—the work of God.  And thus, finally, it is God Himself who calls.  It is He who calls each human being to that to which all humanity is called, it is He who calls each individual to that to which he or she is called personally, and, over and above this, He calls man and woman to such to something specific as the title of this address indicates” (#974).  In the biblical creation account, Adam and Eve “are given the threefold vocation:  they are to be the image of God, bring forth posterity,and be masters over the earth” (#990).  Given that assignment, Eve is called to be Adam’s “helpmate”—an “Eser kenedo—which literally means ‘a helper is if vis-a-vis to him’” (#1000).  Both sexes are equal and equally important, sharing responsibility to “fill the earth and subdue it,” though their roles in carrying out the assignment rightly differ.  Theirs is a complementary relationship:  “man’s primary vocation appears to be that of ruler and paternal vocation secondary (not subordinate to his vocation as ruler but an integral part of it); woman’s primary vocation is maternal:  her role as ruler is secondary and included in a certainty in her maternal vocation” (#1228). 

Rather than point to an evil “patriarchy” or unjust polity, Stein locates the source of the problems women experience:  As a result of man’s Fall:  “Everywhere about us, we see in the interaction of the sexes the direct fruits of original sin in the most terrifying forms:  an unleashed sexual life in which every trace of their high calling seems to be lost; a struggle between the sexes,one pitted agains the other, as they fight for their rights and, in doing so, no longer appear to hear the voices of nature and of God.  But we can see also how it can be different whenever the power of grace is operative” (#1264).  So there is, in God’s Grace, hope for us all:  “The redemption will restore the original order.  The preeminence of man is disclosed by the Savior’s coming to earth in the form of man.  The feminine sex is ennobled by virtue of the Savior’s being born of a human mother; a woman was the gateway through which God found entrance to humankind.  Adam as the human prototype indicates the future divine-human king of creation; just so, every man in the kingdom of God should imitate Christ, and in the marital partnership, he is to imitate the loving care of Christ for His Church.  A woman should honor the image of Christ in her husband by free and loving subordination; she herself is to be the image of God’s mother; but that also means that she is to be Christ’s image” (#1160).  

As a committed Catholic, Stein defends the Church’s tradition regarding the priesthood.  “If we consider the attitude of the Lord Himself, we understand that He accepted the free moving services of women for Himself and His Apostles and that women were among His disciples and most intimate confidants.  Yet he did not grant them the priesthood, not even to his mother, Queen of the Apostles, who was exalted above all humanist in human perfection and fullness of grace” (#1375).  Why?  Because in the natural order designed by God, “Christ came to earth as the Son of Man.  The first creature on earth fashioned in an unrivaled sense as God’s image was therefore a man; that seems to indicate to me that He wished to institute only men as His official representatives on earth” (#1391).  Men and women are equally called to enter into communion with their Lord, but they are called to follow different paths in doing so.  “It is the vocation of every Christian, not only of a few elect, to belong to God in love’s free surrender and to serve him” (#1391).  This is, above all, everyone’s vocation and therein there is “neither male nor female.”  

“God has given each human being a threefold destiny,” Stein says:  “to grow into the likeness of God through the development of his faculties, to procreate descendants, and to hold dominion over the earth.  In addition, it is promised that a life of faith and personal union with the Redeemer will be rewarded by eternal contemplation of God.  These destinies, natural and supernatural, are identical for both man and woman.  But in the realm of duties, differences determined by sex exist” (#1627).  Especially in the process of procreating and rearing children, women must carefully sense and assent to God’s plan for man.  Though single women like Stein herself have an important calling, for most women marriage and children should be fundamental—for it is, in truth, most vital to their being and ultimate happiness.  

Had feminists in the 20th century thought as deeply as Stein—and followed the truth wherever it leads—much of the negative fallout felt by today’s young women could have been avoided!   Were influential academics as committed to truth telling as Stein we’d not be burdened with the strident declarations that there are absolutely no differences between the sexes!  Were Christians more concerned with God’s will than politically correct posturing, there would be greater focus and effectiveness to the Church’s mission.

# # #  

282 A “Republican” Constitution?

  Following the work of the Constitutional Convention of 1787, a Philadelphian asked Benjamin Franklin:  “Well, Doctor, what have we got, a republic or a monarchy?”  Franklin promptly responded, “A republic, if you can keep it.”   He and his colleagues obviously sought to establish a constitutional republic, subject to laws rather than men, but they also (as was evident in many of their debates) wanted to preserve this “republic” from a host of “democratic” abuses that might threaten it.  This differentiation sets the stage for Randy E. Barnett’s insightful treatise, Our Republican Constitution:  Securing the Liberty and Sovereignty of We the People (New York:  HarperCollins, c. 1916), wherein he argues that we must interpret the Constitution in light of the Declaration of Independence’s memorable assertion:  “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.  That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.”  

That pre-existing, natural Rights are given by the Creator and possessed by individual persons, not groups of people, marks a true Republic!  Such rights were then secured by a written document, the Constitution, affording coming generations protection from those who would infringe upon them.  “A Republican Constitution views the natural and inalienable rights of these joint and equal sovereign individuals as preceding the formation of governments, so first come rights and then comes government” (#621).  Contrariwise, when one thinks rights reside in collectives—and are therefore posited or granted by certain groups, e.g. a majoritorian government—he champions Democracy.  “A Democratic Constitution is a ‘living constitution whose meaning evolves to align with contemporary popular desires, so that today’s majority is not bound by what is called ‘the dead hand of the past.’  The will of yesterday’s majority cannot override the will of the majority today” (#592).  It logically follows that in a Republic there are elected “representatives” who serve the people; in a Democracy there are “leaders” who court and implement the will of their supporters. 

To oversimplify, Americans lived under a Republican Constitution for the first century of this nation’s existence.  During the next century, however, an increasingly Democratic Constitution became normative.  At issue today is this:  can we—will we—find ways to restore the Republic established by Franklin and his colleagues?  To do so requires us, firstly, to rightly understand the Constitution as crafted in 1787, beginning with the Declaration of Independence and its reliance on the “Laws of Nature.”  Here Barnett, a distinguished professor of law at Georgetown University, exemplifies his pedagogical profession, describing and explaining it.  To understand what the Declaration meant by this phrase, Barnett cites an illuminating passage from a sermon delivered by the Reverend Elizur Goodrich in 1776:  “‘the principles of society are the law, which Almighty God has established in the moral world, and made necessary to be served by mankind; in order to promote their true happiness, in their transactions and intercourse.’  The laws, Goodrich observed, ‘may be considered as principles, in respect of their fixedness and operation,’ and by knowing them, ‘we discover the rules of conduct, which direct mankind to the highest perfection, and supreme happiness of their nature.’  These rules of conduct ‘are as fixed and unchangeable as the laws which operate in the natural world.  Human art in order to produce certain effects, must confirm to the principles and laws, which the Almighty Creator has established in the natural world’” (#812).  This succinctly summarizes the “Natural Law” tradition.

The Constitution composed in Philadelphia sought to establish a tightly limited government rooted in these natural laws, securing “we the people’s” inalienable rights from the pervasive excesses of democracy under the Articles of Confederation—on display to James Madison wherever “measures are too often decided, not according to the rules of justice and the rights of the minor party, but by the superior force of an interested and overbearing majority’” (#1120).  The people are indeed sovereign, the source of the republic’s authority.  But such sovereignty, as clearly recognized by John Jay and John Wilson, the nation’s preeminent judicial thinkers, resided in individuals, not the collectivist “general will” of Rousseau.   

Yet Rousseau’s position helped shape the Democratic Party which was established by Andrew Jackson and Martin Van Buren in 1832.  “The concept of the will of the people was central to Van Buren’s ‘true democracy.’  He believed that the great principle first formally avowed by Rousseau ‘that the right to exercise sovereignty belongs inalienably to the people’” who should rule through popular majorities (#1585).  In the 1850s Stephen A. Douglass would pick up on this idea and promote his vision of “popular sovereignty” in defense of allowing the diffusion of slavery wherever the people supported it.  Abraham Lincoln, of course, took a different view, and the Republican Party first waged a war and later passed the 13th, 14th, and 15th Amendments to secure the individual rights of all persons, thus eliminating slavery in this nation.   

Following the Reconstruction era, however, Barnett says we began “losing our Republican Constitution” when the Supreme Court effectively gutted the three Amendments that freed the slaves and recognized their status as citizens, thereby acceding to the will of racist Democrats in the South.  Simultaneously the Court (as personified by Oliver Wendell Holmes) gradually endorsed legislation passed by Progressives (both Democrat and Republican) who wanted to change the nation by implementing a variety of political, economic and social reforms—often through administrative agencies and courts, staffed with the “experts” so beloved by Progressives.  They insisted the Constitution is a “living” compact—a “living and organic thing” said Woodrow Wilson—constantly subject to change in whatever direction a majority of the people desire.  With the triumph of FDR and the New Deal, this “living” Constitution—a will-of-the-people Democratic agreement—  became the “law of the land.”  

Though this “Democratic” understanding of the Constitution still prevails in this nation’s corridors of power, Barnett thinks it possible to restore the original, “Republican,” understanding to its rightful place.  The federalism and limited government intended by the Founders in 1787 still matter if we are concerned with our God-given rights and personal liberties.  And since 1986, with the appointment of William Rehnquist to the Supreme Court, hopeful signs of a renewed federalism (apart from economic policies) are on the horizon, though President Barack Obama has done everything possible to frustrate this possibility.  Thus Barnett thinks we need to add ten new amendments (initiated by the states) to the Constitution, so as to preserve its Republican nature.  

Though some of Barnett’s presentation will appeal only to readers with suitable backgrounds in legal history and political philosophy, he has set forth a meaningful way to understand the basic issues in this nation’s 200 year history.  Restoring a Republican Constitution would require heroic work in many ways, but it is certainly a goal worth pursuing for citizens concerned for the real welfare of this Republic.

* * * * * * * * * * * * * * * * * 

In Living Constitution, Dying Faith:  Progressivism and the New Science of Jurisprudence (Wilmington, DE, c. 2009) Bradley C. S. Watson aims “to elucidate the connection that American progressivism as philosophical movement and political ideology has with American legal theory and practice” (p. xvi).  Combining Social Darwinism and Pragmatism (the twin ingredients of Progressivism evident in William James and John Dewey, Oliver Wendell Holmes and Louis Brandeis, Theodore Roosevelt and Woodrow Wilson and Barack Obama), we are now subject to “historicist jurisprudence”—taking what is at the moment as good and true simply because it is the current cusp of historical processes, being “on the right side of history.”  We have a judicial system that “is not only hostile to the liberal constitutionalism of the American Founders, but to any moral-political philosophy that allows for the possibility of a truth that is not time-bound” (p. xvi).  These Progressives consciously rejected the Natural Law tradition running from Plato to the American architects of the Constitution, that good law must be anchored in abiding truths authored by God.  

The “living” or “organic” Constitution as promoted by Progressives was on display when the Supreme Court, in Planned Parenthood v. Casey (1992), which justified (as a constitutional right) abortion inasmuch as every person has “the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life” (p. 3).  Within a decade the Court (in Lawrence v. Texas) further affirmed an “emerging recognition” of homosexual behavior that would lead, within another decade, to the legalization of same-sex marriage.  Only an ever-evolving “constitution,” utterly unhinged from the written document of 1787, could rationalize such judicial edicts!  But this was clearly the Progressive vision set forth by Herbert Croly century ago when he urged jurists to discard “Lady Justice,” blindfolded and holding a scale in her hands.  To replace her he suggested a studious woman wearing spectacles, committed to “social justice,” with suitable tools at hand with which to accomplish her goals.  Judges were to decide how to make the world better, not to give “what is due” to all persons.

To show how Progressivism has changed the nation, Watson revisits the “Constitution of the Fathers” which set forth the American Creed, beginning with the Declaration of Independence’s great affirmation that we “hold these truths to be self-evident, that all men are created equal.”  As Abraham Lincoln—one of the greatest of the Constitution’s interpreters—believed, “there are such things as natural rights that do not change with time, that the American Constitution is dedicated to preserving them, and that the role of great political actors, while responding to urgent necessities, is to look backward rather than forward” (p. 38).  When Lincoln famously declared (in 1863) that this nation was “conceived in liberty, and dedicated to the proposition that all men are created equal,” he clearly appealed to “the laws of nature and nature’s God,” undergirding America’s constitutional republic.  

Yet even as Lincoln was invoking God’s laws, Charles Darwin was unleashing an intellectual  revolution, reducing everything to evolution through natural selection.  Consequently, Social Darwinists, enamored with evolutionary “progress,” declared themselves freed from all allegedly eternal principles and embraced the historical developments that improve both the human animal and society as well.  Change is constant—and under the guidance of natural selection (which is helped along by scientifically-trained experts in the social world) it is always for the better!  In America, enthusiastic Darwinists, most notably John Dewey, provided a philosophy (Pragmatism) for committed Progressives from FDR to Barack Obama, who sought to improve things through “progressive education, the welfare state, and the redistribution of capital” (p. 83).  “Long before ‘the courage to change’ became effective presidential campaign slogan, Dewey helped ensure that ‘change’ would have a central position in American political rhetoric” (p. 84).   

After retelling the story of Progressivism’s political triumphs, running from Woodrow Wilson’s “New Freedom” through FDR’s “New Deal” to LBJ’s “Great Society,” Watson explains how it shaped “the new science of jurisprudence” whereby the “moral realism” of Madison and Lincoln was replaced by skepticism and sociological jurisprudence.  Thus Progressive jurists, Richard Epstein says, “‘attacked the twin doctrines that most limited government power—federalism, on the one hand, and the protection of individual liberty and private property, on the other. . . .  However grandly their rhetoric spoke about the need for sensible government intervention in response to changed conditions, the bottom line, sadly, was always the same:  replace competitive processes, by hook or by crook, with state-run cartels’” (p. 117).  

To influential jurists such as Oliver Wendell Holmes, the Constitution means whatever the Supreme Court decrees.  He and his disciples openly disdained any objective moral standards—right and wrong simply changed over the course of time as the stronger rightly dominated the weaker!  Thus “Holmes is a candidate for many labels—pragmatist, utilitarian, Nietzschean, social Darwinist, nihilist” (p. 132).  Rather like the  Thrasymachus in Plato’s The Republic, Holmes considered “justice” to be whatever the dominant person or system determined.  Whatever the established government wants, it rightly gets.  In a democracy, whatever the majority of the people wants, they should get.  In time, their wants will change, so laws (or constitutions) must change to implement  their desires.  By rejecting the Natural Law, Holmes and his followers clearly repudiated Lincoln and Madison, but they also rejected “the very notion that human beings are creatures of a certain type, with transcendent purposes and ends that do not change with time.  The new jurisprudence was suspicious of the very idea of justice itself” (p. 145).  

Obviously dismayed by the impact of this “new science of jurisprudence, Watson concludes his work by noting “the future is now.”  For the good of our nation, for the good of coming generations, it’s imperative to return to the wisdom of the Founders as endorsed by Abraham Lincoln.  To do so requires us first of all to recover our language.  Progressives, as Orwell’s 1984 makes clear, manipulate language, massaging it to attain their ends.  Thus advocates of same-sex marriage effectively change the meaning of marriage, a noun which by definition requires an opposite-sex union, something affirmed through centuries of “common law and American constitutional law” (p. 186).  Advocates of same-sex marriage dramatically illustrate the power of philosophical Nominalism—saying so makes it so!  More radically, Watson predicted, “courts will routinely declare men to be women and vice versa, according to the political pressures of the age” (p. 191).   

* * * * * * * * * * * * * * * * * *

  Living in a “constitutional republic,” we Americans should (one would think) seriously seek to understand the document that sets forth its principles and precepts.  To do so, it’s helpful to consult The Constitution:  An Introduction (New York:  Basic Books, 2015) by Michael Stokes Paulson and Luke Paulson.  This is a father (Michael) and son (Luke) duo, written during nine summer vacations while Luke was in high school and college and while Michael was teaching law at the University of Minnesota.  Their partnership initially involved Michael writing a chapter and allowing Luke to edit it with an eye on readability for students and non-lawyers, hoping “to provide a reasonably short, reader-friendly, intelligent interaction to the United States Constitution in all respects—its formation, its content, and the history of its interpretation” (#87).  

Successfully separating from Great Britain, this nation’s founders inscribed their convictions in two pivotal documents:  The Declaration of Independence and The Constitution of the United States, both declaring “the ultimate right of the people to  freely chosen self-government, protective of their natural rights” (p. 4).  When the Articles of Confederation failed to function effectively, a distinguished company of men—the “Framers”—gathered in Philadelphia in 1787 to compose “something entirely new:  a written constitution for a confederate republic, covering a vast territory and embracing thirteen separate states.  . . . .  There was literally nothing in the world like what the framers were trying to achieve” (p. 23).  That it was to be written was hugely important, establishing a government of laws, not men, clearly setting limits to what it could do and not do.  Thus “the meaning of the Constitution is fixed by the original meaning of its words.  The people can change their written Constitution by amendment, but they should not be able to evade or supplant the ‘supreme Law of the Land’ simply by inventing their own meanings for words or altering meanings to suit their purpose” (p. 27).  

As a result of considerable debate and compromise, the Constitution prescribed a federalism balancing powers within the national government (two legislative bodies, an independent executive, an unelected judiciary) and reserving important rights to the states.  When working rightly, this checks-and-balance system guards personal freedom within the legitimate controls of good government.  Though each branch of government has extensive powers, they are limited to those “enumerated” or “granted” and further curtailed by the first ten amendments.  Thus James Madison “worried aloud, when introducing his proposed Bill of Rights in the House of Representatives, that liberties like religious freedom not be set forth in language too narrow, as if to suggest that they were granted by the Constitution rather than recognized in the Constitution” (p. 99).  The Paulsons effectively describe the work of the Founders, providing helpful biographical vignettes of the leading Framers and celebrating their genius.  But one of their compromises—the three-fifths provisions regarding slavery—sullied their work and scared the new nation’s face for 70 years until a bloody war and three constitutional amendments abolished it.  

Having detailed the important components of the written Constitution, the authors address arguments set forth by proponents of a “living Constitution.”  Obviously the Founders crafted a permanent document which would not change over time, except as properly amended.  But various actions (by all three branches of the government) beginning in the first administration of George Washington and advanced by John Marshall’s Supreme Court slowly expanded its powers.  With the Union’s victory in the Civil War and Reconstruction the powers of the national government grew quickly, as was evident in Lincoln’s Emancipation Proclamation, and it is clear “that the Civil War was fought over the meaning of the Constitution—and over who would have the ultimate power to decide that meaning” (p. 155).  Then the 14th Amendment abruptly “transferred vast areas of what formerly had been exclusive state responsibility to federal government control” (p. 181).  

With the demise of Reconstruction, however, the authors lament the epoch of “betrayal”—the years from 1876-1936 when the Supreme Court “abandoned the Constitution,” denying equal rights to women, upholding racial segregation, nullifying social welfare legislation, etc.  Here it seems to me they think that whenever the Court failed to endorse progressive legislation and ideas it “betrayed” the Constitution.  Other scholars, more libertarian or conservative in their orientation, definitely see these years quite differently!  Fortunately, say the Paulsons, FDR rode to the rescue, and the New Deal Court rightly restored the Constitution by correcting earlier abuses.  FDR’s appointees upheld the constitutionality of his commitment to extend “national government power over the economy” (p. 220), though they curtailed the executive branch’s authority by annulling one of President Truman’s orders in the pivotal Youngstown case.  Especially important was the Warren Court’s Brown v. Board of Education, ending racially segregated schools and launching “the process of dismantling America’s history of legal racial apartheid” (p. 220).  

From 1960 to the present, the national government has expanded dramatically, leaving little of the Constitution’s original “federal” structure standing.  As judicial activists in the courts have sustained this process, we increasingly have an unwritten constitution, meaning whatever the current Supreme Court desires it to be, even claiming itself the “supreme authority to interpret the Constitution—provocatively elevating its own decisions to the same level as the Constitution itself.  However questionable that claim, nobody successfully challenged it” (p. 262).  Such arrogance was fully on display in the Roe v. Wade decision that imposed abortion-on-demand through the land.  “Not even Dred Scott, warped as it was in its distortion of constitutional text, so completely seemed disregard the text as Roe did” (p. 270).  Other critical decisions—ranging from affirmative action to same-sex marriage—further illustrate the withering of the “written Constitution” which once preserved this nation as one under laws rather than men.

# # #  

281 Something, Not Nothing

    If I stumble over something in the dark, I know something’s there.   It’s not something I’m dreaming about, something solely in my mind.  What it is I know not, though when carefully inspected it’s obviously a stool.  That it’s there I’m certain—such sensory information can be painfully indubitable.  It’s something!  What it is I may later determine, finding it’s clearly a four-legged steel stool, useful for reaching things on high shelves but injurious to the bare foot!  Why it’s there, however, involves an altogether different kind of reasoning, as Aristotle famously demonstrated in his Metaphysics.   When asking why the stool was there—or why it was made of steel rather than wood—I unconsciously assume the truth of an ancient philosophical proposition:  Ex nihilo nihil fit—nothing comes from nothing.  The same reasoning process ensues when I venture into the world around me.  That there’s material stuff I encounter is indubitable.  What it is I can ascertain through certain tests.  But why it exists requires a philosophical, not a scientific way of thinking.  

Empirical questions we rightly investigate using scientific means.  But there are deeper  questions which cannot be similarly pursued since they address non-empirical realities such goodness,  beauty, and God.  Thus Einstein allegedly said “scientists make lousy philosophers.”  In ancient Greece most pre-Socratic thinkers were empirical, monistic materialists, though some did think a mysterious kind of infinite, non-material Being existed.  “The decision of this question,” Aristotle said, “is not unimportant, but rather all-important, to our search for truth.  It is this problem which has practically always been the source of the differences of those who have written about nature as a whole.  So it has been and so it must be; since the least initial deviation from the truth is multiplied later a thousandfold” (On the Heavens, I, 5; 271 [5-10]).  

Aristotle’s insight is nicely illustrated in Lawrence M. Krauss’s A Universe from Nothing:  Why There is Something Rather Than Nothing (New York:  Atria, c.  2013).   A noted physicist-turned-cosmologist, Krauss tries to show, as the book’s title says, how the universe literally came from nothing.  Realizing the linguistic pit he’s digging, however, he tries to re-define the word “nothing” to mean, it seems to me:  “well, almost nothing,” since there’s a mysterious but necessarily material realm that magically gives birth to the material world.  Krauss also realizes the word “why” brings with it all sorts of philosophical baggage—especially denoting a rational direction and purpose to the cosmos—which he resolutely refuses to consider.  So he declares that scientists such as himself deal only with “how” questions—the only ones worth pondering.   And “the question” he cares about, “the one that science can actually address, is the question of how all the ‘stuff’ in the universe could have come from no ‘stuff,’ and how, if you wish, formlessness led to form” (#130 in Kindle).  Dismissive of  both philosophy and theology, he insists that he and his guild alone can provide the answers to life’s important questions.  But he slides, incessantly, from “how” to “why” questions, showing how  “scientists make lousy philosophers.”  

On one level, A Universe from Nothing offers the general reader a fine summary of what scientists have discovered during the past century.  It is indeed a fascinating “cosmic mystery story”—-filled with dark holes and quarks and dark matter—told with zest and skill.  We have before us an amazing amount of data regarding the age and shape of the material world, though the conclusions reached regarding the data certainly changed with time.  “String” theories have given way to “multi-universe” hypotheses.  The “steady-state” position once championed by distinguished physicists has been replaced by the “big-bang” view now accepted by most “authorities.”  To theists who for centuries have believed God created (ex nihilo) all that is, the big-bang notion fits easily into their cosmology—the universe simply came into being, in an instant, as God spoke it into being.  An eternal, purely spiritual Being could easily bring into being all that is.  But to materialists such as Krauss there must be a purely material Source—and he devotes this treatise to showing how it might in fact conceivably exist.  And as he chooses to use the word, “‘nothing’ is every bit as physical as ‘something,’ especially if it is to be defined as the ‘absence of something’” (#241).  

There is thus an Alice-in-Wonderland quality to Krauss—words simply mean whatever he chooses them to mean.  “‘When I use a word,’ Humpty Dumpty said, in rather a scornful tone, ‘it means just what I choose it to mean—neither more nor less.’ ‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’ ‘The question is,’ said Humpty Dumpty, ‘which is to be master—that’s all.’”  So too Kraus insists words such as “nothing” mean what he wants them to mean, not what they really mean!   (And, to confuse matters even further, important word meanings shift as the book’s argument develops!).  There is thus an enormous amount of data accompanied by only a passing awareness of logic—a vital part of the philosophical thinking he disdains!  That he first asked the late Christopher Hitchens to pen an introduction to this treatise—and then turned to Richard Dawkins who assented to do so—indicates the “new atheist” agenda undergirding this book!  That Dawkins could have  seriously referred to the “selfish genes” and “memes” so memorably lampooned by the Australian philosopher David Stove shows how poorly “scientific” superstars lack basic reasoning skills!  And a similarly deficiency blemishes Krauss’s presentation.  

                                             * * * * * * * * * * * * * * * * *

In Why Does the World Exist?  An Existential Detective Story (New York:  Liverright Publishing Corporation, c. 2012), Jim Holt employs his journalistic expertise  to explore what Martin Heidegger labeled the greatest of all philosophical questions:  Why is there something rather than nothing at all?  That  is the “super-ultimate why” question!  For many years Holt has pondered this and voraciously read first-rate tomes regarding it—as is evident in his “philosophical tour d’horizon” and “brief history of nothing.”  For this book, however, he primarily conducted interviews around the world with the foremost thinkers who are trying to fathom the mystery.  Unlike Lawrence Krauss, Holt understands that the ultimate origin question requires a “meta-scientific” approach, for as the great Harvard astronomer Owen Gingrich said, this is essentially a teleological, not a strictly scientific, question.  

Holt interviewed thinkers such as diverse as Adolf Grunbaum, a distinguished philosopher of science, a dogmatic atheist who simply dismissed the question as meaningless, and Richard Swinburne, a devout Eastern Orthodox theist who has devoted his life to demonstrating the validity of the traditional belief in “God the Father, maker of heaven and earth, and of all things visible and invisible.”  He talked with David Deutsch, who thinks quantum physics justifies a “many worlds” or “multiverse” hypothesis—if there are an infinite number of universes, then it is quite probable that our universe would have just popped into existence.  Then he sought out Steven Weinberg, who wrote The First Three Minutes and is widely regarded as one of the greatest 20th century cosmologists and said:  “The more the universe seems comprehensible, the more it also seems pointless.”  Yet in his Dreams of a Final Theory, published in 1993, he admitted there was simply too much physicists don’t know for any of them to pontificate on ultimate issues, illustrating an “epistemic modesty” that “was refreshing after all the wild speculation I’d been hearing over the past year” (p. 155).    

Since Plato postulated the eternal existence of intellectual forms, many mathematicians have been Platonists of some sort, believing, as Alain Connes says, “‘there exists, independently of the human mind, a raw and immutable mathematical reality’” (p. 172).  Connes is a distinguished French mathematician who shares Kurt Godel’s confidence in the reality of this non-material numeric realm.  “How else can we account for what the physicist Eugene Wigner famously called the ‘unreasonable effectiveness of mathematics in the natural sciences’?” (p. 172).   Another world-class mathematician, Oxford’s Roger Penrose, is an “unabashed Platonist” who takes “mathematical entities to be as real and mind-independent as Mount Everest” (p. 174).  When interviewed, Penrose said there are really three worlds, “‘all separate from one another.  There’s the Platonic world, there’s the physical world, and there’s also the mental world, the world of our conscious perceptions’” (p. 177).      

John Leslie, considered by many “the world’s foremost authority on why there is Something rather than Nothing,” confesses he thought when he was young that he’d found the answer to the question.  But then he learned, ‘“to my horror and disgust,’” that “‘Plato had got the same answer twenty-five hundred years ago!’” (p. 197).  Subsequently he developed “extreme axiarchism,” positing that “reality is ruled by abstract value—axia being the Greek word for ‘value’ and archein for ‘to rule’” (p. 198).  “‘For those who believe in God,’ he thinks, ‘it has even provided an explanation for God’s own existence:  he exists because of the ethical need for a perfect being.  The idea that goodness can be responsible for existence has had quite a long history—which, as I’ve said was a great disappointment for me to discover, because I’d have liked it to have been all my own’” (p. 199).  

Holt ends the book rather as he began it—interested in all sorts of interesting theories but persuaded by none!  Though the question he’s asking is fundamentally serious, there’s a certain intellectual detachment, almost a levity, to the book.  But it does provide an interesting survey of the cosmological scene, leaving the reader to sort out what’s important or irrelevant to him.

* * * * * * * * * * * * * *

When the erudite Boston College philosopher Peter Kreeft says “This is, quite simply, the single best book I have ever read on what most of us would regard as the single most important question of philosophy:  Does God exist?  It will inevitably become a classic,” one is wise to read carefully Michael Augros’ Who Designed the Designer:  A Rediscovered Path to God’s Existence (San Francisco, Ignatius Press, c. 2015).   Unlike the many works of apologetics that rely on cosmology, with its heavy load of scientific theory and evidence, this treatise simply asks us to reason carefully.  Rather than think inductively, collecting facts, we must think deductively, following reason.  Simple, self-evident assumptions—absolute, universal propositions such as the Pythagorean theorem—carefully developed into arguments, lead necessarily to certain indubitable conclusions.   “As the argument advances,” he promises, “I will never ask you to believe in some else’s findings or observations.  Instead, all the reasoning will begin from things you yourself can immediately verify” (p. 12).   That “equals added to equals make equals” or “every number is either even or odd” cannot be denied simply because they are self-evident.  

So Augros begins with the simple truth that children incessantly ask why?  “This endearing (if sometimes trying) property of children is human intellectual life in embryo.   In its most mature forms of science and philosophy, the life of the human mind still consists mainly in asking why and in persisting in that question as long as there remains a further why to be found.  Ultimately we wonder:  Is there a first cause of all things?  Or must we ask why and why again, forever, reaching back and back toward no beginning at all?  Does every cause rely on a prior cause?  Or is there something that stands in need of no cause, but just is?” (p. 9).  In response, Augros unambiguously intends “to show, by purely rational means, that there is indeed a first cause of all things and that this cause must be a mind” (p. 10).  In many ways he simply seeks to fully demonstrate the elegant simplicity and persuasiveness of the ancient Kalam argument so successfully defended in our day by William Lane Craig:  

Premise 1:  Everything that begins to exist has a cause.

Premise 2:  The universe began to exist.

Conclusion:  Therefore, the universe must have a cause

Then let’s begin!  Whenever we reason we seek to find the causes of things.  To Aristotle:  “Evidently there is a first principle, and the causes of things are neither an infinite series nor infinitely various in kind” (Metaphysics).  On this point, “Twenty-five centuries’ worth of great philosophers and scientists nearly all are agreed” (p. 30).  But this cause is not necessarily temporal!  The universe might well be eternal and still stand in need of a First Cause!  An acting cause, such as a potter making a vase, is simultaneous with, not prior to, the product he’s producing.  “Recognizing causal priority as distinct from temporal priority opens the door to first cause of an eternal effect” (p. 32).  Thus “the great thinkers who all insist there is a first cause used the expression first cause not to mean (necessarily) a cause before all other causes in time, but a cause before all others in causal power.  It meant a cause of other causes that does not itself depend on any other cause.  It meant, in others words, something that exists and is all by itself, without deriving its existence or causal action from anything else.  And it meant not a thing stuck in the past, but a thing existing in the present” (pp. 32-33).  Ultimately, “it is impossible for things caused by something else to be self-explanatory.  There must also be something by which things are caused and which is not itself caused by anything” (p. 37). 

Granting the certain existence of a first cause, however, is only the first step in demonstrating the existence of God, Who Is the First Cause and whose Mind sketched the blueprint for the universe—the Latin word for “turned into one.”  Unlike the Greek polytheists, who assigned events to various gods, monotheists following Moses think there is only One true Cause of all that is.  Carefully considered, the material world—matter-in-motion—could not have caused itself and is quite evidently “the first thing from the first cause” (p. 66).  “Matter is not the first cause.  It is impossible for it to be so.  Matter is subject to motion.  The first cause, on the other hand, is not” (p. 60).  Only a non-material Being could be a self-mover, moving everything else.  The ancient Chinese thinker, Lao-Tzu, noted that “‘to turn a wheel, although thirty spokes must revolve, the axle must remain motionless; so both the moving and the non-moving are needed to produce revolution.’  This reasoning sounds the death knell for the theory that matter is the first cause.  Matter, energy, and fundamental particles are all subject to motion.  The first cause [the axle] is not” (p. 62).   

Thus the first cause must be non-material, incorporeal, spiritual.  Given our immersion in material things it is, admittedly, difficult to conceive of purely non-material realities!  But just as a mathematical point (which has no parts) is not a visible dot on the paper but a necessary, indivisible reality-without-parts, so too there are metaphysical realities that utterly transcend the physical world.  And the first cause, though not material, is “the most intensely existing thing” of all!  There is a hierarchy to the universe, leading from fundamentally material to essentially non-material beings.  Plants are superior to rocks, and animals are better than plants, and human beings are higher than fish and pheasants.  “Mineral, vegetable, animal, human.  These kinds of beings form a ladder of sorts.  Ascending from one rung to another, we find something more capable of including beings within its own being” (p. 91).  Higher beings possess more fullness of being.  On the highest rung, possessing the most being, is the Supreme Being, giving being to all lesser beings.  And since it is axiomatic that “nothing gives what it does not have,” we conclude that everything that exists owes it existence to the One who most fully exists, who simply IS.  

Since we are thinking beings making sense of all sorts of things, it follows that the Supreme Being is the ultimate Thinker.  Even atheistic scientists cannot but acknowledge the seeming intellectual dimension to the cosmos.  Thus Richard Dawkins cautions his fans to beware of taking seriously the “apparent” design of things.  And Stephen Hawking confesses that the “apparent laws of physics” seem to be amazingly well-designed to make for a life-welcoming universe.  But atheists cannot open the door to such non-material realities as “purpose” without bringing into question their materialist dogma.  So the evolutionary biologist Richard Lewontin confessed:  “It is not that the methods and institutions of science somehow cope us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create and apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated.  Moreover, that materialism is an absolute, for we cannot allow a Divine Foot in the door’” (pp. 147-148).  But, Augros counters, even our limited minds can “understand all things, at least in a general way” and then conceptualize a universe.  Using our limited minds we legitimately envision an Omniscient Mind knowing all things—a First Cause responsible for their existence.  Indeed:  “The intelligence of the first cause of all things explains the look of design everywhere in the universe” (p. 113).  

Rightly discerned, this omnipresent design gives things their distinctive beauty and goodness.  Wonder, both Plato and Aristotle noted, is basic to the philosophic quest—pausing to note the sheer givenness of all that is, reflecting on its mysterious configurations, delving into the why-ness of what’s beheld.  Thus Whittaker Chambers, writing about the sheer beauty of his infant daughter’s ear in Witness, dated his break with Communism to that moment.  He was overwhelmed with wonder while gazing at “the delicate convolutions of her ear—those intricate, perfect ears.  The thought passed through my mind:  ‘No, these ears were not created by any chance coming together of atoms in nature (the Communist view).  They could have been created only by immense design’” (p. 100).  Then there’s a fascinating passage in Sir Arthur Conan Dole’s Memoirs of Sherlock Holmes, where Watson recalls Holmes reflecting on “What a lovely thing a rose is!”  Gazing at the color and configuration of a moss-rose, the great detective declared:  “There is nothing in which deduction is so necessary as in religion.  It can be built up as an exact science by the reasoner.  Our highest assurance of the goodness of Providence seems to me to rest in the flowers.  All other things, or powers, our  desires, our food, are really necessary for our existence in the first instance.  But this rose is an extra.  Its smell and its color are an embellishment of life, not a condition of it.  It is only goodness which gives extras, and so I say again that we have much to hope from the flowers.”  As we wonder (with Chambers and Holmes) at the beauty and goodness of beings, we cannot but think there must be a first cause, a Supreme Being, responsible for all this.  

In the book’s “Epilogue,” Augros notes he stands on “the shoulders of giants” such as Aristotle and Aquinas.  Though primarily relying on ancient and medieval thinkers and differing in his approach from Rene Descartes, he shares some of the “first modern” philosopher’s confidence that:  “The existence of God would pass with me as at least as certain as I have ever held the truths of mathematics.”  Thinkers such as Descartes have ever worked by “deducing the logical consequences of timelessly valid principles.  It is not by chance that those principles have arisen in the thoughts of great minds again and again down through the centuries.  They are the common heritage of the human mind.  ‘Nothing comes from nothing.’  ‘What is put into action depends on what acts by itself.’  ‘Nothing gives what it does not have.’  ‘Some things are nobler than others.’’  And on and on.  Such are the laws of being, expressed in terms too universal for science to employ, let alone refute.  We are free to ignore them, since the explicit recognition of their truth is in no way necessary for our daily existence.  . . . .  The just quietly await our notice.  The conclusion that God exists, when deduced from principles like these, is true and hard-won knowledge, worthy of the name” (p. 208).  

That such laws of being point persuasively to the existence of God is the conclusion of this highly readable treatise.  Thus, with Thomas Hibbs, Honors College Dean at Baylor University I say:  “I know of no other book about the existence and nature of God that is as readable and enjoyable as this one.” 

280 Losing Our Mind?

   In 1960 America’s schools were widely considered among the world’s best.  Then came the ‘60s revolution which significantly changed the culture, including  the erosion of educational standards.  Within two decades concerns for the schools’ quality became amplified, and in 1983 a presidential committee issued Nation At Risk to alert the public to manifest deficiencies in our schools.  In 1987 Allen Bloom  voiced his lament for the quality of university education in The Closing of the American Mind (which became a surprise best-seller).  Mounting concern in political circles led to federal initiatives such as President George W. Bush’s “No Child Left Behind,” seeking to arrest the decline by insisting that certain standards be met to insure children were receiving a decent education.   But despite all the discussions—and the massive expenditure of funds—America’s K-12 students now score near the bottom in standardized tests administered in industrialized nations.    

To assess this issue, Mark Bauerlein and Adam Bellow edited a volume titled The State of the American Mind:  16 Leading Critics On the New Anti-Intellectualism (West Conshohocken, PA:  Templeton Press, c, 2015).   In their Foreword—“America:  Have We Lost Our Mind?”—the editors state that Americans had historically been characterized by “independent thought and action, thrift and industriousness, delayed gratification and equal opportunity” (#92 in Kindle).  Such traits had largely disappeared by the mid-80s as traditional content-focused courses, designed to transmit the core knowledge and wisdom of the past, were replaced by student-centered activities aiming to enhance self-esteem and critical thinking.   Consequently:  “Instead of acquiring a richer and fuller knowledge of U.S.History and civics, American students and grown-ups display astounding ignorance of them, and their blindness is matched by their indifference to the problem” (#157).  The “rugged individualism” of the past has dissolved into rampant self-absorption.  “Not only has self-reliance become a spurious boast (‘You didn’t build that’), but dependency itself has become a tactical claim” (#157).  Rather than celebrating their freedom to think and debate, large numbers of “Americans accept restrictions on speech, freedom of association, rights to privacy, and religious conscience” (#165).  The closing of the American mind seems to continue, especially in the nation’s educational institutions.

E. D. Hirsch Jr.’s “The Knowledge Requirement:  What Every American Needs to Know”  updates his 1987 manifesto, Cultural Literacy, urging the schools to recover their commitment to transmitting knowledge of history, civics, mathematics, science, and literature.  Now armed with the fact that SAT scores have declined for 50 years, Hirsch restates his case and blames the decline on the fact “that general knowledge” is not emphasized in the schools.  As teachers emphasize “skills” rather than “mere facts” many students learn very little and demonstrate it by performing poorly in international exams.  Ironically, Mark Bauerlein, in “The Troubling Trend of Cultural IQ,” notes that IQ scores have significantly increased during the past century—we’re actually getting smarter!  But higher IQs have not resulted in more knowledge.  Thus an alarming number of high school graduates (two-thirds of the students entering the Cal State University system) need remedial courses in math and writing, demonstrate a dwindling vocabulary, and have limited general knowledge. 

In “Biblical Literacy Matters,” Daniel Driesbach draws a dismal portrait that should concern everyone, for there has been “an alarming decline in biblical literacy” that includes an “ignorance of key biblical texts, stories, characters, doctrines, themes, rituals, and symbols” (#749).  Compared with George Washington, who often brought biblical phrases into his writings, today’s politicians frequently prove inept when trying to appear biblically astute—e.g. Howard Dean citing Job as his favorite New Testament book!  “In his 1800 assessment of education in America, Pierre Samuel Du Pont de Nemours observed, ‘Most young Americans . . . can read, write and cipher.  Not more than four in a thousand are unable to write legibly—even neatly.’  He attributed America’s high literacy rate to frequent Bible reading, which, he also said, ‘tends to increase and formulate ideas of responsibility’” (#864).  Two hundred years later we can hardly say the same!  And this loss of biblical literacy bodes ill for a nation whose laws and political premises are so suffused with biblical precepts.

In “The Rise of the Self and the Decline of Intellectual and Civic Interest,” Jean Twenge identifies one of the most important problems plaguing modern education.  Teachers for decades have stressed the absolute, if not ultimate importance of self-esteem.  All students, we’re told, must feel good about themselves—and any problems they have must be attributed to a lack of self-esteem.  Believing in yourself—not learning history or mastering calculus or becoming virtuous—is the pedagogical goal!  So today’s students routinely consider themselves masterful mathematicians or writers whereas their test scores demonstrate the converse.  Their inflated self-evaluation is bolstered by the rampant grade-inflection everywhere evident.  In 2012, 37 percent of high school seniors had an A average.  Whereas in the 1960s the most common grade in college “was a C, by 2000 the most common grade was an A” (#2269).  It’s revealing that “the ethnic group with the lowest self-esteem is Asian Americans” who “also demonstrate the best academic performance, possibly because their culture emphasizes hard work rather than self-belief” (#2370).  

Radio host Dennis Prager says “We Live in the Age of Feelings” bequeathed us by the ‘60s and thus fail to reason rightly.  Rather than wondering of something is “true” or “right,” today’s youngsters almost inevitably ask “how do I feel about it?”  They value their own feelings rather than the well-being of others, their own response to music and art rather than classical aesthetic criteria, and their own sexual satisfaction (e.g. cohabitation, abortion, sodomy) rather than the good of society (e.g. marriage and children).   Feelings fuel the ubiquitous concern of the young for “social justice”—meaning favoritism for the poor and disadvantaged, supporting the “poor man, even if he is in the wrong” (#3407).  Such convictions lead to such educational “reforms” as re-writing history books to exaggerate the role of women, homosexuals, and ethnic minorities, thereby erasing the truth of the American story.  

In a more foundational essay, R.R. Reno, the editor of First Things, identifies “The New Antinomian Attitude” as the “greatest threat” we face, for it has led to “an Empire of Desire” that flourishes in our postmodern world and corrupts our culture.  “Ministered to by a therapeutic vocabulary empowerment, the pedagogy of multiculturalism, and our dominant, paradoxical moral code of nonjudgmentalism, this empire has come to dominate the American Mind” (#3741).  Whatever we want we will have!  Not even stubborn realities such as sexual differences will deter us.  In adopting this antinomian attitude “we’ve empowered the dictatorship of relativism, which is closely allied with the harrying mentality of political correctness” (#3881).  And with this we have effectively constricted the reasonableness needed for a healthy culture.  So we are, in fact, losing our mind!

* * * * * * * * * * * * * * * * * * * * * * 

Though I generally encourage reading the books I review, sometimes I think people should merely know about a book without laboring to digest it first-hand.  So though the information in Terry Moe’s Special Interest:  Teachers Unions and America’s Public Schools (Washington D.C.:  Brookings Institution Press, c. 2011) is truly important, the treatise is clearly designed for scholars rather than the general public.  His thesis is clear and disturbing:  though individual teachers may very well be deeply committed to their students the teachers unions have another, overriding objective—protecting and advancing the welfare of their members.    Thus we have such things as New York City’s “Rubber Rooms,” where 700 teachers daily do nothing (since they are incompetent) while continuing to  draw their salaries and full benefits.  They’re teachers who don’t teach!  But they’re protected by their union, which makes them impossible to fire.   

While not bad enough to be sent to the Rubber Rooms, another 5-10 percent of our teachers are mediocre at best and clearly harming their students.  If we could merely replace the bottom 10 percent “‘we could dramatically improve student achievement.  The U.S.could move from below average international comparisons to near the top’” (#157).   Educational reformers know this, but every effort to change the system has failed for one simple reason:  unions oppose efforts to discipline ineffective teachers , to allow school choice or merit pay.  Before 1960 unions had little power and exerted little influence.  The National Education Association (now the largest union of any kind in the U.S.) was a professional organization largely controlled by school administrators.  In the 60s, however, the unions began to win legislative and judicial victories that enabled them, by 1980, to establish “what was essentially a new system of public education” (#218).   “The rise of the teachers unions, then, is a story of triumph for employee interests and employee power.  But it is not a story of triumph for American education” (#1419).  

The unions mastered the art of financing politicians (almost exclusively Democrat) who would in turn generously appropriate money to the schools and require union membership.  Unions  “were the nation’s top contributors to federal elections from 1989 through 2009” (#251).  They also effectively marshal their members as “volunteers” to work in important campaigns (especially school board  elections and bond proposals).  And they have effectively aligned themselves with other powerful public sector unions to mutually enrich themselves at the public purse.  When confronted with the dismal record of student achievement (near the bottom compared with other developed countries), the unions loudly insist the problem is simply financial—given enough money, all would be well in our schools!  Yet the U.S. spends “more than twice as much on education—per student, adjusted for inflation—as it spent in 1970 (and more than three times as much in 1960” (#296).  Unions insist that small classes insure better learning  and demand we hire more teachers to man small classrooms.  Yet whereas in 1955 there were 27 students per teacher and there are now 14, the students are demonstrably less well-educated.  Smaller classes mainly mean more teachers—and more union dues—but less effective instruction.

Yet amidst the generally dismal story of America’s schools there are a few “small victories for sanity.”  New Orleans has witnessed some “path-breaking” improvements launched in the wake of the devastation wrought by Hurricane Katrina in 2005!  The city’s “teachers were dismissed and the local union and its formidable power” was crushed (#4414).  Then state and local officials were free to establish “a full-blown choice system filled with autonomous charter schools” now enrolling 60 percent of the city’s children (#4420).   Short of a hurricane, however, constructive change rarely comes in the nation’s largest cities!  Consider Washington, D.C., dead last in test scores and “long known for having one of the worst, most incompetently run school systems in the country” despite its lavish funding (#4753).   When Adrian Fenty was elected mayor in 2007 he resolved to reform the system and brought Michelle Rhee on board as Superintendent of Schools to do so.  She sought to “build a new personnel system around performance:  rewarding good teachers for their success, creating strong incentives to promote student achievement, and—just as important—attracting a new breed of teachers” who would improve things (#4811).  But Rhee soon exited because the unions opposed her every move and help orchestrate the defeat of Mayor Fenty at the next election.  

In Moe’s opinion, unless the teachers unions are radically curtailed there is no hope for the children in public schools.  The symbiotic bond between the unions and the Democrat Party must be dissolved.  School vouchers, school choice, charter schools, and new technological options offer positive alternatives to the established order that may in time improve things.  But ultimately, for any meaningful reforms to take place the teachers unions must somehow be sidelined.    

                                         * * * * * * * * * * * * * * * * * * * * * 

In John Dewey and the Decline of American Education:  How the Patron Saint of Schools Has Corrupted Teaching and Learning (Wilmington, DE:  ISI Books, c. 2006) Henry T. Edmondson III demonstrates the old adage that “ideas have consequences.”  After briefly noting the general consensus regarding the decline of the nation’s schools during the 20th century, he declares that John Dewey’s educational philosophy explains much of it.  Indeed, his dolorous “impact on American education is incalculable” (p. xiv).  Committed to the pragmatic proposition that truth is “what works” when solving problems, and holding “that belief in objective truth and authoritative notions of good and evil are harmful to students,” Dewey disdained metaphysics, ethics, history and theology.  (His anti-religious statements rival those of militant atheists such as Nietzsche and Marx, whose moral nihilism and socialist aspirations he sought to promote in the schools).  Deeply influenced by Rousseau’s Emile, he considered books—and especially the classics that constituted the core of traditional pedagogy—impediments to the experiential “learning-by-doing” he favored.  Students should work out their own moral standards through “values clarification” discussions rather than study Aristotle’s Ethics or McGuffey’s biblically-laced Readers.  

Surveying the academic scene in 1964, historian Richard Hofstadter said:  “‘The effect of Dewey’s philosophy on the design of curricular systems was devastating’” (p. 37).  Rather than studying the traditional “liberal arts,” the schools now seek to “liberate” students from the shackles of the past, encouraging “creativity” and engendering “self-esteem.”  Socialization—most notably progressive social change—increasingly replaced learning and scholarly proficiency as the central mission of the schools.  “Thanks in no small part to Dewey,” Edmondson says, “much of what characterizes contemporary education is a revolt against various expressions of authority:  a revolt against a canon of learning, a revolt against tradition, a revolt against religious values, a revolt against moral standards, a revolt against logic—even a revolt against grammar and spelling” (p. 56).  

To rightly respond to the educational problems we face, Edmondson invokes Flannery O’Connor, who simply advised parents:  anything that John Dewey says “do, don’t do.”  To make our schools good for our children, the ghost of Dewey must be exorcised!  Banishing such things as “whole language learning” (which leaves students unable to read and spell well), “fuzzy” math (which replaces memorizing with analysis) and “values clarification” would be a healthy place to begin!  Making the study of history central to the curriculum is essential—as is the discipline of memorizing facts about the past.  Learning logic—unlike indulging in “critical thinking”—would equip youngsters to actually think rather than emote.   In short:  we must rescue our children from the pernicious pragmatism of of John Dewey.

                                       * * * * * * * * * * * * * * * * * * * * 

Years ago I read and favorably reviewed David Gelernter’s Drawing Life:  Surviving the Unabomber—a moving account written by one of the nation’s most prestigious computer experts—a professor at Yale, an Orthodox Jew who was seriously injured by one of the notorious Unabomber’s mailed explosives.  Subsequently I’ve found various of Gelernter’s books (several of them historical monographs) worth perusing.  This is quite true of his America-Lite:  How Imperial Academia Dismantled Our Culture (and Ushered In the Obamacrats) (New York:  Encounter Books, c. 2012).  “Everyone knows,” Gelertner begins, “that American civilization changed in the 1960s and ‘70s” (#20).  “In 1957, Americans were pleased with America and proud of it,” but within 20 years “that proud confidence was gone, crumbled like mid-bricks into flyblown could of dust” (#105).  Revolutionary cultural changes—evident in an eroding of public civility and “the etiquette that used to govern relations between men and women”—have fundamentally changed America.  In significant ways, America has been Europeanized.  Responsible for these changes are those Gelertner dubs PORGIs—“post-religious globalist intellectuals”—who disdain both patriotism (love of country) and patriarchy.  

The PORGIs have effectively orchestrated a cultural “coup” by occupying the universities and making their prestigious degrees the price of admission to social prominence and financial success.  They are energized by their leftist ideology, which is “a new religion” morphing into such things as “earth worship” and the “sacralization of homosexuality” clearly akin to “ancient paganisms” (#402).  Addicted to—and intoxicated by—various theories (e.g. social justice,  global warming, affirmative action), the intellectuals portray themselves as champions of truth and righteousness.  But, strangely enough, they have little interest in any concrete facts that might refute their theories!  As Hannah Arendt said, evaluating the social revolutionaries of the ‘60s:  “‘The trouble with the New Left and the old liberals is the old one—complete unwillingness to face facts, abstract talk, often snobbish and nearly always blind to anybody’s else’s interest’” (#356).  

Rather than dealing with the oft-inconvenient facts before them, today’s intellectuals “invent theories and teach them to Airheads.  Airheads learn them and believe them” (#286).  Airheads “never need to think at all” since they need only repeat the theories and dogmas fed them by their professors.  To Gelertner, “Barack Obama and his generation of airheads, the first ever to come of age after the cultural revolution, are unique in American history.  All former leftist movement were driven by ideology.  Obama’s is driven by ignorance” (#1479).  The president “himself is merely a mouth for garden-variety left-liberal ideas—a mouth of distinction, a mouth in a million, but a mere mouth just the same.  He is important not as statesman but as symptom, a dreadful warning.  He is important not because he is exceptional but because he is typical.  He is the new establishment; he represents the post-cultural revolution PORGO elite” (#1491).  Obama’s historical ignorance distresses Gelertner!  That a president could refer to “the bomb” (rather than the bombs) that “fell on Pearl Harbor,” or to brag that his great-uncle helped liberate Auschwitz (a Polish camp liberated by the Russians), demonstrates his prestigious but vacuous “education” in the most elite schools of the country!  “What kind of mismash inhabits this man’s brain?” (#1532).  

Unfortunately, “There is a pattern here.  This president is not an ideologue; he does no reach that level.  He is a PORGI Airhead:  smart, educated, ignorant.  And there is a deeper, underlying pattern.  Obama has learned theories about the police versus black men.  They are wrong.  He has learned theories about ‘real causes’ of terrorism and about ‘isolated extremists’ and ‘Islamophobia.’  They are wrong.  He applied his theories just the way he was taught.  But the theories, being wrong, gave him wrong answers.  That is the PORGI elite, the new  establishment” (#1614).  Obama  will soon be replaced.  Butt the ‘60s revolution has succeeded inasmuch as “a new generation of Obamacrats enters America’s bloodstream every year, in late spring, when fresh college graduates scatter like either little birds of puffs of dandelion seed to deliver a new crop of Airhead left-winger to the nation and the world” (#1497).  Knowing little about history, having read little literature, free from any grounding in logic or philosophy, their college experience has primarily trained them to be faithful leftists.  

Despite their impoverished education, their growing power has resulted in Gelertner calls “Imperial Academia,” a confirmation of one of President Dwight D. Eisenhower’s 1961 warnings concerning the pernicious power of an emergent “scientific-techological elite,” which (through government funding and political influence) posed a threat “gravely to be regarded.”  Massive amounts of federal money now flow into academic institutions, which in turn provide the government with highly-trained experts who and support big government solutions to the nation’s problems. 

# # # 

279 Stalin’s Harvest of Sorrow

 Since WWII Jews around the world have routinely resolved to “never forget” Hitler’s brutal effort to destroy the Jewish people.  So too all of us should determine to never forget the far costlier devastation visited upon Russia by Joseph Stalin.  In concentration camps such as Belsen and Auschwitz the Nazis slaughtered some six million people, but a decade earlier, in the Ukraine and adjacent Cossack areas in southern Russia, the Bolsheviks killed nearly twice as many peasants—totaling more than all deaths in WWI.  The late English historian Robert Conquest devoted much of his life to finding, rigorously documenting, and publishing the truth regarding what transpired in the Soviet Union between WWI and WWII.  One of his most powerful treatises is Harvest of Sorrow:  Soviet Collectivization and the Terror-Famine (New York:  Oxford University Press, c. 1986).  The book’s title is taken from “The Armament of Igor,” a poem lamenting that:  “The black earth /  Was sown with bones / And watered with blood / For a harvest of sorrow / On the land of Rus.’”  

For many centuries Russian peasants were serfs—working the land of aristocratic landowners who often exploited them.  Reform movements in the 19th century, much like anti-slave movements in America, led to their liberation in the 1860s.  While certainly harsh by modern standards, their lot slowly improved, though like sharecroppers following the Civil War in America they were generally landless and impoverished in a nation firmly controlled by the Tsar and aristocracy.  Thus the Bolshevik revolution in 1917 was initially welcomed by peasants who often seized and carved up the large estates they worked on, hoping for the better life promised by the upheaval.  Yet they “‘turned a completely deaf ear to ideas of Socialism’” (p. 44).  As Boris Pasternak made clear, in a passage in Doctor Zhivago:  “‘The peasant knows very well what he wants, better than you or I do . . . .   When the revolution came and woke him up, he decided that this was the fulfillment of his dreams, his ancient dream of living anarchically on his own land by the work of his hands, in complete independence and without owing anything to anyone.  Instead of that, he found the had only exchanged the old oppression of the Czarist state for the new, much harsher yoke of the revolutionary super-state’” (p. 52).  

Realizing that the innate love of farmers for land ownership and free markets militated against his totalizing ideology, Lenin noted that he would ultimately “‘have to engage in the most decisive, ruthless struggle against them’” (p. 45).  He’d found that Communists such as himself knew little about economics—as was evident when he tried to abolish money and banking—and quickly launched the New Economic Policy, effectively restoring important aspects of capitalism.  He also had to find effective ways to encourage agricultural productivity, so he delayed collectivizing agriculture in the 1920s.  By the end of that decade, however, Joseph Stalin had seized sufficient power to undertake the radical restructuring of Russian agriculture.  A 1928 grain crisis prompted Party bureaucrats to mandate production quotas, taxes and distribution mechanisms.  They also needed scapegoats to blame and signaled out the best, hardest working and most prosperous farmers (the kulaks who owned a few acres and a handful of animals and even hired laborers as needed) who seemed to qualify as closet capitalists and “wreckers.”  As Stalin declared:  “‘We have gone over from a policy of limiting the exploiting tendencies of the kulak to a policy of liquidating the kulak as a class’” (p. 115).   

Stalin and the Soviet Politburo established the All Union People’s Commissariat of Agriculture, staffed by alleged “experts,” which was authorized to push the peasants into collectives and set utterly utopian, ludicrous goals for yearly harvests.  Such policies (part of Stalin’s Five Year Plan) led to an “epoch of dekulakization, of collectivization, and of the terror-famine; of war against the Soviet peasantry, and later against the Ukrainian nation.  It may be seen as none of the most significant, as well as one of the most dreadful, periods of modern times” (p. 116).  Farmers who failed to meet their quotas or “hoarded” grain (even seed grain!) were arrested and resettled in remote regions if not shot or sent to camps.  Conquest documented, in mind-numbing, heart-rending detail, this deliberate destruction of those who stood in the way of Stalin’s grand socialistic agenda.   To the Party, in the words of a novelist, “‘Not one of them was guilty of anything; but they belonged to a class that was guilty of everything’” (p. 143).  And in the  “class struggle” intrinsic to Marxist analysis, evil classes must be destroyed.   Sifting through all the documents available to him, Conquest estimates that at least fourteen million peasants perished.  “Comparable to the deaths in the major wars of our time,” Stalin’s “harvest of sorrow”  may rightly be called genocide.  

Above all, Stalin targeted the peasants of the Ukraine, the Don and Kuban, where a massive famine transpired in the early ‘30s.  Party activists (generally dispatched from the cities and lacking any knowledge of agriculture) presided over the process.  One of them recalled:  “‘With the rest of my generation I firmly believed that the ends justified the means.  Our great goal was the universal triumph of Communism, and for the sake of that goal everything was permissible—to lie, to steal, to destroy hundreds of thousands and even millions of people, all those who were hindering our work or could hinder it, everyone who stood in the way’” (p. 233).  One of the few Western journalists daring to discern and tell the truth, Malcolm Muggeridge, said:  “‘I saw something of the battle that is going on between the government and the peasants.  The battlefield is as desolate as nay war and stretches wider; stretches over a large part of Russia.  One the one side, millions of starving peasants, their bodies often swollen from lack of food; on the other, soldier members of the GPU carrying out the instruction of the dictatorship of the proletariat.  They have gone over the country like a swarm of locusts and taken away everything edible; they had shot or exiled thousands of peasants, sometimes whole villages; they had reduced some of the most fertile land in the world to a melancholy desert’” (p. 260). 

Consequently, Soviet agriculture imploded.  In 1954 the Nikita Khrushchev admitted that despite the more highly-mechanized farming techniques in the collectives “Soviet agriculture was producing less grain per capita and few cattle absolutely than had been achieved by the muzhik with his wooden plough under Tsarism forty years earlier” (p. 187).   And what’s true for agriculture is true for the rest of the USSR under Communist rule—socialism inevitably destroys whatever it controls.  

* * * * * * * * * * * * * * * * * * * * 

One of the last of Robert Conquest’s books dealing with 20th century Russia is Stalin:  Breaker of Nations (New York:  Penguin Books, c. 1991).  Benefitting from “the great flow of new information” recently available from Russian archives, he endeavored to portray one of the few men in history who harmed both his own country and much of the world.  Immersed within a system rooted in “falsehood and delusion”—what Boris Pasternak described as the “reign of the lie”—Stalin was an arch-deceiver who misled and betrayed virtually all his associates and allies.  Indeed, he “invested his whole being in producing illusion or delusion.  It was above all this domination by falsehood which kept even the post-Stalin Soviet Union in a state of backwardness, moral corruption, economic falsification and general deterioration until in the past decade the truth became too pressing to be avoided” (p. 325).  He was, as Churchill labeled him, “an unnatural man”—the personification of the “moral nihilism” basic to both Nazism and Bolshevism.  Along with Hitler, Mao, Ho Chi Minh, Fidel Castro and Pol Pot—brutal ideologues determined to shape the world in accord with their fantasies—he contributed much to one of the bloodiest centuries in history

Stalin was born in 1879 in Georgia, an ancient nation in the Caucus annexed to Russia by Tsar Alexander I in 1801.   Christened Iosif Vissarionovich Dzhugashvili, he assumed the name Stalin (“man of steel”) when he joined underground revolutionary activities designed to overthrow the Tsar.  Periodically arrested—and sent into exile in Siberia—he gained renown for his self-discipline, party loyalty, writing skills, and ability to get things done.  He was, however, a rather minor figure until after the 1917 Bolshevik Revolution.  Thereafter he proved useful to Lenin, who valued his loyalty as well as willingness to manage unpleasant tasks.   In 1922, when Lenin suffered his first of several strokes, Stalin began to effectively maneuver himself into powerful positions within the Politburo, jockeying with Trotsky for preeminence.  Lenin apparently distrusted him, however, and disapproved him as his successor, confiding to his wife, in a document hidden from the public for 33 years:  “‘Stalin is too rude, and this defect, though quite tolerable in our midst and in dealings among us Communists, becomes intolerable in a General Secretary.  This is why I suggest that the comrades think about a way to remove Stalin from that post and appoint another man who in all respects differs from Comrade Stalin his his superiority, that is, more loyal, more courteous, and more considerate of comrades . . . .’” (p. 101).  According to one of his secretaries, Lenin had resolved “‘to crush Stalin politically’” but died before doing so in 1924.

Unlike Trotsky, who advocated the primacy of world-wide revolutionary struggle, Stalin determined to first establish “Socialism in One Country,” and he rallied (through cajolery, intrigue, slander) enough followers to impose his will on the USSR.  Once in power, he “planned to launch the party on an adventurist class-war, policy of crash industrialization and collectivization, adventurist beyond even the most extreme of the plans hitherto rejected as beyond the pale for their Leftism” (p. 141).  “Stalinism was, in part at least, the result of a simple preconception—the nineteenth-century idea that all social and human actions can be calculated, considered and predicted” (p. 322).  Such policies, crafted by alleged “experts” who often knew very little about agriculture or industry or anything but Party ideology, were ruthlessly imposed and quickly impoverished virtually everyone but Party functionaries.  As definitively described inConquest’s Harvest of Sorrow (Stalin’s liquidation of the Ukrainian peasantry, “the greatest known tragedy of the century” that killed some fifteen million souls) and his The Great Terror (Stalin’s elimination of all rivals within the Communist Party)—few monsters in all of history have ruled so barbarously.    

When WWII broke out, Stalin did whatever necessary to further his own objectives.  Thus he cheerfully aligned himself with Hitler when it looked like the two dictators would help each other, expanding their power over vast sections of Europe.  Betrayed by Hitler when the Nazis invaded Russia, Stalin then turned to Churchill and Roosevelt—flattering and dissembling and manipulating these “allies” to secure invaluable materials with which to drive back the Germans and ultimately control Eastern Europe.  Quite capable of charming those he encountered, he favorably impressed visitors such as America’s Vice President Harry Hopkins and Britain’s Foreign Secretary Anthony Eden.  President Roosevelt, though  warned to be careful in negotiating with “Uncle Joe,” followed his personal “hunch” and determined to “give him everything I possibly can and ask for nothing in return,” trusting him not to “annex” any territory and “work with me for al world of democracy and peace’” (p. 245).  In Conquest’s view,  FDR’s naive judgment “must be among the crassest errors ever made by a political leader” (p. 245).   

Following WWII, Stalin resumed his ruthless policies—waging a “cold war” abroad and purging all possible enemies to his regime within Russia—before dying in 1953.   “In real terms, Milan Djilas’s conclusion stands up:  ‘All in all, Stalin was a monster who, while adhering to abstract, absolute and fundamentally utopian ideas, in practice had no criterion but success—and this meant violence, and physical and spiritual extermination’” (p. 327).  To understand the man and his evil deeds, Conquest’s Stalin:  Breaker of Nations is a trustworthy source with which to begin.

                                           * * * * * * * * * * * * * * * * * 

When Stalin’s daughter, Svetlana Alliluyeva, fled the Soviet Union in 1967, she brought with her a manuscript—Twenty Letters to a Friend (New York:  Discus Books, c. 1967)—describing important aspects of her life, which became an instant best-seller published in many languages.  She wrote the book in 35 days in 1963 just to put her thoughts on paper and did not envision publishing it while living in her own country.  She mainly recorded memories of her mother and father, bearing witness to the insatiable longing children have to be with and love their parents, but in the process she tried to make sense of what happened around her and thus gives us insight into what took place in Russia during her lifetime, for:  “The twentieth century and the Revolution turned everything upside down” (p. 30).

Several years after her father died she took her mother’s family name, Alliluyeva—a word akin to “Hallelujah” meaning “Praise ye the Lord.”  She fondly remembers both her mother and her maternal grandparents.  When Stalin married her mother, her grandparents became part of a nurturing extended family.  Grandfather Alliluyeva was born a peasant in Georgia but became a skilled mechanic who joined the Russian Social Democratic Workers’ Party in 1898 and retained an old-fashioned revolutionary idealism and personal integrity until he died in 1945.  Her grandmother was also from Georgia, the descendent of German settlers, who spoke Georgian with a German accent.  Reared in a Protestant church she  “was always religious,” stoutly resisting the atheistic propaganda surrounding her.  “By the time I was thirty-five,” Svetlana says, “I realized that grandmother was wiser than any of us” (p. 54).  That wisdom, she came to believe, was nurtured by a religious perspective she ultimately shared.   

Pondering her maternal grandparents’ influence, Svetlana credited their love for Georgia for instilling in her a love for the beauty of nature.  “O Lord,” she wrote, “how lovely is this earth of yours and how perfect, every blade of grass, every flower and leaf!  You go on supporting man and giving him strength in this fearful bedlam where Nature alone, invincible and eternal, gives solace and strength, harmony and tranquility of spirit” (p. 82).  Amidst all the destruction wrought by various “madmen” who ravage the earth, its “beauty and majesty” needs to be revered.  Still more:  “It seems to me that in our time faith in God is the same thing as faith in good and the ultimate triumph of good over evil” (p. 83).  Consequently, “By the time I was thirty-five and had seen something of life, I, who’d been taught from earliest childhood by society and my family to be an atheist and materials, was already one of those who cannot live without God” (p. 83).   

Svetlana’s mother, Nadya, was born in the Caucasus but grew up in St. Petersburg, immersed in the revolutionary activities .  Here she met and soon married Joseph Stalin, much older than she, whose first wife had died.  Nadya was sincerely devoted to the revolutionary cause, strictly followed Party rules, and was willing to sacrifice her all for the good of the people.  Thus she worked a great deal and spent limited time with her children, though when present orchestrated lots of fun and games.  Stalin himself proved to be a poor husband, so:  “Because my mother was intelligent and endlessly honest, I believe her sensitivity and intuition made her realize finally that my father was not the New Man she had thought when she was young, and she suffered the most terrible, devastating disillusionment” (p. 117).  In 1932, following an argument with him regarding the genocidal famine taking place in the Ukraine pursuant to Stalin’s orders, she went to her room and killed herself with a pistol, though Svetlana was told she had died of appendicitis.  “Our carefree life, so full of gaiety and games and useful pastimes, fell apart the moment my mother died” (p. 133).    Svetlana was six years old.  

“For ten years after my mother died, my father was a good father to me” (p. 133).  He had always been the more affectionate parent, making sure Svetlana was well cared for in every way, including an excellent education.  But when she finished her schooling and became more independent, their relationship frayed.  Discovering the real reason for her mother’s death while reading an English magazine further depressed her.  By now she was also aware of the growing list of classmates, friends and relatives who had been sent into exile or killed under her father’s rule.  When only seventeen she met and fell in love with Alexei Kapler, a noted musician, who seemed to her to be “the cleverest, kindest, most wonderful person on earth” (p. 187).  Soon thereafter Kapler was arrested and sentenced  to the Gulag for five years, apparently for daring to court Stalin’s daughter!  “After that my father and I were estranged for a long time.”  Indeed, “I was never again the beloved daughter I had once been” (p. 192).  In 1944 she married Grigory Morozov, a fellow university student.  Stalin didn’t approve of him either—both Kapler and Morozov were Jews and he harbored a deep anti-Jewish prejudice.  He refused to meet him .  This and welcomed the news that they divorced soon after she gave birth to a son.  She then married the son of a prominent Bolshevik, with whom she had a daughter.  This marriage garnered her father’s approval but quickly dissolved.   

Despite their estrangement, father and daughter occasionally spent time together following WWII.  She found him difficult to talk with and thought the obsequious men surrounding him (Beria, Malenkov, Bulganin) odious.   The Communist Party hardly resembled what was envisioned by sincere revolutionaries in 1917.  It “had nothing in common with the spirit of my grandfather and my grandmother, my mother, the Svandizes and all the old Party people I knew.  It was all hypocritical, a caricature purely for show” (p. 207).  How superior were the simple people of the “old Russia” such as Stalin’s own mother!  Her “grandmother had principles of her own.  They were the principles of one who was old and God-fearing, who’d lived a life that was upright and hard, full of dignity and honor.  Changing her life in any whatever was the furthest thing from her mind.  She passed on all her stubbornness and firmness, her puritanical standards, her unbending masculine character and her high requirements for herself, to my father.”  Still more, when Svetlana visited her paternal grandmother’s grave, she wondered how could she not think about her “without my thoughts turning to God, in whom she believed so devoutly?” (p. 214).   In 1962, less than a decade after her father died, Svetlana was baptized in the Orthodox Church.  For her, she explained:  “The sacrament of baptism consists in rejecting evil, the lie.  I believed in ‘Thou shalt not kill,’ I believed in truth without violence and bloodshed.  I believed that the Supreme Mind, not vain man, governed the world.  I believed that the Spirit of Truth was stronger than material values.  And when all of this had entered the heart, the shreds of Marxism-Leninism taught me since childhood vanished like smoke.”  

In the godless world of Stalin’s USSR, however, there was little to celebrate.  For his  daughter, nothing “turned out well” for those she knew.  “It was as though my father were at the center of a black circle and anyone who ventured inside vanished or perished or was destroyed in one way or another” (p. 231).  Yet despite it all Svetlana found reason for hope.  Much about the Russian character evident in her faithful nurse, Alexandra Andreevna, “Granny,” still survives.  “But what is good in Russia is traditional and unchanging” and ultimately “it is this eternal good which gives Russia strength and helps preserve her true self” (p. 232).  When Svetlana’s mother died, “Granny” became “the only stable, unchanging thing left.  She was the bulwark of home and family, of what, if it hadn’t been for her, would have gone out of my life forever” (p. 237).  Though not conventionally religious, she retained a deeply moral perspective and faith.  

No doubt influenced by both her maternal grandmother and “Granny,” Svetlana developed a deeply religious conviction.  “The Good always wins out,” she said.  “The Good triumphs over everything, though it frequently happens too late—not before the very best people have perished unjustly, senselessly, without rhyme or reason” (p. 242).  She had witnessed how her father and his revolutionary comrades “tried to do good by doing evil” and ruthlessly “sacrificed senselessly, thousands of talented” human beings (p. 244).  Yet she also knew that:  “Everything on our tormented earth that is alive and breathes, that blossoms and bears fruit, lives only by virtue of and in the name of Truth and Good” (p. 245).  

278 Still the Best Hope

Dennis Prager is one of the more thoughtful “talk-show hosts” on radio, sustaining an audience for several decades.  A conservative Jew, he consistently tries to look at the “big picture” when addressing current issues and brings to his subjects a thoughtful religious perspective.  He has also written several fine books, the most recent of which is Still the Best Hope:  Why the World Needs American Values to Triumph (New York:  Broadside Books, c. 2012).  The words in book’s title were crafted  by President Abraham Lincoln in 1862 when he addressed Congress and declared that America to be “the last best hope on earth.”  To Prager those words still ring true, so he wrote the to defend and promote the uniquely American trinity of values, conveniently inscribed on every coin minted in this nation:  “Liberty”—the personal freedom which flourishes alongside limited government and free enterprise; “In God We Trust”—which indicates that our natural rights and moral responsibilities are God-given; and “E Pluribus Unum”—which declares Americans to be a diverse people united by principles rather than blood or ethnicity.  

Prager wrote this book with a sense of urgency, believing we stand at a crossroads offering us three incompatible religious and/or ideological options, devoting roughly one-third of the book to each:  1) Leftism; 2) Islamism; 3) Americanism.  He explains:  “The American value of ‘Liberty’ is at odds with a Sharia-based society and with the Leftist commitment to material equality; ‘E Pluribus Unum’ is at odds with the Leftist commitment to multiculturalism; and ‘In God We Trust’ conflicts with both the Leftist commitment to secularism and the Islamic ideal of a Sharia-based state” (p. 10).  Though he certainly has read widely and thought deeply, Prager relies more on illustrations than scholarly studies, broad generalizations rather than meticulous documentation.  This is not to discount his presentation but simply to make it clear he writes for the general public, not the academy.   

Leftism, emerging in the French Revolution and thenceforth fueling scores of revolutionary movements around the world, is very much a religious movement, though of a secular sort.  Energized by  Karl Marx, it seeks to destroy Western Christian Culture and replace it with a scientifically-based, egalitarian society.  Its religious nature was evident in Hillary Clinton’s touting “the politics of meaning”—granting primacy to this-worldly concerns, continually seeking to establish a heaven-on-earth through political orchestration.  It dominates organizations such as the American Civil Liberties Union and the National Organization of Women and the National Council of Churches.  It rules the media (e.g. the New York Times and NBC) and most all liberal arts colleges and universities (e.g. Harvard, Columbia, UCLA and Occidental).  Prager thinks “Western universities have become Left-wing seminaries” (p. 97).  To soften and promote their ideological posture, Leftists usually call themselves “progressives” or “liberals” or “feminists” or “environmentalists”—much like denominations within a religion—but they share some core convictions.  They seek to make America a thoroughly secular place, resembling the “social democracies” in Europe which have sought to shed their national distinctions by joining the European Union, and they want to transform America to make it more egalitarian via universal health care, a command economy, minimum wages, cradle-to-grave welfare programs, affirmative action, race-based college admission policies, etc.  

Importantly, Leftists oppose traditional religions and seek to suppress, if not eliminate, their presence—their free expression—in the public square.    Philosophically committed to materialism, they necessarily believe:   “Man has supplanted the biblical God.  ‘God is man,” said Marx.  And man is God,’ said Engels” (p. 38).   Though some of them may “believe” in a deity of some sort, they reject “the personal, morally judging, transcendent God of the Bible” (p. 40).  What they really reject is special Revelation, with its clear-cut distinctions between good and evil.  To Prager, who regularly teaches classes on the Hebrew Bible, “the dividing line is belief in divine scripture.  Those who believe that God is the ultimate author of their scripture (the Old and New Testaments for Christians, the Torah for Jews) are rarely Leftist.  On the other hand, those Christians and Jews who believe that the Bible is entirely man-made are far more likely to adopt Leftist values” (p. 40).  

The Left believes, above all, in improving the world, making it a better place, creating a utopia of some sort.  It thinks we should not seek to understand things as they are but to devise ways to change them, to even transform such basic things as human nature.  As Robert F. Kennedy said:  “There are those that look at things the way they are, and ask, ‘Why?’  I dream of things that never were, and ask, ‘Why not?’”  Or we’re urged to sing along with the Beatles’ John Lennon and Imagine a perfectly peaceful world cleansed of private property and freed from greed—a world where there’s “no heaven or hell” and “everyone lives for today” (p. 69).  Thinking so makes it so!  As “a famous dissident joke stated:  ‘In the Soviet Union the future is known; it’s the past that is always changing’” (p. 209).  Good intentions, not effective actions, qualify one for membership in the “inner ring” of the self-anointed saviors.  “Because the Left relies heavily on feelings and intentions,” Prager says, “wisdom and preexisting moral value systems do not count for much” (p. 77).  Consequently, there is an adulation and courting of young people and their tastes (e.g. clothes, slang and music).  

Yet despite all their allegedly “good intentions”—despite all the propaganda circulating through the schools and media—“the Left’s moral record is among the worst of any organized group or idea in history” (p. 168).  Almost everything it’s “touched has made it worse—morals, religion, art, education from elementary school to university, and the economic condition of the welfare sates it created” (p. 168).  Most appalling is the number of innocents murdered by Stalin, Mao, Castro et al.—100 million, according to The Black Book of Communism.  Softer versions of socialism, now evident throughout Europe and touted by presidential candidate Bernie Sanders, have established ultimately unsustainable welfare programs such as Britain’s National Health Service that slowly slide into faceless bureaucracies failing the very people they claim to serve.   Though claiming to represent and care for ordinary people, “If the Left had its way, the citizens of the state would be told how to live in almost every way:  what to drive and when; what lightbulbs to use; what temperature to keep their homes; what men would be permitted to say to women; what school textbooks must include; when God could be mentioned, and when not; how much of their earning people may keep; what art would be funded and what art would not; what food children could be fed; how enthusiastically to cheer girls’ sports teams; and much more.  The list of Left-wing controls over our lives is ever expanding” (p. 208).  

Turning to Islam, which along with Leftism is devoted to the destruction of Western Civilization, Prager admits he treads through a minefield wherein charges of “Islamophobia” are routinely ignited against anyone daring to find fault with any aspect of the faith.  Yet we must fully understand—and dare to critique—an ideology mixing  religion and politics which has for 1400 years threatened Western Civilization.  One must of course try to distinguish between Islam and Islamism—the former a faith calling individuals to certain obligations, the latter a political movement promoting world domination.  There are certainly decent Muslims with whom one may establish concord, but there are also legions of fanatical Islamists supporting terrorism.  In fact, we must realize that Islam has historically allowed little personal freedom (whether religious, intellectual, or economic) and approves the militant establishment and expansion of its Caliphate.  Thus, according to perhaps the greatest Muslim thinker, Ibn Khaldun, Islam “demands jihad, holy war” and “Muslims are therefore enjoined to wage jihad in order to make converts to Islam” (p. 251).  

Islamic jihadists now seek to destroy Israel and America—primarily because they prevent “the expansion of Islamist rule” (p. 288).  Though such aspirations now seem to lack the necessary economic and military strength needed to accomplish them, they must be understood in order to respond to the many acts of terrorism and aggression we now face.  Prager responds to a variety of pro-Islamic arguments (e.g. the Koran contains inspiring verses; most Muslims are peace-loving; Muslim Spain enjoyed a “golden age” of religious tolerance; Muslims don’t impose Islam on conquered peoples), showing that partial truths do not validate an ideology whose negative aspects mandate its rejection.   

Having evaluated America’s rivals, Prager turns to defending her and her “unique values,” the first of which is liberty (“the essence of the American idea”).  Millions of immigrants, from 1607 onwards, have risked everything seeking various kinds of freedom (religious, political, economic) in this land.  For example:  “More black Africans have immigrated to the United States voluntarily—looking for freedom and opportunity—than came to the United States involuntarily as slaves” (p. 313).   Prizing liberty, many generations of Americans favored limited government because personal “liberty exists in inverse proportion to the size of the state.  The bigger the government/state, the less liberty the individual has.  The bigger the government, the smaller the citizen” (p. 316).  As a God-given right, liberty stands rooted in the very Being of God as revealed in the Judeo-Christian Scriptures, and He is absolutely essential as the sustaining Source of all values.   As John Adams insisted:  “Our constitution was made only for a moral and religious people.  It is wholly inadequate the government of any other.”  Explaining how these values earlier helped shape America, Prager provides scores of important illustrations regarding such things as individual responsibility, distinctions between good and evil, the sanctity of property, marriage and life.    

                                                            * * * * * * * * * * * * * * * * * * 

When I began my teaching career in the mid-60s, I routinely taught a history course, Western Civilization, which was most everywhere basic to college curricula.  Two decades later, relocating to a college which had replaced “Western Civilization” with “World Civilizations,” I unsuccessfully argued for a return to the much more focused and manageable course on the West.  As Rodney Stark notes, in How the West Won:  The Neglected Story of the Triumph of Modernity (Wilmington, Delaware:  ISI Books, c. 2014):  “Forty years ago the most important and popular freshman course at the best American colleges and universities was “Western Civilization.”  It not only covered the general history of the West but also included historical surveys of art, music, literature, philosophy, and science.  But this course has long since disappeared from most college catalogues on grounds that Western civilization is but one of the many civilizations and it is ethnocentric and arrogant for us to study ours” (#42).  Thus we witnessed Stanford abandon its “widely admired ‘Western Civilization’ course just months after the Reverend Jesse Jackson came on campus and led members of the Black Student Union in chants of ‘Hey-hey, ho-ho, Western Civ has got to go.’  More recently, faculty at the University of Texas condemned ‘Western Civilization’ courses as inherently right wing, and Yale even returned a $20 million contribution rather than reinstate the course” (#49).  

In light of this, Stark offers his book as a sturdy (indeed, contrarian!) defense of the currently-maligned West.  Doing so, he challenges many of the voguish views of the academy, arguing that the fall of the Roman Empire was in fact beneficial, that the “Dark Ages” never happened, that the Crusades are defensible, that global warming in earlier eras was a blessing, that the “Scientific Revolution” clearly began in the Medieval period rather than the 17th century, that the Protestant Reformers replaced a repressive Catholic system with equally repressive Protestant systems, and that Europe’s colonies impoverished rather than enriched their sponsors.  Still more:  he argues that non-Western societies such as the Chinese and Islamic, Mayan and Indic, failed to become “modern” because of intrinsic factors making such a transition impossible.  To Stark, the West’s distinctiveness resides in its ideas, and contrary to many historians (operating within a generally materialistic—whether Darwinian or Marxist—philosophical perspective) he thinks economic developments do not fully explain why cultures and civilizations rise and fall.  

Glancing at the world of Classical Beginnings (500 BC-AD 500), he finds:  “At the dawn of history most people [whether in China or India or Mesopotamia or Egypt] lived lives of misery and exploitation in tyrannical empires that covered huge areas” (#151).   Subject to arbitrary and frequently despotic rulers, forced to work within a command economy, deprived of secure title to property, the masses of mankind loved poorly.  Consequently, “in 1900 Chinese peasants were using essentially the same tools and techniques that had been using for more than three thousand years.  The same was true in Egypt” (#228).  But then, “In the midst of all this misery and repression a ‘miracle’ of progress and freedom took place in Greece among people who lived not in an empire but in hundreds of small, independent city-states.  It was here that the formation of Western civilization began” (#158).  

Despite the persistence of slavery, the Greeks tasted and celebrated (in both games and politics) the luxury of freedom.  Thriving as individuals, they flourished in such areas as:  warfare; democracy; economics; literary; the arts; technology; speculative philosophy and formal logic.  Importantly (as Herodotus noted in explaining the differences between Egypt and Greece), “the ancient Greeks took the single most significant step toward the rise of Western science when they proposed that the universe is orderly and governed by underlying principles that the human mind could discern through observation and reason” (#473).  This was possible because—as Anaxagoras and Plato saw—there is a Mind (Nous) underlying the physical cosmos—a monotheistic perspective that undergirds the West’s triumphs. 

Anticipating and complementing developments in Greece, Jewish theologians also proclaimed a “rational God” who was eternal, immutable, conscious and revealed to us through both creation and scripture.  Due to Alexander’s conquests and the subsequent Roman occupation of their land, many Jews were quite cosmopolitan—two centuries before Christ Jerusalem was actually known as “Antioch-at-Jerusalem.”  Early Christians such as Justin Martyr drew upon the best Greek thinkers (“Christians before Christ”) as they developed their theology.  Both Christian and Greek philosophers (preeminently Plato) revered “the divine gift of reason” which “has sown the seeds of truth in all men as beings created in God’s image’” (#698).  Thus, to Augustine:  “‘Heaven forbid that God should hate in us that by which he made us superior to the animals.  Heaven forbid that we should believe in such as way as not to accept or seek reasons, since we could not even belief if we did not possess rationals souls’” (#751).  Confidence in the rationality of the Creator—as well as His providential care for creation—enabled later Christian thinkers to do significant the scientific and historical studies basic to Western Civilization.  

By contrast:  “Islam holds that the universe is inherently irrational—that there is no cause and effect—because everything happens as the direct result of Allah’s will at that particular time.  Anything is possible.  Attempts at science, then, are not only foolish but also blasphemous, in that they imply limits to Allah’s power and authority.  Therefore Muslim scholars study law (what does Allah require?), not science” (#825), and Islam, for 1400 years, has demonstrably failed to develop anything comparable to the science and technology, literature and philosophy of the West.  Similarly, in China, the Confucian reverence for the past encouraged an opposition to change clearly illustrated by the great Chinese admiral Zheng He, who led large fleets (involving several hundred ships) across the Indian Ocean to the coast of East Africa between 1405 and 1433 A.D.   His expeditions, which could easily have led to the a Chinese of the globe, came to naught when the emperor dismantled his ships and forbade further construction of oceangoing vessels.  Even the blueprints for Zheng’s ships were destroyed!

Following the fall of Rome (“the most beneficial event in the rise of Western civilization”), the West emerged from the crucible of Greek and Christian culture.  In the “Not-So-Dark” Middle Ages, its genius emerged and flourished.  Political decentralization encouraged creativity and competition, progress and prosperity.  An “agricultural revolution” enabled Medieval Europeans to eat better and live longer—as did the favorable climate during the “Medieval Warming” era (800-1250 A.D.).  “As food became abundant, the population of Europe soared from about 25 million in 950 to about 75 million in 1250” (#2737).  Harnessing wind and water with sophisticated machinery (often shaped in blast furnaces) enabled them to irrigate land and grind grain and navigate seas.  Germans and Scandinavians, Hungarians and Slavs were successfully converted and began contributing to the creative Christian Culture responsible for impressive monuments—Gothic cathedrals; universities at Oxford and Paris; scientific inquires and advances under the guidance of brilliant thinkers such as Nicole Oresme and Jean Buridan; and magisterial scholarly works such as Thomas Aquinas’ Summa Theologica.  

Within that Medieval incubus there emerged, Stark stresses, the “freedom and capitalism” essential for the modern world.  Slavery slowly disappeared throughout Christendom.  It “ended in medieval Europe only because the Church extended its sacraments to all slaves and then banned the enslavement of Christians” (#2349).  Only in the Christian world was slavery eliminated!  Persons were increasingly free (despite the persistence of serfdom) to work voluntarily and creatively—and to increasingly to take part in the political life of their communities.  Capitalism emerged throughout Europe during the late Middle Ages, long before the Protestant Reformation.  Private property, commercial activities flourishing through free markets, and capital investments rendering income all brought about an incredible economic transformation.  Above all else:  “If there is a single factor responsible for the rise of the West, it is freedom.  Freedom to hope.  Freedom to act.  Freedom to invest.  Freedom to enjoy the flirts of one’s dreams as well as one’s labor” (#2663).  This freedom flourished in Medieval Europe and shaped the future of the West.  

Dramatically evident in 1492, the West quickly expanded to control much of the globe in successive centuries.  Technological developments, markedly evident in superior military equipment and trades goods, enabled relatively small groups of Europeans to conquer or colonize the Americas.  They  also proved decisive in numerous conflicts with Muslims, insuring their retreat from Europe.  “In 1800 Europeans controlled 35 percent of the land surface of the globe.  By 1878 this figure had risen to 67 percent.  Then in the next two decades, Europeans seized control of nearly all of Africa, so that in 1914, on the eve of World War I, Europeans dominated 84 percent of the world’s land area” (#6604).  Intellectually, the “Enlightenment” proved equally decisive.  Though it did indeed prompt various heterodox notions, the Enlightenment must be understood, in accord with Alfred North Whitehead, as rooted in many of the scientific and theological insights of Medieval thinkers—most especially the rationality of God and His world.  “For, as Albert Einstein once remarked, the most incomprehensible thing about the universe is that it is comprehensible:  ‘A priori one should expect a chaotic world which cannot be grasped by the mind in any way . . .  That is the “miracle” which is constantly being reinforced as our knowledge expands.’  And that is the ‘miracle that testifies to a creation guided by intention and rationality” (#5963).   

Due to this “miracle,” we Moderns enjoy unprecedented prosperity.  The standard of living has dramatically increased during the past two centuries.  Enjoying “political freedom, secure property rights, high wages, cheap energy, and a highly educated population,” the West now features an unprecedented quality-of-life.  Back-breaking manual labor has been largely replaced by machines.  Ordinary people enjoy “consumer” goods available only to the super-wealthy in earlier centuries.  Hardly the catastrophe denounced by romantics (from William Wordsworth to Al Gore), technology has greatly improved the lot of ordinary folks.  And this is, quite simply, how the West won! 

277 Islam’s Cultural Jihad

To understand both Time magazine’s award of its 2015 “person of the year” award to Angelica Merkel for welcoming nearly a million Syrian immigrants to Germany and the pro-Islamic rhetoric and policies of the Obama administration, they must be seen as nearly verbatim implementations of European pronouncements streaming for 40 years from international organizations such as the United Nations and the European Union.  That Barack Obama launched his 2008 presidential campaign with a speech in Berlin, Germany, indicates the degree to which he consistently seeks to align himself with European nations and their Islamic overtures.  For example, the EU’s ban on words which might offend Muslims—e.g. jihad; fundamentalists; Islamic terrorism—is  scrupulously followed by the American president.  So Obama administration spokesmen, following terrorist attacks, consistently refer to violent extremists, ever insisting they have no necessary ties to Islam—which is, of course, a necessarily “peaceful” religion.  Indeed, as Attorney-General Loretta Lynch immediately declared, following the slaughter in Riverside, California, we must, above all else, avoid Islamophobia whenever “extremists” indulge in terrorist acts!

Should one want to get inside such thinking he should heed Bat Ye’or’s Europe, Globalization, and the Coming Universal Caliphate (Madison, N.J.:  Farleigh Dickinson University Press, c. 2011), an expansion and update of her earlier Eurabia:  The Euro-Arab Axis.  Doing so enables one to see that what’s now happening in America has been developing, in what was once the heart of Western Civilization as Muslims implement their “Koranic duty to Islamize the planet,” since the whole earth is Allah’s and his people, the Muslims, are to enforce his rule.  Importantly:  “Muslims can never be guilty of occupation or oppression because Allah granted them the whole world; jihad returns to them what belongs to them as true believers” (#317).  So while “Westerners define terrorism as murderous attacks that blindly target civilian populations or individuals, committed by criminal gangs that act outside of recognized formations and do not respect the laws of war,” Muslims “judge terrorism by its motives, not its methods.  Any enterprise aimed at extending Islamic territory is considered ‘resistance.’  Palestinian jihidists, who popularized all modern terrorist methods, are always called ‘resistants’ in official OIC documents” (#930).  

Bat Ye’or is an Egypt-born Jewess who has devoted her life to historical research, publishing important treatises illustrating the plight of Jews and Christians under Islam in The Dhimmi:  Jews and Christians under Islam and The Decline of Eastern Christianity  under Islam:  From Jihad to Dhimmitude.   In an enlightening foreword to The Dhimmi, the great French philosopher Jacques Ellul noted that there exists in the secularized West a “current of favorable predispositions to Islam,” notably evident in the many euphemistic discussions of jihad.  But by setting forth the historical facts, Bat Ye’or dares contradict such prevailing assumptions.  “Historians,” she says, “professionally or economically connected to the Arab-Muslim world,” have misled the public with treatises  “which were either tendentious or combined facts with apologetics and fantasy.  After World War II, the predominance of a left-wing intelligentsia and the emergence of Arab regimes which were ‘socialist’ or allied to Moscow consolidated an Arabophile revolutionary internationalism” that remains strong is much of the contemporary world.   

Jihad, in fact, necessarily characterizes Islam, Ellul says, for it is a sacred duty for the faithful—“Islam’s normal path to expansion.”  Almost never the inner “spiritual” combat imagined by some pro-Islamic writers seeking to make the religion palatable to non-Muslims, actual jihad  advocates “a real military war of conquest” followed by an iron-handed policy of “dhimmitude”—the brutal reduction of conquered peoples to Islamic law.  Indeed the word Islam means submission—not peace!  Muslims divide the world into two—and only two—realms:  the “domain of Islam” and “the domain of war.”  At times there will be tactical concessions and “peaceful” interludes.  But ultimately, all devout Muslims are committed to conquer and control as much of the globe as possible.  Ellul stresses this “because there is so much talk nowadays of the tolerance and fundamental pacifism of Islam that it is necessary to recall its nature, which is fundamentally warlike!”  Writing presciently in 1991 Ellul declared:  “Hostage-taking, terrorism . . . the weakening of the Eastern Churches (not to mention the wish to destroy Israel) . . . all this recalls precisely the resurgence of the traditional policy of Islam.”  

Turning to Bat Ye’or Europe, Globalization, and the Coming Universal Caliphate, we find a careful study of the multitudinous documents generated by various congresses operating under the auspices of the UN and EU as well as assorted Muslim-controlled organizations such as the Organization of the Islamic Conference (OIC) which work to establish an “EU Mediterranean policy.”  Illustrating the dictum of Ayatollah Khomeini, the leader of Iran’s revolution—“If Islam is not political, it is nothing”—the OIC fuses religion and politics.  “Close to the Muslim Brotherhood, it shares its strategic and cultural vision, that of a universal religious community, the Ummah based upon the Koran, the Sunna and the canonical orthodoxy of shari’a” (#1429).  It’s supported by 65 countries and represents some 1.3 billion Muslims.   The OIC vows “‘to support the restoration of the complete sovereignty and territorial integrity of a member-state under foreign occupation.’  Such a principle could be applied to every jihad waged by Muslims in various countries to expand the reach of Islam and to install shari’a there, whether in Europe, Africa or Asia” (#4156).  

The ideas set forth by these organizations rather quickly set the tone and substance of policies that have shaped much of the modern world; providing “the progressive Islamization of the West; they establish the major elements of a new global system of totalitarian social and political domination impervious to Western democratic institutions” (#215).  “Europe is a perfect ally, serving the expansionist ambitions of the Ummah, the universal Muslim community” (#902).  In the past Jihadist warriors conquered vast swaths of land and subjected the residents who survived to the dhimmitude that slowly destroyed them.  Today’s “jihad ideology of world conquest, propelled by billions of petrodollars and facilitated by the complacency of European governments and the rivalry between Western powers, is flourishing in every corner of the world.  The driving force of this process is the Organization of the Islamic Conference (OIC), which has been dedicated since its creation in 1969 to the elimination of the State of Israel and the eventual implementation of shari’a over the West” (#257).  

Accordingly, Muslim supporters (many of them former Nazis, such as Paul Kickoff, the former SS officer who became the head of Interpol, and Kurt Waldheim, who served as the UN Secretary General) especially stressed a “multilateralism” and “multiculturalism” paradoxically combined with an anti-Israel agenda which included the strident anti-Americanism routinely expressed in UN resolutions.  Multiculturalism, devoted to the notion that all cultures are equally admirable, served as a rationale recognizing the reality of “Muslim immigrants’ refusal to integrate into Western societies” (#1752) while simultaneously insisting that European nations provide employment, housing, medical care, education, etc.  It also mandated that Europeans promote Islamic culture among immigrants and celebrate the utterly spurious “immense contribution of Islamic culture and civilization to Europe’s development and to include it in school and university syllabuses” (#1954).  

The strong support of the European Community for Yassar Arafat and the Palestinian Liberation Organization (leading to, of all things, a Nobel Peace Prize for the murderous Arafat) sharply illustrates the legitimization of modern jihadism.  One of the most powerful organizations, The Parliamentary Association for Euro-Arab Cooperation (PAEAC), was formed “in 1974 in response to Palestinian terrorism and the oil boycott . . . injected Eurabia into the very heart of Europe.  In effect, to its initial anti-Israeli and anti-American program the association added a new element relating to the internal politics of the EEC:  the promotion in European countries of an extensive Muslim Immigration on which would be conferred the same social, political, and national rights as the indigenous populations” (#550).  Yet the rights and privileges (e.g. freedom of religious expression, equality under law) Muslims demand for themselves in Europe are precisely those denied non-Muslims in Muslim nations!  As Montalembert noted long ago:  “When I am weaker, I ask you for liberty because it is your principle; but when I am strong, I take it away from you because it is not my principle.”  

Bat Ye’or repeatedly discusses the nation of Israel, pointing out how the very survival of this tiny nation is at risk.  She illustrates the deep hostility toward Jews ever-evident in Islamic history, and she shows how this hostility continues in conferences hosted by Arab countries whose publications represent “a monument to hatred and anti-jewish incitement that goes well beyond Nazi literature, with sentences such as, ‘Jews are the enemies of Allah, the enemies of faith and of the worship of Allah’” (#2860).  Israel has no right to exist, and all the land must be returned to Palestinians (this explains why an independent Palestinian “state” alongside Israel is unacceptable to Muslims).  Pro-Palestinian edicts —fully evident in World Council of Churches publications and United Nations resolutions and elite universities’ “divestment from Israel” posturing—are pervasive. 

What’s taking place, Bat Ye’or insists, is the steady break-up of the nation states that once constituted Europe.  Without their consent, the historic peoples of France and Germany, Italy and Spain, have lost their identity as the European Union has taken control of the continent and acceded to almost every Islamic demand, especially regarding immigration.  Dependent on Middle East oil and hoping to profit from immigrant labor, the EU has provided ways for Muslims to settle in Europe without forfeiting their Islamic culture.  Second and third generations insist on the teaching of Arabic and pro-Islamic materials in the schools.  Leaders from Muslim communities must be included in the political system and where possible sharia law must be established to settle intra-Muslim issues.  Slowly, through demographic growth, Muslims hope to gain power in various places.   The Caliphate now effectively dominates a number of European cities.  And across the Atlantic, with “President Obama, America is engaging more radically along such a path,” engaging in “outreach” and education, easing “the bureaucratic process in obtaining US visas and avoid embarrassing delays” entering the country (#3527).  

Europe’s “globalist and pacifist trends are obvious in the American Democratic administration under President Barack Obama,” which strongly supports UN policies and embraced a booklet titled Changing Course:  A New Direction for U.S. Relations with the Muslim World.  “For a European familiar with EU surrender policy, President Obama’s policy had no surprises.  Western guilt, apologies, flatteries, tributes, anti-Zionism/antisemitism, open-doors immigration, were all part of the dhimmitude paraphernalia” (#4521).  Thus in his 2009 Ramadan address, the president praised “‘Islam’s role in advancing justice, progress, tolerance, and the dignity of all human beings’” (#4534).  Such statements, Bat Ye’or sadly concludes, reflect a civilization in the process of collapse, a people willing to submit to Islam.  Europe lost its bet that money and appeasement would pacify Muslims.  And the United States, she fears, is tilting in the same direction.  

                           * * * * * * * * * * * * * * * * *

In Londonistan:  How Britain Created a Terror State Within (London:  Gibson Square, c. 2007), Melanie Phillips detailed how she thought “Britain is even now sleepwalking into Islamization,” blithely ignoring the “pincer attack from both terrorism and cultural infiltration and usurpation” daily changing the very nature the nation (#68). For many years Phillips was an acclaimed reporter and columnist—“a darling of the left”—for the Guardian, probably the leading leftist newspaper in England, “the paper of choice for intellectuals, the voice of progressive conscience, and the dream destination for many, if not most, aspiring journalists.”  Her disillusionment with the Left began when she honestly followed the evidence while researching and writing articles on a wide variety of subjects—immigration, education, environmentalism, marriage and family, feminism, multiculturalism, health care, Israel and foreign affairs.  

Though only nominally Jewish, Phillips found (to her surprise) that her colleagues on the Guardian branded her as a Jew who could not deal dispassionately with Israel.  Indeed, anything but a pro-Palestinian stance was anathematized by Britain’s leftists, who routinely equate Israelis with Nazis!  “The more I read,” she wrote in her autobiography, Guardian Angel, “the more horrified I became by the scale of the intellectual and moral corruption that was becoming embedded in public discourse about the Middle East—the systematic rewriting of history, denial of law and justice and the corresponding demonization and delegitimisation of Israel.”  Indicative of this process is the widespread sale and use of The Protocols of the Elders of Zion (a vicious anti-semitic work popular among the Nazis) and Hitler’s Mein Kampf in the Muslim world.  Pondering this phenomenon, she concluded—wisely, I think—that “Israel represents not a regional dispute but a metaphysical struggle between good and evil.  That is why the cause of Palestine is key to the Islamists’ demands” (#2661).  

Half-a-century ago many European intellectuals embraced Antonio Gramsci, the Italian communist who urged his readers to abandon violent revolution while launching a “long march” through various societal institutions—schools, churches, judicial systems, media outlets, law enforcement and charitable organizations.  “This intellectual elite was persuaded to sing from the same subversive hymn -sheet so that the moral beliefs of the majority would be replaced by the values of those on the margins of society, the perfect ambience in which the Muslim grievance culture could be fanned into the flames of extremism” (#2768).  What needed, above all, to be replaced was the Mosaic code, the Judeo-Christian morality fundamental to Western Civilization.  In many ways, the struggle now taking place is between the adherents of two books—the Bible and the Koran!  Inasmuch as they simply cannot be reconciled there is little hope for reconciliation between their believers.  

Hostility towards Israel was accompanied by lock-step support for the Palestinians and Islamic residents in England.  For decades the British have welcomed immigrants of all sorts, asking no questions and offering them “a galaxy of welfare benefits, free education and free health care regardless of their behavior, beliefs or circumstances” (#912).   Multitudes of Muslims thus arrived determined to preserve their traditions while enjoying England’s standard of living.  While recurrently aroused by terrorist attacks such as the subway bombings in 2005, neither politicians nor public (eased along by a very proper English tolerance and political correctness) pay serious heed to the cultural currents transforming significant sections of their country.  Trying to win over the hearts and minds of immigrant Muslims, naively convinced that “moderates” will support the nation that welcomed them, few Englishmen grasp Islamic extremism.  By “defining ‘extremism’ narrowly as supporting violence against Britain, it makes the catastrophic mistake of treating the aim of Islamizing Britain as an eccentric but unthreatening position and not one to be taken at all seriously” (#84).  The so-called “moderates” in Islam, Phillips thinks, are hardly moderate at all.  Yes, they condemn terrorist attacks, but they simultaneously deny such attacks have any roots in “real” Islam, “denying what was a patently obvious truth that these attacks were carried out by adherents of Islam in the name of Islam” (#2011).  

The Islamizing process is markedly evident throughout English society.  In the schools, teachers are pressured to present Islam in an positive manner, becoming agents of propaganda and indoctrination rather than truthful understanding.  Unable to distinguish between truth and lies, the ordinary Brit easily swallows pro-Muslim pronouncements that lack historical credibility and set forth misleading “mythology, distortion and libels” (#2453).  Underwriting this process, oil-rich Saudi sheiks have poured enormous sums into educational institutions—establishing “chairs” in universities, building elementary school facilities, publishing various pro-Islamic materials.  “Schools have ceased to transmit to successive generations either the values or the story of the nation delivering instead the message that truth is an illusion and that the nation and its values are whatever anyone wants them to be.  In the multicultural classroom, every culture appears to be taught except Britain’s indigenous one.  Concern not to offend minority sensibilities has reached the risible point where piggy banks have been banished from British banks in case Muslims might be offended” (#470).  “This moral inversion has been internalized so completely that the more Islamic terrorism there is, the more hysterically British Muslims insist that they are under attack by ‘Islamophobes’ and a hostile West.  Any attempt by British society to defend itself or its values, either through antiterrorist laws or the reaffirmation of the supremacy of Western values, is therefore denounced as Islamophobia, as even use of the term ‘Islamic terrorism’ is regarded as  ‘Islamophobic’” (#492).  Importantly, Phillips says, the multicultural assault on distinctively English ways results from “a repudiation of Christianity, the founding faith of the nation and the fundamental source of its values, including its sturdy individualism and profound love of liberty” (#1748).  

Police are discouraged from enforcing English law in Muslim communities lest they be charged with Islamophobia, and a growing number of policemen (hired to pacify the strident doctrines of ethnic diversity) are simply jihadists committed to the ultimate triumph of their faith.   Judges, more committed to “human rights” and “transnational progressivism” than England’s common law traditions, seek to impose  multicultural values.  The Association of Muslim Lawyers now calls for the “formal recognition of a Muslim man’s right under Sharia law to have up to four wives” (#2366).  Step-by-step, “Sharia law is steadily encroaching into British institutions.  In February 2008 the Archbishop of Canterbury caused a furor when he declared that Muslim families should be able to choose between English and Islamic law in marital and family issues.  But the fact is that Britain is already developing a parallel sharia jurisdiction in such matters, with blind eyes turned to such practices as forced marriage, cousin marriage, female genital mutilation and polygamy, indeed welfare benefits are now given to the multiple wives of Muslim men” (#158).  

In England today mosques attract more attendees than Christian churches which, have largely replaced “the fundamental doctrines of Christianity” with the “worship of social liberalism.  The Church stopped trying to save people’s souls and started trying instead to change society.  . . .   Miracles were replaced by Marx” (#3197).  Facing an ideology (Islam) determined to destroy Christianity, the Church in England capitulated in hopes of surviving as an emasculated but comfortable institution.  So London now serves as a center for Islamic study, featuring scores of research and educational institutions with newspapers and publishing houses distributing radical Islamic materials throughout the world.  Christian pastors and evangelists face prosecution if they make comments critical of Islam since “Islamophobic” speech is banned in England. Though he recently seems to have altered some of his views, Prince Charles once suggested he be known as a “defender of faith” rather than the ‘Defender of the Faith!”  He seemed to have concluded his nation is no longer Christian and is now truly multi- religious.  Amazingly enough, “the Church of England has been in the forefront of the retreat from Judeo-Christian Christian heritage” (#512)—it’s on its knees, not before the LORD but before Islamic intimidation, and the Church of England’s prostration rather painfully illustrates the plight of England today, says Phillips.  

276 We Cannot Be Silent

Heading toward the White House Barack Obama pledged to “fundamentally transform” America.  And he clearly has!  Yet in many ways he has simply consummated a process launched 50 years ago by the ‘60s Generation.    And one of the most significant transformations—the sexual revolution—has most deeply affected us all.  Signaling what was to come, one of the leaders of the ‘60s generation, Michael Lehrner, celebrated his 1971 marriage to a teenage girl with a wedding cake inscribed with these words:  “Smash monogamy!”  Those words were also embraced by the Weatherman faction of the SDS, led by Obama’s friend Bill Ayers.  (Interestingly enough, Hillary Clinton in the 1990s dubbed the self-ordained Lehrner her “personal rabbi”—though she obviously has resolved to preserve at least one more-or-less monogamous union.)   

To address this transformation R. Albert (Al) Mohler, Jr., the president of Southern Baptist Theological Seminary and one of America’s most distinguished evangelical thinkers, has published an important treatise:  We Cannot be Silent:  Speaking Truth to a Culture Redefining Sex, Marriage, & the Very Meaning of Right & Wrong (Nashville:  Nelson Books, c. 2015).  Acknowledging the impact of the “vast moral revolution” which has swept “away a sexual morality and a definition of marriage that has existed for thousands of years,” he both analyzes the upheaval and offers Christians a way to deal with it, noting that Flannery O’Connor “rightly warned us years ago that we must ‘push as hard as the age that pushes against you.’  This book is an attempt to do just that” (p. 1).     

Though it was oft-unperceived and lacked the violent explosiveness of the French or Russian revolutions, the cultural revolution launched in the 1960s has profoundly reshaped our world.  To Roger Scruton, an astute contemporary philosopher:   “The left-wing enthusiasm that swept through institutions of learning in the 1960s was one of the most efficacious intellectual revolutions in recent history, and commanded a support among those affected by it that has seldom been matched by any revolution the world of politics” (Fools, Frauds and Firebrands:  Thinkers of the New Left).  Consequently, says Mohler:  “We are facing nothing less than a comprehensive redefinition of life, love, liberty, and the very meaning of right and wrong” (p. 1).  

While the Supreme Court’s recent (2015) redefinition of marriage (Obergefell v. Hodges) has vividly illustrated the sexual revolution, Mohler insists “it didn’t start with same-sex marriage.”  Indeed:  “Any consideration of the eclipse of marriage in the last century must take account of four massive developments:  birth control and contraception, divorce, advanced reproductive technologies, and cohabitation.  All four of these together are required to facilitate the sexual revolution as we know it today.  The redefinition of marriage couldn’t have happened without these four developments” (p. 17).  Though Evangelicals have generally avoided the implications of at the first three of these four, Mohler devotes separate chapters to each to demonstrate the validity of his thesis.  

At the dawn of the 20th century eugenicists such as Margaret Sanger began promoting birth control as a means to purify the race—“More from the fit, less from the unfit.”  Though contraception had hitherto been condemned by all major branches of Christianity, accommodating modernity was in the air and the Church of England led the way by endorsing birth control (within marriage) in 1930.  Most all Protestants quickly followed suit.  Indeed, by 1960 few evangelicals even considered it a moral issue.  Nor did they pay much attention to the U.S. Supreme Court’s 1965 decision in Griswold v. Connecticut that granted married Americans the right to purchase contraceptives.  Issuing that decision Justice William O. Douglas admitted that nothing in the Constitution justified the decision, but he insisted, there must be somehow somewhere therein a “right to privacy, including the right to access to birth control, in what he defined as ‘penumbras’ that were ‘formed by emanations from those guarantees that help give them life and substance” (p. 20).  The Court’s rationale in Griswold would soon be wielded first to justify abortion in Roe v. Wade and then same-sex marriage in Obergefall v. Hodges.  “Erotic liberty” was declared a constitutional right!  

Along with contraception, divorce had been “inconceivable for most Christians throughout the history of the Christian church” (p. 22).  But during the 1960s no-fault divorce—first signed into law by then governor Ronald Reagan in California in 1969—soon made it easier to terminate a marriage than to dissolve a business partnership.  The disastrous results—broken families and fatherless children—were clearly unintended but ultimately momentous.  But most churches failed to either anticipate or deal wisely with it.  “No-fault divorce is a rejection of the scriptural understanding of covenant that stands at the very heart of the Christian gospel.  Nevertheless Christian churches generally surrendered” to the culture “and abdicated their moral and biblical responsibility to uphold marriage in its covenantal essence” (p. 24).  Indeed, by failing to strongly resist no-fault divorce evangelicals lost “credibility to speak to the larger issue of sexuality and marriage” (p. 25).  In yet another realm—reproductive technologies—few evangelicals have showed either understanding of or sensitivity to the ethics involved.  So just as the Pill allowed sex without babies so too in vitro fertilization, surrogate motherhood, and other technologies allowed women (independently of men) to have babies without sex.  

Only when dealing with the fourth of  Mohler’s factors—cohabitation—have evangelicals seemed alert to the sinful nature of the sexual revolution.  But on this score they now occupy an increasingly small segment of the culture.  What was once condemned as “living in sin” or “shacking up” has become widely accepted in America.   Most women under 30 who now bear children do so while still unmarried.  Thereby they virtually insure their children’s failure in many important areas, and they represent what Tom Smith says “‘is a massive change in one generation, a change that is so great that the majority of parents of young children today were raised in a different type of family than they live in today’” (p. 3).  

Having scanned the historical components of the sexual revolution, Mohler turns to the recently rapid successes of the homosexual movement, culminating in the redefining of marriage itself.  Though in 2004 eleven states passed defense of traditional marriage initiatives, less than a decade later “not one effort to define marriage as the exclusive union of a man and a woman succeeded” (p. 34).  Younger people in particular approve same-sex relationships and activities.  A monumental moral revolution, fueled by the entertainment industry, is in process.  Its success was carefully crafted and implemented by cunning activists who especially worked within academic disciplines and liberal churches to validate their cause.  Remarkably:  “‘Homophobia’ is now the new mental illness and moral deficiency, while homosexuality is accepted as the new normal” (p. 41).  Liberal churchmen now declare it not sinful but an optional lifestyle, and many evangelicals (e.g. Brian McLaren and Tony Campolo) who are all too frequently biblically compromised and anxious to be compassionate, have joined the chorus supporting the new morality which is now established in the legalization of same-sex marriage.  

Consequently, Christians committed to a deeply-biblical and traditional ethic must now awaken and begin patiently responding to the revolution.  So, Mohler reminds us:  “In the Christian understanding, same-sex marriage is actually impossible so we cannot recognize same-sex couples as legitimately married” (p. 54).  Christians must remember that no government can create or define marriage—that’s already been done by God and revealed in both Nature and Scripture.  “Evangelical Christians, in particular, should recognize natural law as a priceless testimony to the comprehensive grace God, a testimony that displays his glory and pattern for human flourishing” (p. 63).  

Turning to the latest expansion of sexual rights, the “transgender revolution” promoted by Oprah Winfrey et al., Mohler notes that “an entire civilization” has been turned “upside down” by severing “gender” from “sex.”  Decades ago the politically correct establishment decreed that though there are only two biological sexes there may well be a variety of self-selected genders.  So some schools now ban gendered nouns (boys and girls) and pronouns (he and she).  This is because they assert, as Katy Steinmetz explains:  “‘There is no concrete correlation between a person’s gender identity and sexual interests; a heterosexual woman, for instance, might start living as a man and still be attracted to men.  One oft-cited explanation is that sexual orientation determines who we want to go to bed with and gender identity determines what you want to go to bed as.’” (p. 68).  Reality is whatever we want it to be!  And we now face an “omnigender” collage that includes “Queer/Questioning, Undecided, Intersex, Lesbian, Transgender/Transsexual, Bisexual, Allied/Asexual, Gay/Genderqueer”!  Just whatever!

Amidst all this confusion, defenders of traditional marriage face a daunting challenge!  Fortunately, the Bible provides a solid basis for beginning to rebuild families that nurture healthy children within the context of a divinely-blessed, lifelong, monogamous covenant.  There is, in fact, only one way to live rightly together as men and women!  Whatever transpires in the surrounding culture, God’s people have been given clear commandments regarding sexual relations.  And we must also struggle to preserve the legal “right to be Christian” in an increasingly anti-Christian country wherein “Erotic liberty has been elevated as a right more fundamental than religious liberty” (p. 124).   It’s important to listen carefully when President Obama and his administrative enforcers shift from the language of the Constitution—the “free exercise” of religion—to the freedom to “worship,” which can easily be confined within the walls of a “house of worship.”  Certainly we must always to speak the truth in love and seek to reach all men and women with the grace of the Gospel.  But to “bear witness to Christ and the gospel in contemporary culture,” as Robert George says, means “to make oneself a ‘sign of contradiction’ to those powerful forces who equate ‘progress’ and ‘social justice’ with sexual license. 

                                        * * * * * * * * * * * * * * * * * * * 

The sexual revolution, now culminating in the legalization of same-sex marriage and celebration of transgender declarations, triumphed within a culture devoid of a Natural Law ethos.  Though slowly giving way to an evolutionary worldview, wherein there are no established essences to things, the Natural Law (rooted in Aristotle and Cicero, Augustine and Aquinas and America’s Founding Fathers) still provides a rationale for and defense of heterosexual marriage that forever makes sense.  Conjoined with an earlier treatise he co-authored with Robert George and Sherif Girgis—What Is Marriage?  Man and Woman:  A Defense (New York:  Encounter Books, c. 2012)—Ryan T. Anderson’s Truth Overruled:  The Future of Marriage and Religious Freedom (Washington, D.C.:  Regency Publishing, c. 2015) merits serious study and distribution.  “With its decision in Obergefell v. Hodges, the Supreme Court of the United States,”  Anderson asserts, “has brought the sexual revolution to its apex—a redefinition of our civilizations’ primordial institution, cutting its link to procreation and declaring sex differences meaningless” (#80).  Five unelected, elitist judges have rashly claimed the power to trash the most important association known to man!    They not only “decided a case incorrectly—it has damaged the common good and harmed our republic” (#1009).  

Consequently, folks who dare declare their support for traditional, heterosexual marriage are now pilloried as bigots (akin to racists) committed to immoral forms of sexual discrimination.  Christians espousing heterosexual monogamy and everyone who dares condemn  sodomy are now instructed “to take homosexuality off the sin list.”  Facing the fact that the ground has shifted around us, Christians must, Anderson says, clearly think through how to respond, taking to heart the patience and perspicuity of the pro-life movement.  We must, first, identify and reject the judicial activism so evident in both Roe v. Wade and Obergefell v. Hodges.   Poor jurisprudence can, and must be, refuted on the highest of intellectual levels.  Then we must take steps to preserve our constitutionally guaranteed freedoms “to speak and live according to the truth” (#209).  

To do so, Princeton Professor Robert George says:  “‘We must, above all, tell the truth:  Obergefell v. Hodges is an illegitimate decision.  What Stanford Law School Dean John Ely said of Roe v. Wade applies with equal force to Obergefell:  “It is not constitutional law and gives almost no sense of an obligation to try to be.”  What Justice Byron White said of Roe is also true of Obergefell:  “it is an act of ‘raw judicial power.’”  The lawlessness of these decisions is evident in the fact that they lack any foundation or warranting the text, logic, structure, or original understanding of the Constitution.  The justices responsible for these rulings, whatever their good intentions,are substituting their own views of morality and sound public policy for those of the people and their elected representatives.  They have set them selves up as super legislators possessing a kind of plenary power to impose their judgments on the nation.  What could be more unconstitutional—more anti-constitutional—than that?’” (#1031).  Importantly, Professor George’s strong critique of the Court can be found, in equally emphatic language, in the four justices’ (John Roberts; Antonio Scalia; Samuel Alito; Clarence Thomas) opinions who dissented from Obergefall.  

The author’s “goal is to equip everyone, not just the experts, to defend what most of us never imagined we’d have to defend:  our rights of conscience, our religious liberty, and the basic building block of civilization—the human family, founded on the marital union of a man and a woman” (#237).  “Whatever the law or culture may say, we must commit now to witness to the truth about marriage:  that men and women are equal and equally necessary in the lives of children; that men and women, though different, are complementary; that it takes a man and a woman to bring a child into the world.  It is not bigotry but compassion and common sense to insist on laws and public policies that maximize the likelihood that children will grow up with a mom and a dad” (#267). 

To declare this truth we must first insist that words mean something.  Marriage can only describe a conjugal union, the fleshly union of a male and female human being.  To accept the Supreme Court’s verdict is to grant its faulty “assumption that marriage is a genderless institution” (#288), nothing more than an agreement between persons to enjoy some sort of emotionally rewarding relationship.   The Court’s position was, of course, largely set in place by the sexual revolutionaries who promoted cohabitation, no-fault divorce, single parenting, and the hook-up culture dramatically evident on university campuses.  

Still more, as a conjugal union marriage is designed for and ordered to procreation, a fact vociferously denied by sexual revolutionaries.  In the marital act two become one flesh.  It’s not an etherial, spiritual bond between “loving” persons but an intensely physical act, uniting a man and woman in a thoroughly “comprehensive” manner.  Note, Anderson says, this “parallel:  The muscles, heart, lungs, stomach and intestines of an individual human body cooperate with each other toward a single biological end—the continued life of that body.  In the same way, a man and a woman, when they unite in the marital act, cooperate toward a single biological end—procreation” (#407).  Bringing children into the world entails forging intact families suitable for their rearing.  “Marriage is based on the anthropological truth that men and women are complementary, the biological fact that reproduction depends on a man and a woman, and the social reality that children deserve a mother and a father” (#470).  

To redefine marriage in accord with the sexual revolution charts a dire course for our future, says Anderson:  “The needs and rights of children will be subordinated to the desires of adults.  The marital norms of monogamy, exclusivity, and permanence will be weakened.  Unborn children will be put at even more risk than they already are.  And religious liberty—Americans’ ‘first freedom’—will be threatened” (#692).  We already see the harms done by single parenting, whereby children suffer on almost every score—increased poverty, abuse, delinquency, substance addictions, dysfunctional relationships.  So too a “study undertaken by sociologist Mark Regnerus of the University of Texas demonstrated the negative impacts among children being raised in the context of a same-sex home” (#1509).  

And there’s more to come as proponents of erotic rights envision moving beyond same-sex marriage to “legally recognizing sexual relationships involving more than two partners” (#765).  The California legislature recently passed a bill allowing a child to have three legal parents.  Though the governor vetoed it, such legislation will quickly cascade from similar chambers in the wake of the Supreme Court’s recent decision.  Yet other theorists propose temporary marriage licenses—leasing a spouse, much as you lease a house, for as long as he or she suits you.  Once marriage has been reduced to a “lifestyle option” valued primarily for its benefits to autonomous adults, little remains to that most essential “little platoon,” the family.   And precisely that, for the sexual revolutionaries, has been the purpose all along.  As  Michael Lehrner and the Weathermen said, “smash monogamy.”  It all fits nicely into the agenda of Marx and Engels, who placed the abolition of families high on their list in order to create a pure, socialist society.  

Turning to the question of what we can now do, Anderson leads us back to the carefully-wrought, timelessly true theological position of the Christian Church.  The creation account in Genesis provides a wonderful prescription whereby a man and a woman form a divinely-ordained covenant best illustrated in “God’s own covenant-making love in Jesus Christ” (#1670).  This new covenant of grace reaffirms the old covenant, with its rules regarding sex and marriage.  “Sex, gender, marriage, and family all come together in the first chapters of Scripture in order to make clear that every aspect of our sexual lives is to submit to the creative purpose of God and be channeled into the exclusive arena of human sexual behavior—marriage—defined clearly and exclusively as the lifelong, monogamous union of a man and a woman” (#1739).  

Today, of course, there are revisionist thinkers within the religious world who explain away the clear words of Scripture and insist the modern world requires a new morality better attuned to its desires.  In their view, convictions rooted an antiquity have no more value that pre-scientific notions regarding astronomy or immunology.  To such thinkers—and the many churches embracing their views—orthodox believers “must speak a word of compassionate truth.  And that compassionate truth is this:  homosexual acts are expressly and unconditionally forbidden by God through his Word, and such acts are an abomination to the Lord by his own declaration” (#1778).  Strong words!  But compassion need not walk  weakly, extending approval to everyone in every situation!   Without a mental toughness, we will fail to resist the sledge hammer blows now bludgeoning traditional marriage.

Similarly, we dare not stand aside (under the auspices of kindness and tolerance) while this nation’s religious liberties are attacked.  Revolutionaries of all sorts, sexual revolutionaries included, know they must establish their ideologies in a people’s legal structures.  No one thinking clearly about America’s recent history can avoid concluding that Christians who dare deviate from the erotic revolution’s dictates will be punished.  Given the decades-long shift to administrative law courts (invisible to many of us), people are increasingly fined for failing to measure up to the precepts of sexual “equality” or mouthing “hate speech.”  So florists and bakers and photographers refusing to participate in gay weddings have been found guilty and harshly fined for their conscience-bound commitment to traditional marriage.   “Erotic liberty” outweighs religious liberty and threatens to entirely subvert it.  

Rightly read, Truth Overruled and We Cannot Not Be Silent should prompt us to share their truths  and support their proposals if we care for our families, churches, and a good society.  

275 Poverty & Economy

While teaching Ethics in the 1970s I often used, as a supplementary text, Ron Sider’s Rich Christians in a Hungry World.  Written by a Mennonite theologian committed to alleviating hunger and poverty around the world, it challenged readers to thoughtfully address pressing world problems by citing biblical texts and explaining economic structures—generally informed, I now realize, by Marxist critiques (the rich exploit the poor) and Keynesian (deficit spending) prescriptions.  Back then I thought Sider surely knew more than I and properly assessed the issues he addressed.  Two decades later, however, Sider announced that though his biblical perspectives were defensible his economic positions had been skewed by his misunderstanding of free market economics.  Unfortunately, in accord with Sider too many theologians and preachers make economic pronouncements quite untethered to economic wisdom.  

Thus it’s good to consider a book I wish I’d had in the ‘70s written by a fine theologian (Wayne Gruden, PhD, University of Cambridge, now teaching at Phoenix Seminary) and a skilled economist (Barry Asmos, PhD, now serving as a senior economist at the National Center for Policy Analysis) titled The Poverty of Nations:  A Sustainable Solution (Wheaton:  Crossway Books, c. 2013).   Blending their expertise, they seek “to provide a sustainable solution to poverty in the poor nations of the world, a solution based on both economic history and the teachings of the Bible” (p. 25).  They provide a richly-documented and amply-illustrated treatise, engaging and understandable for anyone concerned with rightly alleviating poverty in our world.  

In an endorsement that sums up the book’s message, Brian Westbury, former Chief Economist for the Joint Economic Committee of the U.S. Congress, says:  “I became an economist because I fell in love with the idea that a nation’s choices could determine whether citizens faced wealth or poverty.  Thirty years of research has led me to believe that wealth comers from a choice to support freedom and limited government.  I became a Christian because I fell in love with Jesus Christ.  The Bible says we were created in God’s image and that while we should love our neighbor, we are also meant to be creators ourselves.  I never thought these were mutually exclusive beliefs.  In fact, I believe biblical truth and free markets go hand in hand.  I have searched far and wide for a book that melds these two worldviews.  Asmos and Grudem have done it!  A top-flight economist and a renowned theologian have put together a bullet-proof antidote to poverty.  It’s a tour de force.  The church and the state will find in this book a recipe for true, loving, and lasting justice.”  High praise indeed!  

Asmos and Gruden first focus on the right “goal” to pursue:  increasing a nation’s GDP, which means producing valuable goods and services.   Though popular programs for redistributing existing goods—through taxation or apparently benevolent “aid” programs, or “debt-relief” subsidies for poor countries, or “fair trade” crusades allegedly helping poor coffee farmers, or printing more money—may momentarily appear to reduce poverty, ultimately such endeavors do little to improve economic conditions.  Nor is depending on donations God’s ideal for human flourishing.  “God’s purpose from the beginning has been for human beings to work and create their own goods and services, not simply to receive donations” (p. 72).  Certainly there is an important place for charitable assistance and governmental “safety nets,” but real economic development requires wealth-creation through the creativity of a people adding to their own community’s goods and services.  In short:  “Producing more goods and services does not happen by depending on donations from other countries; by redistributing wealth from the rich to the poor; by depleting natural resources; or by blaming factors and entities outside the nation, whether colonialism, banks that have lent money, the world economic system, rich nations, or large corporations.”  Only one objective should prevail, the “primary economic goal” of “continually producing more goods and services, and thus increasing its GDP” (p. 106).  

To justify their case, Asmos and Gruden carefully analyze and reject eight historical “economic systems that did not lead to prosperity”—hunting and gathering; subsistence farming; slavery; tribal ownership; feudalism; mercantilism; socialism and communism; the welfare state and its illusory equality.    Though certain advantages may be associated with each of these systems, they were all basically stagnant and generally enriched only a small percentage of the population.  Surveying the past, it seems clear that only the “free market” system facilitates wide-spread economic prosperity.  Rightly defined:  “A free-market system is one in which economic production and consumption are determined by the free choices of individuals rather than by governments, and this process is founded in private ownership of the means of production” (p. 131).  

The authors carefully distinguish between the free-market system they support and the “crony” or “state” or “oligarchic” forms of capitalism they reject.  In particular,  to function rightly, free-markets  require the “rule of law” that extends to political and business elites as well as ordinary folks.  Poor countries almost always have poor (i.e. corrupt or incompetent) leaders!   Property must be protected, contracts and deeds must be upheld, and harmful products must be banned.  A free-market cannot work amidst anarchy, so a good if limited government is essential.  And the free-market also needs a stable currency and low taxes to encourage the development of goods and services.  Various aspects of free market economics—specialization, trade, competition, prices, profits and losses, entrepreneurship—are explained and defended. “The genius of a free-market system is that it does not try to compel people to work.  It rather leaves people free to choose to work, and it rewards that work by letting people keep the fruits of their labor” (p. 133).  

Neither one person nor any bureaucracy guides the free-market economy—the collective wisdom of countless individuals making choices enables it to work well.  This meshes well with the Bible’s celebration of “human freedom and voluntary choices” (p. 188).  Freedom is truly essential for human flourishing of any sort.  Thus Asmos and Gruden carefully detail “twenty-one specific freedoms” (e.g. to own property, buy and sell, travel and relocate, trade, start businesses, to work at any job, etc.) that should be protected in any good society.  Sustained by a free people the free-market works!  “With no central director or planner, it still enables vast amounts of wealth to be created, and the benefits to be widely distributed, in every nation where it is allowed to function.  No other system encourages everyone to compete and cooperate, and gives people such economic freedom to choose and produce, and thus enhances prosperity.  Slowly but surely, countries around the world are seeing the win-win nature of a free-market system” (p. 184).  

There is, furthermore, a moral as well as economic component to the free-market.  Obviously  wrongdoing occurs within free-market economies!  No Christian should be alarmed at the reality of sin pervading all areas of human behavior!  But the opportunities for massive corruption are more strikingly evident in socialistic, state-controlled  economies prevalent throughout the developing world.  By encouraging individual freedom and responsibility, free markets recognize the intrinsic dignity of persons created in the image of God who create the goods and services basic for human flourishing.  Such taking care of oneself, acting in one’s self-interest, can be distinguished from covetousness.  As Brian Griffiths says:  “‘From a Christian point of view therefore self-interest is a characteristic of man created in the image of God, possessed of a will and a mind, able to make decisions and accountable for them.  It is not a consequence of the Fall.  Selfishness is the consequence of the Fall and it is the distortion of self-interest when the chief end of our lives is not the service of God but the fulfillment of our own ego’” (p. 208).  

To Rick Warren, who has energetically supported programs around the world while pastoring the Saddleback Community Church, this book merits serious attention from evangelicals.  He’s traveled extensively and “witnessed firsthand that almost every government and NGO (non-profit) poverty program is actually harmful to the poor, hurting them in the long run rather than helping them.  The typical poverty program creates dependency, robs people of dignity, stifles initiative, and can foster a ‘What have you done for me lately?’ sense of entitlement.”  Thus, Warren continues:  “The biblical way to help people rise out of poverty is through wealth creation, not wealth redistribution.  For lasting results, we must offer the poor a hand up, not merely a handout.”  To enable us to do so, The Poverty of Nations “should be required reading in every Christian college and seminary,by every relief and mission organization, and by every local church pastor.”  At Saddleback, Warren says, “this book will become a standard text that we will use to train every mission team we have in 196 countries.”

 * * * * * * * * * * * * * * * * * * * * *

For several decades years Henry Hazlitt was one of the most eminent “public intellectuals” in America—writing economic-oriented columns for the New York Times and contributing to other periodicals, publishing books and engaging in discussions with the nation’s leading thinkers.  In the opinion of H.L. Mencken, he was “one of the few economists in human history who could really write.”  At a dinner honoring him, Ludwig von Mises declared:  “In this age of the great struggle in favor of freedom and the social system in which men can life as free men, you are our leader.  You have indefatigably  fought against the step-by-step advance of the powers anxious to destroy everything that human civilizations has created over a long period of centuries. . . .  You are the economic conscience of our country.”  

In his most acclaimed and influential treatise, Economics in One Lesson (New York:  Three Rivers Press, c. 1946; re-issued in 1961 and updated in 1978) Hazlitt relied on basic common sense to encourage common folks to grasp economic principles and thereby to become better citizens.  With Adam Smith he believed that what is “prudence in the conduct of every private family can scarce be folly in that of a great kingdom.”  Thus the simple difference between good and bad economists is this:  “The bad economist sees only what immediately strikes the eye; the good economist also looks beyond.  The bad economist sees only the direct consequences of a proposed course; the good economist looks also at the longer and indirect consequences.  That bad economist sees only what the effect of a given policy has been or will be on one particular group; the good economist inquires also what the effect of the policy will be on all groups” (p. 16).  Good economists, as the architects of the Iroquois Confederacy recognized centuries ago, should propose and implement policies with an eye on “the seventh generation,” not the current crowd.  

Above all, Hazlitt sought to refute some prevailing “economic fallacies” that “have almost become a new orthodoxy” (p. 9).  The fallacies he analyzed were primarily those espoused by John Maynard Keynes and his disciples shaping the New Deal.  Brilliant thinkers such as Keynes are often “bad” inasmuch as they dismiss concerns for the long-term impact of their policies.  They think only of themselves and their immediate problems.  But if we care for others “the whole of economics can be reduced to a single lesson and that lesson can be reduced to a single sentence.  The art of economics consists in looking not merely at they immediate but at the longer effects of any act or policy; it consists in tracing the consequences of that policy not merely for one group for for all groups” (p. 17).   Unfortunately, folks follow Keynes since he is “regarded as brilliant economist, who deprecate saving and recommend squandering on a national scale as the way of economic salvation; and when anyone points to what the consequences of these policies will be in the long run, they reply flippantly, as might the prodigal son of a warning father: “‘In the long run we are all dead’” (p. 16).

To illustrate his thesis, Hazlitt explains the “broken-window fallacy.  If a baker’s window is broken, he must pay a repairman to fix it.  The repairman thus has more money to spend and that money trickles through the village enriching a variety of people.  What’s not recognized, however, is the baker’s lost savings—money he could have used to buy a new suit or expand his business.  To imagine the vandal’s destructive act could stimulate economic development and add to the community’s welfare is an illusion.  “Yet the broken-window fallacy,under a hundred disguises, is the most persistent in the history of economics” (p. 25).  Wartime spending, for example, certainly seems to stimulate the economy, providing employment and bolstering selected industries, but it does enormous harm in the process.   

So too government spending saps a people’s economic strength.  Though continually “presented as a panacea for all our economic ills” by politicians such as Franklin D. Roosevelt and Barack Obama, in the long run all such expenditures must be paid for by someone, someway, some day.  Just as a broken window stimulates certain activity and enriches certain workers, so too does government spending.  “The government spenders forget that they are taking the money from A in order to pay it to B.  Or rather, they know this very well; but while they dilate upon all the benefits of the process to B, and all the wonderful things he will have which he would not have had if the money had not been transferred to him, they forget the effects of the transaction on A.  Be is seen; A is forgotten” (p. 37).  

The “Forgotten Man” was more fully described by William Graham Sumner in 1883:  “‘As soon as A observes something which seems to him to be wrong, from which X is suffering.  A talks it over with B, and A and B then propose to get a law passed to remedy the evil and help X.  Their law always proposes to determine what C shall do for X, or, in the better case, what A, B and C shall do for X . . . .  What I want to do is to look up C . . . .  I call him the Forgotten Man . . . .  He is the man who never is thought of.  He is the victim of the reformer, social speculator and philanthropist, and I hope to show you before I get through that he deserves your notice both for his character and for the many burdens which are laid on him’” (p. 195).  To which Hazlitt adds:  “It is C, the Forgotten Man,who is always called upon to stanch the politician’s bleeding heart by paying for his vicarious generosity” (p. 195).  

But since people rarely want to pay more taxes politicians generally resort to printing and spending more money.  Their grandstanding easily garners votes in the next election and often appears to help a nation, but the invisible wheels of inflation will surely (if slowly) destroy her, for “inflation itself is merely a form, and a particularly vicious form, of taxation” (p. 31), ultimately harming most those least able to afford it.  Most surely it is “the opium of the people” (p. 174).  Yet throughout human history:  “Each generation and country follows the same mirage.  Each grasps for the same Dead Sea fruit that turns to dust and ashes in its mouth.  for it is the nature of inflation to give birth to a thousand illusions” (p. 171).  

In a series of short chapters, Hazlitt explains and condemns a variety of government policies and programs—government loans (whether for houses or schooling); subsidies (for both farmers and businesses); “full employment” (attained only under totalitarian regimes) ; regulations (whether for industries or favored species); protective tariffs (always favoring special producers rather than consumers); foreign aid (whether military or economic); “parity” prices (beloved by farmers); government price-fixing (whether during WWII or under Richard Nixon); rent control (favoring an elite, enriched class of renters) ; excessive union wage-rates (the constant goal of both the AFL-CIO and NEA); minimum wage laws (inevitably costing jobs and harming production); unemployment benefits (skyrocketing under Barack Obama); “just” prices and wages (arbitrarily set by bureaucrats rather than established by the market); etc.  

In the book’s final edition Hazlitt addressed “the lesson after thirty years.”  In short:  little has changed since 1946.  Economists and politicians continue to pursue deficit spending policies that inflate the currency and cannot but harm the nation in the long run.  Thus within a century the American dollar has shrunk to a nickel!   

* * * * * * * * * * * * * * * * * * *

As he watched Roosevelt’s New Deal bear fruit in Lyndon Johnson’s Great Society (whereby a tenfold growth in government activities took place), Henry Hazlitt protested the mushrooming power and size of the federal government and its necessarily dictatorial “Planned Economy,” setting forth his objections in Man vs. the Welfare State (New Rachelle, N.Y.:  Arlington House, c. 1969).  He noted that FDR’s generation of politicians had promised not only to “bring perpetual full employment, prosperity, and ‘economic growth,’ but solve the age-old problem of poverty overnight.  And the end results not merely that accomplishment has fallen far short of promises, but that the attempt to fulfill the promises has brought an enormous increase in government spending, an enormous increase in the burden of taxes, chronic, deficits, chronic inflation, ‘Social Security’ has brought an ominous increase in social insecurity” (#92).

Basically Hazlitt tries to help the reader understand the short-sightedness of welfare state economics.  There simply cannot be “salvation through government spending” because the deficit spending involved is no better than “creating money out of thin air.”  It wrongly equates income (or money) with goods and services, which are the only true measure of a nation’s wealth.  Nor can we evade the ominous consequences of indebtedness by arguing (as did Harvard’s John Kenneth Galbraith and kindred “liberal” economists) that “we owe it to ourselves.”   Unfortunately, as David Hume observed two centuries earlier, such rationalizations regarding “contracting debt will almost infallibly be abused in every government.”  And rather than repay the debt governments inflate (and thus debase) the currency—effectively repudiating it!  Putting “it bluntly, the government’s creditors have been swindled” (#254).   

Swindlers of all sorts succeed by subtly misleading their victims, and “the welfare state can arise and persist only be cultivating and living on a set of economic delusions in the minds of the voters” (#527).  Among these delusions are the worth of minimum wage laws, price controls, consumer protection regulations, relief programs, Social Security, guaranteed annual income, guaranteed jobs, the negative income tax, and various “soak the rich” endeavors.  The swindlers assert, through the mouths of prominent politicians, that “social justice” demands those who can pay for the welfare state be forced to do so.  Listening to such rhetoric, almost all voters assume the “rich” are people much richer than they—only later to they awaken to the fact that they themselves are the “rich” who must pay the bills!  Admitting they will rob Peter to pay Paul, they imagine they are only depriving the “rich” Peter of his property.  

President Lyndon Johnson once said:  “‘We are going to try to take all of the money that we think is unnecessarily being spent and take it from the “haves” and give it to the “have nots” that need it so much’” (#3022).  The main mechanism for doing this is the progressive income tax, established by American Progressives in the 16th Amendment a century ago.  Those promoting it fully understood its baleful economic prospects, but they wanted to use it for social transformation.  “In the Communist Manifesto of 1848, Marx and Engels frankly proposed ‘a heavy progressive or graduated income tax’ as an instrument by which ‘the proletariat will use its political supremacy to wrest, by degrees, all capital from the bourgeois, to centralize all instruments of production in the hands of the State,’ and to a make ‘despotic inroads on the right of property, and on the conditions of bourgeois production’” (#1537).  

In fact, “the government has nothing to give to anybody that it doesn’t first take from someone else, and most all welfare state policies illustrate  “the shrewd observation of the French economist, Bastiat, more than a century ago:  “‘The State is the great fiction by which everybody tries to live a the expense of everybody else’” (#1028).  Doing so inevitably leads to disasters such as was evident half-a-century ago in Uruguay, which embraced “democratic socialism” a century ago or in Venezuela today.   The cost of living balloons and the GNP declines.  Ultimately, the welfare state cannot but destroy the economy!  

274 A Model Historian: Rick Kennedy

 During the 42 years I taught in Nazarene universities I was privileged to work alongside some truly gifted scholars who lived out their calling as Christian professors.  Dr. Rick Kennedy, a professor of history at Point Loma Nazarene University who received his Ph.D. from the University of California, Santa Barbara, was certainly one of these—combining commitments to teaching and research, conscientiously upholding the orthodox Christian tradition, working winsomely with students, and actively worshiping in San Diego’s First Presbyterian Church.   In addition, he has for years worked diligently within The Conference of Faith and History and now serves as that organization’s secretary.  In a laudatory review of Kennedy’s latest work, Thomas Kidd, a prominent professor of history at Baylor University, commends him as “a formidable academic historian” who has written “many serious books and articles on American intellectual history.”  

Kennedy  has recently added to his list of publications a fine biography, one of an excellent series of religious biographies edited by Mark Noll, entitled The First American Evangelical:  A Short Life of Cotton Mather (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2015).  “In this book,” Kennedy says, placing Mather (the son of Increase Mather, an equally significant Massachusetts clergyman) in his historical milieu, “I will focus on Cotton and his self-conscious desire to tug against the slide of genteel Protestantism” (#152).  And as the title indicates, he wants to identify Cotton Mather as the “first evangelical”—a position usually assigned to leaders of the First Great Awakening such as Jonathan Edwards, the great theologian, and George Whitefield, the wondrously winsome itinerant English evangelist.  Two decades before the Great Awakening broke out in the 1730s, many of its elements emerged under Mather’s ministry in Boston:  “Thousands of people and a large number of churches rallied to the way Cotton Mather articulated and modeled what he called an ‘all day long faith’ and described as a way of walking ‘to the very top of Christianity’” (#162).  

Along with many Puritans, Mather refused to be labeled a “Calvinist” and selected the term “Eleutherian” (a Greek word for freedom), to denote his singular commitment to the New Testament message, for as St Paul declared to the Galatians:  “It is for freedom that Christ has set us free!”  In the 1690s, he led a small band of “Eleutherians” who where committed to the “evangelical interest,” a more intense form of discipleship and piety than was then evident in Boston.  He became “a standard-bearer for a my-utmost-for-his-highest type of Christianity, which moderates saw as too extreme” #1733).  Thus to  Kennedy:  “Herein lies the birth of the evangelical tradition in America:  A coalition of ministers and laypeople rallied to Cotton Mather’s call to a zealous, freedom-loving, Bible focused Protestantism that was open to spiritual activities and communications” #1748).  

Born in Boston in 1663, Cotton was the “oldest child of Increase Mather and Maria Cotton.  Both of his grandfathers, Richard Mather and John Cotton, were revered founders of the colony, powerful ministers, and model Puritans” (#338).  But young Mather witnessed “the last decades of Puritan Boston” while preparing (through the Boston Latin School and Harvard College) to join his father in pastoring Boston’s prestigious North Church, “probably the largest and richest Protestant congregation in America” (#1090).  He soon demonstrated prowess as a preacher as well as effectiveness in pastoral visitation, working with small prayer groups, launching jail ministries, promoting missions to the Indians, and fervently praying both publically and by himself in his study.  “‘My life is almost a continual conversation with heaven,’ Cotton wrote in 1713” (#770).  Still more, he sought to follow “‘the advice of the ancients:  If you wish to be always with God, always pray, always read’” (#480).  

He devoutly pursued a scholarly life, acquiring a vast library (believing “his study was a kind of holy ground”) and writing prolifically on a variety of subjects.  Above all he loved and lived in the Bible, continually seeking to understand and expound it.  Attuned to intellectual currents in Europe, he “was the first important scholar in America to realize that a battle for the Bible was brewing among Protestant scholars” (#2268).  Many were taking a highly critical approach (manifest in the rationalism personified by Spinoza) that disregarded the Supernatural.  To effectively defend the traditional, orthodox commitment to the Bible’s trustworthiness—indeed its infallibility—he embraced and articulated the philosophical “reasonableness” (rooted in Aristotle’s Topics) akin to the “courtroom jurisprudence” that Professor Kennedy has effectively emphasized throughout his publications.  

For Cotton Mather, such reasonableness mines a multitude of sources, so the testimonies of the Christian tradition, preserved in its classic texts, merit respectful attention.  He considered himself an historian—the study of which is “‘one of the most needful and useful accomplishments for a man that would serve God’” (#2101)—and Kennedy argues he was “the greatest American historian of the seventeenth and eighteenth centuries” (#1532).  “In the pulpit he upheld the Bible as divine testimony.  In a book he titled Reasonable Religion, he declared that Christians are not reasonable ‘if we don’t receive that book which we call the Bible or, the Scripture, as a Divine Testimony’” (#612).  Similarly, credible witnesses to miracles (whether in the Bible or in history) should not be dismissed merely because they  testify to supernatural events.  In that spirit he wrote “a religious and political history of New England called Magnalia Christi Americana, the “Great American Deeds of Christ,” a work that made him “an internationally known historian” when it was published in 1702 (#1542).  Indeed, he had become “the most famous American in the British Empire” (#1758).  

Sadly enough, few Americans today could identify Cotton Mather.  If they’ve heard of him, they likely remember a minor incident—his peripheral role in the notorious Salem witch trials.  When he heard reports of girls engaged in witchcraft, he urged they be brought into “the kind of healing program that had worked for” some disturbed girls he’d worked with in Boston.  Secular officials intervened, however, and the trials were held.  One of the judges, “Samuel Sewall later declared to his church that he was willing to ‘take the blame and shame’ of the trials upon himself.  In his history of New England, Cotton agreed that the executions proceeded from mistaken principles” (#1367).  Though he is frequently “associated with the witch trials,” he never attended them; “nor did he have any authority within the situation” (#1367).  The extent of his influence was urging leniency in dealing with the accused.  

As Thomas Kidd indicates, this biography of Cotton Mather is a “gleefully revisionist” treatise.  Professor Kennedy seeks to show his subject in a positive light, quite different from the dour portraits drawn by more muckraking writers.  Committed to his understanding of  “reasonableness,” Kennedy is as open to Mather’s reports of God’s providential and miraculous workings in New England as Mather was open to the same realities in both Scripture and history.  The book’s thesis, identifying Mather as the “first American Evangelical,” will certainly engender scholarly debates, but it seems reasonable to me to find the same spiritual hungers and convictions cultivated by Cotton Mather in New England in 1715 surfacing with more clarity and power in the First Great Awakening in the 1730s.  

* * * * * * * * * * * * * * * * 

Professor Rick Kennedy has long pondered the proper way to research and reason, to think and write and teach history—especially as it comes to bear on the Christian Tradition.  Early on in his career he published an excellent article entitled “Miracles in the Dock:  A Critique of the Historical Profession’s Special Treatment of Alleged Spiritual Events” in fides et historia, XXVI:2 (Summer 1994), wherein he mounted a vigorous attack on David Hume’s flawed rejection of miracles and called historians to recover an earlier, better way of doing history.  Kennedy then expanded that argument in a scholarly monograph titled A History of Reasonableness:  Testimony and Authority in the Art of Thinking (Rochester:  University of Rochester Press, c. 2004).  

He began his treatise with a story John Locke told in his Essay Concerning Human Understanding.  A 17th century King of Siam (modern Thailand) refused to believe a Dutch ambassador’s description of ice!  Having never seen frozen water he could not imagine such could exist.  He refused to believe a report because his personal experience negated it.  John Locke himself, trying to find reasons to believe others’ experiences as well as his own, suggested that tentatively accepting the testimonies of credible witnesses make sense.  So the King of Siam should have at least taken the ambassador at his word and subsequently sought to see if other reports confirmed his assertion.  For Kennedy this story “gets at the deep traditional issues of testimony and authority in the art of reasoning” and gives us a segue into a  discussion of their historical development. 

To rightly reason on the basis of testimony and authority is no minor matter!  Unless we can do so much that holds us together grows tenuous.  “For leaders to act, for juries to decide, and for history to teach, people have needed to trust testimony and authority” (p. 4).  From Aristotle on, as Kennedy shows by examining an impressive number of important textbooks used during the course of 20 centuries, thinkers and teachers concerned with education took seriously the role of testimony and authority.  And inasmuch as Aristotle set forth many important definitions and distinctions in his Topics, we may take him (joining St Thomas Aquinas) as “The Philosopher” since he stands at the heart of the “classical tradition” so central to Western Civilization.  

There is a marked difference between what we “know from within ourselves and what we learn from others” (p. 13).  Gifted children quickly become proficient in mathematics, seeing clearly what simply must be true.  An adolescent can become a world-class mathematician or chess master, but few would want him to be the nation’s president!  That’s because the things we learn from others, such as history and wisdom, must develop throughout a life rightly lived and are learned through dialectic (dialogical reasoning) and rhetoric (writing and speaking effectively) rather than logic and geometry.  “Aristotle ingeniously created an intellectual device that served this and other purposes.  He called it topics” and it became basic to “the liberal arts curriculum for two thousand years” (p. 13).  

Much that we learn, Aristotle insisted, comes from others by way of testimony and authority.  It is a form of “social” knowledge and is essential for “social” creatures such as ourselves.  Throughout the past, a multitude of thoughtful human beings have discovered truths regarding God, man, and the cosmos that we can quickly appropriate by believing them, accepting their authority.  He set forth the “pattern followed by most of the textbook writers discussed in this book, a pattern of writing about testimony from the perspective of honest people giving and receiving the best information available to them” (p. 16).  Such knowledge, of course, is not nearly as self-evident and certain as Euclid’s axioms or sense experiences, and we must take care to be neither overly gullible nor dogmatically skeptical.  But without such knowledge and capacity to reason well we would live seriously circumscribed and intellectual impoverished lives.  

Influential educators, especially Cicero and Quintilian in ancient Rome, simplified, synthesized and prescribed the principles set forth in Aristotle’s Topics.  Then Christian thinkers, such as Augustine, Boethius, and Cassiodorus, preserved this tradition of carefully evaluating and trusting testimony and authority; their works were used in schools throughout the Middle Ages.  During the Reformation, Luther’s close associate Philipp Melanchthon “reached deeply into the works of Aristotle, Augustine, and the best Medieval theologians in order to strengthen not only the role of dialectic as the foundation to all aspects of the liberal arts curriculum but also as the foundation of a Christian reasonableness in general” (p. 117).  

Things began to change, however, when 17th century thinkers such as Francis Bacon and Rene Descartes charted new directions more suitable to the budding scientific approach to truth that was concisely summed up in the newly-established Royal Society’s motto, “Nullius in verba” (On no one’s word).  Bacon specifically sought to surpass Aristotle’s “common sense” philosophy, and Descartes endeavored to confine all knowledge to mathematical strictures.  Thus Gottfried Wilhelm Leibniz (along with Newton a co-founder of calculus) could imagine settling “all disputes” through “computation” (p. 197).  In the hands of David Hume, this approach easily led to the denial of most all testimony—especially when applied to miracles.  Important textbooks, notably The Port-Royal Logic, certainly tried to maintain a balance between truths discerned mathematically and truths delivered through historical witnesses.  And gifted disciples of Aristotle, such as Richard Henry Whatley and John Henry Newman, eloquently upheld his views and emphasized “the reasonableness of Christianity.”  But during the past four centuries the measured rejection of Aristotle’s Topics is quite evident.  

Consequently, C.S. Lewis’s lament in The Lion, the Witch, and the Wardrobe (“Why don’t they teach logic at these schools?”) describes the plight of modern education.  Philosophers following Immanuel Kant reduced knowledge to what can be subjectively discerned.  The autonomous self stands alone, determining what is true, or good, or beautiful.  In America, John Dewey insisted one learns singularly through personal experience, through “doing.”  Reflecting the influence of such thinkers, today’s teachers  promote “Critical Thinking,” encouraging even the youngest scholars to stand defiantly alone and decide for themselves what is true or good or beautiful for them.  Rarely are they taught to trust authorities or historical testimonies or “common sense” traditions.  

In the book’s final paragraphs Professor Kennedy reflects on his own intellectual pilgrimage.  Growing up in California in the 1960s, he embraced the bumper sticker philosophy:  “Question Authority.”  Throughout his many years in school, culminating in his doctoral studies, he was urged to become an independent, “critical” thinker.   Historians, he learned, were to be ever-vigilant, doubting rather than trusting sources, subjecting everything to one’s personal judgment.  Fortunately, he worked with a number of “good teachers who modeled what they did not preach” (p. 310).  So he began to appreciate the wisdom of pre-modern thinkers such as Aristotle and Augustine.  And he has come to believe, along with John Locke, that the King of Siam should have believed the Dutch ambassador’s words regarding the existence of ice in northern climes.  And he thinks, as did Locke:  “We should do well to commiserate our mutual Ignorance, and endeavour to remove it by all the gentle and fair ways of Information” (Essay Concerning Human Understanding, IV, xvi.4).  

* * * * * * * * * * * * * * * * * * *

In Jesus, History, and Mt. Darwin:  An Academic Excursion (Eugene, Oregon:  WIPF & STOCK, c. 2008), Rick Kennedy invites readers to join him in thinking about the discipline of history while climbing (with his two young sons and a good friend) one of the peaks in California’s Evolution Group in the Sierra Nevadas.  The book is by design a very personal account:  “Back in the 1970s, I learned to love university life.  I eventually became a professor of history.  I started out a Bible-trusting Christian and have not lost my faith.  This book is about the reasonableness of biblical Christianity in universities.  By reasonableness, I mean the warranted credibility, if not the persuasiveness, of Christian claims about ancient history” (p. 1).  He thus follows Aristotle, trusting various sources, not the questioning Socrates, and provides, in easily-read form, the basic argument set forth in his History of Reasonableness

While climbing Mt. Darwin, Kennedy also ponders the “natural history” set forth by Charles Darwin which is, at times, posed as a rival to biblical faith since he “then inferred that since the creation of new species did not need God, then it is best to assume that God was not involved” (p. 1).  While he thinks “Darwin’s theory works within the boundaries of credibility that are standard to Natural history,” he doesn’t “think it is true to the extent that it should influence the core of Christian history” (p. 26).  There’s simply a sharp difference between studying pre-historical rocks and human history preserved in documents.  

* * * * * * * * * * * * * * * * * * * 

Twenty years ago, a decade into his professorial career, Rick Kennedy published a fine treatise entitled Faith at State:  A Handbook for Christians at Secular Universities (Downer’s Grove:  InterVarsity Press, c. 1995).  Though designed for students at state institutions, the book can be read by students and professors wherever they study and teach.  Still more:  it reveals Kennedy’s deep commitment to the importance and worth  of higher education.  Universities, Kennedy insisted, need Christians on campus, for the academy influences our culture and the Christian voice needs to be heard therein.  He cited the oft-quoted statement of Charles Malik to emphasize this point:  “‘At the heart of all the problems facing Western civilization . . . lies the state of mind and the spirit of universities’” (p. 13).  The university at its best may be portrayed as the “Academical Village” Thomas Jefferson envisioned for the University of Virginia.  Kennedy likes the “village” image because universities provide relaxed environments enabling folks to find “time to chat, gossip or take a walk” (p 19).  They are—or should be—comfortable, nurturing communities.  They provide the facilities where learning takes place.  And Christian students should seize advantage of every opportunity to learn!

     To help students attain their goals, Kennedy introduces them to the essentials of university life.  “There are lots of good faculty members at every school,” he says.  “The job of the student is to seek out the good ones and avoid the bad ones” (p. 35).  There are peers with whom one can discuss and thereby learn.  There are nearby Christian churches and on-campus organizations which can assist students (Kennedy himself was encouraged by both the Navigators and the Church of the Nazarene in San Luis Obispo, California).  There are great books—the Bible and the writings of wonderful Christian thinkers from St Augustine to John Henry Newman will help Christian students integrate their faith with what they are learning.

     Above all, Kennedy calls Christian students to help recall universities to their original mission:  the search for truth.  Most universities still have (sadly ignored) mission statements of a deeply religious character.  Christians on campus can urge professors and students to remember and live out that mission.  They can also discover the excitement and joy of thinking!  Reason itself, Kennedy says, brings its own rewards.  Readable, thoughtful, buoyantly celebrating the love of learning, Faith at State is a book one should give students who are embarking on their academic voyage.  

# # #