300 Unchanging Witness

       One of the most widely quoted rules in Church History was set forth in the fifth century by St. Vincent of Lerins, who said questions regarding biblical interpretation and doctrinal standing should be settled by deciding:  “Quod ubique, quod semper, quod ab omnia”—what has been believed everywhere, always, and by all.  Universality!  Antiquity!  Consent!  Various heretics, various schismatic sects, may tout their brilliance or novel insights, but the Church must follow what her Sacred Tradition declares.  “We shall follow universality,” Vincent said, “if we confess that one faith to be true, which the whole Church throughout the world confesses; antiquity, if we in nowise depart from those interpretations which it is manifest were notoriously held by our holy ancestors and fathers; consent, in like manner, if in antiquity itself we adhere to the consentient definitions and determinations of all, or at the least of almost all priests and doctors” (A Commonitory, II, 6).  

Certainly indifference to history and disrespect for tradition mark our era.  Students in America’s high schools and universities frequently learn less about the nation’s past than about how they’re anointed to rectify its wrongs in pursuit of social justice.  NFL players show contempt for the nation’s flag in their effort to demonstrate indignation with racial injustice.  And in most, if not all, Christian churches there has been, for many decades, a general contempt for traditions of any sort, be they musical, doctrinal, or ethical.  The “unchanging witness” of 20 centuries merits little attention or emulation.  But if we think clearly, as C.S. Lewis insisted, we must “keep the clean sea breeze of the centuries blowing through our minds, and this can only be done by reading old books.”  Thus Brent Strawn, in The Old Testament Is Dying:  A Diagnosis and Recommended Treatment (Grand Rapids:  BakerAcademic, c. 2017), warns us that one of the most important sources for Christian belief is being seriously neglected by most Christians.  Strawn is a professor at Emery University who did his undergraduate work at Point Loma Nazarene University, generously thanking three of his teachers (and friends of mine) for their exemplary instruction—Reuben Welch, Robert Smith, and Frank Carver.  

Strawn begins by noting that, when speaking in various churches, not even the older folks knew much about the Old Testament.  Just as languages, such as Kiowa or Minoan, can actually die and disappear when they’re no longer spoken, so too (Strawn fears) the Old Testament is no longer “spoken” in modern churches and is dying.  His use of this “linguistic analogy” is  one of book’s strengths.  There are today a few Kiowa-speakers, but it’s no longer a living language.  So too there are multitudes of scholars who can read Hebrew and study the Old Testament, but if it does not enter into a community’s life it becomes essentially dead.  And if the Old Testament is not a living presence in the Christian community something truly essential will be lost.  As the great theologian Karl Barth insisted, the “‘language of Canaan’ is absolutely necessary if one wishes to speak precisely about—or better, to confess—the essence of the Christian faith” (#588).  The demise of  “the language of Canaan” has been supplanted, Strawn says, by three rival discourses—“the so-called New Atheism, Marcionites Old and New, and what I am calling, for lack of a better term, the New Plastic Gospels of the ‘happiologists’” (#601).  

To help us understand the seriousness of the situation, we’re given copious evidence in a lengthy chapter regarding the lack of OT knowledge in the American populace and in the sermons, hymnody, and lectionary of the churches.  A careful Pew study in 2010 actually found that:  “‘Atheists and agnostics, Jews and Mormons’” score higher than “‘evangelical Protestants, mainline Protestants and Catholics on questions about the core teaching, history and leading figures of major world religions’” (#687).  More specifically, barely half of the populace knows that “the Golden Rule is not one of the Ten Commandments” and less than half can identify the four Gospels.  Analyses of sermons preached in American churches reveal a distressing absence of biblical content, and the hymns (and especially “praise choruses”) have recently replaced biblical messages with experiential expressions, cheapening the worship services by eliminating crucial aspects of Christian doctrine.  In churches using lectionaries, which require the reading of several biblical texts in each service, the Old Testament has been deemphasized.    

To demonstrate why these data should concern all of us, Strawn expands upon his analogy between languages and Scripture.  Throughout history languages have grown and died.  In certain instances abbreviated or seriously altered “pidgins” develop, wherein  something of the original language persists.  Pidgins, however, can easily slide into “Creoles” which are in fact new languages.  With regularity languages pass away, following a period of repidginization “when a generation of speakers stops communicating its language on a regular basis to its children” (#1516).  Youngsters identifying with another culture generally resist learning or using the out-dated language spoken by their ancestors.  The language dies, and “when a language dies, a great deal is thus lost—and on more than one level.  Languages are repositories of life:  ‘They . . . contain our history’” (#1556).  The Scriptures, and the theological formulations rooted in them, are the language of the Church.  When we stop speaking scripturally we lose our history—and, in time, the substance of our faith.  If her language dies, the Church too will perish.

Consequently, there are alarming “signs of morbidity” crying out for our attention.  The “New Atheism” is one of those signs.  Reading Richard Dawkins, a biologist who ventures into theological terrain, it becomes quite clear that he is “no Bible scholar,” to put it mildly.  Point by point Strawn shows how Dawkins misrepresents (at times ludicrously) biblical passages he cites.  Inasmuch as his atheism rests upon a repudiation of Christianity, he focuses on snippets of scripture he dislikes and frequently rejects what he fails to understand!  In interesting ways Dawkins joins the “Marcionites Old and New,” duplicating the approach of a second century heretic, Marcion, in his effort to eliminate the Old Testament (with its lawgiving, wrathful God).  Marcion also deleted significant sections of the New Testament from the Christian canon, endeavoring to distill a pure gospel of love.  He was pidginizing the language of the Church!  Interestingly enough, the acclaimed German Church historian and devotee of Protestant Liberalism, Adolf von Harnack, openly sided with Marcion!  

Finally, the “New Plastic Gospels:  The Happiologists” provide yet another indicator of “morbidity” in the Church.  Listening to them—preachers of the “prosperity gospel” such as Norman Vincent Peale and Joel Osteen or authors such as Bruce Wilkinson in The Prayer of Jabez— is rather like hearing “babblers in the nursery, not Shakespeare and not Einstein” (#2606).  In terms of his linguistic analogy, Strawn argues the happiologists speak a “brand-new creole” dialect so different from traditional Christianity as to constitute a new faith.  Overflowing with optimism, the happiologists ignore some of life’s stubborn realities, including sickness and death.  Sadly, as C.R. McDonnell says, ‘“The time when a dying believer needs his faith the most is when he is told that he has it the least. . . .  Perhaps the most inhuman fact revealed about the Faith movement is this:  when its members die, they die alone’” (#2995).  The very existence of this “Faith movement” bodes ill for the Unchanging Faith of the Christian Tradition.  

To help rectify the situation, Strawn provides “recommended treatment” as he closes his treatise.  He finds the answer by showing how Hebrew—the only “dead” language that has been successfully revived—now thrives in the nation of Israel.  Much hard scholarly work was needed for this to occur and the people themselves have had to support it.  So too Christians must rededicate themselves to learning and speaking Scriptures as a second language!  As to how to do this, carefully studying Deuteronomy provides important clues—we need to routinely read the Old Testament and incorporate it into sermons, lectionaries and hymns, fully aware that it is for many folks truly a “second language.”  

The Old Testment Is Dying nicely blends deep scholarship with easily-read presentation and merits much attention.    

* * * * * * * * * * * * * * * * * * * * * * * 

In Unchanging Witness:  The Consistent Christian Teaching on Homosexuality in Scripture and Tradition (Nashville:  B&H Academic, c. 2016), S. Donald Fortson III and Rollin G. Grams treat the issue of homosexuality as a slice of  the significant “crisis of authority regarding the place of Scripture and the church’s witness in theology and ethics” (#284).  Just as abortion is about more than abortion, so too same-sex activity is about more than sexuality.  To the authors, “the challenge of the pro-homosexuality advocates in parts of Western Christianity extends beyond their view of homosexuality.  These advocates not only challenge the orthodox teaching of the church through the centuries; they also challenge scriptural authority, the Bible’s teaching on human sinfulness, the work of Christ on the cross, and the transformative power at work in believers’ lives through Christ and the Holy Spirit.  In a word, these advocates challenge the essence of the gospel” (#7714).  Champions of homosexual “rights” and “same-sex marriage” inevitably ignore or repudiate traditional condemnations of their views.  “This book is our call back to reality.  We issue that call by saying what God has said in his Word by presenting what the church has affirmed throughout its history” #298).  In short:  “Both the teaching of the Bible and the teaching of Christian tradition have uniformly taught the same thing:  homosexual practice is sinful” (#341).  

To tackle their task, Fortson and Grams first chart the historical development of the gay rights movement, rooted in the revolutionary ferment of the 1960s.  Gay activists early determined to infiltrate institutions—especially the media, schools, and liberal religious denominations.  Traditionalists who opposed them were intimidated and labeled intolerant bigots.   Sympathetic scholars such as John McNeill in The Church and the Homosexual provided tendentious treatises justifying a “new” understanding of Scripture and Tradition.  Thus,  McNeill declared, the “sin of Sodom” was actually inhospitality, not sodomy!  A Yale historian, John Boswell, often “called the patron saint of gay Christians,” published Christianity, Social Tolerance, and Homosexuality in 1980, asserting:  “‘The early Christian church does not appear to have opposed homosexual behavior per se’” (#673).  To refute such blatant untruths is the purpose of Unchanging Witness.  Massive, carefully cited evidence from the Church Fathers, providing ancient guidelines for ethical living, includes an unswerving condemnation of sexual sin.  “Early Christians condemned all practices that involved members of the same gender participating in sexual acts with one another.  This included pederasty, male dominance/rape, effeminacy, lesbianism, male homosexuality, transsexuality, prostitution, temple prostitution, orgies, and homosexual ‘marriages’” (#742).  Contra McNeill, “The sin of Sodom, clearly identified as homosexual practice, was often cited as a sin against nature and one upon which God has poured and will pour out his wrath” (#917).  

Throughout the Middle Ages this condemnation persisted, and (contra Boswell) sodomy was frequently ranked as the most heinous sexual sin, frequently “paired with bestiality as an ‘irrational’ sin” (#1089) requiring the most severe forms of penance.  Hildegard of Bingen, one of the great Medieval mystics, spoke for her era, saying:  ‘“A man who sins with another man as if with a woman sins bitterly against God and against the union with which God united male and female.  Hence both in God’s sight are polluted, black and wanton, horrible and harmful to God and humanity, and guilty of death; for they go against their Creator and His creature, which is in them’” #1346).  Little regarding such condemnation changed during the Renaissance and Reformation, wherein the “unspeakable sin” received censure and punishment in both Catholic and Protestant realms.  

In the 20th century, however, the “unchanging witness” of the Church wavered as homosexual activists and their allies sought to justify their lifestyle.  Without equivocation the Catholic Church, following popes John Paul II and Benedict XVI, upheld its traditional position, inviting thereby “vicious” attacks by “Western cultural elites” (#2008).  So too, both globally and in the U.S., Orthodox churches have remained true to Tradition.  Many Evangelical denominations, (including the Missouri Synod Lutherans, Southern Baptists, the Presbyterian Church in America and the Church of the Nazarene) have remained equally firm.  “Most striking in evangelical statements concerning homosexual practice is the unequivocal commitment to Scripture as the final word on the subject.  Specific biblical texts are cited in most of the statements.  An underlying assertion of evangelicals is that the Old and New Testaments comprehensively and consistently condemn homosexual practice as sinful before God” (#3041).  

But more liberal (“mainline”) churches have readily embraced the gay-rights agenda.  Though their “official” statements may retain traditional views, in practice the clergy with the United Church of Christ and Episcopal Church have openly supported homosexual behavior.  Portentously, in 1989 Episcopal Bishop John S. Spong “ordained an openly gay man living with a partner” (#3192), and in 2003 Episcopalian “Gene Robinson, who left his wife and children to live with a homosexual lover, was elected as a bishop in New Hampshire” (#3227).   Liberals (almost always clergy imposing their opinions on a much more conservative laity) orchestrated similar changes within the Presbyterian Church in the United States of America (PCUSA), the Evangelical Lutheran Church in America (ECLA), the American Baptists, and the United Methodists.  Inevitably, they appeal to “a fictitious Jesus, the welcoming and affirming prophet who would never turn away anyone or call people to repentance and self-denial” (#3472).  “By denying God’s decrees in one way or another, several entire denominations have baptized themselves not into Christ but into Western culture” (#7619).  Thereby, to cite the great theologian Wofhart Pannenburg:  “‘If a church were to let itself be pushed to the point where it ceased to treat homosexuality activity as a departure from the biblical norm, and recognized homosexual unions as a personal partnership of love equivalent to marriage, such a church would stand no longer on biblical grounds but against the unequivocal witness of Scripture.  A church that took this step would cease to be the one, holy, catholic and apostolic church’” (#3487).  

After assessing the historical and contemporary record, Fortson and Grams carefully and at considerable length examine all Scriptural passages relevant to the issue.  Unfortunately, imaginative interpretations, “unsupported by academic research, are touted by scholars writing in favor of homosexuality” (#705).  Contrary to the alluring claim that Scripture sets forth a simple, situational “ethic of love,” both Old and New Testaments clearly undergird the Church’s historical condemnation of homosexual activity.  Only inexcusable ignorance of the biblical texts or fallacious (special pleading) reasoning explains how modern “exegetes” in seminaries and churches justify same-sex sodomy!  Jesus did indeed promote a “love ethic,” but it was an ethic deeply rooted in the Mosaic Law and the Prophets!  Any “love” that circumvents the Commandments fails to qualify as Christ-like love.  “Neither can Paul’s ethics be reduced to a principle of love” (#3632), for he demonstrably looked to the Old Testament for guidance in prescribing Christian conduct.  Nothing could be clearer than the message of St. Jude in his first-century Epistle, reminding his readers that:  “Sodom and Gomorrah, and the cities around them in a similar manner to these, having given themselves over to sexual immorality and gone after strange flesh, are set forth as an example, suffering the vengeance of eternal fire” (v. 7).  

* * * * * * * * * * * * * * * * * * * * *

Geoffrey Kirk was an Anglican priest who entered the Roman Catholic Church a decade ago following his unsuccessful effort to resist the ordination in women in the Church of England.  Concerned that there was little serious scholarship available, he seeks to provide an in-depth biblical and historical analysis in Without Precedent:  Scripture, Tradition, and the Ordination of Women (Eugene, OR:  Wipe & Stock, c. 2016), setting forth his reasons for preserving a male-only priesthood—which is not to say he opposes women-in-ministry so long as they do not seek ordination.  He does so, fundamentally, because he believes such is the will of Jesus Christ, the Head of the Church.  Christianity is, after all, “a religion centered upon a God who became a man (worse still, a ‘Father’ who sent his ‘Son’)—and one whose every sacred text and whole history was mired in perennial patriarchy” (p. 7).  To make such a faith compatible with “Christian feminism” involves such linguistic contortions and historical revisions as “to require the elimination of every last vestige of Christian doctrine.  Incarnation, Atonement, Final Judgement, Hell and Heaven:  all must go” (p. 8).   Indeed, carefully attending to much that’s taught in today’s churches, “the governing myth is now, not of Fall and Redemption, but of self-awareness and personal fulfillment.  The existential question posed by the old story was:  ‘How shall I be saved?’  The leading question under the new dispensation is:  ‘How can I be happy’” (p. 46).  

Inasmuch as (according to Judith Lorber) “‘the long term goal of feminism must be no less than the eradication of gender as an organizing principle of post-industrial society’” (p. 27), it cannot be easily reconciled with the ancient Judaeo-Christian mindset evident in Jesus.  That He embraced that worldview stands evident when he chose his disciples, for:  “In choosing twelve males to figure the reconstituted twelve tribes, descended from the twelve sons of Israel, Jesus was consciously employing—and reinforcing—the patriarchal language of a world-view very far from that of modern feminism” (p. 43).  Neither Jesus nor anyone else in the New Testament world seemed concerned with what is today called “gender inclusivity.”  To do so “would have meant reversing the cultural norms both of the culture which gave Jesus birth and the society into which the church was born.  How could Christians have embraced it, if they had no specific dominical authority for it—no word from the Lord?” (p. 47).  

Rather than looking to the Word of the Lord Jesus for guidance, modern feminism stands rooted in the revolutionary ideology of the Enlightenment, with its optimism regarding the transformation of human nature through societal change.  “Since the end of the seventeenth century liberal Christians have been engaged in a self-destructive program of assimilating the content of the scriptures to the insights of the Enlightenment.  The presuppositions of a post-Christian—often anti-Christian—culture have been imposed upon authors who were ignorant of them, and whose own presuppositions were radically different” (p. 64).  Thus we find feminists railing against St. Paul as a misogamist for maintaining the traditional Jewish restriction of liturgical activity, in both temple and synagogue, to males.  To Kirk “there is something tragic in the notion of accusing Paul for not campaigning for the ‘rights of man and of the citizen.’  It is not merely an anachronism, it is an insult,” for he “was aiming not at social justice, but sanctity” (p. 58).  Then we find a certain “Junia” mentioned by Paul in his letter to the Romans suddenly elevated to apostolic authority by evangelical feminists!   And Mary Magdalene now appears as an apostle, a forerunner of the “apostola Apostolorum”!  

Kirk shows how nothing in the texts, nor Christian tradition, justifies such assertions, but feminists following Harvard’s Elizabeth Schussler Fiorenza exercise their “imagination” when writing history, pretending to find historical examples of duly ordained, female clergy.  Never let demonstrable facts interfere with the story you’re determined to tell!  And this new story, replete with assurances that female deacons and presbyters were ordained in the early centuries, has gained currency in many denominations.  Doing so, Kirk insists, aligns them with a variety of heretical, gnostic-rooted, movements but runs counter to the Scripture and Tradition basic to Christianity.  Whatever case for the ordination of women you choose to make, it cannot be justified by Scripture and Tradition! 

299 Absolute Power Corrupts, Absolutely– Exhibit A: LBJ

       Lord Acton (Sir John Dalberg-Acton), one of the most learned 19th century historians (allegedly knowing everyone worth knowing and reading everything worth reading), famously declared, in a letter to an Anglican clergyman:  “Power tends to corrupt and absolute power corrupts absolutely.  Great men are almost always bad men, even when they exercise influence not authority.”  Acton’s insight clarifies the careers of dictators such as Napoleon and Stalin, but it also stamps the trajectories of democratically-elected politicians from Rome’s Republic (e.g. Tiberius and his brother Gaius Graachus) to today’s USA (e.g. Woodrow Wilson and Franklin D. Roosevelt, whom Vice President (“Cactus Jack”) John Nance Garner, branded a “power-hungry dictator”).  That majoritarian democracies easily suppress liberty can be routinely demonstrated, and many 19th century thinkers feared what Gerald Massey described as “the tramp of Democracy’s earthquake feet.”  In short, as arguably the greatest student of democracy, Alexis de Tocqueville, said, egalitarian movements too often illustrate Richelieu’s quip that leveling the “surface facilitates the exercise of power.”  

  Though some of his predecessors clearly evidenced power’s allure, Lyndon B. Johnson impressed all who knew him as insatiably addicted to its toxin.  So Robert A. Caro, writing the definitive biography of Johnson—The Years of Lyndon Johnson—included the word “power” in each of his four volumes (the fifth volume, covering the presidential years, has yet to be published).  And the same word—“power “—stands out in one of LBJ’s most controversial critiques, A Texan Looks at Johnson:  A Study in Illegitimate Power, by J. Evetts Haley.   When it was published, in 1964, I was mid-way through my graduate studies in history at the University of Oklahoma and accepted its widespread dismissal as a “hatchet job” lacking substance,—a malicious verbal vendetta motivated by petty animosities written by one of LBJ’s enemies.  Nevertheless, since Haley had taught at the University of Texas and published some respected monographs on Western history (e.g. the XIT Ranch and a biography of Charles Goodnight) I acquired and read the book.  

As a budding historian, I checked Haley’s footnotes and found ample citations—though mostly taken from newspapers rather than documents.  At that time, of course, the “primary sources” historians seek were unavailable, and I considered A Texan Looks at Lyndon as more a “Philippic” than serious scholarship.  Nevertheless, old Demosthenes’ “Philippics,” warning against Philip of Macedon’s ambitions, certainly contained important truths regarding the tyranny-to-come under his son, Alexander the Great.  And I remembered Cicero’s stirring orations, also called “Philippics,” denouncing Julius Caesar’s dictatorial ways as a threat to the Republic.  Similarly, Haley’s attack on Johnson raised serious questions concerning the president’s character.  In time I learned that Haley was not only an historian but personally quite active in Texas politics, running unsuccessfully for a both a seat in the U.S. House of Representatives and Governor of the state.  He knew, first-hand, the notable political figures of his day, including LBJ and his associates.  In that sense, he was an eye-witness of many events as well as a scholar recording them.  That Haley (a Texas conservative) and Caro (a New York City liberal) basically agree in assessing LBJ’s character certainly suggests the accuracy of their conclusions.      

Just recently, as I finished the fourth volume Caro’s The Years of Lyndon Johnson, I reread A Texan Looks at Lyndon to see if Haley’s claims appear credible 50 years later.    Haley’s biographer, Bill Modisett, recently declared that “none of the assertions contained in the book have ever boon proven wrong and all of them have been verified through publications since that time.”  More importantly, what I found that Haley rather rightly discerned what one most needs to find out about someone:  his essence, his character.  Indeed, Haley’s assessments are even more amply evident in Caro’s less polemical and exhaustively documented work, which began with The Path to Power (New York:  Vintage Books, c. 1981).  Immersed in all the available details, Caro found a “dark” current within LBJ, “a hunger for power in its most naked form, for power not to improve the lives of others, but to manipulate and dominate them, to bend them to his will.”  His was a “hunger so fierce and consuming that no consideration of morality or ethics, no cost to himself—or to anyone else—could stand before it” (p. xix).  He had “a genius for discerning a path to power, an utter ruthlessness in destroying obstacles in that path, and a seemingly bottomless capacity for deceit, deception and betrayal in moving along it” (p. xx).  As Johnson himself declared:  “I do understand power, whatever else may be said about me.  I know where to look for it, and how to use it.”   Though often unreported, there here is, as Joachim Joesten’s 2013 treatise claims, The Dark Side of Lyndon Baines Johnson.

LBJ’s path to power began in central Texas’s “Hill Country,” where young Lyndon experienced poverty and shame as a child.  His parents were exemplary folks, but their young son (early determined to “be somebody”—to be President, in fact) decided living virtuously brought few rewards and quite early turned to scheming and manipulating to gain his goals.  Determined to get an education, he attended the only college in the Hill Country—Southwest Texas State Teachers College in San Marcos, accredited only four years before he arrived in 1927.  Before graduating, needing money, Johnson secured a teaching position in Cotulla, a small town 60 miles form the Mexican border populated mainly by Mexicans.  He poured enormous energy into his teaching tasks and showed genuine concern for his students, illustrating another character trait that persisted throughout his life—a concern for the impoverished and disadvantaged.  After replenishing his funds, LBJ returned to San Marcos to finish his college studies.   Though he studied sufficiently to succeed as a student, he seemed to major in campus politics and student affairs, early on manifesting one of his most obvious traits:  “obsequious to those above him, he was overbearing to those where were not” (p. 153).  Still more, he was notoriously untrustworthy, “snaky all the time” (p. 188).  Unable  to tell unvarnished truth about even the most innocuous subject” (p. 156), he even stole an election that he won by one vote.  Assessing LBJ’s college years, Caro concludes:  “The methods Lyndon Johnson used to attain power on Capitol Hill were the same one he had used on College Hill, and the similarity went far beyond the stealing of an election” (p. 199).  

Following his graduation, Johnson took a job in Houston’s Sam Houston High School, where he taught speech and coached the debate team to victory in the state championship.  Soon after beginning his second year of teaching, however, he accepted the position of private secretary for Richard Kleberg, recently elected to the U.S. House of Representatives, and departed for the nation’s capital, arriving in December 1931.  He was, Caro says, “on his way.”  His boss, Congressman Kleberg, one of the richest men in Texas, took little interest in the daily duties of his position.  He was a generous, kindly man, but very much a playboy disinclined to work.  So young Lyndon took control of the office—managing the staff, answering the mail, raising funds, dealing with constituents’ requests, doling out offices.  He quickly mastered the intricacies of D.C. politics, adeptly mastering the means whereby one climbed the ladder of success.

To do so he followed the pattern evident when he was in college—especially by ingratiating himself with older, powerful men.  In Washington, none was more powerful than Sam Rayburn, the Speaker of the House, and it was Rayburn who arranged for LBJ (at the age of 26) to be appointed director of the National Youth Administration in Texas.  Returning to his home state, Johnson selected the staff, directed the funds, and hired the workers as the NYA prescribed.  In the process, he made the contacts and built the political machine he needed to further his own ambitions.  Most significantly, he helped a Houston businessman, Herman Brown, build the Marshall Ford Dam in the Hill Country, which brought electricity to that region.  In years to come, money flowing from Brown & Root’ apparently bottomless coffers would play a truly major role in LBJ’s ascent to national stature.  Throughout Johnson’s career, men carrying envelopes stuffed with $100 bills shuttled to-and-fro, cementing important deals on his behalf!    

When, in 1937, Congressman James P. Buchanan died, Johnson saw his opportunity to replace him, though he was largely unknown in the district dominated by the city of Austin.  Fortunately for Johnson, however, President Roosevelt was not only known but wildly popular.  So LBJ ran a very simple campaign:  “Roosevelt.  Roosevelt.  Roosevelt.  One hundred percent for Roosevelt.”  He also knew how much money matters in elections, and he secured lots of it.  With money he enlisted party leaders and staged extravagant barbecues that attracted hundreds of voters.  With lots of money he could also buy lots of votes, which he did.  He campaigned furiously throughout the district, exuding an air of “friendliness and sincerity, a love of people” (p. 415).  And, thanks to his support in rural areas, he won!   Then the young congressman, the protege of Speaker Rayburn, quickly made his mark in Washington, ingratiating himself with President Roosevelt, funneling New Deal funds to his district, and expanding his own political base.  “A hallmark of Johnson’s career,” Caro says, “had been a lack of any consistent ideology or principle, in fact of any moral foundation whatsoever—a willingness to march with any ally who wold help his personal advancement” (p. 663).  Haley anticipated Caro’s verdict by noting that “the most exacting logician can search the Johnson utterances and public record and find no conclusive evidence of dedication to any eternal verity; no statement of basic spiritual belief; no yardstick based on goal principle by which his personal life is guided, or by which public policy is measured and determined” (p. 15).  

LBJ’s ever-evident ambition prompted him to run for the U.S.Senate in 1942, unsuccessfully waging “the most expensive campaign in the history of Texas” (p. 718).  Failing in that endeavor, he ran again in 1948, garnering fame as “Landslide Lyndon” by virtue of his narrow (87 votes after 200 votes were added to Johnson’s total six days after the election!) victory over Governor Coke R. Stevenson, one of the state’s most honorable and venerated public servants, known affectionately as “Mr. Texas,” who was, Haley says, “a close student of the Constitution” and “never voted for a tax bill.”  Still more:  “He was anathema to the ultra-liberal New Deal elements” supporting LBJ (p. 21).  In 30 pages Haley highlighted what took Caro 265 richly documented pages to detail!   But both writers find much about Johnson the man revealed in his ’48 election.  Telling various audiences whatever he knew they wanted to hear—and changing the message from place to place—he ingratiated himself with credulous voters.  Relentlessly spreading untruths, deliberately lying, Johnson tried to tarnish Stevenson’s integrity, and “no one,” Caro says, “could destroy a reputation better than Lyndon Johnson” (p. 253).  Egregiously misrepresenting his WWII activities, LBJ posed as a war hero, though is “war” experience amounted to flying as an observer on one flight over New Guinea.  

And most importantly, as Caro demonstrates in a chapter simply titled “The Stealing,” he entrusted  the “Duke of Duval,” George Parr, to pack the ballot boxes sufficiently to win the election.  Legal challenges threatening to reverse the verdict ultimately reached the U.S. Supreme Court, but thanks to good friends in high places (most especially Abe Fortas providing skilled advice and Justice Hugo Black refusing Stevenson’s petition) LBJ “won” the election and became Texas’s junior Senator.  Many years later, one of the men, Luis Salas, intimately involved in the fraud in Duval County, lamented his complicity, saying:  “Johnson did not win that election.  It was stolen for him” (p. 388).  In a manuscript he showed Caro, Salas detailed his involvement “in one of the most notorious scandals of politics that opened the road for L. B. Johnson to reach the presidency of this country’” (p. 392).  As he had done in earlier elections, beginning when he was a college student, LBJ cheated whenever necessary to win.  

Having won the election, Johnson became, as Caro titles the third volume of his biography, Master of the Senate (New York:  Vintage Books, c. 2002).  He entered a legislative body notorious for its traditions, which included slowly moving up the ranks, but he quickly found ways to control it, becoming his party’s leader within two years, then Majority Leader (the youngest in American history) when Democrats regained control of the Senate.  His was a “rise unprecedented in its rapidity” (p. xxii).  To do so he courted allies in the Democratic cloakroom, talking to liberals like a liberal, to conservatives like a conservative, “asserting both positions with equal, and seemingly total conviction” (p. xvi).   He knew how to distribute funds and favors and did so effectively.  He mastered and manipulated the arcane “rules” of the Senate to his own advantage.  And he could always, Hubert Humphrey noted, “size up” a man, sense his most vulnerable spots, calculating how to exploit him to get his way, for as Stuart Symington recalled, “there was a sort of cruelty there” (p. 571).  

Johnson also relied on his amazing skill to obsequiously ingratiate himself with older, powerful men.  In the Senate, none was more powerful than Georgia’s Richard Russell, who was deeply committed to the nation’s armed forces and farmers—and to maintaining segregation throughout his beloved South.  Committee chairmen were disproportionately from the South, and Russell ruled them.  LBJ flattered and befriended the frequently-lonely bachelor, inviting him home to feast on Lady Bird’s cooking (as he had earlier done with Speaker Sam Rayburn).  He suddenly developed a love for baseball, one of Russell’s passions.  He seemed to share all the elder man’s convictions and was soon the recipient of his favors.  So LBJ aligned himself with the senator from Georgia and became Majority Leader as a result.  In the Senate, Johnson followed a life-long pattern, appearing utterly “humble—deferential, obsequious, in fact” when accruing power.  Then “he became, with dramatic speed and contrast, autocratic, overbearing, domineering” (p. 1018).  He became, quite literally, The Master of the Senate!

In 1952 he enthusiastically supported Russell’s unsuccessful bid for his party’s presidential nomination, learning thereby that a Southerner could not become a national candidate.  So Johnson needed to cater to Northerners and become known as more than a “Southerner” without losing the support of Russell and his allies.  This meant, in the 1950s, delicately walking a tightrope when dealing with the growing issue of Civil Rights.   Given the Supreme Court’s 1954 Brown decision and Martin Luther King’s marches, there was a “rising tide” of discontent and agitation regarding segregation in America.  Growing up in the Hill Country, LBJ lacked much of the anti-black antipathy evident in men such as Richard Russell.  His brief stint teaching Mexican children had revealed his rather remarkable commitment to helping the disadvantaged.  So he could cautiously entertain supporting certain civil rights initiatives if he could do so without losing his political base in the South.  He could sincerely “help somebody” as long as it didn’t hinder his driving desire to “be somebody.”  That “somebody” he wanted to be was President!  To attain that objective he needed to burnish his image in the North, so he needed to get some sort of civil rights legislation passed by the Senate, and he finessed a plan in 1957 that persuaded enough Senators to vote for a bill that would serve as a prologue to more vigorous legislation a decade later, when he would be President.

His passage to the presidency is detailed in the fourth volume of Robert Caro’s The Years of Lyndon Johnson, titled The Passage of Power (New York:  Alfred A. Knopf, c. 2012).  As the 1960 election approached, Johnson calculated his chances of winning his party’s nomination for the presidency.  Uncharacteristically indecisive, he delayed and wavered, testing the waters to see how likely he was to succeed.  Hoping that none of his rivals would secure the nomination in the primaries, LBJ pinned his hopes on triumphing in a deadlocked convention, but the delegates instead nominated John F. Kennedy.  Then, unexpectedly (perhaps pressured by Sam Rayburn), JFK asked Johnson be his vice presidential running mate.   Equally astounding, LBJ assented.  According to one of is aides, Johnson declared:  “‘Power is where power goes . . . [and] I’ll still control the Senate’” (p. 109).  Others, including FDR’s vice president, John Nance Garner, might have thought the position worthless, but LBJ imagined it might open to him further opportunities.   According to Clare Booth Luce, who asked him why he accepted the nomination, “he replied:  ‘Clare,I looked it up; one out of ever four Presidents has died in office.  I’m a gamblin’ man, darlin,’ and this is the only chance I got’” (p. 115).  On the campaign trail, his assignment was clearly to establish beachheads for Kennedy in the South (and most especially in Texas).  Consequently, Nixon won only three southern states and JFK won one of the closest elections in American history.     

Prior to 1961, vice presidents were largely ceremonial figures, but Lyndon Johnson resolved to turn the position into one of power, working both within the White House (unsuccessfully trying to slip into Kennedy’s inner circle) and the Senate (unsuccessfully scheming to retain his position as presiding officer of Senate Democrats and thus voiding the Constitution’s commitment to a clear separation of powers).  In fact, he found himself very much an outsider, excluded from exercising the power he desperately sought, derisively labeled “Rufus Cornpone” by the New Frontiersmen who now ran the country.   When important decisions were made, such as responding to the Cuban Missile Crisis, the vice president was not consulted.  Sophisticated socialites in D.C. smirked while wondering “whatever became of Lyndon Johnson?”  According to Jacqueline Kennedy, her husband distrusted Johnson and “‘was truly frightened’” at  “‘what would happen if LBJ ever became President’” (p. 224).   Brother Bobby truly hated Johnson; he detested liars and said Johnson “‘lies all the time . . . lies continually about everything’” (p. 230).  Bobby “despised the way Johnson treated subordinates” and considered him a “‘mean, bitter, vicious’” man (p. 580).  Rumors freely circulated that Kennedy might choose another running mate when running for reelection in 1964).  

Then came November 22, 1963, when the President Kennedy was assassinated in Dallas.  In an instant LBJ attained his life-long ambition, becoming President of the United States.   He hadn’t been aboard Air Force One on the President’s fight to Dallas (excluded as usual from the inner circle), but now it was his to command.  No longer a supplicant hoping for handouts from the Kennedy clan, he demanded Jacqueline fly back with him in the plane carrying her husband’s body, symbolizing thereby Johnson’s legitimacy as his successor.  He also decided, inasmuch as possible, to retain Kennedy’s appointees (including Bobby as attorney general) to facilitate his transition to the nation’s highest position.  Knowing he needed to secure his party’s liberals’ support, he vowed to carry through on many of JFK’s legislative goals—most notably civil rights—and succeeded in ways impossible for Kennedy.  And he scrupulously avoided making necessary decisions regarding America’s involvement in the war in Vietnam, fearing it would jeopardize his reelection in 1964.  “To watch Lyndon during the transition is to see political genius in action” (p. xvi).  

Johnson’s power-quest succeeded.  And Caro’s massive study amply illustrates Haley’s judgment:  his love for “money and power” corrupted whatever he touched, both people and institutions.  While assuring Americans (in his 1964 State of the Union message) that he was committed to “utmost thrift and frugality,” he proceeded to impoverish the nation’s treasury with his “war on poverty”—grandiosely promising “better schools, and better health, and better homes, and better training, and better job opportunities to help more Americans, especially young Americans, escape from squalor and misery and unemployment,” to “not only relieve the symptom of poverty, but to cure it and, above all, to prevent it.”   Promising to keep the nation out of war, he secretly planned to precisely that.  Pretending to be a conservative when it brought him votes, he voted as a liberal and enacted deeply socialistic policies bringing him power.   Details regarding LBJ’s presidency await Caro’s final volume, but the verdict seems quite clear:  “power tends to corrupt, and absolute power corrupts absolutely.”  And as Exhibit A there stands LBJ!  

# # # 

298 Forensic Faith

    J. Warner Wallace is a retired Los Angeles County detective, noted for appearing repeatedly on NBC’s Dateline, Fox News, and Court TV, explaining thereon how to conduct “cold case” investigations.  Until he was 35 years old, he was an atheist, a religious skeptic skilled in dissecting and mocking Christian beliefs.  But, as he says in a brief booklet entitled Alive:  A Cold-Case Approach to the Resurrection (Colorado Springs:  David C. Cook, c. 2014), he “heard a pastor preach a sermon that described the resurrection of Jesus,” apparently believing “Jesus rose from the dead and was still alive today” (#6).  His interest piqued, Wallace “decided to investigate the resurrection as I would any unsolved case from the distant past.  My journey led me out of atheism to the truth of Christianity” (#12).  Subsequently, he began reading the Gospels in light of the principles basic to his work as a homicide detective.  Looking back, he recalls:  “Somewhere on my journey from ‘belief that’ to ‘belief in,’ a friend told about C.S. Lewis.  After reading Mere Christianity, I purchased everything Lewis had written.  One quote from God in the Dock struck with me through the years.  Lewis correctly noted, ‘Christianity is a statement which, if false, is of no importance, and if true, is of infinite importance.  The one thing it cannot be is moderately important’” (#153).  

Though many great apologists (from Tertullian to Calvin to David Limbaugh) have had legal training, detective Wallace brings a unique background to his work, for:  “There are many similarities between investigating cold cases and investigating the claims of Christianity.  Cold-case homicides are events from the distant past for which there is often little or no forensic evidence” (#136).  In Cold-Case Christianity:  A Homicide Detective Investigates the Claims of the Gospels (Colorado Springs:  David Cook, c. 2013), Wallace effectively uses examples from his detective days to explain “ten important principles every aspiring detective needs to master” and then shows how they enable serious investigators to validate the New Testament’s claims.  

First of all, “don’t be a know-it-all”!  Good detectives approach the evidence with objective  humility, refusing to follow either their own presuppositions or rapid-response simple solutions.  Lawyers and judges  constantly ask jurors to disregard their biases and examine the evidence before them.  In his atheist days, Wallace took for granted the philosophical naturalism pervading our intellectual milieu, automatically rejecting miraculous or spiritual realities.   But once he set aside his presuppositions, looking carefully at the evidence presented by the Gospels, he began to see the strength of their claims.  Secondly, detectives “learn how to ‘infer,’” to follow a chain of evidence—to reason abductively.  There’s a great difference between engaging in possible (i.e. imaginable) theories and reasonable (i.e. logical) thought.  The truth that  convicts a felon or persuades a historian will be feasible, straightforward, exhaustive, logical, and superior to competing theories.  Weighing the central Christian claim, that Christ arose from the dead, while considering various explanations for the account, Wallace found himself compelled to infer that something supernatural must have occurred.  Nothing else makes sense.  Thus he also discovered that faith does not negate reason; in fact, “faith is actually the opposite of unbelief, not reason.  As I began to read through the Bible as a skeptic, I came to understand that the biblical definition of faith is a well-placed and reasonable inference based on evidence” (#678).  

Principle #3 is:  “think circumstantially.”  Evaluate events and witness’s accounts in their totality, realizing that bits and pieces of apparently unrelated materials often fall into a coherent picture when put together.  We’re tempted to disregard evidence that is “merely circumstantial” as somehow lacking significance.  But California judges, following the state’s Criminal Jury Instructions, routinely emphasize:  “Both direct and circumstantial evidence are acceptable types of evidence to prove or disprove the elements of a charge, including intent and mental state and acts necessary to a conviction and neither is necessarily more reliable than the other.  Neither is entitled to any greater weight than the other.”  Courtroom convictions, especially in cold-case trials, are frequently made purely on the basis of “circumstantial evidence.”  It’s not necessary to have an eye-witness to a crime to convict the perpetrator!  (In fact, circumstantial evidence is often better than an eyewitness, since it cannot lie!)  Similarly, when one considers various cosmological data, it seems reasonable to conclude there’s a Creator responsible for the universe.  “The cumulative circumstantial case for God’s existence is much like the circumstantial case we made in our murder investigation” (#946).  

Witnesses, of course, are vitally important to detectives, juries and judges.  But it’s important, following Wallace’s fourth principle:  “test your witnesses.”  Information gained from a witness is invaluable, but only if he is trustworthy!  Skilled  detectives master the art reading witnesses.  Subtle clues, both in his words and physical mannerisms, often lead the investigator to believe or disbelieve what’s said.  Four “critical areas should be examined . . . .  If we can establish that a witness was present, has been accurate and honest in the past, is verified by additional evidence, and has not motive to lie,we can trust that that witness has to say” (#1101).  Multiple witnesses are even more persuasive—especially if they differ on trivial matters while concurring on essential facts.  “I would far rather have three messy, apparently contradictory versions of the event than one harmonized version that has eliminated some important detail.  I know in the end I’ll be able to determine the truth of the matter by examining all three stories” (#1121).  Reading the Gospels, with their slightly different perspectives, confirmed for Wallace their truthfulness!  “All four accounts are written from a different perspective and contain unique details that are specific to the eyewitnesses.  There are, as a result, divergent (apparently contradictory) recollections that can be pieced together to get a complete picture of what occurred.  All four accounts are highly personal, utilizing the distinctive language of each witness” (#1239).  

Importantly, when interrogating witnesses, “hang on every word”—the fifth forensic principle.  Carefully recording and pondering a witness’s words often makes the difference between skilled and run-of-the-mill detectives.  When Wallace studied the New Testament, he invested much time investigating its words.  “every little idiosyncrasy stood out for me.  Every word was important.  The small details interested me and forced me to dig deeper” (#1405).  It became obvious to him, for example, that Mark relied on Peter for his information.  Sixthly, good detectives “separate artifacts from evidence.”  Materials added to witnesses’ accounts must be considered “artifacts” and weighed less heavily when discerning what exactly happened.   Ancient documents, including the Gospels, include some artifacts, such as the account of the woman accused of adultery in John 7:53-8:11.  Careful scholarship enables one to disregard such artifacts as irrelevant to the investigation.  So too it’s important to note Principle #7—“resist conspiracy theories.”  Turning to the New Testament’s claim that Jesus arose from the grave, Wallace notes that various theories have been given to explain it, but a simple assent to the testimonies of men who died for this belief makes the most sense.

Principle #8 is:  “respect the ‘chain of custody.’”  Good detectives carefully document and preserve important evidence.  So too the Early Church took care to preserve the eyewitness accounts basic to the Christian faith.  Still more:  those ancient believers seemed to embrace Wallace’s ninth principle:  “know when ‘enough is enough.’”  Given the magnitude of its claims, the New Testament is remarkably brief!  In a courtroom, juries and judges look for sufficient, not overwhelming evidence.  What ultimately  matters is what’s called the “standard of proof.”  What we want to know is what’s reasonable, not theoretically possible.  Not everything than can be said needs to be said, and certainty in a trial necessarily comes without knowing every shred of evidence.  Perfection cannot be attained in a hall of justice!  Finally, cold case detectives must always “prepare for an attack” (Principle #10).  Skilled defense attorneys will try to disprove or discount detectives’ work.  When he became a Christian, Wallace understood atheistic arguments because he had once propounded them.  Listening to the “New Atheists” who gained popularity a decade ago:  “It wasn’t as though these skeptics were offering anything new.  Instead they were presenting old arguments with new vigor humor, cynicism, and urgency.  They were much like the defense attorneys I had faced over the years” (#2241).  Dealing with them as a Christian apologist, once must, above all, insist upon the objectivity of truth, in particular the historical reliability of the Gospels.  

Having introduced the reader to a detective’s guiding principles, Wallace proceeds to carefully consider various evidences pointing to the New Testament’s reliability, setting forth arguments familiar to students of apologetics.  Doing he shows how seriously he has studied both the Scripture and the Early Church.  Though neither a biblical scholar nor an ancient historian, he has clearly consulted the texts and sought to rightly understand them.  Thus it’s reasonable to infer “from the circumstantial evidence . . . that the Gospels were written very early in history, at a time when the original eyewitnesses and gospel writers were still alive and could testify to what they had seen” (#2744).  Considering  corroborating evidence, found in both secular history and second century Christian documents, Wallace concludes that “we can have confidence that the essential teachings of the Gospels have remained unchanged for over two thousand years” (#4144).  

Over the years I’ve read dozens of apologists’ treatises.  Few of them fascinated me as much as this one, primarily because of its unique, detective’s perspective.  Wallace writes clearly, understands the contemporary world, and sets forth his case persuasively.    

                                                  * * * * * * * * * * * * * * * * * * * * *

Though police officers occasionally apprehend and arrest culprits at the crime scene, detectives are called in to study the evidence “inside the crime scene” and identify the suspect who is “outside the crime scene.”  Thus detective J. Warner Wallace followed up his initial work of apologetics—Cold-Case Christianity—with a fine treatise entitled God’s Crime Scene:  A Cold Case Detective Examines the Evidence for a Divinely Created Universe (Colorado Springs:  David C. Cook, c. 215).  To refer to creation as “God’s crime scene” might initially startle a reader, but the title makes sense when one sees a detective’s mind in action, trying to locate a person by looking at the evidence for his activity.  As was true in his earlier work, the materials presented are generally available in other works of apologetics, but Wallace provides the unique perspective of a skilled sleuth and effectively makes his case, pointing to someone “outside” the physical world responsible for its substance and structure.  

Detectives like Wallace “investigate causes.  Who caused this murder?  What motivated this suspect to commit this crime?  Criminal investigations are largely causal investigations.  Detective learn to ask good questions about causation to determiner the identity of a suspect” (p. 27).   What does the evidence suggest regarding whether or not a crime was committed?   So too, what cosmological evidence leads one to conclude that a Creator created the universe?  Wallace cites copious current data (i.e. the “Big Bang” Standard Cosmological Model, including amazing details regarding a “finely-tuned” universe) that cogently point to the fact “that our universe came into being from something beyond the space, time, matter and energy of our universe” (p. 37).  Assuming the principle of sufficient reason, we wonder “Why is there something rather than nothing?” and conclude Someone—“a purposeful Fine-Tuner”—made it.  “Inside evidence” cannot fully explain the existence of the universe.  “The evidence points to a cause outside of space,  time, and matter” (p. 44).  Evidence at a murder scene invariably points toward a murderer, not an accidental confluence of random events.  So too the amazingly fine-tuned cosmos provides evidence (“signs of design”) pointing toward an intelligent Artist orchestrating it all. 

Origin of life questions provide yet more reasons to believe in God.  Having investigated murders, Wallace fully understands the radical difference between living persons and deceased corpses.  We can easily detect and describe the difference between the living and the dead, and as we explore the mysteries of living organisms we find that “the complexity required for cells to metabolize and reproduce is mind boggling.  Cells are packed with miniature biological machines resembling (and often exceeding ) the best work of human engineers” (p. 72).   Though not a trained scientist, Wallace nicely explains what life scientists have found—amino acids, proteins, DNA etc.  As a detective he seeks to answer the “where, what, why, when, and how” life originated questions.  Purely naturalistic answers regarding the mystery of life’s origin prove ever ephemeral, especially when dealing with the vast amount of “information” basic to all that lives, for “the laws and forces of nature cannot produce information, but information is required for life to begin” (p. 321).    

As manifestly evident (and deeply mysterious) as the reality of life is the reality of human consciousness!  “Consciousness poses one of the most difficult conundrums for philosophers and scientists.  As philosopher David Chalmers lamented, ‘Conscious experience is at once the most familiar thing in the world and the most mysterious.  There is nothing we know about more directly than consciousness, but it is far from clear how to reconcile it with everything else we know.  Why does it exist?  What does it do?  How could it possibly arise from lumpy gray matter?’” (p. 122).  Though persons can see and touch physical things, their mental states, their minds, are known only to themselves.  In particular, we first intend to do things and then do them.  We think about things apart from ourselves, and those thoughts are purely mental, non-material, realities.  We’re also able to think logically, following a train of argumentation that cannot be reduced to the chemical reactions within the brain.  As the renowned philosopher Thomas Nagel recently wrote:  “‘So long as the mental is irreducible to the physical, the appearance of conscious physical organisms is left unexplained by a naturalistic account of the familiar type.  On a purely materialist understanding of biology, consciousness would have to be regarded as a tremendous and inexplicable extra brute fact about the world’” (p. 136).   

Naturalistic thinkers deny the existence of free will as well as consciousness.  Though he didn’t think clearly about this in his atheist days, Wallace’s deterministic philosophy actually undermined his legitimacy as a detective!  For without free will the criminal justice system has little justification.  If a killer couldn’t have avoided killing it’s hard to see why he should be punished for his “crime.”  But in the criminal justice system:  “Personal responsibility is assigned to every person who chooses to commit a crime when he or she could have chosen otherwise” (p. 141).  Forty years ago the Supreme Court decreed that “‘a deterministic view of human conduct’ was ‘inconsistent with the underlying precepts of our criminal system.’  In fact, the Court described ‘belief in freedom of the human will and a consequent ability and duty of the normal individual to choose between good and evil’ as the ‘“universal and persistent” foundation stone in our system of law, and particularly in our approach to punishment, sentencing, and incarceration’” (p. 147).  

Accompanying a belief in free will, detectives almost necessarily believe in “law and order”!  The laws they seek to uphold are society’s way of declaring and enforcing morality.  Laws are necessary because there are really evil people in our world!  They reflect the fact that written on the human heart is a deep awareness of the “natural law,” the notion that good should be done and evil resisted.  Wallace has found that even “hardened criminals” who break the law hold one another accountable to certain moral standards!  The man who kills another man’s wife will inevitably condemn anyone who kills the killer’s mother!  There are right ways—and wrong ways—of treating others.  Such standards are more than personal perspectives or fleeting emotional reactions.  They point to a higher, objective standard, a “transcendental moral truth giver,” a Lawgiver, an “all-powerful, non-material, nonspatial, atemporal, purposeful, personal Creator” whose laws reflect “His nature” (p. 172).   

Having carefully examined all the relevant evidence, Wallace concluded there is Someone responsible for the world we live in, ourselves included.  “I believe God exists because the evidence leaves me no reasonable alternative” (p. 201).  In jury trials, judges explain that verdicts should be based on a “Standard of Proof.”  Jurors can never be 100% sure when they’re making decisions, but they can with certainty conclude—“beyond a reasonable doubt”—that a suspect is guilty of a crime.  As a does a good detective, he also points us to “expert witnesses,” scholars who have shaped his presentation, providing us with a helpful, up-to-date reading list.  

Praise for God’s Crime Scene comes from distinguished writers such as Eric Metaxis, the bestselling author of Bonhoeffer, who says:  “What if a brilliant prosecutor tried to prove the existence of God using real evidence and crystal clear arguments?  Well, that’s precisely what J. Warner [Wallace] does in this magnificent book—and you get to be the jury.  Don’t blink.  Thrilling and amazing.”  Hank Hanegraff concurs:  “Sherlock Holmes has nothing on J. Warner Wallace.  In God’s Crime Scene, Wallace uses the tools of a world-class homicide detective to discern whether or not clues point in the direction of a Divine Intruder.  The reader can almost hear the words ‘Elementary, my dear Watson’ as Wallace evaluates the evidence for cosmic design.  A highly readable resource by which seekers and skeptics can follow truth toward its origins.” 

                                            * * * * * * * * * * * * * * * * * * 

In Forensic Faith:  A Homicide Detective Makes the Case for a More Reasonable, Evidential Christian Faith (Colorado Springs:  David C. Cook, c. 2017), J. Warner Wallace urges us to take the apologetic materials presented in his two earlier works and effectively use them as we interact with our increasingly secular culture.  All too many Evangelicals, says John Stonestreet, President of the Colson Center for Christian Worldview, dismiss such endeavors.  In his judgment:  “It certainly sounds spiritual to say things like, ‘Arguments never saved anyone,’ or, ‘No one is ever argued into the kingdom.’  Such are, however silly straw men” (#193).  In fact, most folks come to faith when they find good reasons to do so.  Truth matters!  “To be human is to reason, to reflect, and to ask questions about life and its meaning” (#198).  And, Stonestreet insists, “Christianity is really True.  With a capital T.  True for everyone, whether they believe it or not.  Christianity describes reality as it actually is” (#209).  

Policemen like Wallace are committed “to protect and to serve” the public.  It’s an honorable—indeed a sacred—calling.  To protect and serve our world, as C.S. Lewis did nearly a century ago, we Christians need to know what we believe and explain it clearly.  We need a “forensic faith,” a faith that can withstand public scrutiny and rigorous argumentation, for in addition to caring for the poor and homeless we need to provide Truth for hungry minds.  We need to become skilled “case makers,” committed to bearing witness to our Lord.  To do so we need solid biblical teaching—and Wallace provides a plethora of texts supporting his position—but we also need good training, learning how, as Origen said centuries ago, “to do battle for the truth.”  Parents and pastors, to protect and serve young believers, simply must engage them in activities designed to discipline them, to make strong disciples, able to withstand the challenges, the intellectual battles awaiting them.  Taking students on a “forensics” mission trip to UCLA may be more important than a “compassionate” mission trip to Mexico!  

To do so, using Wallace’s books—and downloading free materials from his website, ColdCaseChristianity.com—would be wise!  Inasmuch as far too many collegians forsake their religious views, youth pastors especially should ponder the case he builds for a Forensic Faith!      

297 Proof of Heaven

   Eben Alexander’s Proof of Heaven:  A Neurosurgeon’s Journey into the Afterlife (New York:  Simon & Schuster, c. 2012) is a fascinating, persuasive personal “life-after-life” account given credibility by the author’s medical training and cogent presentation.  After receiving his M.D. from Duke University Medical School, he pursued post-doctoral study and taught for 15 years at Harvard Medical School, operating on “countless patients” and becoming quite expert in dealing with brain injuries.  Though nominally religious (attending an Episcopal church at Christmas and Easter), he’d struggled with some personal issues and doubted the basics of the Christian faith, including the reality of “God and Heaven and an afterlife” (p. 34).  Believing, with Albert Einstein, that “a man should look for what is, and not for what he thinks should be,” he takes a scientific stance, determined to deal with the realities he encountered as a result of a “near-death” experience which forever“ changed his mind” regarding heaven.  

In 2008, at the age of 54, Alexander fell ill with bacterial meningitis—“arguably the best disease one could find if one were seeking to mimic human death without actually bringing it about” (p. 133)—and lapsed into a deep coma for seven days.  While his brain shut down completely—“it wasn’t working at all” (p. 8)—he encountered “the reality of a world of consciousness that existed completely free of the limitations of my physical brain” (p. 9).  Consequently, he concluded:  “My experience showed me that the death of the body and the brain are not the end of consciousness, that human experience continues beyond the grave.  More important, it continues under the gaze of a God who loves and cares about each one of us and about where the universe itself and all the beings within it are ultimately going.”  He now knows:  “The place I went was real.  Real in a way that makes the life we’re living here and now completely dreamlike by comparison” (p. 9).  Having encountered Ultimate Reality, he asserts:  “What I have to tell you is as important as anything anyone will ever tell you, and it’s true’ (p. 10).  

While Alexander was in the coma, doctors ran all the sophisticated tests modern science prescribes, preserving graphs and images of his damaged brain.  Though his brain showed no activity, he journeyed first into a dark “underworld filled with repulsive creatures and foul smells.  Then a light descended into the darkness and he heard “a living sound, like the richest, most complex, most beautiful piece of music you’ve ever heard” (p. 38).  Suddenly he was ushered into a beautiful new world—“The strangest, most beautiful world I’d ever seen” (p. 38).  “Below me was countryside.  It was green, lush, and earthlike.  It was earth . . . but at the same time it wasn’t” (p. 38).  He’d entered a really Real world!  A beautiful young “Girl on the butterfly Wing” joined him, giving him a “look that, if you saw it for a few moments, would make your whole life up to that point worth living, no matter what had happened in it so far” (p. 40).  (After he recovered, he received a picture of one of his deceased biological sisters—whom he’d never seen, even in a picture—and realized the “Girl” looked exactly like her!)  Without speaking she gave him a wonderful message:  “’You are loved and cherished, dearly, forever.’  ‘You have nothing to fear.’  ‘There is nothing you can do wrong’” (p. 40).  At that moment, Alexander felt “a vast and crazy sensation of relief.  It was like being handed the rules to a game I’d been playing all my life without fully understanding it” (p. 40).  He found his deepest questions answered, but not with words.  “Thoughts entered me directly” (p. 46).  He also felt himself immersed in the Reality of God.  Indeed, “there seemed to be no distance at all between God and myself.  Yet at the same time I could sense the infinite vastness of the Creator, could see how completely minuscule I was by comparison” (p. 47).  

Still more, he understood:  “The world of time and space in which we move in this terrestrial realm is tightly and intricately meshed within these higher worlds.  In other words, these worlds aren’t totally apart from us, because all worlds are part of the same overarching divine Reality” (p. 48).  Because of his illness, he’d taken a remarkable out-of-body “tour—some kind of grand overview of the invisible spiritual side of existence” (p. 69).  And, above all, he’d learned a priceless truth:  he—and we—are loved.  Every one of us!  “Love is, without a doubt, the basis of everything” (p. 71).  This truth is as certain to Alexander as any of the scientific truths necessary for his vocation as a surgeon.  “The unconditional love and acceptance that I experienced on my journey is the single most important discovery I have ever made, or will ever make, and as hard as I know it’s going to be to unpack the other lessons I learned while there, I also know in my heart that sharing this very basic message—one so simple that most children readily accept it—is the most important task I have” (p. 73).  

Applying his scientific understanding of the human brain—and the mind/brain/lconsciousness questions that have forever fascinated philosophers—Alexander tries to explain how the physical brain serves as a “kind of reducing valve or filter, shifting the larger, nonphysical consciousness that we possess in the nonphysical worlds down into a more limited capacity for the duration of our mortal lives” (p. 80).  We are, spiritually, in touch with an Ultimate Reality that we rarely sense because our brains too easily restrict  our consciousness to material realities.  But there is a vast, mysterious universe that is purposeful and spiritual.  Indeed:  “The physical side of the universe is as a speck of dust compared to the invisible and spiritual part” (p. 82).  We are primarily spiritual beings, designed and destined for eternal life with God.  “This other, vastly grander universe isn’t ‘far away’ at all.  In fact, it’s right here . . . .  It’s not far away physically, but simply exists on a different frequency.  It’s right here, right now, but we’re unaware of it because we are for the most part closed to those frequencies on which it manifests” (p. 156).  

When, after seven days, Alexander emerged from his coma, his family observed him smiling.  “‘All is well,’ I said, radiating that blissful message as much as speaking the words.  I looked at each of them, deeply, acknowledging the divine miracle of our very existence” (p. 113).  He was, miraculously, well!  “In fact—though at this point only I knew this—I was completely and truly ‘well’ for the first time in my entire life” (p. 123).  With each passing day his neuroscientist’s knowledge returned.  And so did his “memories of what had happened during that week out of my body . . . with astonishing boldness and clarity.  What had happened outside the earthly realm had everything to do with the wild happiness I’d awakened with, and the bliss that continued to stick with me” (p. 124).  Still more:  he was “also happy because—to state the matter as plainly as I can—I understood for the first time who I really was, and what kind of a world we inhabit” (p. 124).  

Above all, he’d encountered what’s really Real!  “What I’d experienced was more real than the house I sat in, more real than the logs burning in the fireplace.  Yet there was no room for that reality in the medically trained scientific worldview that I’d spent years acquiring” (p. 130).  His own experience led him to led to plunge “into the ocean of NDE [Near Death Experience] literature” (p. 131).  He found his experience amply confirmed by others!  Years earlier he’d heard about Raymond Moody’s Life After Life, but he’d neither read it nor considered its evidence.  Now he read it carefully and affirmed its contents.  But Alexander also realized that (compared with many other NDEs) his “was a technically near-impeccable near-death experience, perhaps one of the most convincing such cases in modern history.  What really mattered about my case was not what happened to me personally, but the sheer, flat-out impossibility of arguing, from a medical standpoint, that it was all fantasy” (p. 135).  

After a lengthy convalescence, Alexander made his way to church.  To his amazement, the music and architecture which had left him unmoved before his NDE now touched him deeply.  “At last, I understood what religion was really all about.  I didn’t just believe in God; I knew God.  As I hobbled to the altar to take Communion, tears streamed down my cheeks” (p. 149).  That heavenly realm he’d visited while in a coma was, in fact, the same realm celebrated in Christian worship.  Opening our minds to God in meditation and prayer ushers us into that eternal realm wherein we can directly communicate with God, knowing Him as He Is.

                                      * * * * * * * * * * * * * * * * * * *

Mark Twain once quipped:  “The two most important days in your life are the day you were born and the day you find out why.”  As he indicates in his book’s subtitle in Life After Heaven:  How My Time in Heaven Can Transform Your Life on Earth (New York:  WaterBrook, c. 2017), Steven R. Musick is less concerned with his own Near Death Experience than with encouraging us to live in the light of truths he discerned therein.  Musick begins by detailing his early life—growing up in Denver, accepting Jesus as his Savior at the age of seven, devoutly attending an Episcopal church.  Financially unable to finish his studies at the University of Colorado, he enlisted in the Navy and was sent to the Great Lakes Naval Station in Chicago in 1975.  He fully embraced and enjoyed the military life and managed to qualify for both the Naval Academy and the SEAL training school.  The prospects of a military career seemed bright.  But then he was given a routine flu inoculation that adversely affected him; when he didn’t recover he was given a “lethal dose” aminophylline, to which he was unknowingly allergic.  He fell into a five-week period of unresponsive unconsciousness.    

Losing consciousness, he suddenly was weightless, flying through a white tunnel, transported to another realm of reality—“That Place.”  He stood (mysteriously in his own “body”) in a “rolling green meadow, immersed in indescribable light.  “I barely know how to describe the vibrancy of it all.  It’s like super high-definition television on steroids.  Everything is crystal clear” (p. 38).  It was a world of sheer beauty filled with wondrous music and pure joy.  “It is a perfect paradox of heaven:  I feel absolutely held and absolutely free.  I am physically feeling God’s security.  The safest place imaginable is in the arms of the Father.  Once you’ve felt that, it’s all you want.  Nothing else, from that day to this, satisfies.  It is the overwhelming, wonderfully sensation of being held” (p. 39).  He felt utterly at home, being where he was designed to be.  He also saw Jesus.  “He’s a person.  Not a shadowy figure, no figment of my imagination, not translucent or some floating being.  A person.  Solid” (p. 41).  As they talked about the author’s life, Jesus’ “words reveal there is purpose behind it all, a plan woven through my life.  It gives meaning to every moment of it.  And it is okay” (p. 42).  Consequently, “I begin to see my life from the perspective of heaven.  And how different it looks” (p. 43).  

Though he didn’t want to leave heaven, he awakened from his coma and spent many weeks convalescing in the naval hospital.  He struggled to breathe, since his illness reduced to one-third his lung capacity.  Thus disabled, he was discharged from the Navy, moved back to Denver, married his sweetheart, and went back to school.  Unable to find employers willing to hire an obviously unwell employee with a compromised immune system, he began his own business as a financial adviser, becoming modestly successful in time.  Though he constantly remembered his visit to “That Place,” he said nothing to anyone about it because he couldn’t make sense of it.  He did become deeply religious, however, spending much time in Bible study and prayer.  Various experiences reminded him of God’s abiding presence, but he was resigned to living with his infirmity, unable to do many of the daily things most of us take for granted.  

He and his wife attended various churches for many years but never found a permanently home.  Then, in 1984, they discovered Denver Vineyard, a “classic Vineyard” congregation that prayed for and believed in miracles.  He thought, for a couple of years, that miracles surely happened—but not that he might experience one!  Then one night, struggling with his disability, he felt impressed to attend a service.  “The worship was so powerful that night.  I don’t remember the message at all, just a growing sense of God’s presence, the knowledge that we were in a holy place.”  At the end of the service, the pastor invited people who wanted to pray to come forward.  Musick remained seated, but the pastor said:  “‘Wait a minute.  Someone here has been dealing with a malady for years.  A decade.’”  He then added:  “‘You’ve been sick all week.  Sick sick.  I think you have a respiratory thing’” (p. 102).  

Musick was astounded, as he’d told no one in the church about his illness.  He felt prompted to get up and walk to the front of the sanctuary.  He made it half-way.  An associate pastor met him there and put his hand on his chest.  “It felt like electricity went through my body.  I fell to the ground” (p. 103).  He found himself re-entering heaven— “That Place” he’d explored a decade earlier—seeing the “same sights, smell, and sounds” (p. 103).  Again he met and talked with Jesus.  Then he awakened “on the floor of the Denver Vineyard church.”  Getting to his feet, he took “a full breath of air” for the first time in ten years.  He’d been dramatically, miraculously healed in an instant!  Driving home, he talked with God, enjoying an intense intimacy with the Father.  For the first time he shared with his wife details concerning his earlier entrance to heaven, enabling her to better understand and rejoice with him.  His skeptical doctor took out his stethoscope and discovered that his lungs sounded “clear and healthy.”  

Subsequently, Musick intensified his life of prayer, study, and worship.  He and his wife joined a “team that prayed for people” in the church, and they witnessed wonderful healings.  Though he testified regarding his own healing, he didn’t feel inclined to share his heavenly visit resulting from his Near Death Experience.  Then, in 2011 he felt impelled to bear witness to what happened to him.  More importantly, however, he wanted to use his experience as a vehicle with which to tell us that “Heaven is a lot closer than you think.”  And if we pay attention to the little “bubbles of heaven” that frequently occur we can live more joyously and productively in Christ’s Kingdom.  “God intends for all his people to experience and to encourage heaven to come to earth.  He wants his presence and power to impact our everyday lives.  He wants his love to characterize our lives” (p. 121).  

                                        * * * * * * * * * * * * * * * * *

Chauncey Crandall graduated from the Yale School of Medicine, a skilled cardiologist who fervently believes the Christian message.  He lives and works on Palm Beach Island in Florida, where  “Business moguls, celebrities, major media personalities, music artists, bestselling authors, and athletes either get their daily mail . . . or have their second or third homes”  At the age of 19, working as a hospital orderly, he encountered death for the first time and “decided I hated death and would devote myself to fighting it with everything I could muster” (p. 2).  Pondering the spiritual as well as physical aspects of dying, he became a Christian, though for a number of years his scientific training kept him from diligently practicing his faith. “Little did I know, I needed a major dose of God (and more specifically, of His Son, Jesus, and the Holy Spirit) to be able to operate at full capacity in my faith” (p. 33).  

In Touching Heaven:  A Cardiologist’s Encounters with Death and Living Proof of an Afterlife (New York:  FaithWords, c. 2015), Crandall urges us to live better lives by living attuned to heaven.  As a physician, he routinely sees “evidences of the next realm all the time, in my work and ministry; every day, this life gives us glimpses of the next.  These snapshots—from my patient’s besides and my personal experiences—are what I want to share with you” (p. 4).  The physical and spiritual realms interpenetrate.  “Both realms are real, just as surely as God is real.  And because of these realities, I now know that life doesn’t end here” (p. 5).  Still more:  he believes “the Lord can make available to us everything here on earth that is available in the kingdom of heaven” (p. 93).  

As he began his medical practice, Crandall talked with people who’d had “out-of-body” experiences.  Some described being suspended above the surgical bed watching doctors work frantically trying to save them.  A classmate in medical school described Jesus sitting in his room when he was deathly ill.  Often he heard of angels appearing and effectively helping people.  But it was only when (in 2000) his own son Chad became sick with leukemia that he seriously began to attend to spiritual realities.  Until then he’d “balked at any intimations that I might need or want more of God” (p. 31).  He’d become an expert at “looking” at patients, noting their symptoms and seeking to heal them.  But by “looking” he could only see material realities.  Then he learned to “see”—to discern deeper and higher realms of reality wherein miracles occur.  “God surrounds every one of us with His kingdom at every turn—with messages and messengers, signs and gifts—and He has given it all to us so that we would turn our eyes and hearts toward Him.  Some people don’t notice because they doubt that He cares.  Many more, though, are missing daily hints of Him simply because they’re not paying attention” (p. 22).  Enabled to see clearly, he beheld “a universe crafted by an Artist who longs to express who He is and deeply connect with all He has created, but who is particularly focused on the ones He fashioned in His image” (p. 23).  

Once Crandall’s son became ill, he began “testing the universe,” fervently seeking to fully know God.  “Having a son diagnosed with leukemia activated my faith like nothing else had—making me vividly aware of the reality and presence of heaven.  It accelerated my spiritual growth” (p. 54).  He and his wife began visiting various churches, looking for revivals at home and abroad, thinking more deeply about the Bible and Christian theology.  From nominally attending a Presbyterian church he moved into Pentecostal circles.  He attended services were people were instantly healed, where the bread and wine for a communion service mysteriously multiplied to supply an unexpectedly large congregation, where 500 youngsters were fed with only 200 prepared meals.  He began to understand the true greatness of God who is very much with us and working miracles for us.  “As time went on, I thought, if He is this big in the world, then He can be even bigger in my medical practice, which opened me up to praying for every patient who would let me (nearly every one of them, as it turns out)” (p. 39).  Seeing some 150 patients a week, he has found his own “mission field.”  “The more I have invited heaven into the operating and exam room, the more healing power I have seen at work—and the more others have recognized the hand of God” (p. 73).  Having personally seen prayers answered for patients who were in comas (even for one man definitively pronounced dead) he confidently attests to the reality of Near Death Experiences validating the reality of heaven—the “really real world.”     

While witnessing many miracles, however, Crandall had to watch his own son fail in his struggle with leukemia.  Even though his “miracle research” prompted him to believe “Chad could be healed, and that prayer was a means to it” (p. 55) his son died.  Trusting medical science as well as prayer, he and his wife secured the best care possible, including a bone marrow transplant from his twin brother.  They tried everything!  In the midst of many dark hours, they sensed God’s presence, though when Chad died Crandall “see-sawed between numbness and anger” (p. 151)—inevitable, human feelings.  In the end:  “Chad’s battle was over.  We as a family had fought our fight with everything we had, and while the enemy may have been rejoicing, thinking that cancer had won, we knew the truth:  Chad was now in heaven’s care—now fully healed” p. 157).  

# # # 

296 Rebuilding the Culture

    Anthony Esolen, a highly-regarded, scholarly translator of Dante’s Divine Comedy, has written a number of general interest works, including The Politically Incorrect Guide to Western Civilization—a stirring defense of Christian Culture as well as the civilization derived from Jerusalem, Athens, and Rome, which developed in Europe during the Middle Ages.  For 25 years Esolen taught courses in English and Western Civilization at Providence College, though he was just recently forced to leave after writing an article entitled “My College Succumbed to the Totalitarian Diversity Cult.”  (He’s learned that one dare not challenge the secular dogmas now reigning in academia, even in allegedly Catholic institutions!)   

Many of Esolen’s core convictions give structure to his just-published Out of the Ashes:  Rebuilding American Culture (Washington, D.C.:  Regnery Publishing, c. 2017).  He begins with a caveat, promising to “indulge myself in one of civilized man’s most cherished privileges.  I shall decry the decay of civilization” (#56 in Kindle).  Doing so, he identifies with the ancient historian Livy, writing at the time of Christ, who lamented Rome’s moral collapse, “with duty and severity giving way to ambition, avarice, and license, till his fellow Romans ‘sank lower and lower, and finally began the downward plunge which has brought us to the present time, when we can endure neither our vices nor their cure’” (# 59).  Though both Livy and Esolen doubtlessly exaggerate the cultural decay of their eras, both merit careful reading and reflection regarding their concerns, for:  “Sometimes entire civilizations do decay and die, and the people who point that out are correct” (#107).  In fact: “Winter comes and goes in the affairs of men and nations and cultures, and if they are to survive at all they must plant seeds:  they must remember.  What happens if they neglect the planting” (#131).  So along with alerting us to the culture’s decadence, Esolen wants to challenge us to faithfully plant good seeds in well-tilled, healthy soil, patiently awaiting their flowering.    

To do we must implement the title of his first chapter:  “Giving Things Their Proper Names:  The Restoration of Truth-Telling.”  Created in God’s image, Adam was tasked with seeing the essence of and accurately naming other creatures.  Confucius rightly noted “that the beginning of wisdom is to give things their proper names” (#259).  Nevertheless, the history of our race reveals a perennial proclivity for lying!  In our day, for example, “pro-choice” devotees routinely lie when describing the unborn babies they want to kill—they endlessly talk about “reproductive rights” and female freedom.  Then we’re told anyone can choose any “gender” he desires and that “a woman can make as good a soldier as a man” (#311).  Given such wide-spread deceits, we must become counter-cultural and honestly describe things as they are.  “Things, in their beautiful and imposing integrity, do not easily bend to lies,” says Esolen.  “A bull is a bull and not a cow.  Grass is food for cattle but not for man.  A warbler is alive but a rock is not.  The three-hundred-pound stone will not move for a little child or a boy or a feminist professor.  Water expands when it freezes and will break anything unless you allow for that.  Things are what they are.  They know no slogans, and they do not lie.  And they give witness to the glory of God” (#487).  

Further witnessing to the glory of God, we must restore a “sense of beauty.”  In The Strange Death of Europe, Douglas Murray laments the state of modern art, which “nearly all has the aura of a destroyed city.”  Forsaking transcendent meaning or truth, today’s artists “stop aiming to connect to any enduring truths, to abandon any attempt to pursue beauty or truth and instead to simply say to the public, ‘I am down in the mud with you.”  In particular,” he says, modern art “has given up that desire to connect us to something like the spirit of religion or that thrill of recognition—what Aristotle termed anagnorisis—which grants you the sense of having just caught up with a truth that was always waiting for you” (#4783).  

Though highly advanced in many ways, we are literally starved for beauty.  Esolen seriously ponder the example of Henry Adams—the son of President John Quincy Adams.  Visiting the Great Exposition in Paris in 1900, where he pondered the panoply of technical marvels on display, he fled a few miles west to take refuge in Chartres’ glorious medieval cathedral.  The difference between the artistry evident in Chartres and the mechanical genius on display in Paris moved him to write his classic study— Mont St. Michel and Chartres.  Though very much a skeptic in regards things theological, Adams sensed the almost infinite distance between the beauty of an edifice devoted to God and the whirling machines devoted to human consumption.  “‘Four fifth of [man’s] greatest art,’ said Henry Adams, was created in those supposedly dark days, to the honor of Jesus and Mary.  The Enlightenment destroyed more great art than it produced, and what the harbingers of the novus ordo saeclorum did not get around to destroying they slandered” (#624).  Recognizing this, we must begin patiently planting the seeds of beauty, especially in our churches.  Truly beautiful poetry and music must be reinstated in our “worship” centers, where too often the tawdry, tasteless, and momentarily fashionable hold sway.  

Persuaded that “a mind is a terrible thing to baste,” Esolen urges us to set about “restoring” schools and colleges to their rightful place in our culture.  This is not to say he favors funding the public schools, which are beyond reform!  Indeed, he argues the one-room schools a century ago did a better job of educating youngsters than do today’s massive consolidated training centers.  “A monstrous thing has taken its place—not just a parasite or a cancer feeding off the host, but a disease that has slowly transformed the host into itself, like an all-eating and all-digesting alien.  The word school remains, but not the reality” (#861).  Distressed at the impoverished language skills of his university students, he concludes they have learned “no grammar in grammar school,” so it’s evident “there is not much school there, either” (#854).  Failing to teach grammar, our schools rob our students of the chance to master the English language.  Failing to emphasize the names and dates of history, our schools graduate youngsters without any knowledge of the past.  And most importantly, by eviscerating religion from the curriculum, the schools are trying “to win a temporary consensus by sacrificing what the education of a human being ultimately is for.  We avoid religious questions at the cost of avoiding the most human questions.  And thus education, which should be human, is reduced to the mechanical and the low” (#1178).  Similarly he finds the nation’s colleges little better than the schools.  Institutions once dedicated to the pursuit of truth through free inquiry now serve as censors, enforcers of political correctness.  In prestigious universities, such as Princeton, students who once studied Shakespeare now slouch through courses on Young Adult Fiction!  The motto of both Harvard University and the author’s own Providence College is Veritas:  Truth.  “The old mottoes assumed the existence of God, the moral law, and the beauty of pursuing truth” (#1266).  While still engraved in stone, such mottoes no longer describe the modern universities.  They no longer attain their ends.  Consequently, if we want to rightly educate our children we must build new schools and colleges, clearly committed to the treasures of Western Civilization that will nourish youngsters’ souls.  

Much Esolen desires can be attained through “repudiating the sexual revolution:  restoring manhood.”  Indeed, Esolen insists, “Christians must repudiate the whole sexual revolution.  All of it” (#1582).  Recovering the biblical distinction between men and women and restricting behaviors in accord with their natures will prove difficult in 21st century America, but it simply must be done.  “We have to recover a strong sense of the beauty of each, and of their essence as being-for the other; man is for woman, and woman is for man, and both are for God” (#1635).  Countering our gender-bending society, determined to sanction something without precedent in human history, we must create ways for boys to become men.  Without strong, masculine, patriarchal leaders, our culture cannot be revived.  Esolen urges us to “take an honest look at what happens when men retreat from the public swuare.  You do not get rule by women.  You get anarchy” (#1763).  To see this up close, simply visit sections of Chicago any day of the week!  

Manly men, virtuous men who know that “truth is more important than feelings,” are in short supply these days!  In 1886, penning The Bostonians, Henry James envisioned the disastrous impact of  “the most damnable feminization” that would result if feminism prevailed.  Speaking through his protagonist, Basil Ransom, he said:  “‘The whole generation is womanized; the masculine tone is passing out of the world; it’s a feminine, a nervous, hysterical, chattering, canting age of hollow phrases and false delicacy and exaggerated solicitudes and coddled sensibilities, which, if we don’t look out, will usher in the reign of mediocrity, of the feeblest and flattest and the most pretentious that has ever been.  The masculine character, the ability to dare and to endure, to know and yet not fear reality, to look the world in the face and take it for what it is . . . that is what I want to preserve, or rather, as I may say, to recover’” (#1886).  

And good men will restore the womanhood needed to make homes for families.  Men build houses, but women transform them into homes, the most human of places.   Countering a culture that urges young women to ape men, Esolen urges them to be thoroughly feminine, but not feminists!  Their naturally compassionate, nurturing hearts thrive “best at the hearth, the bedside, the table.  It is the passionate self-giving that makes the home” (#2065).  Women’s home-work, especially rearing children, has a limited focus but unlimited value.  Always remember:  “The world hates the family.  The state is the family’s enemy.  The state grows by the family’s failure, and the state has an interest in persuading people that the family can do nothing on its own.  It hates fatherhood, and makes little pretense otherwise.  It hates motherhood, though it makes a show of championing the unwed mother as well as the mother who, as the ugly phrase puts it, ‘has it all,’ though a moment’s reflection should suffice to show that no one can give his or her all to a career and a family and the local community” (#2188).   

Following discussions of work, leisure, and politics, Esolen finishes his treatise by proposing we  embrace the life of  “pilgrims, returning home,” singularly intent on reaching heaven.  “The pilgrimage was the way of the Cross”—quite different from the current progressives’ endeavors to construct a heaven-on-earth; it requires “you to bend your knee in penitence for your sins” rather than blaming others (past and present) for whatever’s wrong with the world (#3050).  Christians must acknowledge that the world is not our home, and we can help it only by being truly Christian, marked with the Character of Christ.  “He who would save a culture or a civilization must not seek first the culture or the civilization, but the Kingdom of God, and then all these other things, says Jesus, shall be given unto him as well” (#3163).  

* * * * * * * * * * * * * * * * * * * * 

Currently the editor of First Things (my favorite journal), R. R. Reno has written extensively for both scholarly and popular audiences.  As this century dawned, he published In the Ruins of the Church:  Sustaining Faith in an Age of Diminished Christianity (Grand Rapids:  Brazos Press, c. 2002), seeking “to provide spiritual guidance to Christians seeking faithfulness within increasingly dysfunctional churches” (p. 13).  Like Nehemiah of old, he argued we must settle into the ruins of Jerusalem (or the Church) and rebuild her walls.  Reared an Episcopalian, he wrestled with the somber truths regarding his denomination’s disunity and decay.  What was needed was “ressourcement, a return to the sources” (p. 94), preeminently the Scripture, Richard Hooker and the ancient Fathers.  But despite his yeoman-like effort to propose reform within his denomination, there was a latent pessimism underlying his words, leaving the reader wondering if the faith could be sustained.  Thus it was not particularly surprising when Reno was received into the Roman Catholic Church in 2004, explaining:  “as an Episcopalian I needed a theory to stay put, and I came to realize that a theory is a thin thread easily broken.  The Catholic Church needs no theories.”  

Reflecting Reno’s recent position, Resurrecting the Idea of a Christian Society (Washington:  Regnery Faith, c. 2016) acknowledges a “dark side to our national character,” a poverty that is spiritual and ethical rather than economic. “Many now live without a Father in heaven.  Political correctness denies the patrimony of a workable cultural inheritance.  For an increasing number of young people, there’s not even a father at home.  A nation of orphans, literal or metaphorical, will not long endure” (#55).  Surfeited with “health, wealth, and pleasure,” many of us have little interest in either transcendental realities or the needs of our fellow men.  But we desperately need a society that “encourages human flourishing to the degree that the supernatural authority of God’s revelation is proclaimed and the natural authority of his creation sustained” (#97).  Without seeking to legally establish our convictions, Christians should “say, out loud and with confidence, that we’re best off when we live under the authority of the permanence of marriage, accept the duties of patriotism, and affirm the supernatural claims the church makes on our souls” (#97).  

Thus there is, as Reno titles his first chapter, “The Need for a Christian Society.”  To understand this need in America, we must first understand what distinguishes this nation.  To Reno, what most  Americans value above all is the freedom long celebrated by frontiersmen—whether cowboys in Wyoming or the “New Frontiersmen” in John F. Kennedy’s White House.  “Live free or die!”  To “make something” of ourselves, to become “whatever we want to be,” rather defines the American way.  While restricted (as it was in the 18th and 19th centuries) to economic and political realms, such freedom incubated much good; it was primarily a positive “freedom for” human flourishing.  But it took unexpected turns—following a negative “freedom from” recipe in the 20th century—as increasing numbers of persons and groups declared their determination to secure various kinds of “rights.”   Increasingly, folks justified licentiousness (e.g. fornication) and violence (e.g. abortion) while mouthing relativistic and reality-defying slogans.  Yesterday’s “liberals” have become “progressives,” promoting same-sex marriage and “transgender” rights, thus seeking “freedom from human nature itself, a goal that fosters a Jacobin spirit determined to destroy all that stands in its way” (#325).  “The moral relativist is defending freedom, the freedom to define moral truth for oneself” (#308).  

However alluring—however embedded in our national consciousness—such “freedom” is fundamentally false.  “Freedom properly understood is based in a pledge of loyalty, not a declaration of independence.  Our country’s freedoms arise from eternal verities affirmed, not ties severed.  As the Declaration of Independence says, ‘We hold these truths to be self-evident.’  The first and fundamental act is holding, not choosing, standing fast in truth, not making it up.  We are freest when we acknowledge the authority of the truth, not when we seek a god-like independence from all limits” (#408).  Rather than trusting ourselves, we need to listen to seasoned authorities who prescribe wholesome ways to live, for “Our American dream of freedom will become a nightmare if we do not put it in the loyal service of something greater than ourselves” (#479).  

That “nightmare” stands revealed in Charles Murray’s Coming Apart:  The State of White America, 1960-2010, wherein he describes ominous cultural trends, including the abolition of marriage, the scarcity of good jobs (especially for men), and the absence of religion in working class communities (deemed the “weak” by Reno, since they have been mistreated by the elites ruling the nation).  We’re witnessing a class-war between highly-educated elites who control the nation and ordinary folks who must suffer their policies.  Importantly:  “The weapon of mass destruction in our war on the weak has been moral relativism, heedlessly deployed by an elite culture in love with critical strategies for disenchanting old, inherited moral norms” (#629).  More desperately than food stamps and unemployment benefits, America’s working class needs “clear rules that direct them toward decisions that help them lead dignified lives” (#807).  But our academic and cultural elites, determined to impose their toney nonjudgmentalism on the nation, refuse to sanction such rules.  

Confronting the plight of the nation’s poor, Christians must respond appropriately.  Above all this means:  “A Christian society judges nonjudgmentalism unjust” (#808).  A century ago, when the “poor” were economically deprived, many Christians embraced the “Social Gospel” and sought to improve their material conditions.  “Today’s social gospel movement must have the courage to be judgmental” (#1051).  With today’s “poor” suffering from nonjudgmental moral relativism, lacking social capital rather than financial capital, Christians face a more cerebral task, needing to craft a cogent rationale for spiritual renewal.  We “who care about the teaching of Jesus must reckon with a singular fact about American poverty:  its deepest and most destructive effects, its most serious deprivations, are not economic but moral” (#871).  If you’re so inclined, give money to rescue missions or volunteer with Jimmy Carter in building a house with Habitat for Humanity, but “you will do more for the poor by resisting nonjudgmentalism.  Exercising the preferential option for the poor means having the courage to use old-fashioned words such as ‘chaste’ and ‘honorable,’ putting on a tie, turning off trashy reality TV shows, and maintaining standards of deportment.  It means restoring a public culture of moral and social discipline” (#900), including pro-marriage legislation and back-to-basics curricula in the public schools.  

Most fundamentally, Christians rebuild the culture by supporting the Church.  Church-going folks promote a good society.  Religious people donate more time and money to community endeavors than their secular counterparts.  Their generosity flows from their theological convictions.  Loving God and their neighbors, they inevitably promote the commonweal, for they care for both the local congregation and the world-wide Christian community.  They are both appropriately patriotic and concerned for global needs (amply evident in missionary and humanitarian endeavors).  Christians following Jesus in our materialistic culture must commit to openly seeking more than health, wealth, and pleasure.  Confronting a materialism that “denies the existence of higher things, and relativism denies we could know about them even if they did exist” (#1849), we must defend those “higher things” basic to a truly good life.  

Doing so, there’s “the possibility of a Christian society.”  Despite the many laments regarding the decline of religion in America, there remains a “committed core” of Christians—roughly one-fourth to one-third of the populace—who can provide the needed leaven for a resurgent, deeply Christian culture.  These church-going, bible-believing folks repudiate the sexual revolution, reject same-sex marriage, oppose divorce, and condemn the notion that “as long as we don’t hurt others, we should be able to live however we want.” In short:  “Because they have a culture, the Faithful can be countercultural” (#2220).  Consequently they generally must live “on the peripheries of cultural and institutional power” (#2250).  But throughout history creative minorities have been the salt and light of the world, and there’s a world of potential in these faithful followers of Jesus.  

Concluding his treatise, Reno admits that:  “It’s easy to be demoralized.  Many powerful forces want to make us ‘dhimmis,’ the Muslim term for non-Muslims who are tolerated as long as they don’t evangelize or challenge the supremacy of Islam” (#2344).   But we must take heart—the Church of Jesus Christ has endured for 2,000 years, and there’s good reason to hope she will continue to thrive, in various ways and various parts of the world, until He comes again.  

# # # 

295 Vanishing Adults

 One of today’s most accomplished United States Senators, Nebraska’s Ben Sasse, has recently published The Vanishing American Adult:  Our Coming-of-Age Crisis—and How to Rebuild a Culture of Self-Reliance (New York:  St. Martin’s Press, c. 2017).  He felt impelled to write this treatise by the growing conviction that “our entire nation is in the midst of a collective coming-of-age crisis without parallel in our history.  We are living in an America of perpetual adolescence.  Our kids simply don’t know what an adult is anymore—or how to become one” (#47 in Kindle).  He realized this problem while serving as the president of Midland University (affiliated with the Evangelical Lutheran Church in America) in Fremont, Nebraska, where many students seemed adrift and unable to assume adult responsibilities.  He also awakened to the fact that his three “pampered daughters” seemed unprepared to flourish in the world awaiting them.  

Sasse devotes the first section of his book to “the problem:  How do we know the situation with our kids has really gotten worse” (#146).  In essence, the problem is the passivity evident in J. M. Barrie’s story of Peter Pan, who wanted to neither attend “‘school and learn solemn things’” nor to “be a man’” (#206).  He wanted to be (in the words of Bob Dylan) “forever young,” without tasks or accountability.  This is something new in America, where children early worked with their parents and, Alexis de Tocqueville observed, “appeared not to need an adolescent stage at all” (#599).  Still more:  Peter Pan represents, for Sasse, the alarming fact that “No civilization has ever embraced endless adolescence” (#219).  Throughout human history children moved rather rapidly through adolescence to adulthood; they  aspired to be adults and early imitated their ways.  (Today, strangely enough, many adults remain childish and seek to dress and behave in accord with their fashions!)  

It’s clear that American youngsters are getting “softer.”  Childhood obesity has skyrocketed from less than one in twenty 50 years ago to one in five today.  The toy industry, which hardly existed a century ago, now hauls in a billion dollars a year.  Historically undetected behavioral problems now require a bewildering mixture of medications, running from Ritalin to Prozac to Xanax.  Video games, for many, have replaced physical activity.  “Fully one-quarter of Americans between age 25 and 29 now live with a parent—compared to only 18 percent just over a decade ago” (#673), and only 23 percent of them were married.  Some scholars predict that fully one-fourth of the Millennials will never marry.   Whereas 50 years ago fully 90 percent of collegians attended religious services, fully 35 percent of today’s Millennials have no religious ties. They seem to be psychologically vulnerable, seeking “safe spaces” and sensitive to a variety of “micro-aggressions” that hurt their feelings.     

Representing—and rather responsible—for this cultural upheaval, Sasse thinks, is John Dewey, considered by many “America’s foremost philosopher.”  He is, without question, the father of the “progressive” educational agenda now reigning in the nation’s schools, and ultimately “he is responsible for allowing schools to undermine how Americans once turned children into adults” (#425).  To Dewey, school was not envisioned as “an instrument supporting parents” by teaching youngsters reading, writing and arithmetic.  Neither was it a place to master classical or modern languages, nor to understand history and philosophy.  Rather, the school was to be an agency of the state seeking to shape evolving youngsters into effective workers and citizens.  Though Dewey may not have intended his child-centered program to prolong adolescence, that’s what took place as the 20th century ended.

Since America’s schools contribute to the problem, Sasse argues entitles one chapter:  “More School Isn’t Enough.”  We need to take seriously Mark Twain’s quip:  “I never let school interfere with my education.”  As the son of teachers, Sasse treats them respectfully, but all too often our public schools, as Paul Goodman said, engage in “compulsory mis-education.”  We expend lots of money and accomplish little!  During the past 30 years federal spending on education has quintupled without securing any measurable effect—the U.S. now ranks 20th in science and 27th in math on international tests.  Kids spend more time in classrooms than ever before, “yet they leave high school for college or the workforce less prepared and less able to cope with the next stage of their lives” (#1172).  As A Nation at Risk warned in 1983, “‘a rising tide of mediocrity . . . threatens our very future as a Nation and people.’  The authors cried out:  ‘If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war’” (#1206).  To counter Dewey’s “progressive” educational philosophy, Sasse turns to  Dorothy Sayers, who penned “The Lost Tools of Learning” in 1948.  Sasse considers it “the most important essay on education written in the last century,” and she is the “patron saint of the educational philosophy underpinning this book” (#1363).  Sayers sensed that the “‘artificial prolongation of intellectual childhood and adolescence into the years of physical maturity’” would lead to irresponsibility and societal decline.  Only by recovering the “lost tools of learning” embodied in the trivium/quadrivium-based classical curricula can this trend be reversed.  

Having assessed the problem, Senator Sasse turns to exploring solutions.  Begin, he says, by minimizing, if not eliminating, the “age segregation” which seems closely correlated with antisocial behavior.  Until modern times, multigenerational families lived and worked together, providing healthy routes to adulthood for the young.  Living in a segregated “youth” culture, today’s youngsters know very little about adults and rely unwisely on their peers.  Attending “youth” worship services in church, for example, they fail to learn how wiser and more experienced persons approach God.  Significantly, a study by the Fuller Youth Institute found:  “‘involvement in all-church [intergenerational] worship during high school is more consistently linked with mature faith in both high school and college than any other form of church participation’” (#1532).  A truly good education should bring together the elderly and the youthful.  As Cicero said, in On Old Age:  “‘People who say there are no useful activities for old age don’t know what they’re talking about.  They are like those who say a pilot does nothing useful for sailing a ship because others climb the masts, run along the gangways, and work the pumps while he sits quietly in the stern holding the rudder’” (#1577).  Especially important, in the wake of the Sexual Revolution, older folks need to provide healthy perspectives on marriage and family sorely lacking in today’s youth culture.  

Then we must help youngsters “embrace work pain” in order to grow up.  Senator Sasse is close enough to Nebraska’s soil to appreciate the “hardness” of farm work that made the state!  At the tender age of seven he was sent by his parents to “walk beans” in a local soybean field, learning first-hand the meaning of hard work, and he’s come to believe, with Theodore Roosevelt that “Nothing in this world is worth having or worth doing unless it means effort, pain, difficulty.”  Consequently, he and his wife agreed to send one of their daughters to work for on a Nebraska ranch for a month—soon discovering the necessity of manual labor.  But when he became a college president Sasse found the 21st century students arriving at Midland University with no appreciation  of the pain required to do good work.  As a 37 year old man he was taken aback to find Midland’s students markedly different from those of his generation.  These “Millennials” shunned responsibilities and mainly relished “sleeping in, skipping class, and partying” (#2072).  To a degree they are rightly branded “needy, undisciplined, coddled, presumptuous,” (#2078) and unable to meet adult expectations.  

To successfully transition to adulthood, our youngsters also need to “consume less.”  To do so runs counter to one of the most marked traits of the Millennials, who generally think buying things, getting more stuff, will make they happier.  But despite this nation’s great wealth, surveys show Americans to be less happy than they were half-a-century ago.  We’ve not learned, with Socrates, that:  “He who is not contented with what he has would not be contented with what he would like to have.”  Somehow we, and our children, need to learn that life is truly satisfying to the extent we produce goods rather than consume them.  Along with consuming less, we need to travel more—to learn, by personal discovery, how cultures differ, why world works as it does.  As Mark Twain wrote:  “Broad, wholesome, charitable view of men and things cannot be acquired by vegetating in one little corner of the earth all one’s lifetime.”  

Then we need to “build a bookshelf” suitable for living well.  “Critical, engaged reading skills are not a luxury, but rather a necessity for responsible adults and responsible citizens” (#3511).  We absolutely need a “rebirth of reading” if we’re to flourish as a people.  But younger folks are reading less and less.  (I recently saw a New York City commuter who’d been riding the train into the city for 30 years; in earlier years, most every passenger would be reading a newspaper or magazine, but now they all seem to be playing games on hand-held electronic devices!)  Intimidated by the likes of Jesse Jackson, leading Stanford University students chanting “Hey, hey, ho, ho, Western culture’s got to go!,” American colleges and universities have dramatically  eliminated required courses in both foreign languages and the humanities, thus effectively eliminating expansive reading experiences, provoking scholarly protests in the 1980s from E.D. Hirsh (Cultural Literacy) and Allan Bloom (The Closing of the American Mind).  While various writers may propose somewhat different lists of books everyone should read, Senator Sasse cites 60 basic texts he finds truly worthy of consideration—ranging from Homer and Aristotle in the ancient world to Dietrich Bonhoeffer and Martin Luther King in the 20th century.  Each of us would, if we resolved to address the task, set forth a different set of essential books for ourselves and our families.  What would really help, however, is if we would simply do so!  If not, Sasse’s list is a helpful place to start!  

Obviously concerned with the state of the American union, Senator Sasse concludes his work by urging readers to join him in strengthening our culture by helping our children become responsible adults and thereby strengthening the republic.  This is a fine treatise, attuned to significant issues, deserving widespread reading, reflection, and discussion.  

* * * * * * * * * * * * * * * * * * * * * *

For many years Christian Smith, currently a professor at Notre Dame, has released scholarly sociological studies detailing evident characteristics in young Americans.  In Lost in Transition:  The Dark Side of Emerging Adulthood (New York:  Oxford University Press, c. 2011), he and a team of researchers provide information essential for understanding this nation’s 18-23-year-old “emerging adults.”  Though Smith commends much in this demographic group, Lost in Transition focuses only on the darker side of their portrait—their “mistakes and losses, trials and grief, confusions and misguided living” (p. 3).  The youngsters we encounter are the beneficiaries of both higher education and their parents’ willingness to care for them well into their 30s; they are delaying marriage, in part because of widely-available contraceptive technologies; and they struggle to adjust to the realities of a global economy that makes employment increasingly problematic.  They have, still more, to a large degree embraced many of the postmodern views (ethical relativism and multiculturalism) espoused by thinkers such as Nietzsche and Derrida and popularized by MTV and simplistic high school teachers.  

Given their postmodern views, many emerging adults are morally adrift, embracing varieties of ethical relativism, one of their more “unsettling” traits.  One third of them claim not to know why anything is right or wrong!  Sixty percent of them take “a highly individualistic approach to morality.  They said that morality is a personal choice, entirely a matter of individual decision.  Moral rights and wrongs are essentially matters of individual opinion” (p. 21).  One interviewed woman thinks stealing is wrong—at least for her!  But if others steal it’s not really wrong— just a “dumb thing to do.”  With moral decisions reduced to personal opinions, no one should “judge” another person’s behavior and there is no need to work for any social consensus on moral standards.  One woman even refused to condemn mass-murdering terrorists!  In her opinion:  “‘It’s not wrong to them.  They’re doing the ultimate good.  They’re just like, they’re doing the thing that they think is the best thing they could possibly do and so they’re doing good’” (p. 28).  Just live according to your notion of “good” and keep quiet regarding anyone else’s!  Many emerging adults know nothing of moral philosophy, traditional religion, or anything other than their inner feelings.  Nearly three-fourths of them simply follow their “instincts,” apparently thinking moral knowledge is innate and intuitively knowable.   Lacking objective moral standards, they think “anything could be morally right, then, as long as someone believes it” (p. 29).  Shocking though it may seem, that’s “the professed outlook of nearly one-third of emerging adults today” (p. 29).  Though a third of emerging adults want to reject such extreme relativistic thinking, they lack the “moral-reasoning skills” to do so.  A substantial minority of the respondents did refer to God or the Bible, but they frequently lacked the conceptual skills to draw upon their religious traditions when explaining their ethical views.  

Their failure to reason well results, Smith suggests, from the “multicultural” indoctrination they receive in virtually all American schools.  If different cultures have different moral standards, they must all be accepted and respected in accord with their own perspectives.  To avoid being labeled a “racist,” emerging adults are ready to approve virtually anything done by groups different from theirs.  This squares with their commitment to what philosophers call “positive law” rather than the “natural law” espoused by classical and Christian thinkers.  Positive law is whatever a regime (whether hereditary or democratic) decrees.  Thus what may be right Stalin’s USSR could be wrong in Roosevelt’s USA; what was wrong in the 19th century (e.g. abortion) becomes right when the Supreme Court decrees it in the 20th.  

From the authors’ perspective, “the widespread moral individualism and solid minority presence of moral relativism among emerging adults today tells us that the adult world that has socialized these youth for 18 to 23 years has done an awful job when it comes to moral education and formation” (p. 60).  “They are morally at sea in boats that leak water badly” (p. 60).  Especially important is the lack of those “intellectual virtues” Aristotle mandated in his Ethics.  The Postmodern contempt for any form of realism has seriously truncated the Millennials’ reasoning skills.  “Central to many of the confusions in emerging adult moral reasoning is the inability to distinguish between objectively real moral truths or facts and people’s human perceptions or understandings of those moral truths or facts.  The error of not distinguishing these two things is this:  the realities themselves are confused with, and therefore dependent upon, people’s cognitive grasp of them.  What actually exists is conflated into what is believed to exist” (p. 61).  Consequently, as a society we face a huge task.  As the noted philosopher Charles Taylor observed:  “‘We have to fight uphill to rediscover the obvious, to counteract the layers of suppression of modern moral consciousness’” (p. 69).  

In addition to moral relativism, today’s emerging adults are generally devout consumers, especially of the high-tech and entertainment items that have emerged during their lifetimes.  Consequently,  “between one-half to two-thirds of emerging adults said that their well-being can be measured by what they own, that buying more things would make them happier, and that they get a lot of pleasure simply from shopping and buying things” (p. 71).  Intangible goods seem irrelevant to them, since less than ten percent “spoke of knowing God or making God proud, deepening their life of faith, or being more religious” (p. 105).  They want stuff, not spirit!  They give little thought to their acquisitiveness, considering it as normal and inescapable as breathing.  Though a few aspects of mass consumption may trouble them, they see no need to change anything, guided as they are by some of the “key assumptions of liberal individualism” (p. 80).  “All that society is, apparently, is a collection of autonomous individuals who are out to enjoy life.  The idea of people changing their own lifestyles or of mobilizing for collective social or economic change is nearly unimaginable” (p. 86).  

Illuminating this consumerist mentality, one respondent said:  “‘A good life for me would be to have more than enough money than I actually need, and live like a kid the rest of my life.  That would be my little heaven in today’s reality.  Yeah, it’s consuming a lot of stuff, but at the same time, if you can afford it, what is money anyway?  Money is meant to be spent, so why not?  You only live once, and if you have the chance to live in excess, why not?’” (p. 95).  This attitude explains their utilitarian approach to education.  “Not many emerging adults talk about the intrinsic enrichment of an education, of the personal broadening and deepening of one’s understanding and appreciation of life and the world that expansive learning affords.  Few emerging adults talk about the value of a broad education for shaping people into informed and responsible citizens in civic life, for producing members and leaders of society who can work together toward the common good” (p. 101).  They go to school for one reason:  to get a good (i.e. well-paying) job.  

When getting more stuff fails to make them happy, many emerging adults turn to the timeless  illusions of wine, women, and song!  They routinely seek to get “high, stoned, buzzed, and drunk” (p. 110).  Rather than drinking in moderation, significant numbers of them routinely engage in binge drinking and smoking pot when partying—ways to escape their “boring” lives.  They also seek satisfaction in sexual engagements.  “What were once daring and rebellious acts of ‘love’ outside of committed relationships have now for many emerging adults become routine, almost pedestrian” (p. 148).  Yet, though trumpeted as “liberating” and “fun,” the hook-up culture has proved deeply disturbing—especially for women.  “We were struck by the number of very traumatic breakups that we heard described in interviews, since we assumed that emerging adults generally want to hold off on seriously committed relationships.  But the truth is that, while most emerging adults do want to hold off on marriage, many of them—again, particularly women, it appears—also long for the kind of intimacy, loyalty, and security that only committed relationships can deliver” (p. 154).  The authors conclude that “the sexual revolution’s promise of easy, safe, uncomplicated, fulfilling, casual sex” has dramatically failed (p. 176).  It’s failed simply because it cannot alter one of the most basic aspects of human nature—the need for fidelity and permanence in sexual relations.  Sadly enough:  “not far beneath the surface appearance of a happy, liberated emerging adult sexual adventure and pleasure lies a world of hurt, insecurity, confusion, inequality, shame and regret” (p. 195).  Finally, today’s emerging adults seem unusually disengaged from the “civic and political” world.  “Citizenship is not a word in their vocabularies” (p. 223).  Self-absorbed, uninformed and apathetic, they take little interest in community, church, or national affairs, volunteering little of their time and contributing none of their money.  

No one interested in solid data regarding today’s Millennials can ignore Lost In Transition.  As a nation, we’ve failed to provide the nurturing institutions and winsome mentors obviously needed by the younger generation.  And though Christian Smith admits to not knowing exactly what to do, speaking the truth is the first step in finding a cure to the “malady” crippling our emerging adults. 

294 “Strangers in a Strange Land”

  Church history records the incessant fluctuations—the triumphs and setbacks, the flourishing and decay—of the Body of Christ.  Throughout the past century, first in Europe and now in America, we have witnessed a cascade of alarming losses experienced by the Roman Catholic Church, the mainline Protestant denominations, and now many of the the hitherto robust evangelical American churches.  A spate of recent treatises document and endeavor to explain what’s happened—primarily to the largest of these communions, the Catholic Church, but extending to others as well—and generally offering suggestions as to what’s to be done.   

Among the most notable is The Decline and Fall of the Catholic Church in America (Manchester, New Hampshire:  Sophia Institute Press, c. 2003) by David Carlin, a sociology and philosophy professor whose articles have appeared in publications as disparate as First Things and the New York Times.  As a committed Catholic, he’s dismayed by what’s happened but feels impelled to deal honestly with it.  Thus he argues:  “The root problem is that the Catholic Church in the United States has largely ceased to be Catholic,” turning itself into a culturally-acceptable and innocuous “generic Christianity or Christianity-in-general” (#34 in Kindle)—one of many declining “denominations.”  By discarding one “offensive” dogma after another, the Church now finds itself standing for nothing distinctively Catholic, softly proclaiming little more than “a gentle wish:  ‘Can’t we all just be nice to one another?’” (#44).  

Lacking a distinctive message, the Catholic Church has dramatically been imploding for 50 years.  Easily accessible data reveal the startling decline of weekly church attendance (from around 75 percent in 1965 to 25 percent today), parochial schools (4.5 million grade-school students in 1965, 1.9 million in 2002), monastic communities, and priestly vocations (slipping from 49,000 in 1965 to 4,700 in 2002).  Large numbers of professing Catholics no longer support traditional doctrines (e.g. the Trinity, Incarnation, Resurrection, Real Presence) or ethics (e.g. condemning of cohabitation, contraception, divorce, abortion, homosexuality).  Should this trajectory continue, Carlin fears, the Church will simply wither away, along with mainline Protestant denominations she’s chosen to imitate.   

The Catholic collapse resulted, Carlin thinks, when three currents converged “to produce the ‘perfect storm’”—1) implementing “the spirit of Vatican II;” 2) escaping the Catholic “ghetto;” and 3) the ‘60s’ cultural (i.e. sexual) revolution.   Before the Second Vatican Council, Catholics moved within a Church unchanged for many centuries, but suddenly, in accord with its “spirit,” much in their “immutable” faith seemed up for grabs.  The largely ethnic ghettos formed by turn-of-the-century Catholic immigrants, providing nurture and comfort, dissolved in the ‘60s as Catholics (graduating from elite universities and working in successful corporations) shed their Irish or Italian identities and defined themselves as fundamentally American.  Their freedom to thrive as Americans was signaled by John F. Kennedy’s election in 1960, a testament to their acceptance in this country as well as an opportunity to blend in with their fellow citizens.  Then the cultural revolution of the ‘60s and ‘70s—a sustained rebellion against authority of any sort— “blindsided” the Church. 

Probing these phenomena for deeply philosophical perspectives, Carlin identifies the Cultural Relativism that “seduced a generation” as one of the primary reasons for the Catholic collapse.  University students exposed to the anthropological works of Ruth Benedict and Margaret Mead came to believe cultures shape persons and different cultures prescribe and approve dramatically different, purely man-made moralities.   What’s right within one culture might be considered wrong in another—and there’s no transcultural standard whereby behaviors can be condemned.  Along with Cultural Relativism, Ethical Emotivism was widely embraced.  Influential philosophers declared every one should simply follow his feelings, consulting his heart when making choices.  More than a bumper sticker, “If it feels good, do it” became a prescription for morality!  Finally, fearing to be identified as an illustration of The Authoritarian Personality (written by members of the Frankfurt School who had emigrated to the United States), many Catholics spurned conservative traditions and mouthed the “Question Authority!” mantra.  

Such developments firmly established Secularism as “the dominant American paradigm” by 1970.  Its “antinomian moral theory entailed a rejection of a long list of traditional religion-based moral rules” regarding sexual behavior and targeted the “family ideal as downright oppressive”—especially to women who needed to be freed from the shackles of patriarchy.  Celebrating tolerance as its singular ideal, Secularism powerfully impacted all segments of American society, which quickly cast loose from its religious anchors.  To Carlin, the 1962 Supreme Court’s Engle v. Vitale decision (banning prescribed prayers in the public schools) marks the triumph of a militantly secular movement in this nation.   Gaining momentum, secularists worked to dismantle the traditional Judeo-Christian moral consensus which had shaped the country.  Thus the prayer-ban Court decision was soon followed judicial edicts legalizing contraception, abortion, and (just recently) same-sex marriages.  Morality to many Americans became mainly a matter of personal preferences—following what Carlin identifies as  the “Personal Liberty Principle” (PLP).  

Conservative (Evangelical and Pentecostal Protestant, Traditional Catholic) Christians certainly rallied to oppose this anti-Christian secularist agenda, sparking the “culture war” that still divides America.  But numbers of Liberal Christians in both mainline Protestant and Roman Catholic circles easily embraced it and precipitated thereby the radical decline in both numbers and doctrinal integrity they have suffered.  Liberal churches, Carlin believes, will inevitably fade away.  And conservative churches, to survive, must awaken to the the threats they face from today’s Secularism.  It’s an enemy which must be clearly identified and vigorously resisted.  Rather than adjust to the world, churches that survive must defy it, living in accord with Supernatural, rather than natural, standards.  

Carlin’s treatise is remarkably clear and cogent.  Though focused upon the Catholic Church, his analysis easily extends to all Christian churches.   And while basically pessimistic, his counsels and suggestions are worth heeding.  

* * * * * * * * * * * * * * * * * * * * * * * * *

One of the most widely discussed recent publications is Rod Dreher’s The Benedict Option:  A Strategy for Christians in a Post-Christian Nation (N.Y.:  Penguin Random House, c. 2017).  Over the decades, Dreher moved through Roman Catholicism and Evangelicalism to finally join the Russian Orthodox Church.  A respected journalist and unabashed believer, he writes sorrowfully, lamenting the catastrophic losses Christendom has recently experienced and believing that the “culture war that began with the Sexual Revolution in the 1960s has now ended in defeat for Christian conservatives” (p. 3).  Consequently, a nihilistic secularism prevails.  Not only have abortion, cohabitation, and same-sex marriage gained sanction, but today’s Millennials seem unusually disinterested in the Christian faith and have virtually no knowledge of its content.  Philip Rieff’s telling insight—“The death of a culture begins when its normative institutions fail to communicate ideals in ways that remain inwardly compelling”—seems sadly confirmed.  We face challenges comparable to those Christians such as St. Augustine faced as the Roman Empire collapsed during the fifth century.  

Whereas Augustine faced Vandals literally battering down the walls of his city as he died in 430 A.D., we confront home-grown, anti-Christian barbarians produced by important historical developments:  1) the 14th century’s emergence of philosophical nominalism; 2) the 16th century’s Protestant-driven fragmentation of Christendom; 3) the acidic impact of the 18th century’s Enlightenment; 4) the 19th century’s Industrial Revolution; and 5) the 20th century’s Sexual Revolution.  “Now we are on the far side of a Sexual Revolution that has been nothing short of catastrophic for Christianity.  It struck near the core of biblical teaching onset and the human person and has demolished the fundamental Christian concept of society, of families, and of the nature of human beings.  There can be no peace between Christianity and there Sexual Revolution, because they are radically opposed.  As the Sexual Revolution advances, Christianity must retreat—and it has, faster than most people would have thought possible” (p. 202).  

More profoundly, the Faith that fomented Western Civilization has been sidelined by a secular humanism that makes Man, not God, its ultimate concern.  Thus Supreme Court Justice Anthony Kennedy, determined to forever establish abortion as a constitutionally guaranteed right in, declared, in Planned Parenthood vs. Casey:  “At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.”  A nation committed to such a precept will have little patience with orthodox Christians, and Dreher says:  “The church, a community that authoritatively teaches and disciples its members, cannot withstand a revolution in which each member becomes, in effect, his own pope. Churches . . . that are nothing more than a loosely bound assembly of individuals committed to finding their own ‘truth,’ are no longer the church in any meaningful sense, because there is no shared belief” (p. 44).  

The time has come, Dreher thinks, to radically separate from this sinful world and singularly seek to be the church, challenging rather than cooperating with mainstream social structures.  “Rather than wasting energy and resources fighting unwindable political battles, we should instead work on building communities, institutions, and networks of resistance that can outwit, outlast, and eventually overcome the occupation” (p. 12).  This involves embracing what he calls the “Benedict Option,” a proposal that grew out of his reading of Alasdair McIntyre’s suggestion in After Virtue (his pivotal treatise on ethics); McIntyre said cultural barbarians have again inundated Western Civilization and it’s time to await “a new—doubtless very different—St. Benedict,” leading us to build monastic preserves devoted to maintaining truly Christian faith and practice.  

To better understand St. Benedict (a sixth century reformer), Dreher travelled to Norcia, Italy, and visited with a dozen (mainly young American) monks who recently reopened the ancient monastery, 200 years after it had been closed by Napoleon.  There he saw the ancient Benedictine Rule, blending prayer and manual labor, carefully followed.  Though the Rule was intended for monastics, its truth can easily be extended to any Christian community (family; school; church) committed to shaping its life in accord with love for God and man.  Politically, this means abandoning the effort to “take back America” and follow the examples of dissidents within Communist countries (bearing witness to eternal truths—“living in truth,” as did Vaclav Havel), and fighting for religious liberty.  It also leads to homeschooling or establishing classical Christian schools for children, living prayerfully, creating a robust Christian culture.  Above all, it means making family (a “domestic monastery”) and church the absolute foci of everything we do.      

For those interested in joining Dreher and embracing the Benedict Option, he provides examples and resources.  Clearly there are small communities around the world, such as Tipi Loschi in Italy and the Saint Constantine School in Houston, who are committed to living out their faith in radically countercultural ways.  And though the Benedict Option will never be embraced by large numbers of Christians it remains a viable means whereby the Faith is preserved and transmitted to coming generations.  

* * * * * * * * * * * * * * * * * * * * *

Philadelphia’s Archbishop Charles J. Chaput has many concerns for the future of Christianity, but in Strangers in a Strange Land:  Living the Catholic Faith in a Post-Christian World (New York:  Henry Holt and Company, c. 1917), he balances those concerns with a robust confidence in the strength of both individual believers and the Church herself to overcome them.  He especially urges us to put things into perspective, noting how dramatically a truly global Christianity has emerged during the past century.  Thus:  “In Africa, 9 million converts enter the Catholic Church each year.  By 2030, if current trends hold, China may have the largest Christian population the world” (p. 1).  In Europe and America the churches may be struggling, but around the world they may be enjoying their finest hour!  And despite much bad news, there’s much encouraging news in both Europe and America as believers creatively respond to our postmodern and increasingly post-Christian world.  

Rather than despair, Chaput urges us to remember that there have always been, as Augustine taught, two cities—the City of God and the city of man.  “We are born for the City of God.  The road home leads through the City of Man.  So we are strangers in a strange land, yes” (p. 246).  But the Church has been forever attacked and sometimes withered away in various geographic regions.  For 2000 years the Church of Jesus Christ has endured—and surely she will do so unto He returns.  Unlike Rod Dreher, who takes St. Benedict as his exemplar, Chaput celebrates St. Augustine, a bishop caring for his flock in the North African city of Hippo.  “For Augustine, the classic civic virtues named by Cicero—prudence, justice, fortitude, and temperance—can be renewed and elevated, to the benefit of all citizens, by the Christian virtues of faith, hope, and charity.  Therefore, political engagement is—or at least it can be—a worthy Christian task” (p. 14).  Despite its many flaws, this world is still a good world—what Augustine called a “smiling place.”  Despairing at the conditions of society can be as sinful as despairing of one’s own salvation.  “As Augustine said in his sermons, it’s no use complaining about the times, because we are the times.  How we live shapes them” (p. 17).   

Turning to our country, the United States, Chaput urges us to remember our godly heritage, honoring what’s good before railing against what’s bad.  Though never perfect, this nation has embraced the Christian religion and encouraged its “free exercise.”  Protestants and Catholics alike have supported America’s guiding principles, routinely giving thanks for the freedoms they enjoyed in this great land.  As late as 1955, a leading Jesuit, John Courtney Murray, could still assert that the American commonwealth “‘is not intelligible and cannot be made to work except by men who possess the public philosophy’ that the founders first brought to building it.”  Inasmuch as possible, it remains our task to recover the Founders’ vision and make sure the constitutional republic they established will survive.    

Yet times have changed, Chaput acknowledges, honestly documenting the many harmful cultural currents which have eroded much of the nation’s spiritual and ethical landscape.  As would be expected of a Catholic bishop, he devotes considerable attention to the baneful consequences of the Sexual Revolution and the dissolution of the family.  But he also looks more deeply, lamenting changes in our educational system which shows little interest in any search for ultimate Truth.  “What the modern world really wants, as Josef Pieper said, “‘is flattery, and it does not matter how much of it is a lie’” (p. 226).  Indeed, we’re surrounded by what Scott Peck described as the People of the Lie.  But a healthy society requires the careful use of words, and Pieper noted that “‘the abuse of political power is fundamentally connected with the sophistic abuse of the word.’  And the degradation of man by man, and the systematic physical violence against human beings, have their beginnings ‘when the word loses its dignity,’ because ‘through the word is accomplished what no other means can accomplish, namely, communication based on reality’” (p. 122).  

Amidst all the dreary details portraying a post-Christian world, it’s easy to despair and retreat to well-fortified cultural castles.  We must honestly assess and respond to the challenges we face, knowing that the “Church of tomorrow won’t look like the Church of today, much less of memory” (p. 187).  It may very well be smaller and poorer, but it can become more disciplined and effective.  Neither despair nor isolation are options for Chaput.  Christians necessarily have hope because Jesus arose from the grave!  “This small moment, unseen by any human eye, turned the world upside-down and changed history forever” (p. 146).  As a supernatural virtue, hope enables us to see everything in the light of eternity, never despairing of what God may in fact bring to pass.  Thus it’s our duty, John Henry Newman said, to set forth on “‘ventures for eternal life without the absolute certainty of success’” (p.152).  Despair results from trusting ourselves.  Hope springs eternal because we trust God.  Trusting God means following His precepts, summed up so powerfully by Jesus in the Beatitudes, to which Chaput devotes many pages, and embracing the call to holiness as have saints throughout the centuries.  

For guidance in the 21st century Chaput finds fascinating clues in a second century document, The  Letter to Diognetus—a wonderful manual for Christians marching as pilgrims though a hostile land.  In that ancient letter we’re reminded that the Christian Faith is not a man-made construct.  Rather it was given us by the “Creator of all, the invisible God himself, who from heaven established the truth and the holy incomprehensible word among men, and fixed it firmly in their hearts.”  So Christians live normally, following the daily customs (food; drink; clothing, work) of their countrymen.  “They marry, like everyone else, and they beget children, but they do not cast off their offspring.  They share their board with each other, but not their marriage bed.  

It is true, the Letter to Diagnetus says, that Christians are “‘in the flesh,’ but they do not live ‘according to the flesh.’  They busy themselves on earth, but their citizenship is in heaven.”  In sum:  “What the soul is in the body, that Christians are in the world.  The soul is dispersed through all the members of the body, and Christians are scattered through all the cities of the world.  The soul dwells in the body, but does not belong to the body, and Christians dwelling the world, but do not belong to the world.  The soul, which is invisible, is kept under guard in the visible body; in the same way Christians are recognized when they are in the world, but their religions remains unseen.”  Yet even when unseen they animate and uplift the world.  Loving God and others, Christians are truly the leaven, the salt, and the light of the world.  Not always triumphant, they are called not to succeed but to remain faithful, bearing witness to the Gospel—that greatest of all truths, the Good News the world always needs to hear, especially the post-modern world that despairs of any truths at all, much less one overarching Truth.  Our task, as John Henry Newman said, is “not to turn the whole earth into a heaven, but to bring down a heaven upon earth’” (p. 218).  

To do that, knowing that “beauty is the battlefield where God and Satan contend for the hearts of men” as Dostoyevsky said, part of our task is to preserve and cultivate beauty.  Wherever we encounter it, beauty points upward, symbolizing a transcendent Reality.  Discerning beauty in God’s creation ennobles us and should lead us to tend it wisely.  “Thus the spoiling of the earth with waste and the brutalizing of our human habitats with ugly art and buildings are not just clumsy mistakes of progress, but desecrations.”  Militant Muslims vandalize Buddhist and Roman monuments; iconoclastic Puritans eviscerated cathedrals; “transgressive” modern artists defile the “image of God” in their depictions of human beings.  Western Christian Culture was a God-centered culture, but “God has never been more cast out from the Western mind than he is today.  Additionally, we live in an age when almost every scientific advance seems to be matched by some new cruelty in our entertainment, cynicism in our politics, ignorance of the past, consumer greed, subtle genocides, posing as rights like the cult of abortion, and a basic confusion about what—if anything distinctive at all—it means to be human” (p. 229). 

We must remind the world of what it means to be human!  The world needs no more “love, sweet love,” but Christians who live out the Love of God, bearing witness to the eternal verities of the City of God while living responsibly and robustly in the city of man, the best counsel available in our trying times.

293 What’s Happened to the University?

In What’s Happened to the University:  A Sociological Exploration of Its Infantilisation (New York:  Rutledge, c. 2017), Frank Furedi appraises developments during the past 50 years in institutions of higher learning.  He began his academic life as a student in 1965 and is now Emeritus Professor of Sociology at the University of Kent in the UK.  In his student days, universities were open to new ideas and touted the virtues of free speech and challenging ideas.  Subsequently, however, they became “far less hospitable to the ideals of freedom, tolerance and debate than in the world outside the university gate.  Reflecting on this reversal of roles has come about is the principal objective of this book” (p. vi).   Furedi’s distressed that students now seek to ban books that threaten their vulnerable psyches and protest speakers who  might offend a variety of sexual and ethnic groups.  The free speech mantras of the ‘60s have turned into speech codes; the former devotees of free speech have frequently become, as powerful professors, enforcers of censorship.  “Safe spaces,” “trigger warnings,” “microagressions” and “chill out rooms” (replete with play dough and “comfort” animals to relieve anxieties) indicate how many universities have in fact become infantilized.   Thus:  “Harvard Medical School and Yale Law school both have resident therapy dogs in their libraries” (p. 27).  

In some ways this culminates a project educators launched in the 1980s, making “self-esteem” their summum bonum.  Feelings, above all, must be massaged and potential hurts (e.g. poor grades or athletic defeats) eliminated.  Protecting children became a parental obligation easily transferred to the schools.  Parents now accompany and hover over children entering the university.  Administrators serve in loco parentis, not as they did a century ago, by regulating campus behavior, but by protecting students’ feelings, especially if they self-identify as members of certain “vulnerable groups.”  Wellness clinics, counseling services, ethnic and same-sex study centers all cater to psychological or emotional rather than intellectual needs.   Treating students as “biologically mature children, rather than young men and women, marks an important departure from the practices of the recent past” (p. 7).  As one might anticipate, “the more resources that universities have invested in the institutionalization of therapeutic of therapeutic practices, the more they have incited students to report symptoms of psychological distress” (p. 46).  

An incident at Yale University in 2015 illustrates this.  A university committee issued guidelines regarding appropriate Halloween costumes.  One faculty member, Erika Christakis, posted an email suggesting “that ‘if you don’t like a costume someone is wearing, look away, or tell them you are offended’ and concluded that ‘free speech and the ability to tolerate offense are the hallmarks of a free and open society’” (p. 17).  Students then denounced Christakis and her husband (a psychology professor who defended her) for racial insensitivity.  Yale’s President, Peter Salvoes, promptly met with tearful undergraduates and shared their felt distress.  Though not dismissed from their positions, Erika and Nicholas Christakis soon left Yale, casualties of the raging intolerance now widespread in academia.  Another incident further illustrates campus conditions.  “Caroline Heldman, a professor in Occidental University’s politics department, recalled that some of her students began experiencing PTSD-related episodes in her classes:  ‘there were a few instances where students would break down crying and I’d have to suspend the class for the day so someone could get immediate mental health care.’  Her antidote to this problem was to introduce a trigger warning on her course” (p. 42).  

  What really matters these days is one’s racial or sexual identity.   “Universities are singularly accommodating to the objectives of cultural crusaders” (p. 65).  To identify as an African-American or Native American or gay man or lesbian woman grants one status and authority quite apart from whatever one may think or say.  In addition, it’s especially important to stress the “victim” status of one’s group, even if the only obvious victims were ancestors who lived decades if not centuries ago.  Doing so enables one to invoke “social justice” and demand preferential treatment of some sort.  “Social justice” increasingly means protesting historic policies and personalities.  So students at the University of Missouri demanded a statue of Thomas Jefferson be removed from campus because he owned slaves.  Schools must be renamed if they memorialize anyone tainted with racist or sexist traits.   Selected cultures must be sacrosanct, making intolerable any “appropriation” of their dress, music, or food.  So many campus cafeterias dare not feature Mexican or Asian food lest students remonstrate!   And, importantly, only women can speak for women, only blacks for blacks, only Indians for Indians!  Authority comes purely from one’s ancestry, not from any scholarly expertise.  Consequently:  “The reverential and self-righteous tone of cultural crusaders echoes the voice of traditional religious moralists” (p. 64).  

To provide “safe space” for culture groups leads to self-segregated dormitories, and there are now dorms reserved for blacks and other minorities at elite schools such as UC Berkeley and MIT!  These “safe spaces” for students protect them from  psychic and emotional hurts, shoring up their fragile self-esteem.  No debates are allowed, lest someone be judged wrong!  On many campuses, the notion that “criticism is violence” has gained traction, so teachers are warned to avoid even evaluating their students!  “It is an article of faith on campuses that speakers who espouse allegedly racist, misogynist or homophobic views should not be allowed to speak” (p. 103).  Challenging speakers, such as Heather Mac Donald and David Horowitz, are shouted down or prevented from appearing on campuses, for they might distress the feelings of some groups.  Advocates of safe spaces insist that “tolerance, affirmation and respect” therein provide a good environment for learning, though no empirical studies demonstrate such.  In fact, from Socrates onward it’s been assumed that learning advances when one is forced to examine his beliefs and test his presuppositions with a commitment to embracing even uncomfortable truths.  

Conjoined with “safe spaces” are the efforts to censor free speech which have accelerated since 1980.  Certain words simply cannot be uttered!  Though profanity (as traditionally understood) flourishes in dormitories and classrooms, legions of taboo words are now forbidden.  Thus one may no longer refer to his “wife”—though “partner” is allowed.  In elite universities one may proudly be a “Native American” but never an “Indian.”   “Censorship, which was once perceived as an instrument of authoritarian attack on liberty, is today often represented as an exercise in sensitive behavior management” (p. 102).  Even threatening ideas must be policed, with professors issuing “trigger warnings” that exempt sensitive students from exposure to them!  Classic texts, ranging from Sophocles’ Oedipus the King to Mark Twain’s Huckleberry Finn to J. D. Salinger’s Catcher in the Rye are now suspect!  Feminists especially object to reading classic texts they brand misogynist.   

Thus “microagressions,” even though unintentional and even unconscious, cannot be tolerated.  Lurking behind hurtful words there must be gravely immoral thoughts!   “You can’t think that” is now an acceptable policy on some campuses.  According to one influential theorist:  ‘“Many racial microaggressions are so subtle that neither target nor perpetrator may entirely understand what is happening.’”  But this may well make them “more harmful to people of color than hate crimes or the overt and deliberate acts of White Supremacists” (p. 119).  Generally speaking, only the ones who suffer from these verbal assaults really understand their evil.  An offense is in they eyes of the beholder!  Students on many campuses are now demanding the right to anonymously inform on their professors’ microaggressions and “Bias Response Teams” have been formed to enforce proper discipline on them.  To prevent hurt feelings, for example, UCLA now “prohibits people from asking Asian-Americans the question ‘Where are you from or where you born?’” lest they feel non-American.   Nor can you say “America is a land of opportunity” lest someone feel that such is not true for him (p. 109).  Correcting a student’s grammar may lead to complaints of “white privilege” and racial bias.  

The culmination of these developments, Furedi says, is “the quest for a new etiquette.”  Traditional ways, including chivalrous conduct, have generally dissolved.  To replace them we find what Jurgen Habermas “‘described as the juridification of everyday life’” (p. 125).  Yet exactly what kinds of behavior may now be condemned or approved and enacted into law remains undecided.  Administrative decrees, more psychological than philosophical in justification, seek to regulate activities but lack deeply moral (and especially religious) justification, so they quickly change and often defy common sense.   “The rhetoric of campus guidelines tends to avoid the language or right and wrong or good and evil, appealing instead to the therapeutic language of feelings” (p. 128).  To make sure feelings are protected, universities employ numbers of sensitivity experts and trainers and workshop “facilitators” to raise “awareness” and enforce speech codes and punish microagressions.  Millions of dollars are yearly expended to deal with “sexual harassment” complaints.  Students must be properly acculturated to the modern ethos, so Cambridge University now promotes “events ‘to celebrate Lesbian, Gay, Bisexual and Transgender (LGBT) History Month, Black and Ethnic Minority (BME) History Month, International Women’s Day (IWD), International day of Persons with Disabilities (IPDP) and Holocaust Memorial Day (HMD)” (p. 135).  No victim groups may be ignored lest someone’s self-esteem decay!  

None of us really knows where this will all end.  But Umberto Eco was certainly prescient when he said that “‘even though all visible trees of 1968 are gone, it profoundly changed the way of all of us, at least in Europe, behave and relate to one another.  He added that ‘relations between bosses and workers, students and teachers, even children and parents, have opened up,’ and that therefore,’they’ll never be the same again’” (p. 134).  If Furedi’s right, universities have wasted their patrimony and may never regain their rightful place in modern culture.  

* * * * * * * * * * * * * * * * * * *

During spring break in 2006, the captains of Duke University’s lacrosse team hired two strippers, including twenty-seven-year-old Crystal Magnum, to perform at an off-campus party.  Such events were not particularly notable, since 20 or so had occurred at the university that year.  But Magnum subsequently claimed to have been raped, provoking a sensational series of events carefully recorded by Stuart Taylor Jr. and KC Johnson in Until Proven Innocent:  Political Correctness and the Shameful Injustices of the Duke Lacrosse Rape Case (New York:  St. Martin’s Press,  c. 2007).   Added to the incident itself, illustrating the sexual tone of today’s universities, it’s a disturbing story laced with racial tensions, political aspirations, faculty prejudices, administrative cowardice, and media malpractice.  Though dense with details, the book fully engages the reader, alerting him to some the troublesome aspects of 21st century culture.

Though best known for its basketball prowess, Duke’s lacrosse team was a perennial powerhouse, routinely competing for national championships.  Accordingly, the team featured many fine athletes, often graduates of elite prep schools where lacrosse was emphasized.  These athletes were, moreover, generally outstanding students, bound for the graduate and professional schools which train the doctors and lawyers their parents envisioned.  The stripper, on the other hand, had a checkered background, marked by a failed marriage, illegitimate children, prostitution, and mental problems.  But she was poor and black, born and reared in Durham.  And the lacrosse players, with one exception, were white, the sires of wealthy families.  The two strippers’ performance lasted all of four minutes, in part because Magnum was apparently too drunk to stand, much less dance.  She and her colleague, Kim Roberts, departed the house, though Magnum  passed out on the back stoop.  She said nothing to Kim about being raped, nor did she say anything to a security guard who subsequently called 911 and tried to help her, nor to the police who responded.  Taken to the hospital, she was examined by doctors and nurses, who found no signs of rape.  Finally, however, a feminist nurse who considered herself an advocate for rape victims filed her own report, and it became the basis for later rape accusations.  Throughout the process Magnum’s story continually changed, so it was not clear exactly what had transpired with the lacrosse team. 

When police received information regarding the incident, a Durham detective well-known for his antipathy to Duke students took charge of the investigation.  He had no interest in interviewing Kim Roberts, who best knew what actually happened.  When another policeman interviewed her, six days after the alleged rape, Roberts declared the sexual assault story “a crock,” and her handwritten statement “contradicted Magnum on all important points” (p. 57).  The lead detective also refused to consider any data regarding Magnum’s career as a prostitute, though in time one of her associates testified to “taking her to jobs in three hotels with three different men” on the nights preceding the lacrosse party.   The detective also failed to interview the doctor who had actually performed the pelvic exam when Magnum was admitted to the hospital.  When shown pictures of all the lacrosse players, she could not identify any of them with certainty, and one of those she fingered was nowhere near the party that night.  

Taking an even greater interest in her case was District Attorney Michael Nifong, who envisioned it leveraging his political career in the coming election.  He needed the support of the black community in Durham as well as the liberal professors at Duke, so he quickly discerned how supporting Crystal Magnum’s rape accusations would ultimately enable him to win the upcoming election.  In a series of inflammatory press releases, Nifong branded the Duke athletes “rapists” fully deserving the vigorous prosecution he would pursue.  Many of his statements were demonstrably false, but newspapers and media outlets across the nation soon picked up on the case, almost unanimously assuming the guilt of the players accused.  Virtually everywhere there was a simple objective:  “Lynch the privileged white boys.  And due process be damned” (p. 121).  Writers for the New York Times and TV personalities such as Nancy Grace and Joe Scarborough cheered the mob of outraged folks determined to punish the “rapists.”  Few journalists cared to find the truth!  (Amazingly, the most balanced publication dealing with the case was Duke’s student newspaper!)   Inevitably, Jesse Jackson showed up, trumpeting his support for an abused black woman, and the local NAACP applauded Nifong’s every move!  

So too the Duke administrators (most especially President Richard Brodhead) and professors (especially from the African-American and women’s studies programs) began to loudly denounce the lacrosse players, apparently committed to the notion that any woman claiming to have been raped must be telling the truth.  Here was an illustration of the “morality tale” of “virtuous black women brutalized by white men” (p. 66).  The Duke faculty launched hysterical attacks on the lacrosse team.  (Many professors simply resented the fact that many thought of Duke in terms of its athletes, while they wanted the institution to bask in an aura of academic excellence.)  Many had a deep commitment to the feminism on display in the yearly “Take Back the Night” rallies.  And virtually all of them wanted to publicly bear witness to their racial sensitivities and liberal proclivities.   To some teachers, the players should be punished for rape  “‘whether it happened or not’” since it would help compensate “‘for things that happened in the past’” (p. 170).   Even as evidence proving the athletes’ innocence steadily mounted, Duke’s professors “served as enthusiastic cheerleaders for Nifong,” and “for many months not one of the more than five hundred members of the Duke arts and sciences faculty—the professors who teach Duke undergraduates—publicly criticized the district attorney or defended the lacrosse players’ rights to fair treatment” (p. 105).  The more radical the professor (e.g. Houston A Baker, a past president of the Modern Languages Association) the more the mainstream media loved to interview him!  Long before the trial, these professors simply assumed the men were guilty—and, of course, an illustration of how America is a racist, sexist society!  Only one lonely professor, a chemist, dared stand up and defend his friend, the lacrosse team’s coach!  In the judgment of Thomas Sowell:  “‘The haste and vehemence with which scores of Duke professors publicly took sides against the students in this case is but one sign of the depth of moral dry rot in even our prestigious institutions’” (p. 117).  

Fortunately for the lacrosse athletes, several had parents with the means and connections to assemble a strong legal defense team.  These lawyers early saw the flaws in Nifong’s accusations and found solid evidence (especially DNA) upholding the innocence of their clients.  All of the players cooperated with the police, submitting to lie detector exams and volunteering the blood samples requested for DNA tests, which proved to be the “biggest defense bombshell, since the State Bureau of Investigation reported that “‘no DNA material from any young man tested was present on the body of this complaining witness’” (p. 162).  Then the athletes’ attorneys demonstrated “the staggeringly conclusive evidence of innocence, and of probable Nifong misconduct” (p. 302).   Violating an operating rule for prosecutors, Nifong had refused to even look at evidence collected by defense attorneys, something “unheard of” in legal circles, pushing his case through a grand jury and bringing it to trial.  But in time the evidence would, in fact, become public and the athletes were vindicated.  

Cracks in the prosecution’s case began with blogs such as Liestoppers dissecting the mainstream media’s presentations.  Articles in the New York Times were shown to be filled with egregious errors, deliberately omitting crucial evidence countering Nifong’s claims.  Then a few TV programs—most notably Sean Hannity’s—questioned the assumed guilt of the lacrosse athletes.   Students on the Duke campus—many resenting the malicious role their professors played in the process—increasingly sided with the team and believed that Crystal Magnum had lied.  Ultimately CBS’s 60 Minutes, after a lengthy investigation, declared “the rape claim was a fraud and Nifong was guilty of outrageous misconduct” (p. 282).   When Nifong faced the defense attorneys in a preliminary hearing, his case quickly unravelled.  It became clear that he and one of his expert witnesses had conspired to hide evidence, and he dropped the rape charge.  He “had engaged in grossly unethical—perhaps criminal—misconduct, and the case against the lacrosse players was a travesty” (p. 317).  He lost face, soon resigned his office, and would finally be disbarred.  His effort to punish the innocent “may well have been the most egregious abuse of prosecutorial power ever to unfold in plain view” (p. 356).  In sum, Nifong was guilty of “demonizing innocent suspects in the media as rapists, racists, and hooligans; whipping up racial hatred against them to win an election; rigging the lineup to implicate them in a crime that never occurred; lying to the public, to the defense, to the court, and the State Bar; hiding DNA test results that conclusively proved innocence; seeking (unsuccessfully) to bully and threaten defense lawyers into letting their clients be railroaded” (p. 356). 

But even more shameful than the district attorney was the Duke faculty and administration!  Even when the evidence proved the lacrosse athletes innocent, activist professors remained belligerent and unrepentant!  Eight-seven professors published a letter repudiating any efforts to make them retract or apologize for their slanders.  Instead, they attacked the bloggers, students, and journalists who defended the athletes.   So too the NAACP, The New York Times, and other powerful organizations refused to retract their slanders or seek to do justice to the maligned men.  Even if they did no wrong, it seems, they represent what’s wrong in this nation’s racist/sexist/classist society!   Anyone concerned with the justice in America needs to know what happened at Duke—and is still happening in other sectors of the USA.

292 “Lo, the Poor Indian”

Reflecting a pervasive Enlightenment perspective—and presaging Jean-Jacques Rousseau’s Romantic admiration for America’s “Noble Savage”—Alexander Pope, in his Essay on Man, declaimed: 

Lo, the poor Indian!  whose untutored mind

Sees God in clouds, or hears him in the wind;

His soul proud Science never taught to stray

Far as the solar walk or milky way; 

Yet simple nature to his hope has given,

Behind the cloud-topped hill, an humbler heav’n.   

Neither Pope nor Rousseau knew much about the New World’s indigenous inhabitants, but that didn’t dissuade them from making authoritative pronouncements, and similar ignorance has infected much that’s been written or portrayed about Indians ever since.  Thus today many folks imagine they understand them  as a result of watching a TV special on the Dakota Access Pipeline or listening to alleged “Native American” spokesmen leading protests in various locales.    

Illustrating this ignorance is the widespread circulation (especially in environmentalist circles) of an alleged statement made by Chief Seattle in 1851.  The quotation declared:  “Every part of this earth is sacred to my people.  Every shining pine needle, every sandy shore, every mist in the dark woods, every clearing and humming insect is holy in the memory and experience of my people.”  Seattle’s words, duplicated in many books and displayed on schoolroom posters, effectively persuaded many Americans that the First Americans were the First Environmentalists, carefully husbanding the natural world, walking softly on Mother Earth.  In fact, the speech was written in 1972 by a Texas scriptwriter working on a film produced by the Southern Baptist Radio and Television Commission!  It fit the mood of the moment, whether or not it had any historical veracity, and became part of the nation’s folklore!  Certainly it helped establish one of the many misleading stereotypes that in the long run serve to harm Indian people.  

Endeavoring to better root us in reality, Naomi Schaeffer Riley recently toured the United States and Canada gathering material for her insightful The New Trail of Tears:  How Washington Is Destroying American Indians (New York:  Encounter Books, c. 2016).   To understand anything we need first to describe it and then think clearly to explain it.  So Riley proffers careful descriptions accompanied by reasoned analysis.  Her descriptions remind us of similar accounts through the centuries—tribal peoples beset by a multitude of problems (including the highest poverty and lowest life expectancy rate of any racial group, shocking suicide numbers, alcohol and drug abuse, rape, sexual abuse and widespread gang activity).  Her analysis, however invites us to think hard about the glaring failures of latest in a long list of “saviors”—the federal government.  Rapacious frontiersmen and ruthless armies harmed Indians in the past, but today the primary culprit responsible for their predicament is the government, the pretentiously  benevolent Welfare State.  

“As you’ll see in this book,” Riley says, “the problems American Indians face today—lack of economic opportunity, lack of education, and lack of equal protection under law—and the solutions to these problems require a different approach from the misguided paternalism of the past 150 years.  It’s not the history of forced assimilation, war, and murder that have left American Indians in a deplorable state; it’s the federal government’s policies today” (#149).  More troubling:  Indians provide us a “microcosm of everything that has gone wrong with liberalism,” caused by “decades of politicians and bureaucrats showering a victimized people with money and sensitivity instead of what they truly need—the autonomy, the education, and the legal protections to improve their own situations” (#149).  

Consider this:  the federal bureaucracies charged with responsibility for the nation’s one million reservation Indians, the Bureau of Indian Affairs (BIA) and the Bureau of Indian Education (BIE), employ 9,000 employees—roughly one bureaucrat for every 100 Indians.  The feds’ funding “for education, economic development, tribal courts, road maintenance, agriculture, and social services—was almost $3 billion in 2015.  Consequently:  “Tribal leaders only demand more money from Washington to fix their problems.  And the senators and congressmen who represent them are only too glad to oblige return for the votes of the populations” (#2910).  Yet extraordinary unemployment rates, coupled with tribal ownership of land and reliable welfare payments, leave virtually all reservations poverty-stricken.  Lacking private property rights, reservation Indians (whose lands are tribally owned but held “in trust” by the federal government) almost inevitably suffer what economists call “the tragedy of the commons.”  Theoretically, everyone owns the land, but no one owns any actual parcel and takes no responsibility for any of it.  But everyone gets annuities (and in many areas, per capita dividends from tribal casinos) that provide subsistence without needing to work—and therein lies much that’s wrong with the reservations.  

Still more:  endless federal regulations dictate how reservation lands may be used—and make it virtually impossible to use it productively!  Entrepreneurs and venturesome economic projects inevitably run afoul of a nanny state determined to insure that Indians will always be the “Indians” suitable to bureaucrats who often operate in accord with sentimental myths rather than observable realities.  Thus, for example, Michelle Obama could tell a gathering of Indian youngsters that “‘on issues like conservation and climate change, we are finally beginning to embrace the wisdom of your ancestors’” (#567).  Had she simply driven through most any reservation she could have seen how little ancestral wisdom regarding the “sacred land” may be found in Indian country!  Here the results of the Obamas’ antipathy to developing natural resources can be demonstrated.  Reservations sit on enormous coal, uranium, oil and gas reserves, but ’”86% of Indian lands with energy or mineral potential remain undeveloped because of Federal control of reservations that keeps Indians from fully capitalizing on their natural resources if they desire’” (#450).   Even a superficial assessment of Indian affairs should persuade one that the money expended on behalf of the Indians hardly helps (and probably harms) them.

The greatest natural resource, of course, is people, and children must be well educated in order to develop their potential.  The Bureau of Indian Education (BIE), a notably inefficient bureaucracy,  expends about $850 million providing for its “42,000 students (most children on reservations don’t attend BIE schools), which amounts to about $20,000 per pupil, compared with a national average of $12,400” (#87).   Only half of the students in high school graduate, and those who do frequently have less-than-adequate skills.  Providing details, Riley sets forth a sobering assessment of the schools under federal jurisdiction.  In one school on the Crow reservation in Montana, for example, $27,304 per pupil was expended—compared with $10,625 in non-Indian state schools.  Yet the graduation rate was 39%!  There, and everywhere you look, Indian schools are “among the worst in the nation” (#1562).   In stark contrast, the Saint Labre Catholic schools in southeastern Montana serve 800 Crow and Cheyenne children.  These Catholic schools take no federal monies and do nicely, enjoying a dropout rate of only one percent!  And large numbers of their graduates go on to study in college.  (There are some bright lights in Indian country, but they’re rare.)  

Aware of the educational failures of reservation schools, distraught parents and students usually blame the lack of discipline and qualified teachers, as well as nepotism-infected tribal administrations, though they also point to the breakdown of the family as the primary culprit.  Some youngsters who graduate high school then attend one of the 32 federally-funded tribal colleges, where they often study tribal traditions or arts and crafts.  Rarely do they graduate and attend a university, nor do they learn much they can use apart from the reservation.  Sadly:  “Every school on the [Pine Ridge, Sioux] reservation is scrambling for teachers.  But the tribal school—Oglala Lakota College—doesn’t even offer a degree in secondary education” (#2258).  Rather than training youngsters to effectively help their people, most colleges cater to personal proclivities, often traditional arts and crafts.  Thus  “‘The Tribal Institute of American Indian Arts in New Mexico,’ according to the Atlantic, ‘spends $504,000 for every degree it confers . . . more than Harvard or MIT’” (#1578).  

The Indians doing the best these days are the ones whose descendants lost their lands in the 19th century—or individuals who leave the reservation and find their way in the broader culture.  Descendants of the Five Civilized Tribes in eastern Oklahoma are certainly prospering nicely when compared with the reservation-rooted Sioux and Navajo.  So too the Lumbees (in the Lumberton North Carolina area), lacking language, chiefs and tribal land, blended into the area’s population.  Fully assimilated, they supported a decent school system and also embraced the “passionate Baptist faith that, to a person, they today profess’” (#1292).  In one Lumbee’s opinion, their success resulted from the “tribe’s independence from the federal government.  ‘Indians had to pay for everything themselves here.  They had pride in the people who built it’” (#1300).  They could also own and develop, buy and sell land.  

The Lumbees weren’t wealthy, but they were doing okay.  Then politicians in Washington D.C. decided to help them!  Overwhelmed with liberal guilt following WWII, the feds decided to allow landless tribes to “reconstitute” themselves.  In 1975 President Nixon signed the Indian Self-Determination and Education assistance Act, opening the coffers for grants to law enforcement, education, and environmental programs.  Increasingly, Indians could qualify for generous welfare programs.  As a result, increasing numbers of younger Lumbees ceased working and now waste their days doing drugs.  Today’s schoolchildren are notably less well-educated than their grandparents!  Whereas churches used to help the needy, the government now hands out money and enables them to idly self-destruct.  An older Lumbee, Ronald Hammonds, a successful cattle farmer, laments:  “‘Women are encouraged to have babies.  It’s economic development.  You get a check.  We’ve got more illegitimate kids than ever, and it’s getting worse.’  He calls the local housing project a ‘breeding ground’ and says that the children are mostly being raised by their grandmothers.  ‘They’ve got no responsibility.  They’re looking for the government as the solution to all our problems’” (#1419).   The only answer to the many problems the Lumbees now face, Hammonds thinks, is to get the government out of their lives.  

And that’s basically the solution Riley recommends:  eliminate the dependency engendered by the reservations!  That would, of course, mean much anguish in Indian communities—and in the non-Indian liberals who empathize with them.  But it may be the only “tough love” way to free the most impoverished peoples in America.  Indicating how little things have actually changed in 150 years, read carefully the final paragraph in Our Wild Indians:  Thirty-three years; Personal Experience Among the Red Men of the Great West (Hartford, CN:  A. D. Worthington and Company, c. 1883).  Colonel Richard Irving Dodge, who knew the Indians as well as any 19th century writer and described them with a relentless honesty, harbored no romantic or humanitarian illusions regarding either them or their cultures.  “The only hope for the Indian,” he wrote, “is in the interest and compassion of a few men, who, like the handful of “Abolitionists” of thirty years ago, have pluck and strength to fight, against any odds, the apparently ever losing battle.  These in turn must rely upon the great, brave, honest human heart of the American people.  To that I and they must appeal to the press; to the pulpit; to every voter in the land; to every lover of humanity.  Arouse to this grand work.  No slave now treads the soil of this noble land.  Force your representatives to release the Indian from an official bondage ore remorseless, more hideous than slavery itself.  Deliver him from this pretended friends and lift him into fellowship with the citizens of our loved and glorious country” (#9377).  

* * * * * * * * * * * * * * * * * * * * * * * 

Inasmuch as the main focus of my graduate study at the University of Oklahoma was Western American History—writing my master’s thesis and doctoral dissertation on Cherokee history—I for many years often taught a class entitled “The First Americans.”  One of the books I either required or recommended was Dee Brown’s Bury My Heart at Wounded Knee:  An Indian History of the American West, though I warned students to take it as more of a pro-Indian polemic than balanced history.  Despite its bias, it presented the post-Civil War Indian wars in a very readable way and alerted readers to the mistreatment of tribal peoples.  Were I still teaching today, however, I’d have better source that covers the same terrain—and basically comes to the same conclusions—with more effort to understand both white and Indian perspectives. to see both good and evil in each group of people.  

It’s Peter Cozzens’ The Earth is Weeping:  The Epic Story of the Indian Wars for the American West (New York:  Alfred A. Knopf, c. 2016). Fortunately, says Cozzens, there are primary sources unavailable to Dee Brown and he can “tell the story equally through the words of Indian and white participants and, through a deeper understanding of all parties to the conflict, better address the many myths, misconceptions, and falsehoods surrounding the Indian Wars” (#351).  He provides in-depth descriptions and interesting details regarding Indian warriors’ training and skills as well as those of the U.S. Army recruits who opposed them.  Still more:  he effectively shows how Indians themselves (through intra-tribal rivalries and conflicts as well as inter-tribal animosities) contributed to their defeat.  In many ways the book simply fills in the details contained in a succinct statement made by Lieutenant Colonel George Crook, who fought many a battle with them:  “I do not wonder, and you will not either, that when Indians see their wives and children starving and their last source of supplies cut off, they go to war.  And then we are sent out there to kill them.  It is an outrage.  All tribes tell the same story.  They are surrounded on all sides,the game is destroyed or driven away, they are left to starve, and there remains but one thing for them to do—fight while they can.  Our treatment of the Indian is an outrage’” (#318).  

After setting the stage with a discussion of United States developments and policies, as well as Indians’ tribal traits and migrations onto the Great Plains, Cozzens turns to Red Cloud’s War in 1866.  Determined to halt the movement of miners into Montana’s gold camps, Red Cloud (leading Oglala and Miniconjou Sioux warriors) prevailed, defeating an army detachment at the Fetterman “massacre” and subsequently signed the second treaty of Fort Laramie in 1868, closing the Bozeman trail and securing for the Lakotas the “Great Sioux Reservation” (today’s South Dakota west of the Missouri River), to be maintained for their “absolute and undisturbed use and occupation.”  Red Cloud’s “victory” was a rare Indian triumph—and it hardly arrested the westward movement of Americans pioneers.   

The Lakota and Northern Cheyenne further enjoyed two brief victories in 1876—the battles at the Rosebud and the Little Big Horn in southeastern Montana.  At the Rosebud, General Crook was repulsed by warriors following Crazy Horse.  Days later, Colonel George Armstrong Custer led his Seventh Cavalry into the Little Bighorn region, where he encountered one of the largest encampments of Sioux and Cheyenne (7,000 Indians; 1800 warriors) ever assembled.  He’d bragged that his Seventh Cavalry could “whip all the Indians in the Northwest,” but at the Little Big Horn he proved himself a poor prophet.  Following Crazy Horse, Sitting Bull and Gall, the Indian warriors slew 258 troopers, losing only 31 of their own.  Following the battle, Sitting Bull said:  ‘“I feel sorry that too many were killed on each side.  But when Indians must fight, they must’” (#5249).  Custer’s last stand, however, was the northern tribes’ last stand, for the army thereafter sent column after column (frequently in winter, burning their lodges and food supplies) after the hostiles and effectively broke their will within a few years.   With the surrender of Crazy Horse, the last renegade Lakotas came to terms with the United States and accepted their lot as reservation Indians.  After taking refuge in Canada for a few years, Sitting Bull too surrendered early in 1881.  “‘Nothing but nakedness and starvation has driven this man to submission,’ concluded a sympathetic army officer, ‘and not on his  own account but for the sake of his children, of whom he is very fond’” (#6070).  

On the Southern Plains, at the same time, the Cheyennes and Arapahoes were defeated (in part by Colonel Custer’s massacre of Black Kettle’s peaceful village of Southern Cheyennes on the Washita River) and confined to a reservation in the western part of Indian Territory, to be joined soon thereafter by the Kiowa and Comanche (finally defeated in the Red River Wars in the 1870s).  Adding to the relentless might of the military, the Indians further faced the loss of the buffalo—the enormous herds that supplied their every need in 1865 were simply gone by 1875.  Buffalo hunters, killing the animals for their hides, nearly wiped out the species!  Hide hunters, Phil Sheridan said, did “more to settle the Indian Problem in two years than the army had done in thirty.  For the Sake of lasting peace, let them kill and skin until the buffalo are exterminated’” (#3100).  And without the buffalo, the Indians either starved or begged for rations from army forts.  

In the Far West, the Modocs were defeated in northern California.  The Nez Perces, led by Chief Joseph, were forced from their Washington homeland and conducted an epochal struggle, coursing through 1700 miles in Idaho and Montana before surrendering near the Canadian border.  The Utes of the Rocky Mountains were defeated and relocated in reservations in Utah and southern Colorado.  In the Southwest, the Apaches under Cochise and Victorio waged some resourceful guerrilla wars, but with the defeat of Geronimo’s small band in 1886 that region was pacified.  At the end, some 5,000 troops were involved in corralling eighteen warriors led by Geronimo and Naiche!  Though there is a certain aura around Geronimo, those who knew him best generally disliked him.  One Apache leader said:  ‘“I have known Geronimo all my life put to this death and have never known anything good about him.’”  The daughter of  Naiche  “agreed.  ‘Geronimo was not a great man at all.  I never heard any good of him’” (#7348).  Significantly, the troops  who most effectively hunted down the Apache bands were other Apaches, equally skilled in tracking and surviving in harsh environs.  General George Crook, one of the officers engaged in Indian wars for three decades, said:  “‘In warfare with the Indians it has been my policy—and the only effective one—to use them against each other’” (#7566).  

The post-Civil War conflicts in the American West was consummated in a massacre at Wounded Knee South Dakota in 1891.  Hundreds of despairing Lakotas had been captivated by a new religious movement, the Ghost Dance.  A Paiute medicine man, in Nevada, Wovoka, meshed native and Christian traditions and urged followers to dance incessantly to usher in a wonderful world devoid of white men and their oppression.  Though most Indians disdained the movement, fervent practitioners worried officials in the Indian Bureau, whose agents insisted the army suppress it.  In a convoluted chapter of the ferment, Sitting Bull was arrested and killed by Indian policemen.  Then a 65 year old Miniconjou chief named Big Foot decided to lead his band to safety on the Oglalas’ Pine Ridge Reservation.  Confusion and misunderstanding led to a violent confrontation along Wounded Knee Creek, and at least 150 Sioux (mainly women, children, and old men) died.  

Thirty years of Indian wars had ended.  And Peter Couzzens provides the most readable, accurate account of them I’ve read.  

# # # 

291 The War on Humans

   When, during the last presidential debate, Hillary Clinton defended all forms of abortion (the deliberate taking of an unborn, innocent human being’s life at any time in a woman’s pregnancy), she graphically illustrated her party’s position in this nation’s decades-long cultural war.  Though the defenders of life have won some important battles, pro-abortion forces still occupy commanding positions on the battlefield.  That truth is powerfully illustrated in Ann McElhinney and Phelim McAleer’s investigative treatise—Gosnell:  The Untold Story of America’s Most Prolific Serial Killer (Washington:  Regency Publishing, c. 2017).  Four things powerfully struck me while reading the book:  1) the sheer barbarity of the late-term abortions performed by Kermit Gosnell, M.D., in his Philadelphia Women’s Medical Society clinic wherein an estimated “40 percent of the babies aborted . . . were over the gestational age limit for legal abortion in Pennsylvania” (#2318); 2) the utter indifference and dereliction of state officials required to inspect and regulate abortion clinics; 3) the lock-step commitment of the nation’s media to ignore, obscure, or at least minimize Gosnell’s crimes; and, 4) the irony of some abortions (late-term) qualifying as murder whereas others (the million or so done yearly in Planned Parenthood facilities) have absolute legal protection.   As Kirstin Powers said, ‘“whether Gosnell was killing the infants one second after they left the womb instead of partially inside or completely inside the womb—as in routine late-term abortion—is merely a matter of geography.  That one is  murder and the other is a legal procedure is morally irreconcilable’” (#157).  

McElhinney and McAleer are Irish journalists who were drawn to the story by its intrinsic merit rather than because of any pro-life convictions.  Indeed, Anne McElhinney had “never trusted or liked pro-life activists” (#127).  Then, as she began covering Gosnell’s trial, she realized that “pro-abortion advocates tend to avoid any actual talk of how an abortion is done and what exactly it is that is being aborted.”  But now she knows!  And she also now knows that “what is aborted is a person, with little hands and and a face that from the earliest times has expression.  The humanity in all the pictures is unmistakable, the pictures of the babies that were shown as evidence in the Gosnell trial—first, second, and third trimester babies, in all their innocence and perfection” (#140).  While researching and writing she “wept at my computer.  I have said the Our Father sitting at my desk.  I am no holy roller—I hadn’t prayed in years—but at times” she could do nothing else.  Even more profoundly, she sensed “the presence of evil,” the sheer lack of conscience, pervading the pro-abortion establishment.

The Gosnell case began with a drug investigation launched by a Philadelphia undercover narcotics investigator, Jim Wood, who was getting drug peddlers to reveal the sources of illegal prescriptions for drugs like OxyContin.  A tangled web of informants led Wood to Dr. Kermit Gosnell, who turned out to be “one of the biggest suppliers in the entire state of Pennsylvania,” operating out of his Women’s Medical Society clinic (#302).  Therein investigators discovered far more than a drug emporium!   They found “a filthy, flea-infested, excrement-covered” abortion clinic almost impossible to describe.  Urine and blood discolored the floors; trash, cat excrement and hair littered the facility.  They found “semi-conscious women moaning in the waiting room.  The clinic’s two surgical procedure rooms were filthy and unsanitary,” featuring rusty equipment and non-sterile instruments (#452).  Unqualified, unlicensed staff members had administered sedatives and cared for the patients—one worker had an eighth-grade education and a phlebotomy certificate!  Another liked being paid in cash (and given free Xanax, Oxy-Contin, Percocet, etc.) because it enabled her to continue drawing fraudulent disability benefits from the Veterans Administration.  “The basement was filled with bags of fetal remains that reached the ceiling” (#527).   In a cupboard there were jars filled with little baby feet—apparently something of a fetish for Gosnell.  Dead babies were found in various containers, stored in refrigerators and freezers.  “Investigators found the remains of forty-seven babies in all” (#632).  It was truly a house of horrors!

Evidence collected from the clinic and Gosnell’s house, as well as testimony from his staff and patients, was presented to a grand jury, which spent a year combing through it.  “The final report, published on January 14, 2011, is a complete page-turner, a chronicle of how America’s biggest serial killer got away with murder for more than thirty years.  In its gruesome 261 pages, the grand jury named and shamed—and in some cases recommended charging—the doctor, his wife, and most of this staff, along with officials in numerous state government agencies, all the way up to the governor” (#798).  Indeed, it became clear that multiple complaints had been filed against the clinic for 30 years and  “the incompetents in Harrisburg, Pennsylvania’s state capital, knew or should have known that, even by their own lax rules, Gosnell should not have been carrying out abortions—but they didn’t care” (#1201).  Their dereliction was facilitated by the 1995 election of a “pro-abortion Catholic Republican,” Tom Ridge, whose policies proved “catastrophic for the many women and the hundreds of live babies who were injured and killed in Gosnell’s clinic” (#1473).  To one Philadelphia-area reporter, Ridge was “‘Gosnell’s chief enabler’” (#1480).  

Given the laws in Pennsylvania, to be charged with murder it was necessary to prove that some of the babies had been born alive and subsequently killed by Gosnell.  The case before the grand also had to be made before an openly pro-abortion judge who “was keen to draw attention away from the abortion establishment closing ranks, protecting one of their own and protecting abortion, regardless of the harm done on the way” (#1685).  Even the “partial-birth” procedure, whereby the baby’s head remained in the birth canal while the torso and legs were outside the mother, could not be labeled “murder” since the law allowed it.  Gosnell’s staff, however, testified to seeing many babies born alive and then killed (snipping their necks with scissors) by the doctor.  Importantly, for the trial, they had also taken some pictures of the slain babies that would provide vital evidence for the prosecution.   Ultimately he would be “charged with seven counts of first-degree murder and two counts of infanticide, and conspiracy to commit murder.  But from the evidence, it’s fair to assume that he murdered hundreds—perhaps thousands—over the course of his career” (#2684).  

Refusing a plea deal that would have led to his incarceration but spared his wife, Gosnell stood trial confident that he would be found innocent of all charges.  “His desire to appear as the smartest guy in the room overpowered all reason and good sense” (#2952), and he even fantasized serving as his own defense attorney!   His attorney, considered by many the best in Philadelphia, portrayed him “as a hardworking, selfless man—a pillar of the community with a virtually unblemished record who ran afoul of an overzealous prosecutor” (#3308).  Given a jury cleansed of pro-life persons and a pro-abortion judge who was a drinking companion with the defense attorney, Gosnell thought he could escape punishment by appealing to the pro-abortion ethos prevalent in progressive circles.  Nevertheless, as the evidence was presented and the expert testimony given, showing graphically what takes place in “late term” abortions as well as the killing of born-alive infants, the jury concluded Gosnell was in fact guilty as charged and he would be sentenced to life in prison.    

What the jury saw, however, went largely unreported by the nation’s media.  “If it hadn’t been for a committed group of bloggers, new media journalists, pro-life activists, and Twitter users, the Kermit Gosnell trial very likely would not have made national news” (#3983).  If a journalist mentioned the case it was usually to stress how virtually all other abortions were different from those performed in the Philadelphia clinic.  But then Kirstin Powers wrote a piece for USA Today, harshly condemning the press for neglecting the trial.  She’d found that none of the major TV networks mentioned the case during its first three months.  Nor did President Obama, who had “worked against the Born-alive Infants Protection Act” while he was in the Illinois Senate, make any comments or face any questions dealing with his position on Gosnell.  Only Fox News covered “the story from the beginning of the trial” (#4189).   The book’s reception further illustrates the media’s pro-abortion bias.  Within days of its release, it sold out on Amazon and Barnes and Noble, outselling all but three non-fiction titles.  But the New York Times refused to put it on its best seller list, and no mainstream media reviewed it.  

Ann McElhinney and Phelim McAleer have written a fully-documented, compelling treatise.  They obviously read everything relevant to the case, sat day-after-day witnessing the trial procedures, and later interviewed Gosnell in prison.  Though the surfeit of details—minutely describing the dead babies found in the clinic, investigating the police and prosecutors responsible for bringing the case to trial—may put off readers wanting a short synopsis, Gosnell:  The Untold Story of America’s Most Prolific Serial Killer merits the attention of everyone committed to the sacred “right to life” guaranteed by the Constitution as well as proclaimed in the Scriptures.  

* * * * * * * * * * * * * * * * * * * * * * *  

In the name of Nature, human nature is being denied and degraded in many venues.  Under the guidance of secular humanism, anti-human forces have been unleashed and radical “trans-human” proposals entertained.  As an astute Mortimer Adler long ago predicted (in The Difference in Man and the Difference It Makes), once a clear distinction between human beings and the rest of creation is drawn, no reason remains for granting man any special standing (i.e. “human exceptionalism”).  The Great Chain of Being has dissolved, leaving nothing but randomly scattered and essentially equal beings.  For several decades Wesley J. Smith has researched and written about precisely this development, and in The War on Humans (Seattle:  Discovery Institute Press, c. 2014), he challenges some of the growing anti-human (misanthropic) currents in contemporary culture—most notably within an environmentalism “that is becoming increasingly nihilistic, anti-modern, and anti-human” (#107).   This is clear when one confronts the philosophical aspects of the Deep Ecology Movement which serves for many as a “neo-Earth religion” that considers  human beings as no more than technologically sophisticated, consumerist parasites destroying Mother Earth.  Consequently, reducing the human population and giving other species unrestricted opportunities to thrive and multiply becomes the goal.   

This “green misanthropy” denies any moral difference between flora and fauna and human beings, whose numbers need reducing in order to enable other species to flourish.  To Paul Watson, head of the Sea Shepherd Conservation Society, humans are the “AIDS of the Earth.”  Only radical surgery, reducing man’s presence and activity on earth, can save the planet.  Similarly, Eric R. Pianka, a biology professor at the University of Texas who was named the Distinguished Texas Scientist of 2006, suggested it would be good if an ebola pandemic would kill 90% of the human population.  To Pianka:  “Humans are no better than bacteria, in fact, we are just like them when it comes to using up resources. . .  We are Homo the sap, not sapiens (stupid, not smart)’” (#318).  Needless to say, such activists enthusiastically promote abortion, euthanasia, eugenics and genetic engineering.  

Fortuitously, today’s green misanthropists have found in the hysteria regarding global warming (or climate change) a useful tool with which to promote their agenda.  Smith claims no expertise in dealing with climate change claims, but he does clearly discern the anti-human tone permeating the discussion.  He also notes that atmospheric carbon levels have steadily increased while there’s been no significant increase in world temperatures detected in 20 years.  Fearsome predictions abound—as in former Vice President Al Gore’s feverish warnings— but minor changes have actually occurred.  Polar bears still flourish, ice still forms in the arctic, snow still falls, crops still grow, hurricanes and earthquakes continue as usual—life on earth continues much as before.  

Yet numbers of school children fear they will not live into adulthood!  A U.S. senator introduced legislation to punish anyone daring to question the reality of climate change!    NASA’s James Hansen urged “the jailing of oil executives for committing ‘crimes against nature’ for being global warming ‘deniers’” (#811).  An editor at the Los Angeles Times says the paper will no longer print letters to the editor that doubt global warming!  Something has happened.  Vast numbers of folks have succumbed to green propaganda.  “Illustrating just how wacky global warming Malthusianism can become, the Mother Nature Network published an article lauding Genghis Khan—the killer of millions of people—for wonderfully cooling the planet during his years of conquest” (#690).    The author claimed that the Mongol invasions eliminated enough people (ca. 40 million) to keep 700 tons of carbon from fouling the atmosphere!  So the planet cooled for a century and Khan’s genocide should be praised!  To Smith:  “Only when the new Earth religion reigns can a vicious barbarian like Khan be canonized a saint” (#697).  

In 1974, Christopher Stone, a University of Southern California law professor published an article, “Should Trees Have Standing?—Toward Legal Rights for Natural Objects,” arguing that trees, as well as humans, should enjoy legal standing.  Subsequently, courts have increasingly granted environmentalists’ claims that legislation (most notably the Endangered Species Act) should be broadly construed so as to guarantee the preservation of all sorts of creatures and environments.  So now we face a new Earth religion that insists all of Nature has inalienable rights, including the right to exist—i.e. to be respected, to procreate, to have access to water.  Laws in nations such as Ecuador, Switzerland (with its “plant dignity” agenda), and New Zealand (declaring the Whanganui River to be  a person), now protect such rights. “In the 1970s,” Smith says, summarizing his presentation, “the values of Deep Ecology were anathema to most.  Ten years ago, granting ‘rights’ to nature would have been laughed off a a pipe dream.”  Yet, as we have witnessed in the rapid acceptance such innovations as same-sex marriage, “in contemporary society very radical ideas often gain quick acceptance by a ruling elite growing ever more antithetical to human exceptionalism” (#1305).  “The triumph of anti-humanism within environmental advocacy threatens a green theocratic tyranny.  Like eugenics, the misanthropic agendas discussed in this book are all profoundly Utopian endeavors, meaning that the perceived all-important ends will come eventually to justify coercive means.   Indeed, the convergence of human loathing, concentrated Malthusianism, and renewed advocacy for radical wealth redistribution—all of which are now respected views within the environmental movement, and each of which is dangerous in its own right—threatens calamity.

“Don’t say you weren’t warned” (#1380).  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

In A Rat Is a Pig Is a Dog Is a Boy:  the Human Cost of the Animal Rights Movement (New York:  Encounter Books, c. 2010), Wesley J. Smith carefully distinguishes between “animal welfare” (treating animals rightly) and “animal rights” (treating animals as man’s equal) and seeks to show the dangers posed by the latter.  He writes to alert readers to the dangers posed by animal rights’ radicals who plant bombs, destroy property, burn buildings, condemn all medical research involving animals, urge rigorous forms of vegetarianism, and even justify murder.  He also writes to inform us of the philosophical implications evident in statements such as Richard Dawkins’ declaration that we are not only like apes—“we are apes,” though differently evolved in minor ways!  

Foundational to the animal rights movement is Peter Singer’s 1975 book, Animal Liberation, wherein the first chapter is titled “All Animals Are Equal.”  Fifteen years later he could boast to having launched “a worldwide movement” that would continue to shape  men’s minds, extending the rights of “personhood” to whales and dolphins, dogs and cats, cattle and sheep.  He’d launched a movement!  Millions now embrace his assumptions and promote his objectives—“rescuing” various animals, halting animal research, throwing paint on fur coats, etc.  “Meanwhile, tens of millions of human beings would be stripped of legal personhood, including newborn human infants, people with advanced Alzheimer’s disease, or other severe cognitive disabilities—since Singer claims they are not self-conscious or rational—along with animals that do not exhibit sufficient cognitive capacity to earn the highest value, such as fish and birds.”  Working out the implications of his position, Singer concluded:  “‘Since neither a newborn infant nor a fish is a person the wrongness of killing such beings is not as great as the wrongness of killing a person’” (p. 28).   Contending newborn infants are not yet persons, he notoriously justifies infanticide until the baby attains “personhood” as he defines it.  

Though Singer speaks more plainly, many other distinguished academics and activists share his opposition to “speciesism,” the notion that humans are intrinsically superior to other forms of creation.  Thus they suggest that “Animals Are People Too,” positing “a moral equality between humans and animals,” making it “immoral for humans to make any instrumental use of animals” (p. 35).  All creatures capable of feeling pain are declared full-fledged members of the moral community.  Thus the People for the Ethical Treatment of Animals (PETA) once orchestrated a campaign “called ‘Holocaust on your  Plate,’ which compared eating meat to the genocide perpetrated by the Nazis against Jews” (p. 36).  

Ever alert to the opportunity of pushing their agenda through the judicial system, animal-rights activists work relentlessly to establish the “personhood” of animals in the courts.  Steven M. Wise, a law professor who heads the Center for the Expansion of Fundamental Rights, contends “that all animals capable of exercising what he calls ‘practical autonomy’ are entitled to ‘personhood and basic liberty rights,’ based on mere ‘consciousness’ and ‘sentience’” (p. 62).  Cass Sunstein, one of the regulations “czars” appointed by President Obama, thinks animals should be granted legal standing, and Harvard Law School’s Professor Lawrence Tribe (one of Obama’s instructors) “has spoken in support of enabling animals to bring lawsuits” (p. 67).  To this point, the main success enjoyed in the courts has been in cases restricting or halting medical research using monkeys or on behalf of “endangered species” such as the spotted owl in the Pacific Northwest.  But there is a powerful movement pushing our legal system to grant full equality to all creatures, great and small.

In addition to the courts, animal-rights advocates are working to proselytize children, primarily through the public schools.  Given their childish affection for bunnies and puppies, children easily respond to emotional appeals on behalf of mistreated animals.  PETA comics portray hunters and fishermen as evil people in publications such as “Your Daddy KILLS Animals!”  Young readers are then warned:  “‘Until your daddy learns that it’s not ‘fun’ to kill, keep your doggies and kitties away from him.  He’s so hooked on killing defenseless animals that they could be next!’” (p. 104).  PETA provides teachers with free curriculum materials and guest speakers espousing vegetarianism as well as condemning all forms of animal mistreatment.  High school students are promised legal assistance should they refuse to dissect frogs or dead animals in biology classes.  One organization, Farm Sanctuary, provides schools with materials promoting the “rescue” of animals imprisoned in “factory farms.” 

Smith makes his case by citing an impressive number of sources and presenting arresting illustrations, alerting us to the problems posed by the animal-rights movement.  He also rightly emphasizes “the importance of being human,” rightly caring for animals without elevating them to a sacred status.