315 Culture of Fear

One of the more amazing contemporary phenomena—despite our very evident safety and comfort —is the pervasive insecurity and fragility identifiable in various segments of the West.  As the Norwegian philosopher, Lars Svendsen, says:  “a paradoxical trait of the culture of fear is that it emerges at a time when, by all accounts, we are living more securely than ever before in human history.”  Aware of this, Pope John Paul II frequently encouraged believers to “fear not”—for that biblical phrase, reiterated by the angels announcing Jesus’s coming, indicates the importance of courage in the Christian tradition.  (Indeed, some 365 times the Bible says “be not afraid!”)  But with the waning of Christendom courage seems similarly sidelined.  Thus Alexander Solzhenitsyn said (in his 1978 Harvard Commencement Address):  “A decline in courage may be the most striking feature that an outside observer notices in the West today.  The Western world has lost its civic courage. . . .”  Prophetically, he warned:  “Must one point out that from ancient times a decline in courage has been considered the first symptom of the end?” 

Courage, traditionally understood, enables one to conquer his fears, and most of us admire it—at least in theory.  “But in everyday practice,” Frank Ferudi says in How Fear Works, “we have become estranged from this ideal and do very little to cultivate it.”  It has frequently, in fact, been “downsized” and even extended to assorted self-help endeavors!  Rather than a moral virtue best evident on the field of battle, it has turned into a therapeutic suggestion.  Thus we commend the “courage” of suffering poor health or recovering from romantic distress or speaking in public.  “The classical virtue of courage rooted it within moral norms that emphasized responsibility, altruism and wisdom.  The twenty-first-century therapeutic version is not based on an unshakable normative foundation; it has become disassociated from moral norms and is adopted instrumentally as a medium for achieving wellness” (#3040).

This cultural shift is generally justified by the necessity of “worst-case thinking” and the “Precautionary Principle,” a philosophical rationale “systematically outlined in the works of the German philosopher Hans Jonas, whose influential 1979 text The Imperative of Responsibility advocated the instrumental use of fear—what he calls the ‘heuristic of fear’—to promote the public’s acceptance of a dreadful view of the future.  Jonas offers what he perceives to be an ethical justification for promoting fear, which is that through its application, this emotion ought to be used to avoid humankind’s infliction of an ecological catastrophe on the planet” (#2749).  Jonas propounded “a teleology of doom based on the premise that modern technology threatens the world with an imminent threat of disaster” (#2756).  Among the intelligentsia he is “something of a philosophical saint” revered for his ecological sensitivities.  However, Furedi warns:  “his promotion of the principle of fear, his elitist contempt for people, and his advocacy of deception and tyranny, are rarely held to account” (#2798).

Inasmuch as courage is rooted in moral convictions, the increased fear in our society indicates a loss of moral certitude.  This phenomenon was diagnosed by Frank Ferudi in his 2016 work, What’s Happened to the University:  A Sociological Exploration of Its Infantilisation.  The author began his academic life as a student in 1965 and is now Emeritus Professor of Sociology at the University of Kent in the UK.  In his own student days universities were open to new ideas and touted the virtues of free speech and debating ideas.  As the decades passed, however, they became “far less hospitable to the ideals of freedom, tolerance and debate than in the world outside the university gate.”  They became fearful!   Students now seek to ban books that threaten their vulnerable psyches and protest speakers who might offend a spectrum of sexual and ethnic groups.  The free speech mantras of the ‘60s have morphed into speech codes; the former devotees of free speech have frequently become, as tenured professors, enforcers of censorship.  Many teachers forego the use of red pens to mark papers lest they damage fragile students’ egos, and    “safe spaces,” “trigger warnings,” “microagressions” and “chill out rooms” (replete with play dough and “comfort” animals to relieve anxieties) indicate how many universities have in fact become infantilized.

Two decades ago Ferudi published Culture of Fear: Risk Taking and the Morality of Low Expectations, arguing that moral confusion had hollowed out Western culture, making persons both increasingly less able to deal with risk and uncertainty and less positive about human nature and man’s ability to aspire and adventure.  Now he has revisited the subject in How Fear Works:  Culture of Fear in the Twenty-First Century (London:  Bloomsbury Publishing, c. 2018; Kindle Edition).   To illustrate his thesis Ferudi notes:  “Even an activity as banal as forecasting the weather has been transformed into a mini-drama through adopting a rhetoric that inflates the threat posed by relatively normal conditions.  Routine occurrences like storms, heavy snowfall or high temperature have been rebranded as extreme weather by the media.”  Indeed:  “The term ‘extreme weather’ is a paradigmatic culture of fear expression” and is, strangely enough,  “often interpreted through a moralistic narrative that presents it as the inevitable outcome of irresponsible human behaviour” (#338).  Summing up his study, he says “society has unwittingly become estranged from the values—such as courage, judgement, reasoning, responsibility—that are necessary for the management of fear” (#580).

In the past, many of our fears were restrained by religious faith, the confidence that some things were eternally true and worth risking—or even giving—one’s life to secure.  “Religion has always been interwoven with guidelines about what and what not to fear. Secular fear appeals concerning health, the environment, food or terrorism continue this tradition and are also often conveyed through a moral tone.  However, in the absence of a master-narrative that endows the unknown and the threat it poses with shared meaning, people’s response to threats has acquired an increasingly confusing and arbitrary character” (#1875).  Thus as we enter the 21st century “a pessimistic teleology of doom pervades the public deliberations on this subject” (#1202).  Every hurricane elicits warnings regarding climate change—as do arctic cold fronts, volcanic eruptions, and earthquakes!  No solid evidence or logical analysis is required to stoke the fears of folks immersed in our media world.  Think for a moment about the current Socialist superstar in Congress, Alexandra Ocasio-Cortez, who solemnly says we only have 12 years to save the planet!  Such somber predictions of environmental collapse (following the pattern cut out by Rachel Carson 50 years ago in Silent Spring) are often accompanied by warnings of a global demographic time bomb (confidently decried by Paul Ehrlich in his now thoroughly discredited Population Bomb). 

Consider the outlandish rhetoric of many social justice warriors!  Former President Jimmy Carter, for example, recently published a book entitled A Call to Action: Women, Religion, Violence and Power and grandly declared that right now (today!) slavery is “a ‘serious problem in the US’” and is even “‘more prolific now than during the eighteenth and nineteenth century.’”  It is, however, invisible!  Somehow Carter just knows it’s there, unseen and insidious.  “Like the hidden toxins ‘playing their tricks’ . . . modern slavery is not visible to the eye.  Typically, its hidden victims are said to be invisible and, therefore, the number of cases that have been actually detected are only the ‘tip of the iceberg.’”  To the former president “the transatlantic slave trade, which was responsible for the brutal enslavement of 12 to 15 million Africans, is merely a less prolific version of the ‘modern’ variety of the twenty-first century” (#1932).   This mantra is also recited by Jeff Nesbit, a former White House communications director, who said:  “‘No one knows the numbers.  That’s what’s so scary!’” (#1940).  To which Furedi retorts:  what’s scary is the fact that highly influential men such as Carter and Nesbit knowingly spread baseless falsehoods!

Then we’re fed alarming reports of rampant obesity and of children facing a barrage of threats to their well being.   “In most Western societies, the population is healthier and lives longer than in previous times.  The latest generation of young people is likely to live 20 years longer than their grandparents.  Yet there has never been so much propaganda warning the public about yet another danger to its health” (#1736).  It’s apparently even risky to drink tap water!  “There was a time when people did not walk around holding different brands of bottled water in their hands; they drank tap water unless they lived in areas where tap water was considered to be unsafe, in which case water was boiled.”  But we now see people everywhere “clutching their bottles of water,” gripped by fears of contaminants of some sort.  “In 2016, bottled-water consumption in the US reached 39.3 gallons per person.”  This is done despite the fact “that the fears directed at tap water are not based on an objective evaluation of the risks of drinking it.  From a health perspective, the consumption of bottled water makes little sense.  Unfortunately, the sensible message that tap water is in most places safe to drink and that paying for the bottled variety is unnecessary is often distorted through a narrative of fear.  Instead of merely stating ‘Let’s get real and drink tap water’, opponents of the bottled-water fad frame their argument through the perspective of fear.

As one might expect from a sociologist, Furedi is most helpful when compiling data and describing problems.  He clearly demonstrates the pervasive fears stalking contemporary society.  And he clearly shows how the lack of courage contributes to their currency.  But while he recognizes the need for the moral virtues, courage included, he fails to acknowledge the necessarily deeper philosophical or theological foundations necessary to establish courageous persons. 

      * * * * * * * * * * * * * * * * * * * * * * * * * *

In The Coddling of the American Mind:  How Good Intentions and Bad Ideas are Setting up a Generation for Failure (New York:  Penguin Publishing Group, c. 2018; Kindle Edition), Greg Lukinoff and Jonathan Haidt stress the harm done children by teachers and parents excessively fearful for their safety.  The authors had become increasingly distressed by the onerous “speech codes” hindering free thought and expression on university campuses.  “Something began changing on many campuses around 2013, and the idea that college students should not be exposed to ‘offensive’ ideas is now a majority position on campus” (p. 48).  The “rationale for speech codes and speaker disinvitations,” once limited to racist or sexist declarations, “was becoming medicalized:  Students claimed that certain kinds of speech—and even the content of some books and courses—interfered with their ability to function.  They wanted protection from material that they believed could jeopardize their mental health by ‘triggering’ them, or making them ‘feel unsafe’” (p. 6).  To address their concerns Lukinoff and Haidt first wrote a widely-discussed article for The Atlantic Monthly and then, subsequently, this book to unmask three fashionably propagated “Great Untruths”:  1) “The Untruth of Fragility”—the notion that stress or discomfort harms you; 2) “The Untruth of Emotional Reasoning”—the injunction to disavow reason and “always trust your feelings; and, 3) “The Untruth of Us Versus Them”—the warning that evil people continually seek to damage you.  Consequently the authors say:  “We will show how these three Great Untruths—and the policies and political movements that draw on them—are causing problems for young people, universities, and, more generally, liberal democracies” (p. 4). 

To illustrate the falsity of fragility, Lukinoff and Haidt point out how parents trying to protect their youngsters from peanut allergies actually endanger them by prohibiting children’s powerful immune system from properly developing.  A careful study revealed:  “Among the children who had been ‘protected’ from peanuts, 17% had developed a peanut allergy.  In the group that had been deliberately exposed to peanut products, only 3% had developed an allergy.  As one of the researchers said in an interview, ‘For decades allergists have been recommending that young infants avoid consuming allergenic foods such as peanut to prevent food allergies.  Our findings suggest that this advice was incorrect and may have contributed to the rise in the peanut and other food allergies’” (p. 21).  Indeed, as Nassim Nicholas Taleb says in The Black Swan:  “Just as spending a month in bed . . . leads to muscle atrophy, complex systems are weakened, even killed, when deprived of stressors.  Much of our modern, structured, world has been harming us with top-down policies and contraptions . . . which do precisely this:  an insult to the antifragility of systems.  This is the tragedy of modernity:  as with neurotically overprotective parents, those trying to help are often hurting us the most’” (p. 23). 

That human beings—homo sapiens—should renounce reason and trust their feelings is similarly untrue.  Though pop psychologists and media personalities may urge it, trusting your feelings flagrantly contradicts “much ancient wisdom.”  Whether pondering Epictetus or Buddha or Shakespeare or Milton, the best philosophers have inisted we think rather than feel.  Consult, for example, Boethius’ The Consolation of Philosophy, once one of the basic texts for the liberal arts, wherein he praises “Lady Philosophy,” who “chides him gently for his moping, fearfulness, and bitterness at his reversal of fortune” before helping  “him to reframe his thinking and shut off his negative emotions.  She helps him see that fortune is fickle and he should be grateful that he enjoyed it for so long.  She guides him to reflect on the fact that his wife, children, and father are all still alive and well, and each one is dearer to him than his own life.  Each exercise helps him see his situation in a new light; each one weakens the grip of his emotions and prepares him to accept Lady Philosophy’s ultimate lesson:  ‘Nothing is miserable unless you think it so; and on the other hand, nothing brings happiness unless you are content with it’” (p. 35).  Wise words for all ages!

The “us vs. them” untruth has gained currency to a large degree because of identity politics.  When race becomes the key to your identity you easily suspect racism in anyone who differs from you.  When sex defines you, you easily accuse others of sexism when you feel dissatisfied.  A widely-discussed incident at Yale illustrated this.  Erika Christakis, a lecturer at the Yale Child Study Center responded to an administrative edict regarding Halloween costumes.  She approved concerns for “avoiding hurt and offense,” but “she worried that ‘the growing tendency to cultivate vulnerability in students carries unacknowledged costs.’”  Rather than issue behavioral rules, she suggested:  “Free speech and the ability to tolerate offense are the hallmarks of a free and open society’” (p. 57).  Her rather mild email aroused angry students who protested and denounced her for racial insensitivity.  The university president sided with the aggrieved students, and in time Erika resigned from her position.  So goes “academic freedom” in modern America!

As was evident at Yale, intimidation and violence are manifestations of the coddling of the American mind!  Defining speech they find objectionable as “hate” speech, it is easy to then insist it is a form of violence.  And in response to violence self-defense is justified.  So conservative speakers on university campuses are not only shouted down but physically attacked.  Witch-hunts are employed to root out dissenters on campus.  When a liberal mathematics professor at Evergreen College refused to approve a campus shutdown to show solidarity with people of color, students demanded he be fired.  Successfully intimidating the college president, “students chanted, ‘Hey hey/ho ho/these racist faculty have got to go’’ (p. 117).   “President Bridges, who at the beginning of the school year had criticized the University of Chicago for its policy protecting free speech and academic freedom, agreed to many of the protesters’ demands.  He announced that he was ‘grateful’ for the ‘passion and courage’ the protesters displayed, and later, he hired one of the leaders of the protests to join his Presidential Equity Advisors” (p. 119).  Most everything that’s wrong with the modern university stands starkly revealed at Evergreen College!

Having described the “coddling of the American mind,” the authors turn to explaining how it came to be and set forth “six interacting explanatory threads,” beginning with “rising political polarization and cross-party animosity.”  Political positions no longer reflect a positive agenda, rooted in traditional and reflection; rather they are too often fueled by angry disdain for perceived enemies.  Secondly, they point out the importance of “rising levels of teen anxiety and depression.”  An alarming, and very recent, increase in teenage depression and suicide clearly constrict the passage from adolescence to adulthood.  Data recently collected from 139 colleges indicate that “half of all students surveyed reported having attended counseling for mental health concerns” (p. 156).  Importantly, some persuasive studies especially stress the negative role electronic devices play in the lives of our young. 

Thirdly, “changes in parenting practices” or “paranoid parenting” clearly contribute to the malady.  The “permissive parenting” associated with Dr. Spock has morphed into the “intensive parenting” now dominant.  Responding to perceived threats to their children—such as being abducted by strangers, something that happens less than 100 times a year—parents overreact.  Though seat belts and bicycle helmets have certainly made children’s live safer, “efforts to protect kids from risk by preventing them from gaining experience—such as walking to school, climbing a tree, or using sharp scissors—are different.  Such protections come with costs, as kids miss out on opportunities to learn skills, independence, and risk assessment” (p. 169).  Fourthly, there has been a “decline of free play,” something absolutely necessary for childhood development.  A child’s brain needs “thousands of hours of play—including thousands of falls, scrapes, conflicts, insults, alliances, betrayals, status competitions, and acts of exclusion—in order to develop.  Children who are deprived of play are less likely to develop into physically and socially competent teens and adults” (p. 183).  Unfortunately, school children are less likely to have physical education classes or recess.  And rather than learning to play ball with neighborhood kids—and to choose teams and referee the game—kids are shoved into organized leagues with uniforms and trophies and assorted adult paraphernalia irrelevant to healthy personal development. 

Fifthly, once in the university students face a burgeoning “campus bureaucracy” devoted to insuring their comfort and security.  Thus we find the president of Louisiana State University declaring:  “‘Quite frankly, I don’t want you to leave the campus ever.  So whatever we need to do to keep you here, we’ll keep you safe here.  We’re here to give you everything you need’” (p. 199).  Such protective “safetyism” increasingly extends to emotional as well as physical well-being.  Students must be shielded from “microaggressions,” given “trigger warnings” when scary subjects are be breached, and supplied with “safe spaces” suitable for children.  Finally, students are immersed in “a rising passion for justice in response to major national events, combined with changing ideas about what justice requires” (p. 125).   They then become “social justice warriors” determined to eliminate inequalities and inequities wherever possible.  Little concerned with distributive or procedural notions of justice, they are increasingly devoted to “equal-outcomes social justice,” even if they trample on important concepts such as “innocent until proved guilty.” Concluding their treatise with a section titled “wising up,” Lukinoff and Haidt first proffer advice for parents who want to rear “wiser, stronger, and antifragile” kids who will become self-reliant adults.  Giving them lots of time for “free play,” encouraging them to walk or bike to school, placing limits on the time they spend with electronic devices, including television, are important aspects of their prescription.  And for “wiser” universities they urge a return to the vigorous pursuit of truth once considered essential for liberal arts education.   Rather than promoting “social justice,” universities should urge persons to freely think and speak, embracing Benjamin Franklin’s commitment to founding the University of Pennsylvania:  “‘Nothing is of more importance to the public weal, than to form and train up youth in wisdom and virtue. Wise and good men are, in my opinion, the strength of a state: much more so than riches or arms, which, under the management of Ignorance and Wickedness, often draw on destruction, instead of providing for the safety of a people’” (p. 269).

314 Luther’s Reformation and Its Consequences

In his Requiem for a Nun William Faulkner famously said, “The past is never dead.  It’s not even past.”  This is especially true when it comes to Church history, so it was predictable that to mark the 500th anniversary of the Protestant Reformation (launched by Martin Luther in 1517) a plethora of books were published.  Inevitably—given Luther’s personality and positions—interpretations varied widely and nothing approaching a consensus is possible.  But I read and commend two works, beginning with Lyndal Roper’s Martin Luther: Renegade and Prophet (New York:  Random House Publishing Group, c. 2016, Kindle Edition).  Roper is an Australian historian who did doctoral research at Tübingen University under Professor Heiko Oberman, the author of a notable study of Luther.  Now the first woman to hold the prestigious Regius Chair at Oxford University, she is less interested in Luther’s theology than his personality, seeking “to explore his inner landscapes so as to better understand his ideas about flesh and spirit, formed in a time before our modern separation of mind and body.  In particular,” she says, “I am interested in Luther’s contradictions” (#400).  Thus she diligently mined a wealth of primary sources newly available in archives opened to scholars in the wake of East Germany’s demise.

Roper believes Luther’s “theology sprang from his character, a connection that Melanchthon, one of the first of his biographers and his closest co-worker, insisted upon:  ‘His character was, almost, so to speak, the greatest proof’ of his doctrine.  Luther’s theology becomes more alive as we connect it to his psychological conflicts, expressed in his letters, sermons, treatises, conversations, and biblical exegesis.  Such a rereading of the original sources,” enhanced by psychoanalytical insights, will provide “a richer understanding not only of Luther the man but also of the revolutionary religious principles to which he dedicated his life, the legacies of which are still so powerful” (#405).  His letters especially “give us a sense of the charisma he must have radiated, and the sheer delight his correspondents must have experienced in being his friends.  It was Luther’s vivid friendships and enmities that convinced me that he had to be understood through his relationships, and not as the lone hero of Reformation myth” (#476).

Luther’s early beginnings took place in Mansfield, where his father was a prosperous miner, followed by scholarly instruction in the nearby cities, including Erfurt, where he attended the university.  Though the university specified strict rules of behavior, “Luther acidly remembered, ‘Erfurt is a whorehouse and beerhouse’ . . . .   Founded in 1392, the university was the oldest German institution to have a charter, and in the early sixteenth century it boasted an outstanding collection of humanists, interested in the revival of ancient learning and in returning to the sources” (#1029).  Luther was only an “average student,” but he absorbed much of Erfurt’s weltgeist—both “the via moderna and nominalism, a direction in philosophy that reached back to William of Ockham in the fourteenth century.  Luther’s teachers included cutting-edge nominalists” who promoted the via moderna rather than the via antiqua evident in Thomas Aquinas and Duns Scotus.  Luther especially became committed to “critical thinking” and “empirical evidence,” i.e. primary sources. 

Then came his famous awakening in the 1505 thunderstorm!  Fearing he might die, he consecrated himself to the religious life, joined the Augustinian order, and entered its monastery in Erfurt.  Here he followed a normal course of studies but also struggled with what seems to have been an inexplicable “sense of overwhelming guilt.”  Strangely enough:  “Luther seems almost to have luxuriated in feelings of guilt, as if, by driving them to their extreme, he could experience a heightened devotional state of self-hatred that would bring him as close as possible to God” (#1241).  Conversely, his mentor, Johan von Staupitz, “had a relaxed attitude to sin—he once joked that he had given up making vows, for he was simply unable to keep them—but what worried Luther were not the usual sins but the ‘real knots’:  his lack of love of God and his fear of judgment” (#1336).  He would ultimately solve this conundrum by replacing the obligation to love with faith alone as the touchstone of salvation. 

In 1511 Luther was sent to Wittenberg, a town of some 2000 residents, the site of a new university, a castle, and a magnificent cathedral—all thanks to the Elector Friedrich.  Here he became a professor and found the academic life fully suited him, plunging into it with gusto, reading and writing and thinking deeply about the Gospel.  By 1517, when he posted his famous 95 Theses, he had discarded scholasticism and declared that Aristotle (whose works were basic to the medieval university curriculum) “was not only unnecessary for the study of theology, but positively harmful” (#1958).  Indeed, Greek philosophy in toto—given its celebration of reason—had no value since it “was just a distraction from the meaning of Scripture, and one must give up on attempting to find God through ‘the whore’ of reason, for the point of faith is that it exceeds rationality and reveals the distance between God and man” (#338).  So:  “‘No one can become a theologian unless he becomes one without Aristotle’” (#1965).  Claiming instead to follow St. Augustine, Luther said:  “‘The truth therefore is that man, made from a bad tree, can do nothing but want and do evil;’” consequently:  “‘Man is by nature unable to want God to be God.  Indeed, he himself wants to be God, and does not want God to be God’” (#1968).  Thus Sola Scriptura became a Reformation dicta

Yet another dicta was justification by faith alone.  In 1545, the year before he died, Luther recalled how Paul’s Letter to the Romans proved central to the Reformation:  “‘At last, by the mercy of God, meditating day and night, I gave heed to the context of the words, namely, “In it the righteousness of God is revealed, as it is written, ‘He who through faith is righteous shall live.’”  There I began to understand that the righteousness of God is that by which the righteous lives by a gift of God, namely by faith.  And this is the meaning:  the righteousness of God is revealed by the gospel, namely, the passive righteousness with which the merciful God justifies us by faith, as it is written, “He who through faith is righteous shall live.”  Here I felt that I was altogether born again and had entered paradise itself through open gates’” (#275).

Luther’s paradise included increasing sensual indulgence!  Thus he encouraged monks and nuns to marry and himself wedded Katherina von Bora, a “poor noblewoman” who “was, by all accounts, attractive, feisty, and passionate” (#5455).  In a fascinating chapter entitled “Marriage and the Flesh,” Roper describes and analyzes the importance of Luther’s marriage.  Katherine was a valuable helpmate, effectively running the household and allowing Martin to focus on his studies.  She bought and farmed some land and “was famed for her beer brewing, a necessity in a period when water was not safe to drink” (#5579).  But to his friend Melanchthon this step indicated “that something had changed in Luther by 1525, and he did not like it.  The ascetic was becoming a sensualist” (#5498).  And, indeed, Luther entertained “remarkably uninhibited views about sexuality—and consequently marriage” that accorded well with his “radical Augustinianism.  If we can never do anything good, as all human acts are sinful, then sexual acts are no different or worse in kind than other types of sin.  This gloomy anthropology paradoxically freed Luther to take a relaxed view of sexuality.  Lust was part of human nature—it was how God had created mankind” (#5615).

Though Luther insisted he’d found the absolute truth proclaimed in the Scripture, his reformation quickly splintered.  When the great humanist Erasmus differed from him regarding predestination, Luther excoriated him.  Then Erasmus published A Discussion or Discourse Concerning Free Will, asserting man may cooperate with God in the salvation process and denying total depravity, and Luther responded with De servo arbitrio (On the Enslaved Will), arguing that God arbitrarily determines everything.  We are so congenitally sinful that “only God’s grace can enable us to do anything good.”  Indeed, speaking personally, he did “not wish to be given free will.”  “His newfound relationship with God required there be no free will, because “‘I am certain and safe, because he is trustworthy and will not lie to me, and also because he is so powerful and great that no devils, no adversities could break him or snatch me from him’” (#5666). 

Others joined Erasmus in dissenting from Luther.  His Wittenberg collaborator and supporter, Andreas Karlstadt, began stressing the importance of Gelassenheit—a total surrender of one’s will to God’s Will, “a state of mystical receptivity and openness where the boundaries between oneself and God disappear—as if one were to return to the womb where there is no separation between mother and child” (#4430).  He thus proclaimed the possibility of attaining a kind of Christian perfection Luther could not tolerate.  Then, dressed in lay clothing while celebrating Mass, Karlstadt distributed both bread and wine, allowing anyone present to participate in Communion.  Consequently, of the thousand parishioners present “many of those who took Communion had not kept the obligatory fast but had eaten and drunk beforehand” (#445).   Such behavior outraged many in the community—including the Elector, whose support Luther surely needed! 

Added to Karlstadt’s increasingly aberrant behavior, more radical reformers arrived in Wittenberg!  Known as the Zwickau prophets, three zealous laymen claimed God directly spoke to them.  No need for Bible or trained pastors!  They could read the Bible—as Luther insisted—for themselves.  And they could also—as Luther denied—interpret the Bible as they wished.  “The Zwickau prophets represented a new kind of evangelical movement that owed little or nothing to universities.  God’s spirit, it seemed, was being poured out onto laypeople to preach and prophesy, bypassing traditional authority” (#4534).   Predictably, the radicals appealed to university students, and considerable chaos erupted.  Soon, wherever the reformation took root, evangelicals were “interrupting sermons, destroying altarpieces, tearing up Mass books, urinating in chalices, or mocking the clergy—and they drew on the same repertoire of carnivalesque ritual and comedy that the Wittenberg students had developed” (#4227).  Even more threatening was yet another reformer, Thomas Müntzer, who came to Wittenberg and took an apocalyptic approach to Scripture, saying he felt led to  violently usher in the Kingdom of God.  Consequently, the Peasants’ War erupted in 1524 and proved to be “the biggest social uprising in the German lands before the era of the French Revolution began” (#5113).  Celebrating Reformation themes—“freedom,” “Christ alone,” Scripture alone”—peasants, armed “with pikes and swords had remarkable success” and briefly controlled “vast swathes of south and central Germany” (#5174).

In response, Luther determined to arrest and stabilize the movement he’d launched!  Consistently aligning himself with secular authorities, he insisted only his version of Protestantism be allowed.  So in 1524 he assailed Karlstadt in Against the Heavenly Prophets, and responded to the peasants’ uprisings by publishing Against the Robbing Murdering Thieving Hordes of Peasants.  His attack on the peasants led to their repudiating him as the “Brother Fattened-swine and Brother Soft-life,” “Doctor Liar” and “the spiritless, soft-living flesh at Wittenberg.”  Then he had to deal with deviants in Switzerland!  Huldrych Zwingli had orchestrated a reformation in Zurich and shared many of Luther’s views.  But he differed from him regarding the Eucharist.  In 1529 the two men met at the colloquy of Marburg, where Luther insisted Christ’s words, “This is my body” be taken literally, insisting on the Real Presence of Christ in the bread and wine.  “As it became clear that the two sides could not agree, Luther washed his hands of them, consigning them to the judgment of God, ‘who will certainly decide who is right,’ at which Zwingli burst into tears.  At the end of the meeting, Oecolampadius and Zwingli, pleased that at least they had all now met in person, wanted to embrace their opponents as brothers and allow all of them to take Communion with one another, but Luther bitterly refused” (#6300).

In the final 15 years of his life, Luther continued to teach in Wittenberg and influence the Reformation he had launched.  But his more eirenic associate, Melanchthon, presided over Lutheran theological developments, and secular rulers established essentially “magisterial” (i.e. state-controlled)  churches.  As Roper illustrates with Luther’s letters, he became increasingly bitter and routinely lashed out in anger against his many foes.  Even Melanchthon experienced his wrath!  And though he died with an assurance regarding his own salvation he seemed distressed by much of what the Reformation accomplished.

* * * * * * * * * * * * * * * * * * * * * * * *

For a thoughtful assessment of Luther and the Reformation I commend Brad S. Gregory’s Rebel in the Ranks:  Martin Luther, the Reformation, and Conflicts that Continue to Our World (New York:  Harper One, c. 2017).  Gregory is a history professor at Notre Dame who writes with clarity and authority.  The subject is important, he thinks, for “anyone who wants to understand how and why we have the Western ideas and institutions we have today must understand the Reformation and all that followed in its wake” (p. 13).  Though fascinated with Luther, Gregory is more interested in the unexpected consequences of his reformation, which “had the long-term impact of gradually and unintentionally transforming Europe from a world permeated by Christianity to one in which religion would be separate from public life, becoming instead a matter of individual preference” (p. 8).  There had in fact been many “reformers” over the centuries—such as the Cluniacs—calling for the restoration of morality, but Luther and his followers were distinguished by “asserting that many of the Church’s teachings were themselves false.  The problem wasn’t just bad behavior; it was also erroneous doctrine” (p. 9).  “Taken together, these new ideas, practices, and institutions became the foundations for the modern world.  They led eventually to the modern secularization of Western life—an unintended outcome of a sixteenth-century religious revolution” (p. 10).

After retelling Luther’s story, emphasizing the familiar themes of his reformation—sola scriptura, sola fides, etc.—Gregory turns to his central concern, the “fractious” nature of Protestantism, revealing the deeply political aspects of the movement.  Within a decade of its inception, Protestants divided into rival camps, including the despised Anabaptists as well as the officially supported Lutheran and Reformed churches.  Especially in Reformed regions political powers asserted themselves and there occurred a “reversal of clerical and lay roles:  local magistrates are asserting religious authority—and not just in matters of jurisdiction, as in the late Middle Ages, but in matters of doctrine” (p. 100).  Thenceforth Protestants divided and subdivided:  “Lutheranism in Denmark, Sweden, and much of Germany; Reformed Protestantism in Scotland, England (in some respects), the Netherlands, and parts of Germany and Switzerland” (p. 145).

Then the Protestant churches themselves fractured.  For example, Lutherans soon differed in their understanding of Luther.  “A rift opens between Philippists, named after their leader, Philip Melanchthon,” and “the self-described Genuine Lutherans” who “think Melanchthon and the Philippists are betraying Luther’s views with mistaken interpretations of scripture on a whole range of doctrines concerning faith, grace, and works, among other issues” (p.151).  Reformed Protestants in the Netherlands split between “orthodox” Calvinists and Arminians.  “At the heart of this conflict are theological disagreements about human nature, will, sin, and grace derived from differing interpretations of scripture.”  “Jacob Arminius (1560–1609), a theology professor at the Dutch Republic’s new University of Leiden, arrives at conclusions about core Protestant doctrines that are at odds with those of Calvin (and Luther).  According to Arminius, original sin does not completely corrupt human nature; human beings do have some free will and so can cooperate with God’s grace in salvation.  To card-carrying Calvinists, this is crypto-Catholic backsliding, like taking Erasmus’s position against Luther in their debate about free will and salvation.”  Tensions escalated and led to the Synod of Dort (1618-19), which approved a strong version of Calvinism while dramatically demonstrating “that the principle on which the Reformation rests—‘scripture alone’—is powerful enough to generate rival assertions about what the Bible actually says and therefore rival views about how it is to be applied” (p. 159).

The reformation in England followed the same trajectory.  Though the Tudor and Stuart monarchs tried to control the Church of England, they failed to restrain internal dissent—as was evident in the growing power of the Puritans and their violent revolution in the 1640s, culminating with the beheading of the king.  “Radical Protestants in the English Revolution really come into their own after the execution of Charles I and the proclamation of the Republic in 1649.  Gerrard Winstanley and his Diggers champion a biblical vision similar to the Hutterites: an agrarian, communitarian Christian commonwealth without private property.  The radically different George Fox and other early Quakers are spiritualists who claim illumination by the same ‘inner light’ that they believe inspired Jesus’s first apostles.  Utterly different again are the Fifth Monarchists:  their Christian duty, as they understand it, is to take up arms against Oliver Cromwell’s regime in their own country, hastening the Second Coming of Christ.  Seventh-Day Baptists depart from the already existing General (Arminian) and Particular (Calvinist) Baptists by insisting, as do some other groups, that the Sabbath be celebrated on Saturday rather than Sunday.  And Ranters, like Ebiezer Copp, allegedly take Christian freedom and rejection of the Old Testament law to mean complete sexual permissiveness—for, as scripture says, “To the pure all things are pure” (Titus 1:15).  If you don’t think something is sinful, it’s not sinful for you.  If this all sounds confusing and complicated, that’s because it was—much more chaotic and complex than any brief account can convey.  Like the early German Reformation, the English Revolution shows that scripture interpreted through the Spirit, as Luther emphasized, could come to mean almost anything” (pp. 165-166).

Such unexpected (and unintended) consequences of the Reformation were thoroughly analyzed in Gregory’s earlier, much more detailed Unintended Reformations, wherein he documented, in successive chapters, first, how God was progressively ignored as a non-material and thus unknowable reality.  Secondly, he shows how Christian doctrines were relativized by contentious theologians; as Erasmus lamented, in 1524:  “What am I to do when many persons allege different interpretations, each one of who swears to have the Spirit?”  Thirdly, Gregory demonstrates how the nation states increasingly controlled the churches, for “no Protestant regime was even possible save through dependence on secular rulers” (p. 152).   By 1555 it had been decided:  “cuius region, emus religio—whose kingdom, his religion.”  Fourthly, as a result of the reformations, rival moral authorities presided over diverse moral communities, and in time everyone became not only his own priest and theologian but ethicist.   Fifthly, Gregory notes how the “good life” became increasingly defined as the acquisition of good things.  “The earliest New England Puritans rail against greed and endeavor to punish it in ways that would have made Calvin proud.  By the late seventeenth century, however,” various Christians viewed “material prosperity, including the highly profitable participation in the Atlantic slave trade, as part of God’s benevolent plan for the chosen people of England, his elect imperial nation.  In a dramatic reversal, the pursuit of profit is being aligned with religion, not regarded as a deadly sin or a grave danger to your soul or the common good”(p. 234).  Finally, knowledge became deeply secularized, reduced to describing material entities as a result of powerful prejudices favoring methodological naturalism and evidentiary empiricism.  Metaphysical or theological views were excluded from making any truth claims about anything more than one’s inner feelings.     

More celebratory treatments of the Reformation are easily available, but Gregory’s arguments deserve careful thought and reflection, for the fragmentation of Christendom and the secularization of society cannot be ignored.  And his yoking the Reformation to these developments has much merit.  Rooted in his longing for a “world we have lost,” his works provide a deeply Catholic critique of the Reformation—but they are sorrowful rather than scathing in tone.  As Lucy Wooding says:  “This book is truly breathtaking in its scope, erudition and sheer nerve . . .  There may yet be tie to fix some of what went wrong in the Reformation.”  Understanding it is a place to start!

313 Postmodernism, Scientism

When I first encountered “postmodernism” several decades ago I wondered at the sheer irrationality of the term itself.  After all, The Oxford English Dictionary defines “modern” as “being at this time; now existing; of or pertaining to this present and recent times.”  By definition, then, nothing can be post-modern!  It is, in fact, oxymoronic—self-contradictory.  So I was gratified, recently reading Alexander Solshentisyn’s 1993 essay, “Playing Upon the Strings of Emptiness” (crafted when he was awarded the National Arts Club Medal of Honor for Literature), to find him sharing my view.  “Whatever the meaning intended for this term,” he wrote, “its lexical makeup involves an incongruity:  the seeming claim that a person can think and experience after the period in which he is destined to live.”   Importantly:  “For a post-modernist, the world does not possess values that have reality.  He even has an expression for this:  ‘the world as text,’ as something secondary, as the text of an author’s work, wherein the primary object of interest is the author himself in his relationship to the work, his own introspection.”  

Yet, amazingly enough, throughout the past century growing numbers of people embraced the position Solshentisyn opposed and embraced the motto propounded in Luigi Pirandello’s 1916 play:  Right You Are If You Think You Are.  In their own inner worlds postmodernists fantasize—or “construct” their own reality”—even to the extent of self-selecting their sex!  New York City’s Mayor Bill de Blasio recently defended this, allowing residents to rewrite their birth certificates, choosing any of three sexual categories.  “New Yorkers,” he said, “should be free to tell there government who they are.”  Now boys insisting they are girls join female wrestling team and easily win matches.  In all bizarre behaviors we see postmodernism triumphant!  George Orwell, writing 1984, envisioned such a time as ours, when:  “All words grouping themselves round the concepts of objectivity and rationalism were contained in the single word oldthink.”   He prophetically skewered the twin pillars of Postmodernism:  epistemological skepticism and ethical relativism.  What Orwell called “oldthink” (objective reason), postmodernists reject and claim to transcend.  

To understand this phenomenon, I commend Explaining Postmodernism:  Skepticism and Socialism from Rousseau to Foucault  (New York:  Ockham’s Razor Publishing, Kindle, c. 2011) by Stephen R. C. Hicks.  Primarily, he thinks:  “Postmodernism is the end result of the Counter-Enlightenment attack on reason” (#913).  So to understand it we need to review two centuries of intellectual history, beginning with Immanuel Kant, a philosopher often touted as the personification of the Enlightenment and its dedication to reason, yet who was deeply anti-rational inasmuch as he “asserted that the most important fact about reason is that it is clueless about reality” (#940).  Kant thought we can observe and link together phenomena, but essences—any inner noumena—must remain forever unknowable.   We can describe and manipulate the material world, but the “objects that science explores exist ‘only in our brain,’ so we can never come to know the world outside it” (#1075).  Thus Kant discarded the Enlightenment’s understanding of reason, holding “that the mind is not a response mechanism but a constitutive mechanism.  He held that the mind— and not reality— sets the terms for knowledge.  And he held that reality conforms to reason, not vice versa.  In the history of philosophy, Kant marks a fundamental shift from objectivity as the standard to subjectivity as the standard” (#1143).  “‘I had to deny knowledge,’ wrote Kant in the Preface to the first Critique, ‘in order to make room for faith.’”  Setting forth his “first hypothesis about the origins of postmodernism,” Hicks says: “Postmodernism is the first ruthlessly consistent statement of the consequences of rejecting reason, those consequences being necessary given the history of epistemology since Kant” (#1976).  

Subsequent to Kant, various 19th century philosophers (e.g. Schopenhauer and Nietzsche) and theologians worked out the implications of his position.  In particular there transpired a profound shift in Lutheran theology inspired by F.D.E. Schleiermacher, the father of Protestant Liberalism who declared:  “‘The essence of religion is the feeling of absolute dependence.  I repudiated rational thought in favour of a theology of feeling’” (#1410).  Soon thereafter Soren Kierkegaard “gave irrationality an activist twist” and profoundly influenced (with his “Christian Existentialism”) 20th century theologians such as Karl Barth.  “‘Faith,’” wrote Kierkegaard in Fear and Trembling, “‘requires the crucifixion of reason’”; so he proceeded to crucify reason and glorify the irrational” (#2164).  Equally Kantian is the atheistic version of Existentialism was set forth by Martin Heidegger, who effectively jettisoned reason and logic “to make room for emotion.”   Heidegger rejected “the entire Western tradition of philosophy . . . based as it is on the law of non-contradiction and the subject/object distinction” and propounded a despairing version of metaphysical nihilism (#1670).  He “is unquestionably the leading twentieth-century philosopher for the postmodernists” (#1518).  

In addition to Kant’s philosophical idealism one must understand the importance of Jean-Jacques Rousseau’s socialistic political ideology.  Though postmodernism is certainly a philosophical persuasion, it is equally a political position, leading to Hicks’ “second hypothesis about postmodernism:  Postmodernism is the academic far Left’s epistemological strategy for responding to the crisis caused by the failures of socialism in theory and in practice” (#2153).  Since the French Revolution in 1789, socialism (or progressivism) had become a Rousseau-inspired religion for many.   “Rousseau’s writings were the Bible of the Jacobin leaders of the French Revolution, absorbed by many of the hopeful Russian revolutionaries of the late nineteenth century, and influential upon the more agrarian socialists of the twentieth century in China and Cambodia” (#2204).  Rousseau routinely elevated feeling over reason and determined to follow his “inner light;” he also celebrated the supremacy of simplicity (i.e. the “Noble Savage) over the artificiality of civilization and its consequent corruptions.  

Yet the 20th century’s sorry record of socialist revolutions and regimes effectively refuted its ideology, whereas the much-derided capitalist system had, in fact, made life much better for millions of people.  Marx’s oft-celebrated motto—“From each according to his ability, to each according to his need”—lost its allure in prospering societies wherein virtually all material needs had been satisfied!  So Leftists abruptly stopped talking about “needs” and declared themselves committed to “equality.”  Capitalism had failed, not to satisfy basic needs, but to give everyone equal shares of everything.  Rather than seeking to rectify economic injustices, Socialists promoted “multiculturalism” and crusaded to eliminate racial and sexual inequities.  In addition, Marxist activists embraced environmentalism, which promoted “the radical moral equality of all species” as a movement capable of discrediting capitalism.

In their desire to destroy distinctions and abolish hierarchies, Leftists reveal their deeply nihilistic perspectives.  Indeed:  “Nihilism is close to the surface in the postmodern intellectual movement in a historically unprecedented way.  In the modern world, Left-wing thought has been one of the major breeding grounds for destruction and nihilism.  From the Reign of Terror to Lenin and Stalin, to Mao and Pol Pot, to the upsurge of terrorism in the 1960s and 1970s, the far Left has exhibited repeatedly a willingness to use violence to achieve political ends and exhibited extreme frustration and rage when it has failed.  The Left has also included many fellow-travelers from the same political and psychological universe, but without political power at their disposal” (#4125).  As Nietzsche, one of the architects of postmodernism, said, in Daybreak:  “When some men fail to accomplish what they desire to do they exclaim angrily, ‘May the whole world perish!” 

Consequently, some of the most influential postmodernists, awash in despair at the failure of their socialist faith, seem happy to envision the abolition of man.  Michel Foucault, for example,  “speaks almost longingly about the coming erasure of mankind:  Man is ‘an invention of recent date’ that will soon ‘be erased, like a face drawn in sand at the edge of the sea.’  God is dead, wrote Hegel and Nietzsche.  Man too will be dead, Foucault hopes” (#4186).  Deconstructionists such as Foucault and Jacques Derrida seek to get behind or beneath the apparent meaning of language.  More deeply, following atheistic nihilism of Nietzsche, they deconstruct not only language but Reality itself!  Nothing can be said because, ultimately, nothing ontological is really There.   If there are objective “things” (and especially all eternal, substantial, non-material realities) around us—they are beyond knowing and thus unreal.  What’s real is simply what, at the moment, we consider real for us, whatever works for us.  So here we are:  men calling themselves women!

* * * * * * * * * * * * * * * * * * * * * * * *

Rivaling postmodernism for modern man’s allegiance is what’s frequently dubbed “scientism”— carefully examined by J.P. Moreland, a professor at Biola University and one of today’s best evangelical philosophers, in Scientism and Secularism:  Learning to Respond to a Dangerous Ideology (Wheaton, IL:  Crossway, Kindle Edition, c. 2018).  A quotation from Dallas Willard, another fine evangelical scholar, nicely sums up Moreland’s thesis:  “The idea that knowledge—and of course reality—is limited to the world of the natural sciences is the single most destructive idea on the stage of life today.”  Anticipating Willard’s concern, C.S. Lewis devoted a significant amount of his writings, beginning with his first Christian work, The Pilgrim’s Regress (1933), to a critique of Scientism.  During the Second World War, delivering some lectures published as The Abolition of Man, he warned that:   “The process which, if not checked, will abolish Man, goes on apace among Communists and Democrats no less than among Fascists.”  In fact, he declared that:  “many a mild-eyed scientist in pince-nez, many a popular dramatist, many an amateur philosopher in our midst, means in the long run just the same as the Nazi rulers of Germany.” 

To introduce his case Moreland gives a bit of personal background.  Reared in a very nominal Christian home and church, he entered the University of Missouri determined to pursue a degree in science.  While there, however, he encountered Campus Crusade, had a life-changing conversion experience, and subsequently served as a Crusade staffer for a decade.   Subsequently, he continued his academic work and,  “during the process of my various studies . . . constantly bumped into something dark, hideous, and, I dare say, evil.  It was the philosophical notion of scientism, roughly the view that the hard sciences alone have the intellectual authority to give us knowledge of reality” (#240).

“At the very least,” its devotees declare, “this scientific knowledge is vastly superior to what we can know from any other discipline.  Ethics and religion may be acceptable, but only if they are understood to be inherently subjective and regarded as private matters of opinion.  According to scientism, the claim that ethical and religious conclusions can be just as factual as science, and therefore ought to be affirmed like scientific truths, may be a sign of bigotry and intolerance” (#275).  Inasmuch as the public schools and universities embrace and promote it, scientism has become rather like the air we breathe—something so pervasive we hardly notice it.  Sadly, few of us consider “what it does to a culture and to the church.  It puts Christian claims outside of the ‘plausibility structure’ (what people generally consider reasonable and rational)” and makes it difficult for the Gospel to get a fair hearing (#365).  On the defensive, many Christians have left and “reasonable and rational” realm to scientists and embraced various versions of “blind faith.” 

Representing such scientism, Robert B. Reich, a Harvard professor and Secretary of Labor in the Clinton administration, recently declared:  “‘The greatest conflict of the 21st century . . . will be between modern civilization and anti-modernists; between those who believe in the primacy of the individual and those who believe that human beings owe their allegiance and identity to a higher authority; between those who give priority to life in this world and those who believe that human life is mere preparation for an existence beyond life; between those who believe in science, reason, and logic and those who believe that truth is revealed through Scripture and religious dogma.’  Reich understands that ideas matter, and he hopes that scientism destroys our confidence in Christianity” (#482).  Indeed, Reich is a sterling example of Moreland’s claim that “Scientism is a silent yet deadly killer of Christianity” (#3173).

Despite the self-assurance of folks such as Robert Reich, scientism is, rightly evaluated, irrational.  Bringing philosophical rigor to the discussion, Moreland builds a persuasive case showing “that strong scientism—the view that true knowledge is found only within science—is self-refuting.  It is self-referentially incoherent, meaning that it refutes or defeats itself” (#657).  To explain:  “when a statement is included in its own subject matter (i.e., when it refers to itself) but fails to satisfy its own standards of acceptability, it is self-refuting” (#667).  A self-refuting statement is necessarily false!  If you say “All sentences are exactly three words long,” or “I do not exist,” or “There are not truths,” you refute yourself.  You make no sense, so you’re speaking nonsense!  So too when you say “Truths can only be verified by the five senses or by science” you refute yourself you are stating something that cannot be so verified’” (#671).  

Given his background, Moreland fully understands the scientific world, and he knows its champions assume some utterly non-scientific positions.  For example, they assume there’s a world “out there” to be studied.  Physicists and chemists purport to weigh and measure actual things “independent of mind, language, or theory.”  Whether or not the know it, they are philosophical realists—assuming there’s a world that’s quite real apart from their own inner worlds.  They simply assume our senses and minds enable us to come to grips with and understand a real world.  Scientists further assume the natural world functions in accord with orderly laws (e.g. mathematics or logic or gravity or electromagnetic fields)—constants underlying the changing world of sense perceptions.  Inasmuch as they celebrate “peer review” to establish scientific truth, practicing scientists necessarily believe in “objective truth.”  What’s done in one experiment can be replicated in another—thus what’s discerned is not simply something within the head of the researcher.   “Not only is objective truth a presupposition of science (for most advocates of scientism), but its reality presupposes a certain understanding of truth, namely, the correspondence theory of truth” (#853).  Importantly, Moreland says:  “The conclusions of science cannot be stronger than their presuppositions.  There are many things that science presupposes.  But science itself cannot justify those presuppositions.  It needs philosophy to do that.  And therefore the philosophy of scientism—which is not itself science—ends up also being the enemy of science itself” (#962).

Inasmuch as science cannot advance any pretense of intellectual sovereignty, Moreland invites us to recognize the dignity and worth or other forms of knowing, including logic and math.  Both intellectual disciplines are known not by empirical processes but “by direct rational intuition or awareness, without appealing to sense experience to justify them”  (#1096).  We simply know them a priori or at first sight—prima facia.  The natural sciences, however, are limited to a posteriori reasoningdetermined by observation and calculation.  Thus mathematics and logic are not sciences!  They are, rather, important ways of thinking which are necessary for the sciences.  And then, perhaps most importantly, we know our own minds in ways inaccessible to empirical science.  My self-consciousness is as manifestly real as the earth and stars.  Knowing myself as I really am requires a non-material process, but it is absolutely essential to living as a human being.  To Moreland this is a critical issue, for:   “Simple introspection—combined with biblical, theological, and philosophical reflection—is the most rational and very best way to learn facts about the nonphysical nature of mental properties and mental/conscious states” (#1335). 

Importantly, consciousness “does not fit or is not at home in a naturalistic physical worldview.  As naturalist philosopher Colin McGinn admits, consciousness is one of the most mystifying features of the cosmos,” bordering “on sheer magic because there seems to be no naturalistic explanation for it:  How can mere matter originate consciousness?  How did evolution convert the water of biological tissue into the wine of consciousness?  Consciousness seems like a radical novelty in the universe, not prefigured by the after-effects of the Big Bang; so how did it contrive to spring into being from what preceded it?  A good question indeed!” (#2051).  This leads Moreland to stress the importance “substance dualism,” the position he takes regarding human nature as composed of both body and soul.  Consciousness is not, as many thinkers insist, merely an excretion of material activity within the brain.  It is, rather, a distinct a property of an immaterial soul. 

To persuasively refute scientism, Moreland holds, we need to reinstate the West’s traditional “first philosophy” and insist there are ways of knowing Reality apart from and superior to empirical science.  “The idea of first philosophy has been central to the discipline of philosophy since Plato, but with the advent of scientism in the mid-twentieth century (and the public’s general lack of exposure to philosophy in our educational system!), first philosophy has fallen into disfavor” (#1470).  Nevertheless, as Moreland shows in significant sections interacting with contemporary thinkers, a strong case can be made for both the “autonomy” and “authority” or philosophy.  There are, in fact, at least “five things” science cannot explain:   1) the origin of the universe; 2) the origin of the fundamental laws of nature; 3) the fine-tuning of the universe; 4) the origin of consciousness, and, 5) “the Existence of Moral, Rational, and Aesthetic Objective Laws and Intrinsically Valuable Properties.”  Examining each of these things elicits from Moreland many pages of skillful (and highly persuasive) argumentation.  He then makes helpful suggestions concerning the proper ways for Christians both embrace science without elevating it to scientism, with its methodological naturalistic presuppositions.  “After carefully considering its claims,” Moreland concludes “that it is not science, that it undermines science, that it encourages people to misuse science, and that because it is so widely believed, it ends up hurting Christians who buy into its deceptive lies” (#3167). 

Consider, for example the claim of Stephen Hawking, in The Grand Design, “that quantum physics has made the need for a creator and designer superfluous.”   Hawking thinks “the universe can ‘create itself,’ that is, it came into existence out of nothing.”  Though he was a first-rate scientist, however, Hawking was a poor philosopher!  To think clearly, “nothing” means precisely that—no-thing.  And something cannot, logically, come from nothing.  In fact, he assumed there is an eternal “quantum vacuum, which contains energy and is itself located in space.  The universe, according to them, comes into being spontaneously as a fluctuation of the energy in the vacuum.  This is hardly a case of the universe coming into being from nothing!” (#1740).  

So too “origin of life” researchers frequently fail to think rightly and are notoriously unable to even define the term.  Thus Antonio Lazcano admits:  “‘Life is like music; you can describe it but not define it.” (#1758)  Indicative of its mystery, there are some 100 definitions of “life”—all suggesting it’s non-material in important ways.   “Interestingly, many philosophers have provided new evidence for this argument by claiming, following biologists, that living things are constituted by information.  But apart from a few exceptions, many, perhaps most philosophers that work in this area have claimed that information is immaterial, more fundamental to reality than matter, and, given its nature, there can be no material explanation for the origin of (immaterial) information and, thus, for the origin of life” (#1778).

A lengthy endorsement of the book by Jeffrey Schwartz merits repeating:  “Scientism and Secularism should be mandatory reading for serious Christians who want to intelligently engage in the interface of philosophy and science.  Moreland elegantly guides the reader through concepts typically reserved for serious analytic philosophers and academics.  In doing so, he provides a desperately needed and highly accessible treatment of elite-level arguments that both seasoned philosophy veterans and enthusiastic amateurs will enjoy.  Moreland thus demonstrates a rare ability to distill complicated and abstract philosophical concepts into a framework for everybody to understand.” 

# # # 

312 Contempt, Compromised, Hoax

In the 1980s Kenneth Starr was one of the legal luminaries circulating within the higher echelons of the federal government—appointed to the D.C. Circuit Court of Appeals by Ronald Reagan and then named Solicitor General by George H.W. Bush.  He was a seriously considered for the Supreme Court slot vacated by Warren Burger but was passed over when David Souter appeared to be a less controversial candidate.  “Justice Souter was even heard to say, privately, ‘I have the Ken Starr seat’” (p. 307)  When Robert Fiske (the first special prosecutor appointed to investigate Bill and Hillary Clinton’s activities in Arkansas) resigned, Starr was named his replacement, since he was widely acclaimed as a fair, eminently-qualified lawyer.  Looking back at his prosecutorial efforts in the 90s, Starr has written Contempt: A Memoir of the Clinton Investigation (New York:  Penguin Publishing Group, Kindle Edition, c. 2018).  

In sum, he tells this “story:  Twenty years ago, after a four-year investigation resulting in fourteen criminal convictions in Arkansas and leading to the resignation of the sitting governor of the state, the Whitewater investigation took a bizarre twist.  It was revealed that in 1995 President Bill Clinton had begun an extended Oval Office affair with a twenty-two-year-old White House intern, Monica Lewinsky, then tried to cover it up.  In the fallout from the president’s misdeeds, the nation went through wrenching political turmoil.  Much of the drama was tragically unnecessary, a self-inflicted wound by a talented but deeply flawed president who believed he was above the law.  In the long and painful saga, he showed contempt not only for the law, but for the American people, whom he willfully misled for his political self-preservation.  He also demonstrated a shockingly callous contempt for the women he had used for his pleasure” (p. xi-xii).  Ultimately, Starr thinks:  “By the end of this book, my personal account of the legacy of Bill and Hillary Clinton—a legacy of contempt—I believe most reasonable, open-minded people will agree with me.  Or at least they should agree with my basic proposition:  that President Clinton and the First Lady knowingly embarked on a continuing course of action that was contemptuous of our revered system of justice” (pp. xii-xiv). 

To provide suitable context for his account, Starr shares a bit of his own story.  He was born in Texas and reared in a pastor’s home (his father ministering in the Churches of Christ denomination).  Thenceforth, though moving away from his father’s denomination, he says:  “Faith proved to be a pillar of strength in my daily life” (p. 24).  Ever a sterling student, he earned a B.A. from Brown University and a law degree from Duke.  Entering the legal profession he found his true life’s calling and fully enjoyed both practicing law and serving as a judge.  Then, much to his sorrow, he was persuaded to accept the position of special prosecutor and investigate the Clintons’ Whitewater adventures.  Almost immediately the president’s political operatives (e.g. James Carville, Lanny Davis, and Sidney Blumenthal) swung into action, portraying him as a “right-wing hit man” (p. 40).  Starr thinks they mainly implemented the strategies of Hillary, the more  sinister of the Clintons, for she had been “profoundly influenced by the radical Saul Alinsky, whose ‘rules for radicals’ included tips for budding community activists such as:  ‘Keep the pressure on, never let up,’ ‘Ridicule is man’s most potent weapon,’ ‘Go after people and not institutions,’ and ‘Pick the target, freeze it, personalize it, and polarize it.’  She’d written her ninety-two-page senior honors dissertation on Alinsky, whom she quoted as saying that gaining and holding on to power ‘is the very essence of life, the dynamo of life’” (p. 64).  Embracing Alinsky’s tactics, the Clintons left a trail of “wrongdoing” which “could have been avoided if they’d followed the Golden Rule instead of Alinsky’s rules for radicals” (p. 65). 

Presiding over a team of FBI agents and Department of Justice lawyers, Starr collected evidence from witnesses such as Judge David Hale, who repeatedly directed them to various of the Clintons’ shady deals.  “Hale became the epicenter of the Arkansas investigation.  Through his testimony, the mysteries of Whitewater and other financial crimes were illuminated.  If Judge Hale was right, Bill Clinton was a potential felon, assisted by Hillary” (p. 62).  Others in the Clinton entourage—e.g. Webb Hubbell, Jim and Susan McDougal—were interviewed and investigated and convicted of assorted crimes.  But their efforts were impeded by the mysterious disappearance of important documents and at every turn, and investigators were constantly frustrated by the Clintons’ disdain for law.  In fact:  “Engaging with the White House was like walking in molasses” (p. 86).  Or, to shift the metaphor, “Talking to Clinton,” Starr found, “was like nailing spaghetti to the wall” (p. 239).

One of Starr’s best lawyers, after taking a deposition from the president in 1995, said he “knew the president ‘was a lying dog’” who had probably committed perjury (p. 98).   While watching a film of Clinton’s deposition, his old friend and business partner Jim McCougal lamented seeing “‘the president of the United States commit perjury,’” and doing in the White House  Map Room.  “The Map Room, to Jim McDougal, was hallowed ground because of his admiration for FDR.  But that sacred soil, so to speak, had been polluted by the self-interested perjury of his hero’s successor.  Despite his own crimes, Jim was morally outraged by the lies under oath of the Man from Hope” (p. 131).  At her deposition, Hillary’s “responses were so glib, so superficial, they were almost ‘in your face,’ alternating on the theme of profound memory loss. In the space of three hours, she claimed, by our count, over a hundred times that she ‘did not recall’ or ‘did not remember’’” (p. 100).  Starr and his team “were of one accord that Hillary was a liar” (p. 203).

Starr’s team focused its attention on the Clintons’ financial activities in Arkansas and  “never pursued any case of sexual wrongdoing against Clinton” (p. 157).  In due time, however, the accusations of Paula Jones and Monica Lewinsky intruded into the investigation because of illegal maneuvers Bill Clinton made trying to deny them.  He refused to accept offered “mediation to resolve the case” brought by Paula Jones and “chose a foolhardy course.  He believed he could lie his way out of it” (p. 185).   “Clinton knew what he had done.  He had lied under oath in his deposition.”  Determined to stay in office, he followed “a multifaceted strategy:  First, take care of or at least neutralize Monica, much in the way the White House had taken care of Hubbell.  Second, stonewall the investigation while purporting to cooperate.  Third, send out surrogates to aggressively attack Starr and his team—and to trash Monica” (p. 195).  That strategy, aided by the media, succeeded magnificently, and the American people rallied to Clinton’s defense. 

So Starr ultimately crafted his “referral” and presented it to the House of Representatives, which duly impeached Clinton.   He and his team clearly identified “counts of impeachable offenses” the president had committed.  He clearly “had committed perjury, tampered with witnesses, and obstructed justice in many ways” (p. 247).  But the Clintonistas effectively massaged the media to make Starr the real “bad guy,” and the president prevailed in the court of public opinion.  To explain and justify his work to the American people, Starr assented to an interview with ABC’s Diane Sawyer.  “Jettisoning her usual Kentucky charm, Sawyer immediately went on the offensive.  She lambasted me for producing ‘demented pornography, pornography for puritans.’  On and on.  When she asked me about the tone of the referral, I was matter-of-fact:  ‘Diane, don’t fault career prosecutors for telling the truth’” (p. 278).  But neither Sawyer nor the public cared much for the truth.  They were, instead, determined to discredit Starr!  “Literally for years, my personal integrity and professionalism had been subject to a well-organized, relentless campaign of character assassination” (p. 300).

Over the years I’ve distrusted few politicians more than Bill and Hillary Clinton.  My suspicions stand confirmed by Starr’s Contempt.  Though he fully recognizes their political dexterity, he concludes:  “Tragically, their legacy, despite their accomplishments, despite their talents, is, above all, contempt: contempt for the rule of law that binds us together as citizens, and contempt for human beings—especially women—as inherently worthy of dignity and respect” (p. 306).

* * * * * * * * * * * * * * * * * * * * * * *

Healthy republics require the “rule of law.”  Lex Rex (law is king) must prevail.  To do so, law enforcement must be trustworthy and transparent.  Thus Seamus Bruner’s Compromised: How Money and Politics Drive FBI Corruption (New York:  Bombardier Books, Kindle Edition, c. 2018) should concern us.  The book examines the FBI’s role in a “story of corruption [which] (like so many others) begins with Hillary Clinton” (#164).  It reveals FBI officials involved in “the misdeeds committed during the 2016 election,” including “criminal allegations of lying under oath, obstruction, leaking classified material, coordination with foreign powers, and coordination with the media.”  But it all “began as a complex smokescreen apparently orchestrated by the Clinton team to undermine opponent Trump and obfuscate allegiances.”  Involved in the operation were:   President Barack Obama; his Attorney General Loretta Lynch; and James Comey, the FBI director appointed by Obama.  Working under Lynch in the Department of Justice were Sally Yates, Rod Rosenstein, and Bruce Ohr.  Having failed to block Trump’s election, they worked to undermine his presidency.  To do so they helped orchestrate Robert Mueller’s appointment as a special prosecution to investigate Trump’s Russian ties, and he “picked a team full of criminal prosecutors, many of whom are Clinton loyalists and Democrat donors who seem hostile towards Trump” (#450).  In Bruner’s searing judgment, Richard Nixon’s notorious Watergate scandal was “fairly tame compared with the FBI’s actions in 2016” (#2256).

The bad actors in his story had both financial and political reasons for their behavior.  Mueller and Comey, for example, have shrewdly moved in and out of government, working briefly for high-powered law firms or corporations that pay them millions of dollars.  They have “worked as a tag team for twenty years, drifting between FBI and DOJ leadership positions before cashing in on their valuable intel and experience” (#2163).  Comey was thus paid $6 million in one year by Lockheed—probably for his contacts within government rather than any stellar legal expertise!  In 2003 Comey was worth $206,000, “according to documents filed with Congress.”  Two years later “he left the DOJ to join Lockheed as general counsel and senior vice president and moved to Bridgewater Associates in 2010.  When Obama appointed him FBI director in 2013, Comey had amassed well over $10 million in compensation from just two sources:  Lockheed and Bridgewater Associates” (#2452). 

“The FBI and the DOJ have long been lucrative stops in the revolving door between the public and private sectors in D.C.  This intersection of money and politics at the top of the FBI and the DOJ is concerning” (#2149).  Comey and Mueller both “fancy themselves ‘Boy Scouts.’”  But they and their associates “became rich passing back and forth through the revolving door,” though Bruner could not demonstrably “link their huge compensation to direct official action.”  They did, however, receive  inordinate retainers, which are “upfront and ongoing compensation paid to attorneys so that when their services are needed, they will be on call.  This same model, applied to government employees, might explain the massive sums that individuals such as Clinton, Holder, and Comey received.”  Importantly:  “It is not even illegal” (#3417).  Both legal and lucrative!  How sweet it is!   

To investigate President Trump, Mueller employed a dossier compiled by Fusion GPS, an opposition research firm which specializes in digging up dirt on Republican politicians.  The firm received an estimated $12 million for producing the document.  “Some of it went to Christopher Steele, the retired MI5 agent who assembled much of the dossier.  Some of it went to Nellie Ohr, the wife of a top DOJ official.  Some of it went to journalists who promoted the salacious findings.  And some of it allegedly even went to the dossier’s sources, which included Russian officials” (#608).  In short:  Fusion GPS created the Trump dossier and Democrats paid for it.  On the other hand, though considerable evidence exists suggesting a Hillary Clinton-Russia connection, Mueller refused to investigate the Democrat candidate.  “Mueller’s special counsel mandate . . . does not differentiate between Russian interference with the Trump campaign and Russian interference with the Clinton campaign.  The absence of any charges implicating the Clinton-connected Russian agents above should prove once and for all that the Mueller investigation is a political cover-up” (#2065). 

The FBI claimed the famous Steele “dossier” justified spying on Trump’s advisors and got a FISA judge to authorize targeting “a sitting U.S. president, which may be an unprecedented abuse of power by the bureau” (#2107).  Indeed:  Mueller’s team seemingly has one mission—to take down Trump” (#2107).  To read Bruner’s Compromised is to have one’s faith in the federal government seriously compromised!  If law enforcement officials seek personal goods rather than the public welfare the fabric of our society cannot but fray. 

* * * * * * * * * * * * * * * * * * * * * * *

In 1774, on the eve of the American Revolution, John Adams envisioned a “government of laws, and not of men.”  Consequently, Supreme Court Justice Lewis Brandeis said:  “if the government becomes a lawbreaker, it breeds contempt for the law; it invites every man to become a law unto himself; it invites anarchy.”  In The Russia Hoax:  The Illicit Scheme to Clear Hillary Clinton and Frame Donald Trump (New York:  HarperCollins, c. 2018), Gregg Jarrett says:  “In truth, this book is a defense of the rule of law” (p. 281).  To do so he seeks to show how the contempt for law evident throughout the careers of Bill and Hillary Clinton persists.  He tells “a story of corruption.  It begins, as it must, with Hillary Clinton” (p. 1).  In the midst of the 2016 presidential campaign, evidence came to light revealing that Hillary Clinton had knowingly flaunted important laws as a federal employee.  Indeed, her “egregious breach of rules, regulations, and laws jeopardized national security” (p. 6) 

Assigned to investigate her case, FBI Director James Comey maneuvered to exculpate Hillary Clinton “from the sundry crimes she appeared to have committed by storing copious classified documents on her unauthorized private computer system at the Clinton homestead.  Despite a subpoena insisting to preserve her records, tens of thousands of government documents were deleted, her server wiped clean, and numerous devices destroyed” (#52).  But President Obama defended her and Comey penned an “exoneration statement” for her behavior long before his agents interviewed important witnesses, including Hillary Clinton.  “Danny Coulson, who served as deputy assistant director of the FBI during his three decades at the bureau,” lamented:  “‘Comey controlled it from start to finish and came out with the results he wanted’” (p. 24).  “Former assistant director of the FBI Steve Pomerantz is convinced Clinton knew she was breaking the law, but didn’t care:  ‘It is consistent with everything I know about the Clintons.  They make their own rules, and it’s wrong.  Hillary Clinton engaged in conduct that was dangerous to the national security of the United States.  And, of course, lying about it only compounds the problem.  The Clintons have a history of lying.  That’s what they do.  First they commit the offense, then they lie about it.  That’s what they do’” (p. 12).  

Jarrett suspects Comey protected Hillary Clinton because he felt pressure emanating from the Department of Justice headed by Attorney General Loretta Lynch.  “We are expected to believe it was a coincidence that former President Bill Clinton just happened to be on the tarmac of Sky Harbor International Airport in Phoenix, Arizona, at exactly the same time as Attorney General Loretta Lynch on June 27, 2016, a scant five days before Hillary Clinton was to meet with FBI officials for questioning about her suspected wrongdoing.  Perhaps it was also just a coincidence that eight days after the furtive tarmac meeting the decision was announced that criminal charges against Clinton would not be filed” (p. 38).  Such convenient “coincidences” rather routinely speckle the Clintons’ records! 

Hillary Clinton obviously broke the law because she had things to hide!  Those things are amply evident in a chapter Jarrett titles “Clinton Greed and ‘Uranium One’.”  Upon leaving the White House in 2001, the former president and first lady became enormously wealthy, raking in some $230 million before taxes.   Shrouding the sources of this income doubtlessly explains Hillary’s “determination to keep her State Department emails forever hidden from public view” (p. 66).  Tellingly, much of their wealth came from “Bill’s lucrative speaking engagements, especially those abroad, [which] accelerated during the four years his wife presided over the state department.  Two-thirds of his fees came from foreign sources.  It is no surprise that many of the foreign entities who were shelling out substantial dollars to Bill were the very people and governments who were angling for favorable actions or decisions by Hillary” (p. 67).  Then there was the Clinton Foundation, purportedly established to do charitable work around the world.  Contributors surely envisioned enjoying special access to the Clintons, and the foundation quickly raised more than a billion dollars.  “The charity also became a cash conduit, helping Bill collect millions of dollars as he leveraged the foundation to secure his lucrative personal speaking engagements” (p. 68). 

The Clintons’ modus operandi is nicely illustrated in the “Uranium One” deal.  In 2005 Bill Clinton and his friend Frank Giustra, a Canadian businessman interested in buying Kazakh mines, went to Kazakhstan.  Clinton facilitated a deal with President Nursultan A. Nazarbayev, whom he fulsomely praised in a press conference.  “Days later Giustra got his lucrative uranium mines.  Soon thereafter the Clinton Foundation received a $31.3 million donation from Giustra, followed by a pledge to give $100 million more.  The deal also provided Bill with incredibly profitable speechmaking fees.”  In due time, following a merger, “Giustra’s company became a uranium giant called Uranium One.  According to the president of the government agency that runs Kazakhstan’s uranium industry, Hillary Clinton pressured his government to approve the merger.  Clinton herself, who then sat on the powerful Senate Armed Services Committee, had allegedly threatened to withhold U.S. aid if the deal did not go through.  It should come as no surprise that it did” (p. 70).   In fact, “more than half the people outside the government who met with Clinton while she was secretary of state donated money to her foundation” (p. 80).  Tit for tat!  So it goes with the “Clinton Cash” machine!

Rather than pursue an investigation of Hillary Clinton, the FBI and Department of Justice launched an inquiry into Donald Trump’s “collusion” with Russia!  The document cited to justify the case was a “dossier” the DNC had paid for, seeking to damage Trump’s campaign.  “On its face, the ‘dossier’ was a preposterous collection of rumors, innuendos, supposition, and wild speculation” (p. 120).  Having thoroughly examined the evidence—detailing the maneuvers, identifying the participants—Jarrett concludes:  “There was never any real evidence of wrongdoing by the Republican nominee for president.  There was no reasonable suspicion or evidence sustaining probable cause that those in his campaign were collaborating with Russians to influence the 2016 election.  In its purest form, it was a hoax that was manufactured by unscrupulous high-ranking officials within the FBI and the Department of Justice.  Their motives were impure, animated by antipathy for Trump. They were determined to tip the scales of justice and, in the process, undermine electoral democracy” (pp. 87-88). 

Jarrett has done extensive, meticulous research, evident in his many citations, his careful concern for details, and his competence as a lawyer fully conversant with the legal system.  To understand the tumultuous beginnings of Donald Trump’s presidency, The Russia Hoax is most enlightening. 

311 Who Are We?

During the American Revolution, in his celebrated Letters from an American Farmer, J. Hector St. John de Crèvecoeur asked:  “What, then, is the American, this new man?”  It was an apt question for America’s Founders, as it is for us today, for we, unlike the Greeks or Germans, do not derive our sense of national identity from our ethnic roots.  To Crèvecoeur, Americans were Europeans transformed by their new land—the “great American asylum” provided by abundant, fertile soil—where they could become free, self-employed, successful farmers.  They experienced a “great metamorphosis” which made them truly “new” human beings.  “Everything has tended to regenerate them,” he said:  “new laws, a new mode of living, a new social system.  Here they are become men:  in Europe they were as so many useless plants.”  In short, he asserted:  “it is here, then, that the idle may be employed, the useless become useful, and the poor become rich.”

During the next two centuries, the United States would continue to welcome immigrants from Europe who generally sustained the vision of the nation’s Founders, and Americans generally shared a core commitment to the “land where my fathers died, land of the Pilgrims’ pride.”  Within the past half-century, however, that enduring sense of identity has been challenged and is possibly collapsing.  Among the many legislative acts spawned by Lyndon B. Johnson’s “Great Society” was the Immigration Act of 1965, primarily crafted by Senator Ted Kennedy.  Discarding the prior preferences given Caucasian immigrants from European nations, the act opened the nation’s borders to Third World peoples who were likely to enroll in the welfare state’s programs and thus support the Democratic Party.  Kennedy and his progressive allies deftly celebrated the virtues of “diversity” and its prospects of strengthening the nation; so decades before Barack Obama promised to “fundamentally change America” one of the main vehicles for such change had been firmly established by his ideological forbears.

To provide a scholarly assessment of this change, the late Samuel P. Huntington wrote Who Are We?  The Challenges to America’s National Identity (New York:  Simon & Schuster, c. 2004).  Huntington was a professor at Harvard for 50 years, and his earlier work on The Clash of Civilizations and the Remaking of World Order was distinguished for taking seriously the religious nature of the Christian-Muslim conflict long before the 2001 terrorist attacks on the United States.  He argued that the great conflicts in the 21st century will take place for cultural—rather than economic or ideological—reasons, and we must recognize that we now live in a “multipolar, multicivilizational world.”  Concerned for the survival of the West, he insisted we must recover its moral fiber.  Thus antisocial behavior, family fragmentation, disinterest in local associations, the loss of a strong work ethic, and the distressing decline of intellectual excellence, must be reversed if the West is to survive.  “The future health of the West and its influence on other societies depends in considerable measure on its success in coping with these trends.”  Civilizations, history records, are difficult to construct but easy to destroy. 

Since nations can be quickly destroyed, we Americans must deal wisely with the threat of massive immigration.  In fact we have a unique national culture well-described by John Jay in The Federalist Papers:  “Providence has been pleased to give this one connected country to one united people—a people descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs, and who, by their joint chubbiness arms and efforts, fighting side by side throughout a long and bloody war, have nobly established liberty and independence.”  Huntington basically revisits and updates Jay’s list of national characteristics, stressing they are precisely what we need today and urging us to “recommit” ourselves to “the Anglo-Protestant culture, traditions, and values that for three and a half centuries have been embraced by Americans of all races, ethnicities, and religions that have been the source of their liberty, unity, power, prosperity, and moral leadership as a force for good in the world” (p. xvii). 

Huntington believed that, as a result of developments since the 1965 Immigration Act, the United States is “less a nation than it had been for a century” (p. 5).  Revealing this is the contrasting poems Robert Frost wrote for John F. Kennedy’s inauguration in 1961 (celebrating the “‘heroic deeds’ of America’s founding with God’s ‘approval’”) with Maya Angelou’s recitation at Bill Clinton’s 1993 inauguration (mentioning 27 racial and ethnic groups without saying the word “America”)!  To Huntington:  “Frost saw America’s history and identity as glories to be celebrated and perpetuated.  Angelou saw the manifestations of American identity as evil threats to the well-being and real identities of people with they subnational groups” (p. 6).  Clinton himself of course sided with Angelou rather than Frost, celebrating multiculturalism and diversity and heralding a “‘great revolution to prove that we literally can live without having a dominant European culture’” (p. 18). 

But Clinton’s “great” multicultural “revolution” seriously threatens to disunite us, for a nation requires an identifying culture—not a collage of many cultures.  Unfortunately, folks like Clinton and Angelou misunderstand what actually makes America a nation.  They probably do so because they accept “two propositions that are true but only partially true and yet often are accepted as the whole truth.  These are the claims, first, that America is a nation of immigrants, and second, that American identity is defined solely by a set of political principles, the American Creed” (p. 37).  To refute the first of these propositions Huntington says the Europeans coming to colonial America were “settlers” who made a society, not “immigrants” who entered into an already-existent society seeking to benefit from it.  Thus in the 17th and 18th centuries European settlers created an homogenous “Anglo-Protestant settler society” that “profoundly and lastingly shaped American culture, institutions, historical development, and identity” (p. 39).   The second proposition—that America is a composed of a “Creed”—is another half-truth.  Before the American Revolution, colonists identified themselves in terms of ethnicity and culture, and especially in terms of religion, and though ideals such as liberty and equality were duly celebrated following the Revolution the people continued to identify themselves in terms of culture and religion continued to identify.  Indeed, the “American Creed” modern liberals celebrate is is basically “Protestantism without God, the secular credo of the ‘nation with the soul of a church’” (p. 69)

To Huntington the “cultural core” of the United States was Anglo-Protestant, for “Americans have been extremely religious and overwhelmingly Christian throughout their history” (p. 83).  English speaking Protestants established both the 17th century colonies and the 18th century nation.  This was especially evident during the First Great Awakening, when “for the first time” people from all the colonies shared a “social, emotional, and religious experience.  It was a truly American movement and promoted a sense of transcolony consciousness, ideas, and themes, which were subsequently transferred from a religious to a political context” (p. 109).  Sustaining the dissenting tradition of the Puritans, a “‘Dissidence of dissent’ describes the history as well as the character of American Protestantism” (p. 65).  So too the vaunted American “individualism” and “work ethic” stem directly from the dissenting Protestant tradition. 

Since the ‘60s, however, the Anglo-Protestant culture in America has been seriously challenged by significant innovations, beginning with the promotion of a “multiculturalism” which is in “essence anti-European,” denigrating Eurocentric values and opposing “‘narrow Eurocentric concepts of American democratic principles, culture, and identity.’  It is basically an anti-Western ideology” (p. 171).  Such was recently promoted by Fr. Arturo Sosa, the Venezuelan now serving as Superior General of the Jesuits, who called the Catholic Church to “show the multicultural face of the God who revealed himself in Nazareth,” promote “universal citizenship” and ultimately “build a multicultural world.”  Multiculturalism now dominates the nation’s schools, so high school students learn more about Harriet Tubman than George Washington.   Stanford University now requires courses on minorities and women, but not on Western Civilization.  And at the beginning of the 21st century “none of the fifty top American colleges and universities required a course in American history” (p. 175). 

Another major challenge we now face is assimilating the 23 millions of immigrants who have come to the country during the past half-century.  Contrary to the half-truth promoted by “open borders” devotees, immigrants to America a century ago were hardly  the “wretched refuse” of the earth.  In fact, as Daniel Patrick Moynihan asserted, most of them were “‘extraordinary, enterprising, and self-sufficient folk who knew exactly what they were doing, and [were] doing it quite on their own’” (p. 189).  Thus the Irish and Italians quickly assimilated and embraced the cultural core of their new country.  They came to America wanting to become Americans.  Recent immigrants, however, frequently seek to preserve their own culture by retaining their own languages and seeking dual-citizenship status.  And if they do become citizens of the U.S. it is “not because they are attracted to America’s culture and Creed, but because they are attracted by government social welfare and affirmative action programs” (p. 219).  Of especial concern to Huntington is the unprecedented Mexican immigration and expansive Hispanization undertaken by folks who frequently think they are reclaiming lands lost in the 19th century. 

A final challenge to American identity is a “denationalization” process characterizing influential academic, business, and political elites—fully evident in Barack Obama’s expansive claim to be a citizen of the world.  So too the elite executives of Apple and Walmart and Amazon have few national loyalties.  This globalization of business, Huntington says, “is proving right Adam Smith’s observation that while ‘the proprietor of land is necessarily a citizen of the particular country in which is estate lies . . . the proprietor of stock is properly a citizen of the world and is not necessarily attached to any particular country’” (p. 267).

Having diagnosed the problems, Huntington devotes his final chapters to prescribing solutions that might restore the American identity, primarily by promoting commonalities such as the English language and the Christian religion.  In truth, one lays down the book persuaded that we American no longer know who we are and have no clue as to how to regain a sturdy sense of national identity!

* * * * * * * * * * * * * * * * * * * * * *

Victor Davis Hanson was for many years a professor (teaching classics at California State University, Fresno), but in he wrote Mexifornia:  A State of Becoming (San Francisco:  Encounter Books, c. 2003) as a life-long farmer tending his family property south of Fresno, still living in a house built by his great-great-grandmother 130 years earlier.  As a child he was part of “a very tiny minority of rural whites at predominantly Mexican-American” schools, and he intimately knows the nuances of the blended peoples and cultures surrounding him.  Setting forth a deeply personal analysis of trends transforming the Golden State, he writes “about the nature of a new California and what it means for America—a reflection upon the strange society that is emerging as the result of a demographic and cultural revolution like no other in our times” (p. xii).  And he provides “the perspective of a farmer whose social world has changed so radically, so quickly that it no longer exists,” a change that comes “entirely because of massive and mostly illegal immigration from a single country:  Mexico” (pp. 1-2).

Mexican immigrants, unlike earlier European immigrants, uniquely challenge the United States because of Mexico’s geographic propinquity.  By virtue of crossing an ocean Irish or Armenian or Chinese immigrants severed themselves from the land of their birth.  But “for the campesino from Mexico there is little physical amputation from the mother country” (p. 21).  And while it is the “poorest and brownest, largely Indian” campesinos who cross the border, the wealthy elites controlling Mexico encourage their movement “northward as a means of avoiding domestic reform” (p. 27).  Once here the campesinos find work eminently suited for young, physically fit men—but work utterly impossible for them a few decades later.  As they age they most likely turn from appreciating the country enabling him to prosper to resenting their niche in American society.  And even more deeply their children grow up feeling angry and alienated—despite the fact that they are infinitely more prosperous than their relatives still in Mexico.  “If we wonder why the hardest-working alien in California sires sons who will not do the same kind of labor, who have tattoos, shaved heads and prison records rather than diplomas, we need look no further than the bitterness of the exhausted, poor and discounted father” (p. 54). 

Hanson has personally witnessed the problems plaguing his community and affecting long-term residents such as himself.   Thieves repeatedly steal his equipment and crops while vandals damage his fields.  The culprits, of course, have no documents and cannot be easily prosecuted even if apprehended.  He can no longer put ongoing mail in his mailbox, and parcels left by the mailman are frequently stolen.  Indeed, “keeping illegal aliens and Mexican gang members off the property is a hopeless task:   in the banter that follows my requests, some trespassers seem piqued that anyone in California should dare to insist on the archaic notion of property rights.  One especially smart teenager tole me in broken English, ‘Hey, it’s our land anyway—not yours’” (p. 64).  

Consequently, Hanson looks back to the world of his youth, praising “the old simplicity that worked.”  Then the churches (both Catholic and Protestant) promoted personal morality and respect for authority.  The schools inculcated both traditional academics and patriotic citizenship.  Assimilation was mandated through compulsory English in the schools and legal traditions sustained by the courts.  The assumption was simple:  immigrants, wherever they came from, were “here to stay and become an American . . . .  He was to become one of us, not we one of him” (p. 79).  The superiority of America was eminently evident in the fact that immigrants left their native lands and chose to settle here.  “The unvoiced assumption—a formulation of classic know-nothingism—resonated with us:  If it is really so good over there, why don’t you go back?  Was this an exercise in American exceptionalism?  Absolutely” (p. 84).  However politically incorrect it may seem, it worked rather well before 1970, as was evident in the “well-integrated middle-age and middle-class residents of Selma” (p. 120).

Since 1970, however, the assimilation of immigrants from Mexico has largely failed.   Thus whereas the elementary school Hanson attended 40 years ago “turned out skilled and confident Americans, its graduates who enter high school now have among the lowest literacy levels and the most dismal math skills in the state” (p. 123).  In large part this is due to the “multiculturalism, authoritarian utopianism and cultural relativism” that now dominate public institutions, especially the schools.  If there’s any hope for a better future, all such “isms” must be repudiated.  Conversely the very worst “course lies in preserving the status quo and institutionalizing our past failed policies: open borders, unlimited immigration, dependence on cheap and illegal labor, obsequious deference to Mexico City, erosion of legal statutes, multiculturalism in our schools, and a general breakdown in the old assimilationist model” (p. 144). 

* * * * * * * * * * * * * * * *

Providing a current, journalistic assessment of immigration in Melting Pot or Civil War?:  A Son of Immigrants Makes the Case Against Open Borders (New York:  Penguin Publishing Group, c. 2018, Kindle Edition), Reihan Salam writes as the son, “brother, neighbor, and friend of immigrants” who believes we need “a more thoughtful and balanced approach to immigration, including a greater emphasis on skills and a lesser one on extended family ties” (p. 8).  His parents came from Bangladesh to New York, where he was born, part of a tiny Bengali-speaking minority.  “Unlike my parents, who have had to deal with a lot of discrimination over the years, I have been untouched by it” (p. 67).  He plunged into the “melting pot” existing 40 years ago and famously succeeded.  During his lifetime, however, “the number of Bangladeshi-born immigrants in the New York area rose from roughly one thousand to more than seventy thousand” (p. 75).  Unlike Salam,  all too many of these newcomers choose to separate from, rather than assimilate to, the American culture. 

Had Salam been born a few years ago he “would not have been the only kid of Bangladeshi origin in my kindergarten.  Rather, my family would’ve been part of an established ethnic community, complete with robust religious and cultural institutions. The presence of tens of thousands of other Bangladeshi immigrants would have changed my parents’ professional lives, too. They might have entered professional niches dominated by their coethnics, and their fellow Bangladeshis would have provided them with a Bengali-speaking customer base.  At the same time, my family would have had fewer interactions with people outside of our ethnic community, and it’s far less likely that I’d have had as many friends from different backgrounds.”  In fact:  “Earlier arrivals have little choice but to make their way in the broader community, as there is no ethnic enclave for them to join.  Later arrivals, in contrast, have the option of joining, and thus replenishing, already-established ethnic enclaves” (pp. 75-76).  So today the vaunted melting pot barely simmers.  But “we need it back, badly”  (p. 14).

Demographic data indicate that within a few years non-Hispanic whites will be a minority.   Immigrants have entered the nation in record numbers and are procreating, whereas “Native-born Americans are forming families later in life, if at all, and they’re having fewer children as a result.  America is thus in the middle of a birth dearth.  One consequence is that recent immigrants, with their comparatively healthy birthrates, are having an outsized impact on America’s younger generations.  One in four U.S. children under the age of eighteen has at least one foreign-born parent.  Unless native-born Americans start having many more babies, a prospect that for now seems rather remote, new immigrants and their descendants will account for almost 90 percent of all population growth between now and 2065” (pp. 32-33).  Unfortunately, most of these immigrants are poorly educated, low-income folks whose children who will likely remain ill-educated and poverty-stricken.  Thus we need “to recognize an uncomfortable truth.  High levels of low-skill immigration will make a middle-class melting pot impossible” (p. 28).  These immigrants will tend to cling to ethnic or racial distinctives and live in segregated enclaves. 

Various countries, ranging from Sweden to Singapore, have devised various ways of dealing with immigrants, who almost everywhere do the menial work disdained by their affluent hosts.  Pro-immigration advocates often urge an “open borders” policy without calculating the cost.  Anti-immigration spokesmen frequently fail to rightly value the contributions immigrants make or the need to help alleviate poverty and injustice around the world.  So how do we devise and implement the best policies for all concerned?   Salam suggests we first grant “amnesty to the long-resident unauthorized population” and then vigorously curtail all illegal immigration.  Second, we should adopt “a skills-based” system, stopping the influx poorly prepared, impoverished newcomers.  Finally, we must begin “fighting the intergenerational transmission of poverty” so evident in the children and grandchildren of immigrants.  These endeavors, Salem thinks, “will, taken together, help make America a middle-class melting pot” (p. 157).

Melting Pot or Civil War? thoughtfully, dispassionately surveys the turbulence surrounding today’s immigration discussion.  As a pro-immigrant journalist, Salam refrains from reciting the litany of pablum regarding compassion and tolerance.  Instead he helps us better understand and think wisely about the issues confronting us.

310 Medieval Wisdom for Modern Christians

Throughout the past century a number of discerning thinkers have lamented the immanent demise of Western Civilization.  Thus when Jesse Jackson led Stanford University students in chanting “Hey, Hey, Ho, Ho, Western Civ has got to go,” he merely described a fait accompli long in the making.  One of the clairvoyant critics discerning this cultural trajectory was C.S. Lewis, who in 1954 (after long being denied promotion at Oxford) accepted an appointment to a chair created for him at Cambridge University as Professor of Medieval and Renaissance Literature.  Many of us know Lewis as a Christian apologist, penning such classics as Mere Christianity, or as the author of The Chronicles of Narnia.  But he devoted much of his life to research, writing, and teaching, and one cannot understand his popular works and worldview without appreciating his deep immersion in the Medieval world.  Nor can one understand the Christian Faith he embraced without seeing its Medieval background.  Thus, concluding his inaugural lecture—“De Descriptione Temporum”—he acknowledged he was “becoming, in such halting fashion as I can, the spokesman of Old Western Culture,” and he would treasure his historian’s role, for doing so “does indeed liberate us from the present, from the idols of our own market-place.  But I think it liberates us from the past too.  I think no class of men are less enslaved to the past than historians.  The unhistorical are usually, without knowing it, enslaved to a fairly recent past.”

Lewis then pointed out the great gap separating Cambridge undergrads from the Old Western Culture he represented.  “Wide as the chasm is,” however, “those who are native to different sides of it can still meet, are meeting in this room.”  He confessed to belonging “far more to that Old Western Order than to yours.”  Indeed, he rather resembled a dinosaur or a Neanderthaler!  Yet if one were interested in either species—and if one of them would mysteriously appear and could be tested or even talk—then, “should we not almost certainly learn from him some things about him which the best modem anthropologist could never have told us?  He would tell us without knowing he was telling.  One thing I know:  I would give a great deal to hear any ancient Athenian, even a stupid one, talking about Greek tragedy.  He would know in his bones so much that we seek in vain.  At any moment some chance phrase might, unknown to him, show us where modern scholarship had been on the wrong track for years.  Ladies and gentlemen, I stand before you somewhat as that Athenian might stand.  I read as a native texts what you must read as foreigners.”  Yet  because he could speak as a native, he might “yet be useful as a specimen.  I would even dare to go further.  Speaking not only for myself but for all other Old Western men whom you may meet, I would say, use your specimens while you can. There are not going to be many more dinosaurs.”   Speaking thusly, C.S. Lewis clearly found much worth heeding in the Medieval World.

That position Lewis made clear in is first scholarly treatise, The Allegory of Love:  A Study in Medieval Tradition (published in 1936)—demonstrating, his biographer George Sayer says, that he “was a great literary critic” who was “without exception, highly praised by reviewers.”  He was subsequently asked to write a volume for The Oxford History of English Literature.  It took him a dozen years to research and write, but he had completed his English Literature in the Sixteenth Century Excluding Drama when he moved to Cambridge.  This is a dense work of scholarship, of interest mainly to literary scholars, revealing Lewis’s amazing mastery of primary sources he discussed.  But his lengthy Introduction, “New Learning and New Ignorance,” detailed some of the reasons he found the Medieval World proffering perspectives worth recovering.  Certainly there was a “New Learning” evident in the 16th Century—preeminently the oft-celebrated turn to natural science.  But it was not a “new” turn to actually studying Nature, which had been widely done in the Middle Ages by men such as Roger Bacon and Albert the Great.  The “New Learning” was a philosophical turn from wondering at the majesty of Nature to controlling her!  Lewis especially stressed the “dreams of power which then haunted the European mind,” markedly evident in the work of Francis Bacon, who referred to her as “a spouse for fruit” rather than a “courtesan for pleasure.”   The “New Learning” was also distinguished by its humanistic, rather than scholastic, approach to learning.  Thus men such as Erasmus did not much concern themselves with propositional logic as with literary style, making “eloquence the sole test of learning” (p. 30).  Indeed, at Oxford in 1550 “the works of the [Medieval] scholastics were ‘cast of of college libraries’” and “publicly burned, along with mathematical books, which were suspected of being ‘Popish or diabolical’” (pp. 30-31). 

In their hatred of the Middle Ages the Humanists found allies in some English Puritans who adhered to the theology of John Calvin.  They also rejected both the Natural Law and the political philosophy espoused by Aristotle and Aquinas, preparing the soil for the Divine Right of Kings position so evident by the end of the century.  For Aquinas, kingly power “is never free and never originates.  Its business is to enforce something that is already there, something given in the divine reason or in the existing custom” (p. 48).  That view was rejected by William Tyndale, who insisted (in 1528) “that ‘The King is in this world without law and may at his own lust do right and wrong and shall give accounts to God only’” (p. 49).  Basic to the English Reformation, of course, was the autocratic exercise of power by Henry VIII and his daughter Elizabeth I.  In the next century Thomas Hobbes rationalized such autocracy in his Leviathan, a book totally at odds with the Ancient and Medieval Natural Law tradition, making “political power something inventive, creative.  Its seat is transferred from the reason which humbly and patiently discerns what is right to the will which decrees what shall be right.  And this means that we are already heading, via Rousseau, Hegel” and others to “the view that each society is totally free to create its own ‘ideology’ and that its members, receiving all their moral standards from it, can of course assert no moral claim against it” (p. 50).  It will be the deranged world powerfully depicted in Lewis’s dystopia, That Hideous Strength. 

Shortly before he died, Lewis collected his lectures on Medieval and Renaissance literature in a (posthumously published) text titled The Discarded Image (Cambridge:  Cambridge university Press, c. 1964).  In his lectures he tried to portray the Medieval Model of the Universe as a “supreme work of art,” rather like the soaring gothic cathedrals at Chartres or Cologne.  He admitted  to making “no serious effort to hide the fact that the old Model delights me as I believe it delighted our ancestors.  Few constructions of the imagination seem to me to have combined splendour, sobriety, and coherence in the same degree” (p. 216).   Obviously it had serious deficiencies, and there is no going back to that world.  But we may, if we rightly study, find in it wisdom for today.

* * * * * * * * * * * * * * * * * * * * * * * * * * * *

Recently Chris R. Armstrong, in Medieval Wisdom for Modern Christians: Finding Authentic Faith in a Forgotten Age with C. S. Lewis (Grand Rapids:  Baker Publishing Group. Kindle Edition, c. 2016), takes seriously Lewis’s invitation to study the Medieval World.  Still more:  he shares G. K. Chesterton’s conviction that you could not “be a proper medievalist until you cared deeply enough about today to apply medieval insights to your own life and thinking.” We moderns actually live in a “tiny windowless universe,” brilliantly described by “G. K. Chesterton’s definition of insanity:  ‘The clean and well-lit prison of one idea.’  Our modern room is well lit by the bare bulb of science.  But of what lies beyond, we see nothing” (p. 66).  To go beyond modernity’s prison requires recovering pre-modernity! 

Providing some personal information, Armstrong (a church historian with a Ph.D. from Duke who edited Christian History for several years and now teaches at Wheaton College) tells of coming to Christ in a “wonderful” charismatic church in Nova Scotia 30 years ago.  It “was one of those modern suburban megachurches with an auditorium-like sanctuary,” and on “Sunday mornings, I would walk in and feel the palpable presence of the all-powerful and all-loving Lord.”  Yet his faith seemed a bit “precarious,” resting “on a foundation made up of the words of our favorite Bible passages (our ‘canon within the canon’), the sermons of our pastors, and a roster of approved visiting evangelists.  There was no sense at all of the whole mystical, historical massiveness of a church that had been around for two thousand years, no sense that our foundation actually stretched down and back through time, resting on such giants in the faith as John Wesley, Martin Luther, Bernard of Clairvaux, and Ignatius of Antioch . . . .   I now see that my early sense of the insecurity of the church stemmed from what J. I. Packer identifies as evangelicalism’s ‘stunted ecclesiology,’ rooted in our alienation from our own past.  Without a healthy engagement with our past, including historical definitions of ‘church,’ we are being true neither to Scripture nor to our theological identity as church!” (p. 46). 

In truth, Armstrong was discovering how the Protestant Reformation had effectively discarded much of “Medieval Wisdom.”  This was, in part, due to the “super-spiritualizing tendency” early evident in “the thought of Ulrich Zwingli,” who tended to denigrate “the ‘outer,’ physical life.  Only the inner and spiritual was to be trusted, not only in worship and devotion but also in the ethic of daily life:  ‘The outer, whether it meant Church-as-institution, the sacrament or ascetic practices was automatically reduced to the role of being no more than an expression (always suspect and dangerous at that) of the inner, or else was condemned outright as materialistic and idolatrous.’”  To refute Zwingli et al. Armstrong wrote this treatise!  For he wants to lead us to “what I have found to be the wisest piece of medieval wisdom:  creation and incarnation are not rote doctrines to be learned, committed to memory, and ignored in our daily practice, but rather are practical linchpins of what it means to lead a good human life in the light of the gospel” (p. 28).   He further believes, in accord with Lewis, that “the scientific revolution and its sequels—such as the Enlightenment—began to sap the material world of its spiritual and moral significance, and that this diminishment has only continued and intensified through today” p. 22).  

This diminishment was evident when 19th century American Evangelicals embraced the “immediatism” popularized by Phoebe Palmer’s The Way of Holiness.  “In it, she said about the traditional Methodist teaching of sanctification:  ‘Yes, brother, THERE IS A SHORTER WAY!’” (p. 7).  Subsequently, various preachers embraced her “optimistic creed,” declaring:  “No more would Christians have to pursue a fraught and painstaking path to holiness.”  Rather:  “By simply gathering their resolve, making a single act of consecration, and ‘standing on the promises’—certain Scripture texts that seem to hold out entire sanctification as an attainable reality—they can enjoy total freedom from sin.  This message galvanized a generation and set a tone for evangelicalism that continues to ring out today.  It may be fair to say that the teaching of a ‘shorter way to holiness,’ whether in Palmer’s more Wesleyan formulation or in the Reformed-influenced ‘higher life’ variations introduced later in the century, fueled the single most prominent and widespread movement among postbellum and Gilded Age evangelicals.  It swept across the nation’s West and South like a sanctified brushfire, birthed new denominations such as the Nazarenes and Christian & Missionary Alliance, fed the all-consuming fervor of temperance activism, and laid the groundwork for the Pentecostal movement of the following century” (p. 7).  

The roots of this message extended back to the emphasis on “heart religion” promoted by Puritan writers and illustrated in John Wesley’s Aldersgate experience, wherein he felt his “heart strangely warmed.”   It was celebrated in 19th century camp meetings.  Indeed, Palmer’s prescription of immediatism ties into the “syndrome of pressurized pragmatism, which Alasdair MacIntyre has identified as the chief cause of many American ills, militating as it does against careful reflection on accumulated wisdom.”  To Armstrong, Palmer’s formulation truncated the fullness of the Christian faith and fell short of the more demanding and authentic spirituality evident in the Medieval World.  Though a Methodist, Palmer unfortunately neglected some of Wesley’s repeated emphases, for he insisted “that those powerful moments of repentance and coming to faith are just the ‘porch’ or the ‘door’ into the Christian life.  The substance of the Christian life, which lasts as long as we live, is holiness.  Wesley has a favorite phrase to explain holiness.  He says it is ‘having the mind of Christ and walking as he walked.’  Achieving that steady character in ourselves requires, in the motto Eugene Peterson borrows from Friedrich Nietzsche (1844–1900), ‘a long obedience in the same direction’” (p. 222). 

To correct serious deficits in this tradition, Armstrong urges us to return to “the Middle Ages with Lewis’s guidance” and recover the fullness of the Christian Faith.  To do so we must challenge “‘immediatism’ in two ways.  First, we must return the authoritative interpretation of Scripture to the Church, removing it from purely individual reason and experience.  To desire to learn from the cloud of witnesses or ‘church triumphant’—those on whose shoulders we stand—is to shift authority back to the older style, weighting Scripture-read-through-tradition more heavily than the dictates of our own freely exercised reason and experience” (p. 11).  Second, we must return to liturgical worship services conducted by a priestly clergy.   As Lewis aged, he increasingly “turned to the early and medieval catholic traditions revived and preserved in high-church Anglicanism.”  He began going to confession and found “the experience was like a tonic to his soul.”  He came to love the “liturgy, the 1662 Prayer Book, the Daily Office, and praying through the Psalter each month.”  He came to believe the Eucharist is more than a mere memorial and “‘found himself able to ‘experience Real Presence in the Blessed Sacrament’” (p. 39).

Lewis was, obviously, immersed in and enamored by Tradition!  He loved the “old books” and urged people to read and re-read them.  He “wanted to stand in the gap of cataclysmic cultural loss, to bring “the tradition” back to the people” (p. 56).   He particularly treasured the Medieval emphasis on the Natural Law, as is evident to any reading Mere Christianity or The Abolition of Man, saying:  “‘Aristotle had assumed it, and Plato.  Cicero had spoken of it when he called it the law that is not written down.  When St. Paul wrote that even the Gentiles knew that certain kinds of behavior were wrong, he was appealing to natural law.  This same idea informed the thought of St. Augustine in the fourth century and St. Thomas in the thirteenth, and influenced Anglicanism at its origin through Richard Hooker’s [scholastically framed] Laws of Ecclesiastical Polity’” (p. 51).   Indeed, in the last essay he wrote for publication (“We Have No ‘Right to Happiness’”) Lewis declared the Natural Law “to be basic to civilization.” 

One of the things Lewis loved about Medieval Christian Culture was its celebration of Reason and the life of the mind and “‘could not [said his friend Owen Barfield] help trying to live by what he thought’” (p. 73).   As Lewis noted in his spiritual autobiography, Surprised by Joy, his “conversion” was almost “purely philosophical” in nature.  Deeply read in Medieval theology, he understood its grandeur and drank deeply from masters such as Thomas Aquinas.  Armstrong argues “that in everything he wrote, whether nonfiction or fiction, Lewis wrote first of all as a Christian moral philosopher.  And I don’t think it’s too much of a stretch to add that he was a medieval Christian moral philosopher” (p. 98).  As he began his Christian journey, Armstrong learned that Luther and many Reformers had severed moral behavior and spiritual discipline from justification by faith alone.  “Luther taught ‘imputed righteousness’:  being covered by the blood of Christ, making up for our complete inability to be good.”  Subsequently “critics said this teaching led to ‘antinomianism,’ a fifty-dollar word for moral lawlessness.”  Four hundred years later Dietrich Bonhoeffer “identified in his Lutheran church this same suspicion of any Christian effort toward righteousness—he called it ‘cheap grace’” (p. 95)   Consequently a “conundrum” persists:  “how to train believers in moral good while also teaching a radical message of grace still plagues evangelical Protestantism.  Protestants have fallen so in love with the message of grace and have so spiritualized their faith that questions of morality—at least the morality of public, communal life—have receded from view.  As the late Dallas Willard described many modern believers, we are ‘not only saved by grace [but] paralyzed by it’” (p. 96).  Or, as Richard Lovelace says, there is a “sanctification gap” in evangelical ranks.  

Armstrong argues we need to recover the “sacramental spirituality” evident in 13th luminaries such as Francis of Assisi, Thomas Aquinas and Dante, who all saw the world as theomorphic, or God-shaped.  “Sacramentalism is the concept that the outward and visible can convey the inward and spiritual.  Physical matters and actions can become transparent vehicles of divine activity and presence.  In short, material things can be God’s love made visible” (p. 143).  Sacramentalists think “all creation is in some sense a reflection of the Creator,” for He is everywhere, always present in His world.  Still more, in beholding the beauty of creation, Lewis said:  “‘We do not want merely to see beauty . . . .  We want something else which can hardly be put into words—to be united with the beauty we see, to pass into it, to receive it into ourselves, to bathe in it, to become part of it’” (p. 145).  

On the basis of his examination of Medieval Wisdom in C.S. Lewis, Armstrong concludes there is a tall wall separating “modern Protestants” from Medieval sacramentalism.  We have, he thinks lost much of our rightful heritage—the “incarnational faith” intrinsic to it.  “I’ve suggested,” he says, “one quite formidable aspect of that wall for evangelicals—our immediatism.  But the barrier stretches back much farther in history.  In a crucial (quite literally) sixteenth-century moment, a central symbol of the incarnation was removed forcibly from the church.  This was the point at which some zealous Reformers went beyond tearing down paintings and smashing statues to take the very body of Christ off of the crucifix—thus (they thought) defending the church against idolatrous images and defending the resurrection.  Left behind was (arguably) only an abstract symbol of a judicial transaction.  The difference between worshiping in a space where there is no body of Christ on the cross and worshiping in a space where there is a body of Christ on the cross is that in the latter space worshipers cannot ignore the humanity of Christ—nor, thus, of themselves.  In that space, our humanity—bodiliness, affectivity, rationality, community, society, culture—always stands (no, hangs) before us in the person of, the body of, the humanity of Jesus Christ the Lord. In a sense, this entire book tells the story of what happens when we lose our hold on the incarnation” (pp. 208-209). 

Could we regain our hold in the incarnation and put “the ‘body’ back into our understanding of Christ and his church,” we could “recapture the wisdom and truth” in both Tradition and Scripture.  “Tradition is nothing less than wisdom and truth passed down from generation to generation throughout history.  How apt is this?  Christianity is at its core not a list of timeless principles or abstract teachings.  It is uniquely a historical religion, based on a historical person and the words of two “testaments” that are full of historical accounts. Nineteenth-century liberal theologians liked to talk about the ‘essence of Christianity’—usually little more than a set of ethical teachings summarized under the rubric ‘the Fatherhood of God and the brotherhood of man’—that needed to be extricated from the centuries of errant doctrines and practices of a church that never seemed to get it right.  . . . .  But there is no ‘essence’ that is not clothed in history, lived out bodily by God incarnate, and then lived out by ‘his body,’ the human beings whom he has constituted ever since as his church.  Christianity is all about the incarnation of God’s Second Person as a first-century Jew from Nazareth, and then the incarnation of his truth in his living, embodied disciples in all ages and places” (pp. 210-211). 

Armstrong’s treatise bears the stamp of a zealous convert, overemphasizing truths he finds crucial.  But in stating his case he helps us become more mindful of great treasures too often neglected by our tradition-less contemporary church culture.  And as always, works focused on Lewis are quite worthwhile! 

* * * * * * * * * * * * * * * * * * * * * * * * * * * *

In 1978, conference instigator Robert Webber began his groundbreaking Common Roots: A Call to Evangelical Maturity by throwing down the gauntlet: “My argument is that the era of the early church (AD 100–500), and particularly the second century, contains insights which evangelicals need to recover.”

Armstrong, Chris R.. Medieval Wisdom for Modern Christians: Finding Authentic Faith in a Forgotten Age with C. S. Lewis (p. 44). Baker Publishing Group. Kindle Edition.

He cited Aquinas repeatedly in The Allegory of Love and The Discarded Image and concludes his Letters to Malcolm by saying observes that the “most blessed result of prayer would be to rise thinking,” in accord with Aquinas, who said of all his own theology, ‘It reminds me of straw.’” (p. 40).  Yet we who read Aquinas—or Lewis—remain forever indebted to the rigor and clarity of their thought.

Lewis repeatedly critiqued distinctive Reformed positions regarding holiness and the freedom of the will, “a crucial part of Lewis’s anthropology and his case for hewing to the morality of the Western (Christian) tradition. . . .  The choices we make on earth have transcendent, cosmic, and divine (or infernal) consequences” (p. 195), a message wondrously illustrated in The Great Divorce.  “In his early spiritual autobiography, The Pilgrims Regress (1933), Lewis shows a ‘Landlord’ (God) who makes rules not just for a particular religious tribe but for all people.  Christianity innovated morally only by teaching that the redemption purchased for us by Christ brings the Writer of the rules into our hearts, thus helping us to keep them”—  important “assumptions [that] thoroughly suffused ancient and medieval culture:  ‘Aristotle had assumed it, and Plato.  Cicero had spoken of it when he called it the law that is not written down. When St. Paul wrote that even the Gentiles knew that certain kinds of behavior were wrong, he was appealing to natural law.  This same idea informed the thought of St. Augustine in the fourth century and St. Thomas in the thirteenth’” (pp. 99-100).   His friend, Dorothy Sayers, translating Dante, endorsed the Medieval Wisdom Lewis promoted, “saying:  ‘We must also be prepared, while we are reading Dante, to abandon any idea that we are the slaves of chance, or environment, or our subconscious; any vague notion that good and evil are merely relative terms, or that conduct and opinion do not really matter; any comfortable persuasion that, however shiftlessly we muddle through life, it will somehow or other all come right.’  We must, in other words, truly believe in God’s gift to us of free will, for ‘The Divine Comedy is precisely the drama of the soul’s choice’” (p. 116). 

 Lewis “clearly recognized that the Christian warrant for traveling the Affirmative Way, encountering the material world as a place rich with sacramental meaning,” and “he very famously taught that our natural desires—our yearning, which is triggered by our experiences of what is good and beautiful in the world—can lead us toward God.  Indeed, he insisted that he came to God in this way, so that he called himself an ‘empirical theist’” (p. 163).  In his sermon “Transposition” he stressed that as physical beings we “finally have no other conduit to the divine besides our bodies and our senses” (p. 203). 

309 Hitler’s Ethics, Philosophers, Doctors

When driven to illustrate utter evil in history, many of us simply point to Adolph Hitler.  Then, trying explain why he was so depraved, we easily employ therapeutic terms, labeling him an irresponsible “madman” or a puppet dancing to sociological or economic machinations.  But in Hitler’s Ethic:  The Nazi Pursuit of Evolutionary Progress (New York:  Palagrave Macmillan, c. 2009) Richard Weikart, a professor of history at California State University, Stanislaus, endeavors to demonstrate “the surprising conclusion that Hitler’s immorality was not the product of ignoring or rejecting ethics, but rather came from embracing a coherent—albeit pernicious—ethic.  Hitler was inspired by evolutionary ethics to pursue the utopian project of improving the human race.  He really was committed to deeply rooted convictions about ethics and morality that shaped his policies.  Evolutionary ethics underlay or influenced almost every major feature of Nazi policy:  eugenics (i.e., measures to improve human heredity, including compulsory sterilization), euthanasia, racism, population expansion, offensive warfare, and racial extermination.  The drive to foster evolutionary progress—and to avoid biological degeneration—was fundamental to Hitler’s ideology and policies” (p. 2).  Indeed, as Fritz Lenz (an influential geneticist favored by Hitler) explained:  Nazism was simply “applied biology.”  

Though Hitler was hardly a profound thinker, he read extensively and by 1923 began setting forth a coherent political agenda, studding his speeches with references to (and quotations from) significant German philosophers and scientists.  In 1934, at a Nuremburg Party rally he insisted that “National Socialism is a worldview [weltanschauung]” (p. 28).  (The New Cassell’s German Dictionary says the word Hitler used—weltanschauung—means “philosophy of life, world outlook, creed, ideology”).   He further “posed as a moral crusader gallantly battling the forces of iniquity, corruption, and even deceit” (p. 17).  And he never hesitated to extol traditional—and very Christian—virtues such as duty, loyalty, honesty, sexual purity, etc., when they suited his purposes.  As he garnered support in the 1920s he especially touted himself as a “truth-teller” exposing those whom Schopenhauer had called “the great masters in lying,” the Jews.  (In fact, as was evident in his skillful propaganda, Hitler was himself 9a masterful liar!)     To him, lying was justifiable if it helped establish his weltanschauung—especially his devotion  to evolutionary progress and the ultimate triumph of the German Volk.  Indeed, his “highest priority in life was to improve the human species, to advance evolution” (p. 83).  As Mein Kampf (the autobiography he wrote in prison) asserted, all of life is a biological battle, and only the fittest survive.  Therein he frequently cited some of Darwin’s phrases—“struggle for existence,” “struggle for life,” and “natural selection.”  Such phrases had regularly appeared in The Descent of Man, where Darwin asserted:  “‘Man, like every other animal, has no doubt advanced to his present high condition through a struggle for existence . . . and if he is to advance still higher he must remain subject to a severe struggle’” (p. 35).  In his Table Talks and speeches Hitler celebrated evolutionary theory and “presented biological struggle in the evolutionary process as a central tenet of Nazism” (p. 38).  This particularly applied to the “racial struggle” validating the superiority of Aryan or Nordic peoples.  “Helping Aryans win the struggle for existence against other races was crucial to achieving his vision.  Morality itself was measured by whether or not it benefitted the German people in their struggle” (p. 83).  Popular books such as the Comte de Gobineau’s The Inequality of the Human Races, praised by eminent biologists including Ernst Haeckel, undergirded Hitler’s racist agenda.  Though he certainly despised the Jews, Hitler equally scorned Africans, Asians and American Indians.  As he declared in Mein Kampf:  “‘All who are not of good race in this world are chaff’” (p. 69). 

Hitler’s racism shaped the “national socialism” he championed.  As a socialist he disdained the individualism of capitalist countries such as the United States, seeking to turn the “German Volk into a true socialist community’” (p. 104).  Thus, as soon as he took control of the country, he launched annual Winter Relief Drives designed to help poor Germans—but not “asocial” vagrants, prostitutes, criminals et al.  He envisioned and supervised extensive public works, including the celebrated autobahns, designed to help everyone.  The Nazis—the National Socialist German Workers’ Party—also endeavored to provide full employment for all Germans.  To Hitler, socialism meant “‘not the solution of the labor question, but rather the ordering of all German racial comrades into a genuine living community; it means the preservation and further evolution of the Volk on the basis of the species-specific laws of evolution’” (p. 111).  He shared the view of August Weismann, a famous Darwinian biologist, who believed that “‘only the interest of the species comes into consideration, not that of the individual’” (p. 114).  

Racist ideology obviously required the sexual ethic which Hitler carefully articulated.  In 1937 he declared:  “‘we are laying claim to leadership of the Volk, i.e. we alone are authorized to lead the Volk as such—that means every man and every woman.  The lifelong relationships between the sexes is something we will regulate.  We shall form the child!’” (p. 121-22).  Ever-more Aryan children were needed to populate an ever-expansive Reich!  Consequently, he said:  “‘there is only one holiest human right, and this right is at the same time the holiest obligation, to wit:  to see to it that the blood is preserved pure and, by preserving the best humanity, to create the possibility of a nobler evolution of these beings’” (p. 141).  So he opposed birth control and abortion, celebrated large families, and often portrayed himself as a staunch defender of traditional family values and morality.  Yet, paradoxically, he also approved extramarital sexual affairs and even toyed with the idea of polygamy if such activity birthed more (and genetically better) Germans.  When the war broke out in 1939, Himmler issued a Hitler-approved order which said:  “‘Beyond the boundaries of perhaps otherwise still necessary bourgeois laws and customs it will also outside of marriage be an important responsibility for German women and girls of good blood, not lightly, but rather in profound moral seriousness, to become the mothers of children of soldiers who are going to the front and of whom fate alone knows whether they will return or fall in battle for Germany’” (p. 133).  

Along with breeding more healthy Aryan children, the Nazis targeted the incurably sick and disabled for extermination.  They didn’t deserve to live longer—as, indeed, Karl Binding (a lawyer) and Alfred Hoche (a psychiatrist) had argued in their notorious, but widely-circulated 1920 treatise, Permitting the Destruction of Life Unworthy of Life.  Learned physicians assured Hitler that infants were not fully human, for “‘when a child is born, it is not really fully matured . . .  But if that is so, then the infant does not actually take its place in human society until several months after its birth’” (p. 185).  Germany’s medical personnel ultimately killed 200,000 disabled “patients” in the nation’s care facilities.  Explaining this, the historian Hans-Walter Schmuhl said:  “‘The racial-hygiene paradigm constituted an ethic of a new type, which was ostensibly grounded scientifically in Darwinian biology.’”  By discarding the Judeo-Christian tradition and “‘giving up the conception of humans as the image of God through the Darwinian theory, human life was construed as a piece of property, that—contrary to the idea of a natural right to life—could be weighed against other pieces of property’” (p. 180). 

Killing Jews, though the best known aspect of Hitler’s racist agenda, actually began two years after the outbreak of WWII, in the final months of 1941.  Prior to that, deportation rather than extermination has been the official Nazi position.  “Hitler’s evolutionary ethic did not require killing.  He could have merely sterilized the disabled and deported the Jews.  This would have accomplished his goals of expanding the Germany population, strengthening the Aryan race by eliminating ‘inferior’ individuals and races, and expanding German living space.  However, even though killing may not have been required by Hitler’s evolutionary ethic, Darwinism contributed nonetheless to the death of the disabled and Jews.”  As Christopher Hutton, concluding his book on Nazi racism, said:   “‘All the key elements of this [Nazi] world-view had been constructed and repeatedly reaffirmed by linguists, racial anthropologists, evolutionary scientists and geneticists.  Ludwig Plate [a Darwinian biologist at the University of Jena] observed that “progress in evolution goes forward over millions of dead bodies” . . .  For Nazism, survival in evolution required the genocide of the Jews’” (pp. 194-195).    

* * * * * * * * * * * * * * * * * * * * * * *

In Hitler’s Philosophers (New Haven:  Yale University Press, c. 2013), Yvonne Sherratt endeavors to unveil “the sinister past of many German philosophers” which has been effectively buried by their protégés.  Though the book “is a work of non-fiction, carefully researched, based upon archival material, [and] letters . . . which have all been meticulously referenced,” it “is written in a narrative style, which aims to transport the reader to a vivid and dangerous world of 1930s Germany” (p. xx).  

Philosophy occupies a prominent place in German culture, granting professional philosophers a celebrity status.  Thus Hitler liked to invoke legendary thinkers such as Goethe, Schiller, Kant, Fichte, Schopenhauer and Nietzsche.  He even sought to portray himself, in Mein Kampf, as the “philosopher Fuhrer.”  For example, a well-known quotation from Schiller’s William Tell—“the strong man is mightiest alone”—served as a chapter title in Mein Kampf  and became his motto during his later years as the Fuhrer” (p. 21).    He especially claimed to embrace to the “critical philosophy” of Immanuel Kant, saying:  “‘Kant’s complete refutation of the teachings which were the heritage of the middle ages, and of the dogmatic philosophy of the church, is the greatest of the services which Kant has rendered to us’” (p. 20).  Though Kant certainly seems to be an implausible figure to indwell the Nazi pantheon, he represented for

Hitler a repudiation of the past, with its irrational superstitions and religious prejudices.  And, importantly,  Kant disparaged Judaism, “labeling Jews as a body superstitious, primitive and irrational.”  Indeed he declared Judaism was not even a bona fide religion “but merely a community of a mass of men of one tribe’” (p. 39).  Jews were innately dishonest and had “no right to an independent existence” (p. 40).   Following Kant, three 19th century thinkers—Fichte, Schopenhauer and Nietzsche—became “the ‘philosophical triumvirate of national Socialism’” (p. 23).  Schopenhauer effectively extended Kant’s insights and, importantly, “glorified Will over Reason,” as would Nietzsche, the philosopher most frequently cited in Hitler’s speeches.  Hitler claimed he read Nietzsche’s works while in prison, and he “‘often visited the Nietzsche museum in Weimar . . . posing for photographs of himself staring in rapture at the bust of the great man’” (p. 236).  In 1934 he met Nietzsche’s sister Elisabeth, who gave him “one of Nietzsche’s most personal possessions’—his last walking stick (p. 26).  “From that day on the Nietzsche catchphrases were everywhere, Wille zur Macht, Herrenvolk, Sklavenmoral—the fight for the heroic life, against formal deadweight education, against Christian ethics of compassion.” (p. 26).  

There’s no mystery as to why Hitler would be drawn to Nietzsche, for his most noted work was “Zarathustra, in which he had coined the idea of the ‘Superman.’”  During the First World War 150,000 copies of the book were “handed out to German soldiers at the front.  A London broadcaster even went so far as to dub the war the ‘Euro-Nietzschean War,’ and the best selling English novelist of the time, Thomas Hardy, claimed in a letter to the Daily Mail that there was ‘no instance since history began of a country being so demoralized by one single writer’” (p. 50).  Nietzsche’s sister selected and published passages from his works, including his “discussion on the possibilities of selective breeding and of educating a ruling caste, ‘the masters of the earth’, ‘tyrants who can work as artists on “man” himself” (p. 50).  When she met Hitler in 1934, she would likely have shown him such passages, and her portrayal of “Nietzsche seemed to supply just of the needs of the Third Reich—there was a zeal for war, a dash of anti-Semitism, the ‘Superman’ and nationalism” (p. 51).  

Hitler He claimed to have read “everything he could get hold of,” including Nietzsche and Schopenhauer, though it’s clear he mainly devoured and absorbed racialist and nationalist tomes composed by writers such as Houston Stewart Chamberlain, Heinrich von Treitschke, and Oswald Spengler.  Above all, he found in Charles Darwin one of his “most crucial influences” (p. 53).  Darwin’s evolutionary thought swept through Germany under the guidance of the “enormously influential zoologist and social philosopher

Ernst Haeckel,” whose books “vastly outsold Darwin’s” (p. 54).  “Nature is God” Haeckel declared, and Nature, through natural selection, had elevated the Aryan race.  Following Haeckel, scores of German scholars propounded his version of Social Darwinism, many of them serving as “collaborators” helping the Nazis gain control of universities and various cultural institutions in the ‘30s.  

The most prestigious philosopher actively lending his support to Hitler was Martin Heidegger, a Freiburg University professor.  That he was deeply influenced by Nietzsche must be noted, for the two of them have profoundly shaped 20th century philosophical and literary thought.  To Sherratt, Heidegger was “Hitler’s Superman.”  He had studied with Edmund Husserl, the noted phenomenologist, and enjoyed his patronage as he established his reputation as a world-class philosopher.  Heidegger joined the Nazi Party in 1933 and was named rector of the University of Freiberg soon thereafter.  Though he resigned as rector within a year, he maintained his Party membership until 1945, and his commitment to National Socialism seems inseparable from with his philosophy.  He “heralded the Third Reich as ‘the construction of a new intellectual and spiritual world for the German nation,’” adding that the “‘construction of National Socialism has now become the single most important task for the German universities’” (p. 106).  He thought no Christian should be appointed to a university lectureship, for traditional morality needed to be consigned to the trash bin of history.  Students were drawn to him and subsequently spread his version of atheistic existentialism, including a disdain for Christian and humanist ideas.

Though some Jewish (e.g. Walter Benjamin, Theodor Adorno, and Hannah Arendt) and Christian (e.g. Dietrich von Hildebrand) philosophers fled Germany in the 1930s and opposed Hitler, very few who remained in the country did so.  The notable exception highlighted by Sherratt was Kurt Huber, a devout Roman Catholic and popular professor at Munich University, who taught musicology as well as philosophy. 

He had been appointed to a prestigious Chair for the Institute of Folk Music at the University of Berlin in 1938, but he refused to toe the Party line and was soon dismissed, though he regained his position in Munich.  He used his lectures on Kant, Spinoza, and Leibniz to subtly criticize der Fuhrer, and he ultimately joined a secret student group (the White Rose resistance society) dedicated to distributing subversive pamphlets.  In time he would be arrested and executed—a “martyr” in Sherratt’s view.  

 Almost as soon as the Allies conquered Germany they conducted the Nuremberg Trials and sought to bring leading Nazis to justice.  One of the “criminals” sentenced to death was Alfred Rosenberg, who had played a leading role in Hitler’s administration, largely because he had written the Myth of the Twentieth Century, which had been, “along with Mein Kampf, the ultimate Nazi bible” (p. 232).  But almost none of the scores of philosophers who had supported Hitler suffered anything more than transient disciplinary measures in their universities.  Though Martin Heidegger was investigated, he successfully “reconstructed his life and career from 1933 to 1945 as one of minimal involvement with the Third Reich” (p. 244).  He managed to reestablish himself as a leader in the European academic world and gained assistance from a most unlikely source—Hannah Arendt, his former student and lover.  Though she was Jewish and had fled to America during the war and had denounced him as a “murderous monster” for his Nazi views, she revisited him in 1950 and abruptly decided to champion his rehabilitation.  Similarly, Jean-Paul Sartre, deeply reliant upon Heidegger’s philosophical work, ignored his Nazi activities and “firmly helped to reestablish Heidegger on the post-war stage” (p. 248).  Subsequently Heidegger traveled widely, giving lectures, and greatly influenced many currents of contemporary thought, especially “post-modernism.”  In light of all this, Sherratt seriously questions the academy’s adulation of Nietzsche, Heidegger et al.  So too we should seriously doubt the value of much that passes as “Postmodernism.”   

* * * * * * * * * * * * * * * * * * * 

Robert Jay Lifton, in The Nazi Doctors:  Medical Killing and the Psychology of Genocide (New York:  BasicBooks, c. 1986), probed, by conducting interviews with both doctors and survivors as well as researching thousands of documents, one of the true mysteries of iniquity—how highly trained and skilled medical doctors (allegedly committed to saving lives) cooperated with the Nazi’s genocidal policies.  Surprisingly, a number of “prisoner doctors” played an important role in running Auschwitz.  These were (generally Jewish and frequently female) medical doctors sent to the camps who assisted the SS doctors.  They often worked as orderlies or nurses, and many of them heroically sought to help other inmates as much as possible, appearing to collaborate while “actually using their position to save as many people as possible” (p. 218).  Still more:  they differentiated between the truly evil and somewhat “better” Nazis who tried to help inmates and were markedly sorrowful as they carried out their orders.  

As one expects from a psychiatrist, Lifton cites many “case studies” and crafts telling illustrations.  He is deeply concerned with medial ethics and confesses that “nothing is darker or more menacing, or harder to accept, that the participation of physicians in mass murder” (p. 3).  Amazingly enough, the doctors he interviewed tried to “present themselves to me as decent people who tried to make the best of a bad situation” and failed (or refused) to make any “clear ethical evaluation of what he had done” (p. 8).  To the extent they did so, it was to rationalize their activities, employing therapeutic language.  They immersed themselves in “medical science” as a “means of avoiding awareness of, and guilt over, their participation in a murderous project” (p. 61).  Thus Dr. Fritz Klein said:  “‘Of course I am a doctor and I want to preserve life.  And out of respect for human life, I would remove a gangrenous appendix from a diseased body.  The Jew is the gangrenous appendix in the body of mankind’” (p. 16).  

The program the Nazis designed to remove gangrenous people was called “euthanasia,” eliminating those deemed “unworthy” to live.  They first implemented the coercive sterilization of “defectives,” for “only the healthy” should procreate.  Then they began killing “impaired” children, for it initially seemed easier to eliminate newborns or young children than larger humans.  Next “impaired” adults, whether mentally or physically disabled, were “put to sleep.”  They also culled out undesirable or “morally inferior” inmates in concentration camps.  Finally came the mass killings in camps such as Auschwitz, whose “primary function” was to kill Jews.  Here the doctors were essential.  They decided, as prisoners were unloaded from the trains, which ones would be immediately sent to the killing centers.  They determined when inmates were not longer useful as laborers and ready to be gassed.  They selected, as did Dr Josef Mengele (doing “scientific” studies on twins), some who would be momentarily spared and used for medical experiments.  As one Auschwitz prisoner doctor remembered:  “‘They [the SS doctors] did their work just as someone who goes to an office goes about his work.  They were gentlemen who came and went, who supervised and were relaxed, sometimes smiling, sometimes joking, but never unhappy.  They were witty if they felt like it.  Personally I did not get the impression that they were much affected by what was going on—nor shocked.  It went on for years.  It was not just one day’” (p. 193).   

They were not monsters, nor even “sadists” as we understand the term.  They were, in fact, rather “normal” human beings.  But they did enormous evil, and what obviously guided and sustained them was their commitment to Nazi ideology—Hitler’s Ethic—with its racist components rooted in a naturalistic, evolutionary, Darwinian ethos.

308 Scalia Still Speaks

As we witness the struggle surrounding President Trump’s recent Supreme Court nominee (Brett Kavanaugh) we’d be wise to remember what happened in 1987 when President Reagan similarly nominated Robert H. Bork.  As Bork noted in The Tempting of America:  The Political Seduction of the Law (NY:  The Free Press, c. 1990), that event revealed a deepening philosophical cleavage dividing the country.  He set forth a historical overview, describing how the Supreme Court has become increasingly politicized inasmuch as it has abandoning the more limited role assigned it by the Constitution.  He also discussed some of the great legal theorists who have influenced American jurisprudence.  For example, Oliver Wendell Holmes and Learned Hand, two great judges, sharply differed in their understanding of the judiciary.  After the two lunched together and Holmes got into his carriage, Hand ran after him, saying:  “Do justice, sir, do justice.”  Stopping the carriage, Holmes reproved his friend, saying:  “That is not my job.  It is my job to apply the law.”  Holmes’ commitment to simply apply the law, not to dispense “justice,” is the strict constructionism Bork endorsed.  

    Apart from the notorious 1856 Dred Scott decision, whereby Justice Roger Taney first enunciated the now popular notion of “substantive due process,” thereby imposing his own opinion on the Constitution and issuing a notoriously bad pro-slavery decision, the Supreme Court and leading theorists generally upheld a strict constructionist viewpoint throughout the 19th century.   At the beginning of the 20th century however, progressive tides altered the legal seashore.  Notable, if isolated, judicial decisions, espec­ially during the New Deal era, generated a tidal wave of revisionist judicial ac­tivism which characterized the Warren Court in the 1950s.  Soon thereafter (obviously legislating from the bench) Justice Berger and like-minded justices issued edicts such as Roe v. Wade (1973) which illustrated revisionism at its worst.  Without providing “an argument that even remotely begins to justify” the decision, the Court opened a deep division still affecting the nation.  

     The activism of the Court’s justices has been widely approved by professors in the nation’s law schools.  No doubt se­lecting one of the most egregious example, Bork wrote:  “Sanford Levinson, of the University Texas law school, advances an extremely skeptical, indeed nihilistic, theory of ‘constitutional’ interpreta­tion.  Levinson says that ‘The “death of con­stitutionalism” may be the central even of our time, just as the “death of God” was that of the past century.’  In a major law review article, Levinson explained that ‘for a Nietzschean reader of constitutions, there is no point in searching for a code that will produce “truthful” or “correct” interpretations; instead, the interpreter, in [phi­losopher] Richard Rorty’s words, “simply beats the text into a shape which will serve his own pur­pose”’” (p. 217).  

Given his opposition to such revisionism, Bork encountered a steamroller of hostility when he appeared before the Senate for his confirmation hearings.  As soon (45 minutes to be precise) as President Reagan nominated Bork, Senator Ted Kennedy launched a vicious attack, claiming that “‘Bork’s America is a land in which women would be forced into back-alley abortions, blacks would sit at segregated lunch counters . . . and the doors of the Federal courts would be shut on the fingers of millions of citizens for whom the judiciary is often the only protector of the individual rights that are the heart of our democracy’” (p. 268).  Speaking through senators such as Kennedy and Joe Biden, a chorus of special interest groups—the ACLU, NOW, Planned Parenthood, et al.—mounted an anti-Bork crusade.  The charges made against him were virtually all false, deliber­ate lies employed to orchestrate the emotions of the masses.  

     For example, Gregory Peck, in an advertisement funded by People of the American Way, asserted Bork favored poll taxes and literacy tests, long used to bar blacks from voting in the South.  In fact, Bork had never favored such.  Ohio’s Senator Metzenbaum loudly bellowed that the nation’s women feared Bork, wresting out of context some decisions he had made as a federal judge.  Senator Joe Biden’s Biden Report assailed Bork for his judicial errors—but without citing a single case as evidence!  In Bork’s opinion, this campaign against him resulted from the fact that he had dared criti­cize the revisionist ideology which underlay sig­nificant decisions of the Warren and Berger courts.  The fact that he found Roe v. Wade judicially flawed further ignited the flames of opposition to his appointment.  And it is clear that today’s obsession with politi­cal correctness makes Supreme Court justice hear­ings an overly-politicized arena wherein senators and special interest groups secure a national pulpit for at least a passing moment.

     Though George Will may overstate a bit, I think his assessment still rings true:  “This is Robert Bork’s brilliant report from the front lines in an ongoing cultural war.  At stake is nothing less than constitutional government.  It is a sobering account of the extent to which judicial willfulness has degraded the elegant constitutional system we were given.”

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Fortunately for our “elegant constitutional system,” the year before Robert Bork was denied a seat on the Court another Reagan nominee, Antonin Scalia, sailed through the Senate with a 98-0 vote!  And reading some of his writings provides us important insights into the both the man and his contributions to American Jurisprudence.  In Scalia’s Court: A Legacy of Landmark Opinions and Dissents (Washington:  Regnery Publishing, c. 2004, Kindle Edition), Scalia made available some of his most important decisions, given context and perspective in “edits and comments” by Kevin Ring, who emphasizes that:  “One trait that nearly everyone praised was Scalia’s brilliant literary style.  His ‘gift for analysis and words,’ one progressive law professor said, made him ‘the best judicial stylist since Oliver Wendell Holmes. Through his opinions, he exerted gravitational pull on the law, even when he lost.  Indeed, during his nearly thirty years on the Court, Scalia was its premier conservative, intellectual gladiator, and wordsmith.  To be sure, many important and influential conservative jurists have served on the High Court, and there remain today others who share Scalia’s textualist and originalist philosophy.  Yet it was Scalia who gave life to Aristotle’s injunction that ‘it is not enough to know what to say—one must know how to say it’ (#54-61).  

Words truly matter.  Scalia often told his clerks that “terminology is destiny.”  So he wrote with great care.  One of Scalia’s sons, Christopher once asked his “father if writing was easy for him.  ‘No,’ he said. ‘It’s hard as hell.’”  Speaking to a group of legal writers (“Writing Well”), Scalia credited “time and sweat” with making the difference between “ordinary” and “good” writing—it’s largely a matter of “writing, revising, rethinking, and writing again.”  Still more, Scalia routinely “put complex arguments about fundamental principles in easy-to-understand terms” (#86).  And he could be quite plain-spoken, expressing “his outrage at the decisions reached and lack of judicial restraint demonstrated by his colleagues on the High Court.  Scalia concluded one opinion, ‘The Court must be living in another world.  Day by day, case by case, it is busy designing a Constitution for a country I do not recognize’” (#95).  Dissenting from the majority in Casey (Justice Kennedy’s convoluted rationale for upholding Roe), Scalia’s words could easily have been written with a word burner, searing the pages. 

Scalia emphasized “textualism”  and “originalism” in writing his opinions.  Differing from some “originalists,” who sought to discern the “intent” of the legislators, Scalia sought to find the “original meaning” of the law as written and understood in its day.  When judges interpret laws, they should do so “reasonably,” looking for the “ordinary meaning” of words as used by the legislators who passed them.   Unwilling to probe into anyone’s “original intent”— as if one could understand the inner thoughts of persons, legislators included, Scalia tried to restrict himself to the words actually written.  (Both positions were rather unique to him when he joined the Supreme Court in 1986, but increasing numbers of young, conservative lawyers and judges have followed his lead.)  We have legislators to implement the “will of the people,” to care for current concerns.  We have a Constitution to secure more permanent things, to establish durable guidelines needed to resist those momentarily fashionable currents so pronounced in democracies.  Thus the notion of a “living Constitution,” continually changing to suit public opinion, reducing the law to whatever judges approve, was an anathema!  

One proponent of the “living Constitution,” Justice William Brennan, “seemed to think the job of Supreme Court justice was similar to that of Senate majority leader.  Brennan famously remarked, ‘You can do anything around here with five votes.’ But Scalia did not want to do ‘anything,’ he wanted the Court to do the right thing” (#377).  “With bracing political incorrectness,” Ring says, “Scalia said he likes his Constitution ‘dead.  He argued that only a fixed and enduring charter could keep judges from reading new fads into the Constitution and less popular mandates out” (#207).  It was justices believing in the “living Constitution” who recently conjured up the right to same-sex marriage, whereas a textualist like Scalia insisted such a “right” should be instituted by a Constitutional amendment, not decreed by the Court.  Scalia also sought to preserve the federal system, the separation of powers set forth in the Constitution.  The federal government has three separate branches, and the states have important powers reserved to them.  He considered federalism “the most important bulwark against government tyranny, even more important than the Bill of Rights” (#742).  In one of his dissenting opinions he declared:  “While the separation of powers may prevent us from righting every wrong, it does so in order to ensure that we do not lose liberty” (#996). 

Turning to Scalia’s position on crucial social issues, we find him insisting that inasmuch as race prejudice is wrong it is wrong even when implemented for allegedly benevolent reasons, i.e. affirmative action.  Thus he disagreed with Justice O’Connor’s view “that, despite the Fourteenth Amendment, state and local governments may in some circumstances discriminate on the basis of race in order (in a broad sense) ‘to ameliorate the effects of past discrimination.’”  However “benign” such strategies appear, they “can no more be pursued by the illegitimate means of racial discrimination than can other assertedly benign purposes we have repeatedly rejected.  The difficulty of overcoming the effects of past discrimination is as nothing compared with the difficulty of eradicating from our society the source of those effects, which is the tendency—fatal to a Nation such as ours—to classify and judge men and women on the basis of their country of origin or the color of their skin.  A solution to the first problem that aggravates the second is no solution at all” (#1242). 

Throughout his tenure on the Court Scalia proved to be a consistent critic of abortion rights, for  nothing in the Constitution or in the American tradition provided for such.  In one crucial case, Webster v. Reproductive Health Services (1989), he argued the Court should overturn Roe.  In Planned Parenthood of Southeastern Pennsylvania v. Casey (1992), wherein Justice Kennedy famously celebrated the “mystery of human life” and the freedom of everyone to manufacture his own morality, Scalia set forth “one of the most caustic opinions ever written by a justice of the Supreme Court” (#1623).  Reaffirming his constitutional stance, he said:  “The States may, if they wish, permit abortion on demand, but the Constitution does not require them to do so.  The permissibility of abortion, and the limitations upon it, are to be resolved like most important questions in our democracy: by citizens trying to persuade one another and then voting” (#1639).  When the Court imposes its position on the public, as was done with Roe, “It is difficult to maintain the illusion that we are interpreting a Constitution rather than inventing one, when we amend its provisions so breezily” (#1751).  Sad to say:  “The Imperial Judiciary lives.  It is instructive to compare this Nietzschean vision of us unelected, life-tenured judges—leading a Volk who will be ‘tested by following,’ and whose very ‘belief in themselves’ is mystically bound up in their ‘understanding’ of a Court that ‘speak[s] before all others for their constitutional ideals’—with the somewhat more modest role envisioned for these lawyers by the Founders” (#1874). 

Throughout American history, the Supreme Court had never dealt clearly with the Second Amendment’s right to “keep and bear arms.”  But in 2008, in District of Columbia v. Heller Justice Scalia successfully argued (citing definitive historical evidence) the case for an armed citizenry, individually entitled to self-defense.  Responding to those who insisted the Second Amendment applied only to a state-controlled “militia,” Scalia wrote:  “Nowhere else in the Constitution does a ‘right’ attributed to ‘the people’ refer to anything other than an individual right” (#2152).  Careful study of 18th century state constitutions shows that individuals were guaranteed the right to “bear arms,” and one of the Founding Fathers, Justice James Wilson, upheld a person’s right to defend himself or his home.  Scalia then supported his view with citations from both Blackstone’s Commentaries and St. George Tucker’s support of “the Blackstonian arms right as necessary for self-defense.”  Tucker insisted the Second Amendment “‘may be considered as the true palladium of liberty. . . . The right to self-defense is the first law of nature’” (#2446).  Justice Joseph Story, whose commentaries give us great insight into the ways the Founders understood the Constitution, “wrote:  ‘One of the ordinary modes, by which tyrants accomplish their purposes without resistance, is, by disarming the people, and making it an offense to keep arms, and by substituting a regular army in the stead of a resort to the militia” (#2483).

Scalia’s Court contains illuminating sections devoted to the death penalty, religious liberty, illegal immigration, homosexuality, Obamacare, free speech etc.  Though many of his opinions were dissents, they show a fine legal mind at work, shining light on topics at the heart of America’s current culture wars.  An editorial in the Wall Street Journal, commented, at his death:  “For some 29 years he defended the original meaning of the Constitution against the legal fads and inventions of more political Justices, bequeathing a judicial legacy even in dissent that will carry long into the future.  Justice Scalia may have been more consequential than any Justice whose jurisprudence so rarely carried a majority of the Court” (#8053).  

* * * * * * * * * * * * * * * * * * * * * * * 

Apart from his judicial work, Antonin Scalia gave speeches to various groups, ranging from his children’s high schools to learned legal societies, addressing topics as diverse as the arts, sports, hunting, education, and good writing as well as legal concerns.  Here we see the man so beloved my most everyone who knew him—witty, generous, self-deprecating.  In her forward to Scalia Speaks: Reflections on Law, Faith, and Life Well Lived (The Crown Publishing Group, c. 2017), Ruth Bader Ginsberg (a colleague with whom he continually disagreed) writes affectionately of his talent to make “even the most sober judge smile.”  Serving alongside him, she “occasionally pinched myself hard to avoid uncontrollable laughter in response to one of his quips.”  To her:  “This collection of speeches and writings captures the mind, heart, and faith of a Justice who has left an indelible stamp on the Supreme Court’s jurisprudence and on the teaching and practice of law.”  

In some of his speeches he focused on this nation’s citizens.  Though properly proud of his Italian heritage and encouraging other ethnics to embrace their own, to him:   “What makes an American is not the name or the blood or even the place of birth, but the belief in the principles of freedom and equality that this country stands for” (p. 14).  To Scalia, one of America’s strong suits, “one of the reasons we really are a symbol of light and of hope for the world, is the way in which people of different faiths, different races, different national origins, have come together and learned—not merely to tolerate one another, because I think that is too stingy a word for what we have achieved—but to respect and love one another” (p. 26).  This results from unique legal traditions, such as the Bill of Rights, but “it is the beginning of wisdom in this area to acknowledge that the Constitution says what it says.  And the fullness of wisdom is to recognize that the crowning achievement of America is not the Bill of Rights (every modern banana republic has one) but rather the structure of government and the democratic tradition that make a Bill of Rights enforceable according to its terms, and not according to the wishes of the ruler—be that ruler a generalissimo or a majority of the electorate” (p. 51).

Thus he was deeply committed to the Constitution as written!  “Unlike any other nation in the world,” he said, “we consider ourselves bound together, not by genealogy or residence but by belief in certain principles; and the most important of those principles are set forth in the Constitution of the United States” (p. 158).  In one of his speeches he quoted a long passage from Benjamin Franklin’s final message to the Constitutional Convention.  Though Franklin confessed he did not fully endorse all its provisions, he noted that “there is no form of Government but what may be a blessing to the people if well administered” and doubted “whether any other Convention we can obtain may be able to make a better Constitution.”   In fact, he was amazed that the proposed document approached as “near to perfection as it does.”  America is indeed blessed, for its founding documents enabled it to become a very special place.  That clearly means that “a judge must be, above all else, a servant of the law—and not an enforcer of his personal predilections” (p. 170).  Thus Scalia strongly objected to the notion of a “living constitution” that would allegedly adjust to what one Court justice labeled “the evolving standards of decency that mark the progress of a maturing society.”  With incisive wit, he concluded:   “The living constitutionalist is a happy fella, because it turns out that the Constitution always means precisely what he thinks it ought to mean” (p. 212).

Scalia never hesitated to defend his deeply-ingrained Christian faith in a culture increasingly hostile to such.  “As one who believes in God, and who believes that those nations who love or at least fear Him, and do His will, will by and large prosper,” he said, “I regret this secularization of our country, or at least of our intellectual classes” (p. 130).  In one of the speeches he gave dozens of times (titled “Not to the Wise:  The Christian as Cretin”), he explained that “‘The Christian as Cretin,’ is meant, of course, to be a play on words.  And it is a wordplay that has some etymological basis. The English word cretin, meaning ‘a person of deficient mental capacity,’ in fact derives from the French word chrétien, meaning ‘Christian,’ which was used in the Middle Ages to refer to the short, often grotesque, severely retarded people who were to be found in some remote valleys of the Alps—perhaps the result of excessive inbreeding.”  Thinking about this, Scalia suggested that “the equivalence of the words Christian and cretin makes a lot of sense.  To be honest about it, that is the view of Christians—or at least of traditional Christians—taken by sophisticated society in modern times” (pp. 107-108).  But in that respect, little has changed since the days of St. Paul, who acknowledged that the world’s elites would generally brand Christians “fools.”  Christians simply are—and ought to be—different from the world!  Such differences clearly appear when sexual questions arise, for “the worldly ideal is not chastity, but safe sex” (p. 121).

While defending his faith, Scalia staunchly upheld the American commitment to the separation of church and state.  “In the last analysis the most important objectives of human existence—goodness, virtue, godliness, salvation—are not achieved through the state; and those who seek them there are doomed to disappointment” (p. 137).  And what his speeches reveal is this:  Scalia modeled not only first-rate jurisprudence but a good man fully aware of higher principles found not on laws but in permanent things. 

307 Leon Kass: Living a Worthy Life

“Our society,” said Leon Kass two decades ago, “is dangerously close to losing its grip on the meaning of some fundamental aspects of human existence.” Seventy years ago (when he and I were young), he says, Americans enjoyed a stable culture affording youngsters “authoritative guidance for how to live. Religious traditions and inherited customs and mores pointed the way to a good life. Adults, quite comfortable with their moral authority, were not stingy with their praise and blame, reward and punishment, nor did they neglect the effort to model decent conduct for the young to follow. In the post–World War II years of my boyhood,” he recalls in Leading a Worthy Life: Finding Meaning in Modern Times (New York: Encounter Books, Kindle Edition, c. 2018), “the prevailing culture took pains to turn children into grown-ups. It offered guidance for finding work and vocation, customs of courtship for finding a suitable spouse, and a plethora of vibrant local institutions and associations – religious, fraternal, social, political, charitable, cultural – for finding meaningful participation in civic and communal life. The institutions of higher learning proudly believed in light and truth, and were pleased to initiate the next generation into the intellectual and artistic treasures of the West” (#50-58).

That world has vanished! “Young people are now at sea – regarding work, family, and civic identity. Authority is out to lunch. Courtship has disappeared. No one talks about work as vocation. The true, the good, and the beautiful have few defenders. Irony is in the saddle, and the higher cynicism mocks any innocent love of wisdom or love of country. The things we used to take for granted have become, at best, open questions. The persons and institutions to which we once looked for guidance have ceased to offer it successfully” (#65). Socrates’ probing questions regarding how we should live are rarely addressed, much less answered. We’ve mastered complex computer technologies but failed to find good reasons to live, and this deficit “is perhaps the deepest curse of living in our interesting time” (#67).

Kass has been teaching for half-a-century, mainly dealing with ethics. Reared in a secular Jewish home, primarily distinguished by it socialistic ideology, he has, as an adult, slowly returned to some of Judaism’s the ancient wisdom, without becoming a devout practitioner. Following his undergraduate schooling at the University of Chicago Kass earned an M.D., then completed a Ph.D. in biochemistry and briefly spent time doing research. But his heart was in liberal education, so he returned to his alma mater and was for decades a professor in the Committee on Social Thought, flourishing within that institutions’s humanities program. “Although formally trained in medicine and biochemistry – fields in which I no longer teach or practice – I have been engaged with liberal education for forty-five years, teaching philosophical and literary texts as an untrained amateur, practicing the humanities without a license” (#4869). In 2001 President George W. Bush appointed him to chair the President’s Council on Bioethics, and he is widely respected for his expertise and wisdom (quite evident in his commentary on Genesis, titled The Beginning of Wisdom). “In my own case,” he recalls, “it was first the prospect of human genetic manipulation that led me to question my onetime conviction that the progress of science and technology would necessarily go hand in hand with an improvement in morals and society, and second, reflection on my activities as a scientist that led me to doubt the claims of some of my colleagues that the activities of living organisms, including man, could be fully understood in terms of nonliving matter and the laws of physics and chemistry, or even in terms of behaviorist psychology and neuroscience” (#4679).

Such convictions were evident when, in the midst of his tenure as chairman of the President’s Council, he published a collection of essays: Life Liberty and the Defense of Dignity: The Challenge for Bioethics (San Francisco: Encounter Books, c. 2002). He was then deeply concerned that “our society has overcome longstanding taboos and aversions to accept test-tube fertilization, commercial sperm banking, surrogate motherhood, abortion on demand, exploitation of fetal tissue, creation of human embryos solely for experimentation, patenting of living human tissue, gender-change surgery, liposuction and body shops, the widespread shuttling of human parts, assisted suicide practiced by doctors and the deliberate generation of human beings to serve as transplant donors—not to speak about massive changes in the culture regarding shame, privacy and exposure.” But beyond his burden for bioethics, Kass stressed: “Perhaps more worrisome than the changes themselves is the coarsening of sensibilities and attitudes, and the irreversible effects on our imaginations and the way we come to conceive of ourselves” (p. 197).

We have unfortunately embraced a “technological way” that finds fuel in “the utopian promises of modern thought” which will ultimately “doom” us to destruction (p. 49). The great issue we face is this: “Everything depends on whether the technological disposition is allowed to proceed to its self-augmenting limits, or whether it can be resisted, spiritually, morally, politically” (p. 49). We must recover our moral compass! We face a grave moral crisis with apparently no notion of what’s at stake. “We are in turbulent seas without a landmark precisely because we adhere more and more to a view of human life that both gives us enormous powers, and, at the same time, denies every possibility of non-arbitrary standards for guiding its use. Though well equipped, we know not who we are or where we re going. We triumph over nature’s unpredictability only to subject ourselves, tragically, to the still greater unpredictability of our capricious wills and fickle opinions” (p. 138).

Still concerned with such issues, in his most recent publication (Leading a Worthy Life,) Kass has collected some papers he’s written during the past two decades, hoping they will “shine fresh light on several fundamental and irreplaceable aspects of the good life, as well as on the specific threats they face today and tomorrow: love, family, and friendship; human achievement, human excellence, and human dignity; learning and teaching in search of understanding and wisdom; and fulfilling the enduring human aspirations for the true, the good, and the beautiful, for the righteous and the holy, and for freedom, equality, and self-government.” He begins by citing an essay by Irving Kristol 25 years ago that illustrated how “succeeding waves of elitist opposition to our inherited moral, aesthetic, and spiritual norms and sensibilities had issued in a nihilistic anticulture, hostile not only to religion, family, patriotism, and traditional morality, but even to the promise of Enlightenment reason itself” (#393). He examines selected “secular realms,” including “work; love and family; community and country;” and the pursuit of truth,” realms connected to “our deepest aspirations: to live a life that makes sense, a life that is worthy of the unmerited gift of our own existence” (#342).

Take first our engagement in work. “That work should be central to life’s fulfillment is a very old idea, and it persists because it is rooted in human nature. Aristotle argued that human flourishing is a life of virtuous or excellent activity, where “activity” translates a word of Aristotle’s own coinage, built from a root meaning “work”: energeia, literally, ‘being-at-work’” (#366). “We need to consider work, as Dorothy Sayers put it, ‘not, primarily, a thing one does to live, but the thing one lives to do.’ Work enables us to utilize and to most fully express our God-given talents, gaining meaning for our lives from fulfilling our nature, from seeing our work well done, and from delighting in the gifts our work provides to a world that needs and appreciates them” (#352).

Then consider conjugal love and family, issues to which he devotes several chapters, thereby indicating their importance. Admittedly, Kass says, “eros can be notoriously fickle in its choice of objects,” but “when disciplined – especially by the vows and practice of a solid marriage – it can provide for a private life whose satisfactions are among the most enduring blessings life has to offer. Living life under a promise, husband and wife enjoy the practice of mutually giving and receiving love, one to the other. Through devotion and care, informed by the pledge and practice of fidelity, everyday life takes on the character of a sacrament” (#387). Such a life seems quite foreign to 21st century youngsters. “Sexually active – indeed, hyperactive – they flop about from one relationship to another.” Too many “young men, nervous predators, act as if any woman were equally good; they are given not to falling in love with one, but to scoring in bed with many. And in this sporting attitude they are now matched by some female trophy hunters. But most young women strike me as sad, lonely, and confused” (#588).

They’re sad and lonely, Kass thinks, because they have lost something essential for women: modesty. “The supreme virtue of the virtuous woman was modesty, a form of sexual self-control, manifested not only in chastity but in decorous dress and manner, speech and deed, and in reticence in the display of her well-banked affections. A virtue, as it were, made for courtship, it served simultaneously (for a man) as a source of attraction and a spur to manly ardor, and (for a woman) as a guard against a woman’s own desires and as a defense against unworthy suitors. A fine woman understood that giving her body (in earlier times, even her kiss) meant giving her heart, which was too precious to be bestowed on anyone who would not prove himself worthy, at the very least by pledging himself in marriage to be her defender and lover forever. Once female modesty became a first casualty of the sexual revolution, even women eager for marriage lost their greatest power to hold and to discipline their prospective mates” (#640).

Years ago Kass and his late wife Amy, distressed by the myriads of failing marriages, began offering a seminar at the University of Chicago to focus on courting and marrying. It occurred to them that universities encouraged many kinds of studies, but rarely focused on the truly central “activities of everyday life” which deeply concerned earlier thinkers such as Aristotle (p. x). “Absent especially is the devoted search for moral wisdom regarding the conduct of life—philosophy’s original meaning and goal, and a central focus of all religious thought and practice—a search that takes help from wherever it may be found and that gives direction to a life seriously lived” (p. ix). Students warmly responded to the course, and the assigned readings have been collected into a sourcebook edited by the Kasses, Wing to Wing, Oar to Oar: Readings on Courting and Marrying (Notre Dame: University of Notre Dame Press, c. 2000). They sought to reverse their students’ apparent disinterest in getting married and having children, for relatively few had thought seriously about the importance of sharing a lifetime with someone. Since the Kasses had found their marriage right at the heart of what makes life meaningful, they unapologetically took a “pro-marriage” stance and wondered why youngsters failed to crave to discover such a good life! In part, they concluded, the demise of “courtship” helped explain it. As they define it, “courting” means “to pay amorous attention to, to woo, with a view of marriage” (p. 5).

Unfortunately, such “courting and marrying” have nearly disappeared in modern America. In part, they believed, the proliferation of “gender studies” and the influence of militant feminism have deliberately sought to “redefine and recreate the meaning of being man or woman,” alleging that “gender” is little more than a “cultural construction” subject to continuous change. An egalitarian ideology has subverted “the authority of religion, tradition, and custom within families, of husbands over wives and fathers over sons” (p. 13). Against such, the professors Kass urge us to simply study our navels! They unequivocally show we were born of a woman! “Moreover, absent a miracle, each of us owes our living existence to exactly one man and one woman—no more, no less, no other—and thus to one act of heterosexual union. This is no social construction, it is natural fact” (p. 7). So let’s be honest and talk about two sexes, not multiplied genders! Doing so leads us to wonder at the beauty of courtship and marriage.

Since publishing Wing to Wing, Oar to Oar, Kass thinks the “beauty of courtship and marriage” has further decayed. In Leading a Worthy Life he underscores his earlier concerns, setting forth “a partial list of the recent changes in our society and culture that hamper courtship and marriage: the sexual revolution, made possible especially by effective female contraception; the ideology of feminism and the changing educational and occupational status of women; the destigmatization of bastardy, divorce, infidelity, and abortion; the general erosion of shame and awe regarding sexual matters, exemplified most vividly in the ubiquitous and voyeuristic presentation of sexual activity in movies and on television; widespread morally neutral sex education in schools; the explosive increase in the numbers of young people whose parents have been divorced (and in those born out of wedlock who have never known their father); great increases in geographic mobility, with a resulting loosening of ties to place and extended family of origin; and, harder to describe precisely, a popular culture that celebrates youth and independence not as a transient stage en route to adulthood but as ‘the time of our lives,’ imitable at all ages, and an ethos that lacks transcendent aspirations and asks of us no devotion to family, God, or country, encouraging us simply to soak up the pleasures of the present. The change most immediately devastating to wooing is probably the sexual revolution” (#628).

Preeminent among the many harmful aspects of the sexual revolution is divorce, American style. “Countless students” have told Kass “that the divorce of their parents has been the most devastating and life-shaping event of their lives” (#685). Fearing long-term commitments, youngsters now choose to “live together,” getting to “know” each other without going through the process of dating, courtship and marraige. “But such arrangements,” Kass says, “even when they eventuate in matrimony, are, precisely because they are a trial, not a trial of marriage. Marriage is not something one tries on for size, and then decides whether to keep; it is rather something one decides with a promise, and then bends every effort to keep. Lacking the formalized and public ritual, and especially the vows or promises of permanence (or “commitment”) that subtly but surely shape all aspects of genuine marital life, cohabitation is an arrangement of convenience, with each partner taken on approval and returnable at will” (#698). Though often angry at their parents for divorcing, cohabiting couples that marry will likely follow their example! “Given that they have more or less drifted into marriage, it should come as no great surprise that couples who have lived together before marriage have a higher rate of divorce than those who have not” (#704).

Whether or not it is possible, Kass calls for a return to earlier models for courtship and marriage as the only practice suitable for our species. “Real reform in the direction of sanity would require a restoration of cultural gravity about sex, marriage, and the life cycle. The restigmatization of illegitimacy and promiscuity would help. A reversal of recent antinatalist prejudices, implicit in the practice of abortion, and a correction of current antigenerative sex education would also help, as would the revalorization of marriage as both a personal and a cultural ideal” (#917).

His commitment to revitalizing marriage is part of Kass’s broader concern for human dignity, something he has extensively dealt with in his bioethical writings, early evident in Life, Liberty, and the Defense of Dignity. Medical researchers, once committed to enabling patients to recover from diseases, now envision genetic manipulation and computer-chip implants which will improve human nature. “Human nature itself lies on the operating table, ready for alteration, for eugenic and neuropsychic ‘enhancement,’ for wholesale redesign. Inn leading laboratories, academic and industrial, new creators are confidently amassing their powers anthill on the street their evangelists are zealously prophesying a posthuman future. For anyone who cares about preserving our humanity, the time has come to pay attention” (p. 4). Reminding us of C.S. Lewis’s warnings in The Abolition of Man and Aldous Huxley’s prophetic Brave New World (two books fundamental to his intellectual development) he worries that we “are not yet aware of the gravity” of powerful anti-human movements seeking to “transform” the natural world we’ve been given. Around the globe we see folks infatuated with utopian aspirations, enamored of technologies, all singing “loudly the Baconian anthem, ‘Conquer nature, relieve man’s estate’” (p. 4).

From his current vantage point, Kass says: “As I look back over the decades since I left the world of science to reflect on its human meaning, three distinct but related pursuits stand out: First, addressing the conceptual danger (stressed by Lewis) of a soulless science of life, I sought a more natural science, one that is truer to life as lived. Second, addressing the practical danger (stressed by Huxley) of dehumanization resulting from the relief of man’s estate and the sacrifice of the high to the urgent, I sought a richer picture of human dignity and human flourishing. And third, addressing the social and political dangers (stressed by Rousseau) of cultural decay and enfeeblement, I sought cultural teachings that could keep us strong in heart and soul, no less than in body and bank account” (Leading a Worthy Life, #5027).

We simply must think more deeply about such things, and such thinking must be philosophical rather than scientific, discerning and seeking to preserve human dignity. “Both historically and linguistically, ‘dignity’ has always implied something elevated, something deserving of respect. The central notion etymologically, in English as in the Latin root dignitas, is worthiness, elevation, honor, nobility – in brief, excellence or virtue. In all its meanings it has been a term of distinction; dignity is not something to be expected or found in every human being, like a nose or a navel” (#2914). Today, he warns, there is a “new field of ‘transhumanist’ science is rallying thought and research for the wholesale redesign of human nature, employing genetic and neurological engineering and man-machine hybrids, en route to what has been blithely called a ‘posthuman’ future” (#2825). What should most concern us is the fact that the real threat we face is not merely technologies such as cloning but “the underlying scientific thought” that sustains them. During the past several centuries, biologists have “reconceived the nature of the organic body, representing it not as something animated, purposive and striving, but as dead matter-in-motion. This reductive science has given us enormous power, but it offers us no standards to guide its use. Worse, it challenges our self-understanding as creatures of dignity, rendering us incapable of recognizing dangers to our humanity that arise from the very triumphs biology has made. What is urgently needed is a richer, more natural biology and anthropology, one that does full justice to the meaning of our peculiarly human union of soul and body in which low neediness and divine-seeking aspiration are concretely joined” (p. 20).

To rediscover Aristotle and the Bible would significantly help us in this endeavor. The wisdom contained in such classic sources far surpasses the reductionistic and frequently irrational pronouncements being uttered by today’s scientists and politicians. So Genesis can tell us “what it means that the earth’s most godlike creature is a concretion combining ruddy earth and rousing breath” (p. 20). Should we be dissatisfied with the reigning mechanistic dogma we could turn to Goethe, “a connoisseur of morphology who . . . explored the immanent creative powers of life and who understood, perhaps better than anyone else, how the purposive yet innovative mind of man might both mirror and emory the purposiveness and creativity of nature itself. And hiding off-stage, but still accessible to us, is that first biologist of nature-in-its-ordinary-course, Aristotle, who emphasized questions of being over becoming, form over matter, purposiveness over moving causes, and wholes over parts; for whom the soul was not an ethereal spirit or a goest-in-the-machine but an immanent and embodies principle of all vital activity; and for whom science was a refined and ever deepening reflection on the natures and the causes of the beings manifest to us in ordinary experience” (p. 294).

Only thereby will we recover our true sense of human dignity.

306 Assessing a Pontiff

Following his 2013 inauguration, Pope Francis I enjoyed widespread adulation, especially in the more progressive circles of the Catholic Church.  The secular media too proved notably fawning inasmuch as he seemed open to modernity and willing to abandon some of the Church’s traditional positions—particularly regarding sexual behavior.  Thus Actress Jane Fonda said:  “‘Gotta love new Pope.  He cares about the poor, hates dogma.’ Actress Salma Hayek, a supporter of abortion rights and gay marriage, asserted, ‘Pope Francis is the best pope that has ever existed.’  Fonda’s ex-husband, the late Tom Hayden, spoke for fellow 1960s radicals when he called the election of Pope Francis ‘the greatest moment in empowering spiritual progressives in decades.’  ‘Francis is on the side of liberation theology, working from within, towards his moment,’ he wrote. ‘His choice is more miraculous, if you will, than the rise of Barack Obama in 2008’” (George Neumayr, Political Pope, p. 25). 

Within a few years, however, conservative Catholics grew concerned with Francis’s off-the-cuff comments, spontaneous phone calls, interviews, and publications.  To understand such concerns, one of the best recent studies is To Change the Church:  Pope Francis and the Future of Catholicism  (New York:  Simon & Schuster; Kindle Edition, c. 2018), by Ross Douthat, a columnist for the New York Times.  “This is a book,” he says, “about the most important religious story of our time: the fate of the world’s largest religious institution under a pope who believes that Catholicism can change in ways that his predecessors rejected, and who faces resistance from Catholics who believe the changes he seeks risk breaking faith with Jesus Christ” (#32).  Douthat thinks we face “a hinge moment in the history of Catholicism, a period of theological crisis that’s larger than just the Francis pontificate but whose particular peak under this pope will be remembered, studied, and argued over for as long as the Catholic Church endures—and, if Catholics are right about their church, for as long as this world endures as well” (#113).  

During the past two centuries liberalizing currents have transformed Western Civilization by reshaping its political, economic, and cultural institutions, moulded by an ever-changing “adaptationist, evolutionist spirit.”  Protestant Liberals and Catholic Modernists have insisted Christians should “adapt or die” and proposed a multitude of changes designed to update the ancient faith by constantly revising and transforming its doctrines, making them “into the equivalent of a party’s platform or a republic’s constitution—which is to say, binding for the moment but constantly open to revision based on democratic debate” (p. 10).  It’s no accident that the most liberal segments within the Church are from Germany, still influenced by the Hegelian notion “that God’s revelation was perpetually unfolding in history, and that therefore it was a mistake to consider Catholicism a closed system in which questions were settled permanently. The liberal Protestant line, ‘Never put a period where God has put a comma,’ was the basic presupposition for this liberal Catholicism as well.  Nothing, save Christ’s divinity and not necessarily even that, could be closed to debate, and the message that the church was called to preach in one era might be very different in the next” (pp. 151-152).  As the decades rolled by virtually all the popes (most notably Pius IX and Benedict XV) have battled such currents, determined to conserve the “faith once delivered to the saints.”  But there has always been a determined faction within the Church working to modernize her—to essentially embrace the Liberalism emergent in mainline Protestant denominations.  To them, invoking “the Spirit of Vatican II,” getting a progressive pope would at last bring the Church into the modern age.  And with the current Pope Francis they may well have found, Dothan believes, the right man “to change the church.”  

Devotees of changing the Church constantly invoke the “spirit of Vatican II.”  Since both John Paul II and Benedict XVI sought to diplomatically implement their understanding of the council’s declarations, they avoided waging “a comprehensive war on modernism,” preferring to exhort the faithful and modestly reform the bureaucracy.  But liberals, waiting in the wings, saw their “era as a kind of temporary conservative coup, in which they had lost the levers of power but hadn’t lost anything permanent. After all, what one coup could accomplish another could eventually undo” (p. 25).  They blithely ignored the obvious:  “In the heartland of ‘spirit of Vatican II’ Catholicism, the Northern European nations whose theologians contributed so much to the council’s liberal voice, the church’s collapse was swift, steep, and stunning” (p. 27).  Scheming to overhaul the Church in modernist ways, they’ve engineered her suicide!  

Their opportunity came when Pope Benedict XVI announced his resignation in 2013.  “That night, by interesting coincidence, a bolt of lightning struck the Vatican” (p. 45).  A small group of progressive German and Belgium clerics (known as the “St. Gallen mafia”) looked for a candidate who would be amenable to their designs.  They found “a plausible candidate” in Jorge Bergoglio, the Jesuit archbishop of Buenos Aires.  Coming of age when liberation theology infatuated many Latin Americans, Bergoglio seemed to share the cabal’s commitment to “a synthesis between gospel faith and political activism, with Jesus’s Sermon on the Mount as a blueprint for social revolution instead of Das Kapital.”  As Pope, Francis, granted Gustavo Gutiérrez, “one of the godfathers of liberation theology,” a private audience and later featured him “as a key speaker at a Vatican event” (p. 68).  Alleviating poverty, not salvation from sin, became the Gospel!  Unlike Benedict XVI, Bergoglio was an activist.  “‘Hagan lío!’” he liked to say to young people.  It was a colloquial phrase—translated as ‘Shake things up!’  ‘Make noise!’ or ‘Make a mess!’ or even ‘Raise hell!’”  His St. Gallen supporters saw in him “hints of their own worldview in his focus on poverty and social justice, his seeming weariness with certain culture war battles, and his decentralizing instincts” (p. 60).

Following his election, Bergoglio certainly sought to “shake things up.”  He discarded some of the dress, residence, formality and symbolism of the papacy, portraying himself as an accessible, transparent man of the people.   He also sought to transform the image of the Church, frequently saying “that he wanted the church to resemble ‘a field hospital after battle,’ in which the most important thing is to bind up open wounds, to use mercy as a medicine, before offering the patient a meticulous blueprint for full health” (p. 66).  Such mercy was especially needed, he thought, in dealing with sex and marriage, as was evident in the prominence he gave to one of the St. Gallen mafia, Cardinal Walter Kasper, “Ratzinger’s old intellectual rival,” who had “recently written a book on the theology of mercy.”  Though Kasper reinforced many of the Church’s traditional teachings, “in the crucial passages” he proposed a novel “penitential path,” opening the way for “divorced and remarried Catholics to receive communion” (p. 82). 

To consider this issue, Pope Francis called for a synod on “the vocation and mission of the family in the Church and in the contemporary world.”  A preliminary, “extraordinary” synod met in 2014 beginning with an address by Cardinal Kasper.  Discussions ensued which were summarized in an unofficial relatio that gained considerable attention.  “Gone was the language of mortal sin and moral absolutes; gone were phrases like ‘adultery’ and ‘living in sin.’  It seemed to suggest that “the twenty-first-century church would recognize and celebrate the virtues of second marriages and second unions and cohabitation even as it continued to teach that they fell short of Catholicism’s official marital ideal” (p. 107).  The 2015 Ordinary Synod considered the positions advanced in the earlier synod, leading to the publication of the pope’s official teaching in Amoris Laetitia, “The Joy of Love.”  It generally reasserted traditional Church teaching, but   in the section devoted to “Catholics in irregular relationships” it was not at all clear where the pope stood. 

What was clear was Francis’s apparently intentional ambiguity.  Whereas John Paul II had strongly opposed situational ethics, Francis seemed to invoke hard cases calling for relaxation of behavioral rules.  So he “piled up lists of mitigating factors that could make an apparent mortal sin less serious.  Where John Paul II had insisted that even in difficult circumstances the moral law is never impossible to follow, Francis discussed all the ways in which family turmoil and personal psychology and the exigencies of modern life could make the moral law seem either too hard to comprehend or too difficult to obey.”  To turn from John Paul’s Veritatus Splendor to Francis’s Amoris Laetitia, one conclude “that Francis wasn’t so much developing John Paul’s thought as arguing with it” (p. 130).  

Apparently Francis is reviving the “contextual” or “situation ethics” that John Paul II so sharply repudiated.  (Illustrating this a Twitter comment posted by one of the pope’s closes advisors, Cardinal Spader, who declared: “Theology is no Mathematics.  2 + 2 in Theology can make 5.  Because it has to do with God and real life of people.”)  It all depends!  Who’s to judge?  The controversial sections in Francis’s papal exhortation sometimes read a bit like the pop psychology of the ‘60s.  “Situation ethics is back,” says Thomas Pauken, author of The Thirty Years War.  “Francis was infected by the virus of 1960s liberalism.”  Thus, almost immediately bishops around the world interpreted the document differently.  Some, such as Robert McElroy in San Diego, authorized remarried Catholics to take communion, leaving the decision on the hands of individuals or priests.  Other bishops invited “to communion any remarried Catholics who felt “‘at peace with God’” (p. 136).  More conservative bishops, such as Charles Chaput in Baltimore, staunchly upheld the Church’s traditional ban.  Given such uncertainty, four prominent, retired cardinals wrote the pope a private letter asking for clarification.  Their letter set four dubia—simple questions—primarily asking “whether Veritatis Splendor’s declaration that ‘circumstances or intentions can never transform an act intrinsically evil . . . into an act “subjectively” good’ had been superseded, and whether the church now taught, as it had not before, that individual consciences could discern ‘legitimate exceptions to absolute moral norms’” (p. 143). 

Such sharp disagreements regarding marriage and divorce reveal deep fissions within the Church under Pope Francis.  Whether ruptures or schisms develop only time will tell.  “Other communions have divided very recently over precisely the issues that the pope has pressed to the front of Catholic debates.  And for good reason:  Because these issues, while superficially ‘just’ about sexuality or church discipline, actually cut very deep—to the very bones of Christianity, the very words of Jesus Christ” (p. 182).  The pope is also working to permanently establish his progressive position by eliminating conservatives who dare oppose him.  Thus cardinals George Pell, Gerhard Muller, Robert Sarah and Raymond Burke have been demoted and shunted aside, to be replaced by Francis’s devotees.  

In multifarious ways Francis has indeed changed the Church—most probably for the worse.  Douthat concludes his treatise with a statement in a “sympathetic papal biography” detailing Bergoglio’s Argentine years.  The Jesuit writing the book said:  “As provincial [of the Society of Jesus] he generated divided loyalties:  some groups almost worshipped him, while others would have nothing to do with him, and he would hardly speak to them.  It was an absurd situation.  He is well-trained and very capable, but is surrounded by this personality cult which is extremely divisive.  He has an aura of spirituality which he uses to obtain power.  It will be a catastrophe for the Church to have someone like him in the Apostolic See.  He left the Society of Jesus in Argentina destroyed with Jesuits divided and institutions destroyed and financially broken.  We have spent two decades trying to fix the chaos that the man left us.  ‘Hagan lío!’ Francis likes to say. ‘Make a mess!’ In that much he has succeeded” (p. 207-208).

* * * * * * * * * * * * * * * * * * * * * * * * * 

In  Lost Shepherd: How Pope Francis is Misleading His Flock (Washington:  Regnery Publishing, Kindle Edition, c. 2018) Philip Lawler, a seasoned Catholic journalist specializing in Church affairs, duplicates much of the material found in Douthat’s To Change the Church.  But he is more personally distraught, for “every day (I am exaggerating, but only slightly), the pope issues another reminder that he does not approve of Catholics like me” (p. 43).  Though initially excited by the prospects of a fresh face in the Vatican, it quickly became evident that he was a divisive figure.  This is due to both his “autocratic style of governance and the radical nature of the program that he is relentlessly advancing” (#57).  

The very name Bergoglio chose—Francis—suggested his intent to differentiate himself from his predecessors.  Ironically, for a Jesuit, he chose a name best associated with the beloved Francis of Assisi, famed for his “commitment to simplicity, humility, and wholehearted love for all of God’s creation.  At the same time, it called to mind the message that the great saint had received from God in the church of San Damiano:  ‘Francis, go, rebuild my house, which as you see is in ruins.’”  For more than 1000 years popes selected a name used by a prior popes.  “So when he chose an entirely new name, Pope Francis indicated that he was prepared to strike out in a new direction” (#262).  Determined to discard some of the regalia associated with the papacy, when an “an aide tried to place the traditional mozzetta across his shoulders before his first appearance on the loggia of St. Peter’s, Francis brushed him away testily, declaring that ‘the carnival is over’” (#301).  As such “brash,” ad lib comments proliferated it was initially difficult to discern precisely who he was or what he stood for.  

He publicly presents himself as “kind, soft-spoken, avuncular, uniting rather than dividing. Yet even a cursory reading of the pope’s daily homilies reveals harsh rhetoric, stinging rebukes, and angry denunciations such as we have not heard from a Roman pontiff for generations” (#2578).  He routinely assails  “the ‘Pharisees,’ the ‘doctors of the law,’ and all who were ‘rigid’ in their interpretation of Church teaching. In language that no one expected from a Roman pontiff, he denounced the ‘careerist bishop,’ the ‘sourpuss,’ the ‘smarmy, idolater priest,’ the ‘moralistic quibbler,’ and the ‘people without light: real downers.’  Some members of the flock, it became clear, particularly get under the papal skin—the ‘starched Christian,’ the ‘bubble Christian,’ the ‘long-faced, mournful funeral Christian,’ and the ‘parrot Christian.’  In a particularly vivid rebuke, he accused journalists who report on conflicts and scandals of ‘coprophilia (an ‘abnormal interest in fecal matter’).  Rarely did the pope identify the objects of his ire by name, but from the frequency of his attacks on ‘rigid’ Christians, it seemed clear that he was talking about those who did not accept his calls for change in the Church” (#2547).  Though many of the cardinals who elected him thought he would initiate needed administrative reforms—e.g. ending the “Vatileaks,” cleaning up the Vatican’s financial problems, confronting the sexual-abuse scandals—Francis did little meow than denounce evil-doers.  His words are often impassioned, garnering favorable media coverage, but little action follows.   Indeed he seems to promote and maintain in office some of the very clerics (so long as they are his allies) associated with financial corruption and sexual abuse!  

And as he settled into his office “a pattern emerged of support for causes usually associated with the political Left—environmentalism, disarmament, unrestricted immigration, income redistribution” (#359).  “The Vatican began to organize conferences on immigration reform and climate change.  Twice Francis hosted meetings of ‘popular movements,’ with invitations going out to environmental activists, ethnic separatists, militant feminists, and community organizers—but not to pro-life leaders or defenders of traditional marriage” (#2764).  Throughout the synods on the family, Francis seemed (by virtue of the speakers he chose and the critics he dismissed) to support allowing allowing divorced Catholics to receive communion—though as ever it was hard to know precisely where he stands.  Nevertheless, when bishops in Buenos Aires embraced the liberal position, the pope sent them a private letter, congratulating “his countrymen on their interpretation of his apostolic exhortation, writing that it ‘fully captures the meaning’ of his work. ‘There are no other interpretations,’ he added” (#2071).

Just as there are “no other interpretations” allowed, so too Francis clearly seeks to permanently change the Church.  He has surrounded himself with liberal Jesuits such as Antonio Spader and often meets with Adolfo Nicolas, who was for many years the superior general of the Society of Jesus.  Much of his agenda seems designed to placate “the solidly left-leaning majority” in his order.  “For a pope bent on change, the Jesuits would be a bulwark. And Francis was bent on change” (#2535).

* * * * * * * * * * * * * * * * * * * * * * * * * * * 

A far more critical treatise is George Neumayr’s The Political Pope: How Pope Francis Is Delighting the Liberal Left and Abandoning Conservatives (New York:  Center  Street, c. 2017).  “The election of Jorge Bergoglio,” he says, “marked the culmination of the left’s long march through the Church.”  Determined to liberalize the Church, Catholic Leftists are generally labeled Modernists.  A century ago Pope Pius X (in his 1907 encyclical Pascendi Dominici Gregis) branded their position heretical, warning that they wished retool the faith, making it suitable “to the times in which we live” rather than historical orthodoxy.  “He foresaw a Church that would chase after elite fads, defer to the spurious claims of modern science, bow down to the secularism of the state, treat all religions as equal, cast Jesus Christ as a mere human political activist, reduce priests to social workers, and Protestantize its worship and doctrine” (p. 41).  His fears seem to have come to pass, for they are now manifest in Pope Francis.  

Neumayr begins by reminding readers that both popes John Paul II and Benedict XVI strongly opposed “‘liberation theology,’ a Marxist-inspired ideology disguised as concern for the poor that the Soviet Union’s KGB spies had helped smuggle into Latin America’s Catholic Church in the 1950s.  The movement was born in the KGB, and it had a KGB-invented name:  ‘liberation theology,’ according to Ion Mihai Pacepa, who served as a spymaster for Romania’s secret police in the 1950s and 1960s” (p. 1).  Having suffered Communist oppression in Poland, John Paul II had no illusions regarding its toxicity.  Thus one of his first significant gestures took place when he met Latin American clerics, many of them favoring liberation theology, in Nicaragua in1983.  Seeing Ernesto Cardenal, “a Catholic priest turned Marxist activist” who had violated his vows by joining the “Sandinista government in Nicaragua,” the pope rebuked him, saying:  “‘You must straighten out your position with the Church’”(p. 1).  

Thirty years later, shortly before leaving office in 2013, Pope Benedict XVI warned against “the destructive liberalism that spread within the Church after the council of Vatican II.”  He lamented liberalism’s impact, producing “‘so many problems, so much misery, in reality:  seminaries closed, convents closed, the liturgy was trivialized’” (p. 14)  He could hardly have anticipated that his successor would, in fact, would champion “the very liberal Church he feared” and embody “the very ‘hermeneutic of politics’ he decried” (p. 14).  Thus the current pope—an Argentinian favoring socialism who has declared that “inequality is the root of all evil”— Bergoglio “has generated headlines not for scolding Marxists but for supporting them, not for rebuking liberation theologians but for honoring them.”  He has named “an open socialist,” a Honduran, chairman of the Council of Cardinals.  Francis has said “that liberation theologians have a ‘high concept of humanity’” and publicly praised radicals such as Brazil’s Leonardo Boff and “the founding father of liberation theology, the Peruvian priest Gustavo Gutiérrez” (p. 3).  When, in 2016, the Jesuits selected, as their general superior they chose (garnering Pope Francis’s blessing) “a Venezuelan, Fr. Arturo Sosa, whose communist sympathies have long been known” and who endorses a “Marxist mediation of the Christian Faith” focused on transforming “the capitalist society into a socialist society” (p. 62).  

In short:  Pope Francis “has emerged as one of the most political popes in the history of the Church.  His left-wing activism is relentless, ranging across causes from the promotion of global warming theory to support for amnesty and open borders to the abolition of lifetime imprisonment.”  And “he is not only championing the radical political agenda of the global left but also subverting centuries-old Catholic teaching on faith and morals” (p. 6).