349 The Real Anthony Fauci

Dr. Anthony Fauci enjoys an exalted reputation throughout many echelons of society.  His charisma and confidence, omnipresent on TV throughout the COVID-19 pandemic, made him a trusted face for “science.”  There are Tony Fauci fan clubs and donuts, an “I heart Fauci” throw pillow, an “In Fauci We Trust” coffee mug, a “Honk for Dr. Fauci” yard sign, and even a Dr. Fauci prayer candle!  To dare question Fauci easily removes one from Twitter or Facebook, and books questioning his probity go unlisted by media giants such as Amazon.  Fortunately for Fauci, a fawning media have largely shielded him from serious critiques, dismissing them as conspiracy theories concocted by devious cranks and misfits.  Consequently, we have a thoroughly misinformed public, leaving, for example, typical Democrats thinking half of those infected with COVID were hospitalized, whereas the real number was less than one percent!  

A medical doctor who’s never practiced medicine, Fauci began working in 1968 for the National Institute of Allergy and Infectious Diseases (NIAID, a department within the  National Institutes of Health) and ultimately became its director.  He is now the most senior and highly paid ($417,608 annually) bureaucrat in the federal system.  A consummate manipulator, he’s created a scientific and financial empire that controls an enormous segment of public health policy.  Fauci’s success is rooted in his skillfully snaring federal grants for research in the early 1980s.  Under his guidance, NIAID funding for AIDS mushroomed from $297,000 in 1982 to half a billion dollars by 1992!  Virtually all this money was poured into seeking a vaccine to prevent the HIV virus from developing into full-blown AIDS.  (Such a vaccine has never been found and many scientists now think HIV is simply present in some AIDS patients without being the actual cause of the disease).   In time Fauci would become a trusted advisor to presidents and ultimately emerge during the COVID-19 epidemic as the face of the federal government.

Countering all the acclaim Fauci has garnered, Robert F. Kennedy, Jr. insists we should know The Real Antony Fauci:  Bill Gates, Big Pharma, and the Global War on Democracy and Public Health (New York:  Skyhorse Publishing, c. 2021; Kindle Edition).   Unlike Fauci’s public personae, Kennedy judges him to be a devious Machiavellian who has orchestrated harmful policies while enriching himself and his loyal cadres.  Sharing this view, Nobel laureate for medicine Luc Montagnier says:  “Dr. Joseph Goebbels wrote that ‘A lie told once remains a lie, but a lie told a thousand times becomes the truth.’  Tragically for humanity, there are many, many untruths emanating from Fauci and his minions.  RFK Jr. exposes the decades of lies.”  Throughout this book it’s clear that Kennedy is writing as a prosecutor rather than an even-handed historian, but the evidence he presents (if true) is truly damning, for the damage done to the world by one man is virtually beyond belief.  “The disturbing story” Kennedy tells “has never been told, and many in power have worked hard to prevent the public from learning it.  The main character is Anthony Fauci” (p. 26).  

Robert Kennedy Jr. is himself is a lifelong Democrat, a member of perhaps the most prominent and powerful political family in recent American history.  He personally knows scores of politicians and bureaucrats, including Anthony Fauci.  His uncle “Ted” Kennedy helped write many of the regulatory laws overseeing federal departments that now serve as “sock-puppets” for the industries they’re supposed to supervise.   Much of his life RFK Jr. devoted to environmental and public health issues, serving as president of Waterkeepers and fighting “Big Oil, King Coal” and assorted corporate polluters.  While so doing, he enjoyed the favor and support of leftists in his party and the media.  He “published regularly in the New York Times and all the major papers” and magazines.  He gave speeches all over the country, deriving a significant income thereby.  But:  “All that changed in 2005, after I published an article, “Deadly Immunity,” about corruption in CDC’s vaccine branch” (p. 327).  

He’d dared enter the debate over vaccines, and he soon discovered that “Big Pharma” essentially controlled the nation’s health agencies through incestuous, corrupt schemes.  Federal agencies now receive billions of taxpayer monies to research diseases.  The researchers getting the money, working in universities or government-supported labs, then propose medicines (most likely vaccines) for the diseases.  Promising meds are then patented by the researchers, who profit handsomely as the public pays for their use.  “The CDC [Centers for Disease Control], for example, owns 57 vaccine patents and spends $4.9 of its $12.0 billion-dollar annual budget (as of 2019) buying and distributing vaccines.  NIH [National Institute of Health] owns hundreds of vaccine patents and often profits royally from the sale of products it supposedly regulates.  High level officials, including Dr. Fauci, receive yearly emoluments of up to $150,000 in royalty payments on products that they help develop and then usher through the approval process” (p. 26)

So Kennedy “wrote this book to help Americans—and citizens across the globe—understand the historical underpinnings of the bewildering cataclysm that began in 2020.  In that single annus horribilis, liberal democracy effectively collapsed worldwide.  The very governmental health regulators, social media eminences, and media companies that idealistic populations relied upon as champions of freedom, health, democracy, civil rights, and evidence-based public policy seemed to collectively pivot in a lockstep assault against free speech and personal freedoms.  Suddenly, those trusted institutions seemed to be acting in concert to generate fear, promote obedience, discourage critical thinking, and herd seven billion people to march to a single tune, culminating in mass public health experiments with a novel, shoddily tested and improperly licensed technology so risky that manufacturers refused to produce it unless every government on Earth shielded them from liability” (p. 23).  

Honestly confronting what happened during the past two years should shock us.  Fauci’s policies—masks, quarantines, lockdowns, and social distancing—were all prescribed, if not mandated, and none of them were ever justified by peer-reviewed studies showing their effectiveness.  On the other hand, Fauci et al. discouraged and actually sought to ban doctors from using demonstrably effective therapeutics, resulting “in by far the most deaths, and one of the highest percentage COVID-19 body counts of any nation on the planet.”  Consequently, the United States, “with 4 percent of the world’s population, suffered 14.5 percent of total COVID deaths” (p. 30).  Just recall, for example, the deafening drumbeats demanding everyone wear masks, and then realize:  “There is no well-constructed study that persuasively suggests masks have convincing efficacy against COVID-19 that would justify accepting the harms associated with masks.”  Indeed, “retrospective studies on Dr. Fauci’s mask mandates confirm that they were bootless. ‘Regional analysis in the United States does not show that [mask] mandates had any effect on case rates, despite 93 percent compliance.  Moreover, according to CDC data, 85 percent of people who contracted COVID-19 reported wearing a mask’” (p. 54).  Nor did lockdowns reduce infection rates.  In fact, the less a country followed Fauci’s prescriptions the better it fared!  Kenya, for example, suffered only 97 deaths per 1,000,000 residents whereas the United States suffered 2,107!  Quarantining the healthy has never dealt effectively with infectious diseases and was consequently discouraged by the WHO [World Health Organization], an arm of the United Nations.  That Fauci encouraged it suggests he is both scientifically and morally obtuse. 

Perhaps the most damning accusation Kennedy levels at Fauci’s role in the COVID-19 pandemic is this:  “Medicines were available against COVID—inexpensive, safe medicines—that would have prevented hundreds of thousands of hospitalizations and saved as many lives if only we’d used them in this country.  But Dr. Fauci and his Pharma collaborators deliberately suppressed those treatments in service to their single-minded objective—making America await salvation from their novel, multi-billion dollar vaccines” (p. 52).  One of the nation’s premier epidemiologists, Yale’s Harvey Risch, the author of over 350 peer-reviewed publications, insists you should “‘quarantine and treat the sick, protect the most vulnerable, and aggressively develop repurposed therapeutic drugs, and use early treatment protocols to avoid hospitalizations.’”  Had we aggressively used therapeutic drugs and implemented sensible policies, we “could have easily defanged COVID-19 so that it was less lethal than a seasonal flu.  We could have done this very quickly.  We could have saved hundreds of thousands of lives’” (p. 63).  

Agreeing with Risch, Dr. Peter McCullough (an internist and cardiologist who has published 600 peer-reviewed articles, making him the most published physician in history in the field of kidney disease related to heart disease) says the pandemic should have been handled quite differently.  “‘We could have dramatically reduced COVID fatalities and hospitalizations,’” he says, by “‘using early treatment protocols and repurposed drugs including ivermectin and hydroxychloroquine and many, many others.’”  He has successfully treated 2,000 COVID patients and “points out that hundreds of peer-reviewed studies now show that early treatment could have averted some 80 percent of deaths attributed to COVID.  “‘The strategy from the outset should have been implementing protocols to stop hospitalizations through early treatment of Americans who tested positive for COVID but were still asymptomatic.  If we had done that, we could have pushed case fatality rates below those we see with seasonal flu, and ended the bottlenecks in our hospitals.  We should have rapidly deployed off-the-shelf medications with proven safety records and subjected them to rigorous risk/benefit decision-making,’ McCullough continues. ‘Using repurposed drugs, we could have ended this pandemic by May 2020 and saved 500,000 American lives, but for Dr. Fauci’s hard-headed, tunnel vision on new vaccines and remdesivir’” (p. 65).  Such physicians are Kennedy’s heroes!  More than merely seeking to find the truth, McCullouigh has spearheaded a “worldwide network of front-line physicians using repurposed drugs to save lives around the globe” (p. 83).

Rather than working with scholarly doctors such as Risch and McCullough, the federal bureaucracy sought to suppress their views.  Research universities, having received millions of dollars from the NIH and Fauci network, toed the line he drew, refusing to do anything but search for a magical vaccine.  Says McCullough:  “‘Not a single medical center set up even a tent to try to treat patients and prevent hospitalization and death.  There wasn’t an ounce of original research coming out of America available to fight COVID—other than vaccines’” (p. 79).  Sadly enough, some of this nation’s most distinguished doctors “believe that Dr. Fauci’s suppression of early treatment and off-patent remedies was responsible for up to 80 percent of the deaths attributed to COVID.   . . . .   The relentless malpractice of deliberately withholding early effective COVID treatments, of forcing the use of toxic remdesivir, may have unnecessarily killed up to 500,000 Americans in hospitals” (p. 80).   

Assessing all this, Kennedy declares:  “There is no other aspect of the COVID crisis that more clearly reveals the malicious intentions of a powerful vaccine cartel—led by Dr. Fauci and Bill Gates—to prolong the pandemic and amplify its mortal effects in order to promote their mischievous inoculations.  From the outset, hydroxychloroquine (HCQ) and other therapeutics posed an existential threat to Dr. Fauci and Bill Gates’ $48 billion COVID vaccine project, and particularly to their vanity drug remdesivir, in which Gates has a large stake.”  If drugs like hydroxychloroquine and ivermectin (which had long enjoyed the approval of the FDA and were both inexpensive and widely available, were to prove effective) the “pharmaceutical companies would no longer be legally allowed to fast-track their billion-dollar vaccines to market under Emergency Use Authorization.  Instead, vaccines would have to endure the years-long delays that have always accompanied methodical safety and efficacy testing, and that would mean less profits, more uncertainty, longer runways to market, and a disappointing end to the lucrative COVID-19 vaccine gold rush.  Dr. Fauci has invested $6 billion in taxpayer lucre in the Moderna vaccine alone.  His agency is co-owner of the patent and stands to collect a fortune in royalties.  At least four of Fauci’s hand-picked deputies are in line to collect royalties of $150,000/year based on Moderna’s success, and that’s on top of the salaries already paid by the American public” (pp. 84-85).  

Kennedy’s dark suspicions regarding Fauci are fueled by his examination of Fauci’s treatment of the AIDS crisis in the 1980s.  Thus he devotes several lengthy chapters of the book to detailing and condemning what Fauci did by using his bully pulpit to “terrify millions into wrongly believing they were at risk of getting AIDS when they were not” and thereby getting lavish federal funding for his agency.  He “perfected his special style of ad-fear-tising, using remote, unlikely, farfetched and improbable possibilities to frighten people” (p. 307).  His “first instinct as national AIDS czar had been to stoke contagion terror,” and he wrote an article in 1983 “warning that AIDS could spread by casual contact.”  Though the disease was largely restricted to intravenous drug users and male homosexuals, Fauci deviously declared that anyone, through even casual contact, could catch the disease.  He and his close colleague, Dr. Robert Gallo, early focused in on HIV (human immunodeficiency virus) as the “sole cause of AIDS,” but despite enormous expenditures and clinical trials “no one has been able to point to a study that demonstrates their hypothesis using accepted scientific proofs” (p. 401).  In fact, not nearly all AIDS patients have the NIV virus and millions of folks have NIV but never contract AIDS.  It seems to be yet another fallacious example of confusing correlation with causation! 

Taking over the HIV program from the National Cancer Institute, Fauci quickly began promoting AZT, an anti-cancer drug that he thought might successfully treat AIDS.  But AZT itself had proven to be horrendously toxic!  Arguably “the world’s most accomplished and insightful retrovirologist,” Dr. Peter Duesberg, actually accused “Dr. Fauci of committing mass murder with AZT, the deadly chemical concoction that according to Duesberg causes—and never cures—the constellations of immune suppression that we now call ‘AIDS’” (p. 403).  But Fauci zeroed in on AZT as the magic bullet, and for 36 years he funneled all federal grants to researchers vainly determined to make HIV the culprit causing AIDS.  His grants helped fuel the panic regarding HIV-induced AIDS in Africa.  Scary news stories said some half of the adults in some African nations were infected with HIV and would soon suffer massive depopulation.  None of these doomsday predictions came true, “and most HIV-infected Africans showed no sign of illness” (p. 415).  But millions of dollars were raised and sent to African research centers and thousands of Africans suffered in clinical trials alleged designed to stop the AIDS “epidemic.”  Numerous scholars now believe the African epidemic was simply fabricated!  This sorry story—and Bill Gates’ involvement in it—deserves a fuller discussion than I can afford here, but it should be widely shared!

  Fauci also disregarded what seemed to be effective treatments developed by physicians treating AIDS’ patients with off-the-shelf “therapeutic drugs that seemed effective against the constellation of symptoms that actually killed and tormented people with AIDS” (p. 345).  Despite their successes, Fauci “refused to test any of those repurposed drugs, which had older or expired patents and no Pharma patrons.   . . . .   Big Pharma and its PIs were loath to test any drug with patents they didn’t control” (p. 346).  Sadly, “One of NCI’s top virologists, Dr. Frank Ruscetti . . .  recalls of that era, ‘We could have saved millions of lives with repurposed and therapeutic drugs.  But there’s no profit in it. It’s all got to be about newly patented antivirals and their mischievous vaccines’” (p. 346).  Employing phrases quite familiar to us 40 years later, Fauci blithely asserted “he simply could not recommend a drug until he saw ‘randomized, blinded, placebo-controlled trial’ results.  That was the ‘gold standard,’ he said.  It would be that, or nothing.  When they asked him, ‘Why not?’ he shouted, ‘There’s no data!’” (p. 353).  He has always preferred quantifiable models to practicing physicians’ demonstrations.  

Fauci flourished as he perfected the practice of using academic physicians and researchers whose careers depend upon federal grants.  These Principle Investigators (PIs) do the research and clinical trials necessary for licensing new drugs.  “Thanks to NIH’s largesse, and to NIAID in particular, a relatively tiny network of PIs—a few hundred—determines the content and direction of virtually all America’s biomedical research” (p. 315).  And they dare not displease Anthony Fauci!  Still more:  these PIs get lucrative grants and “legalized bribes” extended through honoraria, expert witness fees, speaking gigs, and first-class travel to exclusive resorts for conferences” (p. 326).  So they are quite subservient to what Kennedy calls “Big Pharma.”  Fauci’s critics, including Dr. Duesberg “charge that by stifling debate and dissent, Dr. Fauci milled public fear into multi-billion-dollar profits for his Pharma partners while expanding his own powers and authoritarian control” (p. 405).

Kennedy demonstrates the incestuous nature of all this by telling how PIs worked to develop a vaccine to control HPV (human papilloma virus) and prevent cervical cancer.  In 2006 a distinguished (ACIP) panel recommended all girls aged nine through twenty-six get the vaccine.  Merck (the pharmaceutical company promoting the shots) acknowledged it had not thoroughly tested the product, “so no one could scientifically predict if the vaccines would avert more injuries or cancers than they would cause.”  But Gardasil was approved and became “the most expensive vaccine in history, costing patients $420 for the three-jab series and generating revenues of over $1 billion annually for Merck.  That year, nine of the thirteen ACIP panel members and their institutions collectively received over $1.6 billion of grant money from NIH and NIAID” (p. 317).  

In this long (900+ pages) and meticulously documented treatise, Kennedy provides much to ponder!  This is far more than a book about the coronavirus or Anthony Fauci.  It reveals a great deal about our public health and the agencies designed to promote it.  Prompting us to take action, RFK says:  “We can bow down and comply—take the jabs, wear the face coverings, show our digital passports on demand, submit to the tests, and salute our minders in the Bio-surveillance State.  Or we can say No.  We have a choice, and it is not too late.  COVID-19 is not the problem; it is a problem, one largely solvable with early treatments that are safe, effective, and inexpensive.  The problem is endemic corruption in the medical-industrial complex, currently supported at every turn by mass-media companies.  This cartel’s coup d’etat has already siphoned billions from taxpayers, already vacuumed up trillions from the global middle class, and created the excuse for massive propaganda, censorship, and control worldwide.  Along with its captured regulators, this cartel has ushered in the global war on freedom and democracy” (p. 925).  It’s time for us to act both wisely and courageously!  

I’ve focused on Robert Kennedy’s book, because it presents the same evidence and comes to much the same conclusions as several other books I’ve read.  Charles Ortleb, determined to reveal Fauci’s disreputable conduct in dealing with AIDS, wrote Fauci The Bernie Madoff of Science and the HIV Ponzi Scheme that Concealed the Chronic Fatigue Syndrome Epidemic (HHV-6 University Press, c. 2021; Kindle Edition).  As the editor of a gay newspaper, the New York Native, Ortleb writes with an outrage fueled by his belief that Fauci gravely mishandled his task and bears responsibility for the suffering and deaths of thousands of AIDS victims.  He has concluded “that the similarities between AIDS science and Nazi science are too obvious for people of conscience to ignore’ (p. 8).  Focusing on other Fauci misdeeds, Steve Deace’s Faucian Bargain: The Most Powerful and Dangerous Bureaucrat in American History (New York:  Post Hill Press, c. 2021; Kindle Edition) delineates his bureaucratic dexterity that has harmed the country.  

Benjamin Franklin once said:  “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”  So Deace says:   “COVID-19 has taught us this history lesson in the harshest of terms.” We did what we were told and watched millions die anyway.  “Lockdowns didn’t work, but they kill.”  We found a bit of momentary safety and lost our liberty.   

# # # 


In 1783, reflecting on his service as commander-in-chief of the colonial armies, George Washington said:  “The establishment of Civil and Religious Liberty was the Motive which induced me to the Field, the object is obtained, and it now remains to be my earnest wish and prayer, that the Citizens of the United States would make a wise and virtuous use of the blessings, placed before them.”  His ideas of freedom would soon be inscribed in the first 10 amendments to the United States Constitution, guaranteeing citizens important rights vis-a-vis their government.  Today, however, when many of those freedoms seem imperiled, we should especially note threats to the very first item therein listed—freedom of religion:  “Congress shall establish no law respecting an establishment of religion, or prohibiting the free exercise thereof.”  

Contrary those who envision that amendment as a “wall of separation” between church and state, Kenneth Starr insists it actually erected a “‘wall of protection’ so that faith communities can freely chart their own course without disrupting significant public interests” (p. 13).  So he has written Religious Liberty in Crisis:  Exercising Your Faith in an Age of Uncertainty (New York:  Encounter Books, c. 2021, Kindle Edition).  No one could more adequately deal with this subject, for Stasrr is both a noted legal scholar and a devout Christian.  Appointed to the District of Columbia Circuit of Appeals by President Reagan, then serving as Solicitor General of the United States under President George H.W. Bush, he might well have joined the United States Supreme Court in 1990 had not President Bush decided to avoid a Robert Bork style battle in the Senate by appointing the enigmatic David Souter instead.  Add to his judicial prowess the fact that he was born in Texas and reared in a pastor’s home (his father ministering in the Churches of Christ denomination).  Thenceforth, though moving away from his father’s denomination, he says:  “Faith proved to be a pillar of strength in my daily life.”  

Starr begins his treatise by asserting:  “Our national DNA contains a dominant freedom gene” (p. 1).  “Ordered liberty” is basic to the American way.  Freedom of religion is simply one important aspect of its flourishing.  But that freedom has been severely limited by the various lockdowns imposed by governments endeavoring to deal with the COVID crisis.  Along with mandated business and school closures, churches and religious institutions had their worship services cancelled.  “Religion, just like virtually every other sector of American society, other than Walmart and the local liquor stores, was facing a deep crisis” (p. 8).  Ignoring the First Amendment, numbers of “American Caesars” simply decreed that health concerns cancelled freedom of religion.  Tellingly, the Caesars deemed some activities absolutely essential—marijuana outlets in Colorado and California, casinos in Nevada!  When the Supreme Court was asked to rule on the Nevada restrictions, the majority of justices supported the outrageous position Neal Gorsuch pilloried, declaring:  “‘[T]here is no world in which the Constitution permits Nevada to favor Caesars Palace over Calvary Chapel’” (p. 10). 

Here and there a few churches chose to defy government edicts but were subject to crushing fines and harsh denunciation.  Most chose to go along to get along and design creative alternatives to corporate worship.  In retrospect, Starr challenges people of faith to seriously consider one essential “question:  when does the government’s authority trump religious assembly and expression?”  His answer:  demand autonomy!  It’s the master “key to religious liberty.”  It is “a special concept for friends of religious liberty.  It is one of what we can call the Great Principles that undergird our system of ordered liberty.  Properly understood, the autonomy principle provides an extra layer of constitutional protection for all faith communities, and it is one that the Supreme Court has vigorously policed and guaranteed” (p. 12).  Autonomy entails self-government.  Any free person or institution governs himself.  Importantly:  “Throughout our nation’s history, the idea of autonomy, of leaving churches alone to govern their own affairs, has been deemed fundamental to our constitutional order.  Simply put, faith-bearing Americans have upheld the notion that Caesar should mind his own business and stay out of matters of religion, including matters of church governance.  This allows the faithful to freely exercise the tenets of their religion without fear of government interference, or of discrimination—a founding principle of our constitutional order” (pp. 12-13). 

Despite alarming assaults on religious liberties during the COVID panic, Starr wants us to understand how the Supreme Court has increasingly defended the autonomy of churches.  Wisely utilizing these decisions will enable us to better deal with coming threats to religious freedom, for they have declared that, “absent compelling reasons, the government cannot pass laws that target religious institutions in discriminatory ways; and governmental entities cannot interfere with religious institutions, including church schools, in ways that compromise their autonomy to express their beliefs and carry out their faith vision” (p. 18).  And he’s written this book in an eirenic and optimistic way, hoping to show that “the prospects for continuing protection of religious liberty are actually quite good” (p. 18).

Thirty years ago I debated an atheist in San Diego who was determined to remove as cross from atop Mt Soledad in La Jolla.  The cross had stood since 1913 as a memorial to those killed in military action and was widely esteemed as a landmark in the area.  The atheists’ legal maneuvers, countered by veterans’ and religious organizations, would ricochet through various courts for 25 years—only to be concluded with the Department of Defense sold the property to a private organization which would leave the cross in place.  A similar case recently occurred in Prince George’s County, Maryland (now a suburb of Washington, D.C.), involving what’s known as the Bladensburg Cross.  It was erected following World War I to commemorate the soldiers from Prince George’s County who died doing battle in Europe and was funded and cared for by government entities.  For decades it was quite uncontroversial.  But then the American Humanist Association decided it posed a dire threat to all who would see it and brought a lawsuit to remove it.  The United States Court of Appeals for the Fourth Circuit “agreed with the Humanists.  The court found that the memorial cross was an unacceptable symbol of Christianity physically situated on public property.  It had to come down” (p. 24).  The appeals court cited a famous case, Lemon v. Kurtzman, that decreed public monuments must have purely secular themes and reasoned a cross failed to pass that test.  

But when the case was subsequently brought to the U.S. Supreme Court a surprise verdict was rendered!  “In a stunning 7-2 decision, the Supreme Court concluded that the Bladensburg Cross would continue to stand.  The basis of the court’s decision? The cross was not so much a religious symbol as it was a monument to history and tradition, reflecting and embodying the culture of the people who erected it.  All but two of the nine justices (Justice Ginsburg and Justice Sotomayor, who viewed the monument as a religious statement) accepted that proposition, determining that the long-standing religious symbol could remain in place on public land, maintained and preserved through the expenditure of local taxpayer dollars.  To say it was a blow to the Humanists is an understatement” (p. 25).  Writing for the majority, Justice Samuel Alito appraised and defended the presence and significance of religious symbols—crosses and the 10 Commandments—standing on public property.  Speaking for the Supreme Court, Alito decried efforts to tear down monuments or rename cities or otherwise “‘de-Christianize’ American society  would reasonably be perceived as governmental hostility to religion” and thus violate the First Amendment (p. 27).  

So too the Supreme Court has moved in the direction of permitting prayer is public facilities, something disallowed for decades by a variety of courts since the 1962 Engel v. Vitale Supreme Court decision.  “In essence, the court concluded that composing prayers for recitation in the public schools, or in any other official activity, constituted government overreach” (p. 54).  Subsequently, all too many “wildly overread” the Court’s opinion, which “did not ban prayer in public schools under any and all circumstances.  Far from it.  To the contrary, what was constitutionally offensive was much more limited, namely, the official sponsorship of prayers that aligned the government with an expression of faith” (p. 55).   Fearful school administrators (fearing vocal agitators and costly lawsuits) and lower courts (determined to exclude religion from state-supported institutions) were especially inclined to ban all prayers.  

From banning prayers the secularists turned to banning Bible clubs and student-led religious activities in the schools and Christmas displays on public lands and the phrase “in God we trust” in the pledge of allegiance.  But in time the Congress responded to these issues by passing the Equal Access Act during the Reagan years.  This bill illustrates “a larger truth,” Starr says.  “Time and again, Congress and the president monitor the religious-liberty landscape, step into the fray and strike mighty blows, often collaboratively, in favor of religious freedom.”  This was particularly evident when in 1993 Congress passed “the most important religious-liberty Congressional reform in the nation’s history:  the Religious Freedom Restoration Act” (p. 73).  “It was an historic first.  Religious liberty had never enjoyed such an overwhelming legislative triumph” (p. 76).  Consequently, in recent years—relying on history and tradition rather than lower courts’ decisions—the Court has reasoned, for example, that inasmuch as chaplains had offered prayers in various legislators since the Republic’s founding, prayers in civic sites may be allowed.  

School vouchers, enabling parents to send their children to private schools, have strengthened religious freedom in the country.  In the 1990s, Wisconsin Governor Tommy Thompson, prodded by Milwaukee’s poorly-performing public-school system, vigorously promoted vouchers (dubbed “school choice”) and recruited Kenneth Starr to orchestrate the legal defense of his endeavors.  Various challenges wound their way through multiple courts until the Supreme Court finally rendered a verdict that granted “an enormous victory for religious liberty” (p. 95).  Speaking for the majority, Chief Justice Rehnquist ignored the tangle of prior decisions dealing with state aid to parochial schools, “cast a wide descriptive net, reviewing an enormous body of Supreme Court precedent,” and highlighted “three earlier high court decisions that had approved different forms of parental choice . . . which resulted in parochial schools or institutions receiving state funds” (p. 94).  

Had Kenneth Starr not been known for his pro-life convictions he might have become a Supreme Court justice!  As a committed Christian he has resolutely defended an unborn baby’s rigbt to life.  Arguing this cause before the Court as Solicitor General, in the infamous Planned Parenthood v. Casey in 1992, he said:  “Roe v. Wade should be overruled,” because it “was wrongly decided twenty years earlier and has been unsparingly criticized over the years for the weakness of its legal reasoning.”  Roe v. Wade had “created a new constitutional right out of whole cloth” and “needed to be overturned” (p. 137).   Though he (and many legal scholars) thought they had a strong case,  a plurality of the justices decided to retain the “core Holding” of Roe, basically they thought overturning it would create social unrest.  Invoking a legal precept (stare decisis) they simply declared “it’s been decided.”  Abortion rights were enshrined!  

  In fact, stare decisis had never proved decisive in Supreme Court decisions—the Court had throughout the years overruled some 200 of its own decisions. Thus Justice Louis Brandeis, a century ago explained:  “‘[I]n cases involving the Federal Constitution, where correction through legislative action is practically impossible, this court has often overruled its earlier decisions.  The court bows to the lessons of experience and the force of better reasoning, recognizing that the process of trial and error, so fruitful in the physical sciences, is appropriate also to the judicial function’” (p. 144).  By ignoring Brandeis’ admonition and insisting Roe v. Wade be firmly established throughout this nation’s judiciary, the justices in Planned Parenthood v. Casey perpetuated what Starr considers an evil and divisive practice.   

As is evident in his discussion of abortion, Starr acknowledges the power of anti-religious forces in America.  While he finds many encouraging elements in Supreme Court decisions during the past 50 years, he warns Christians that secularists are ever on the offensive, seeking to eliminate religion from the public square.  Endless illustrations of this may easily be assembled.  “But it is not only religious liberty that’s under assault in America.  Our entire constitutional order of democratic debate is under challenge” (p. 169).  Finally:  “In this era of open hostility to communities of faith, let’s ‘keep calm and carry on,’ with winsomeness and “charity for all,” fighting the good fight and championing the Great Principles of American liberty.

                                 * * * * * * * * * * * * * * * * * * * * * * * * * *

Whereas Kenneth Starr, reading recent Supreme Court decisions, found reasons for optimism regarding religious liberty, David Horowitz, in Dark Agenda:  The War to Destroy Christian America (West Palm Beach, FL:  Humanix Books, 2018. Kindle Edition), wrote to decry the rising tide of assaults on it.  Such endeavors are hardly new:  “Since its birth in the fires of the French Revolution, the political left has been at war with religion, and with the Christian religion in particular” (p. 10).  The Jacobins renamed the Notre Dame Cathedral the “Temple of Reason,” sought to suppress the Church and killed thousands of Christians.  Decades later Karl Marx decried religion as “the opium of the people” and Communists thereafter tried to all means possible to eradicate it.  And today’s American leftists, given the opportunity, will do precisely the same.

They would do so because the Left is deeply religious, and as Pascal wisely wrote, centuries ago:  “Men never commit evil so fully and joyfully as when they do it for religious convictions.”  Analyzing the work of one of the “new atheists,” his friend Christopher Hitchens (a brilliant journalist who wrote God Is Not Great:  How Religion Poisons Everything), Horowitz found him full of an ill-founded religious faith:  Marxist humanism (Comte’s “religion of humanity”).  His “vision of mankind’s liberation from timeless afflictions—fear, disease, tyranny—is as head-spinning as any flourish in Marx’s writings.  It is in fact an updated version of Christopher’s lifelong romance with Marxism, which he never really abandoned” (p. 19).  Along with Marx, he wanted to “abolish religion” in order to secure their ultimate happiness. 

Reared in a “red diaper” home, Horowitz was a deeply committed Communist, one of the leaders of the “New Left” in the 1960s.  He personally knows many of the leading leftists and fully understands their agenda.  Generally labeling themselves “progressives,” they continue to pursue their socialistic, utopian dreams—a heaven-on-earth.  Since the collapse of the Soviet Union, they no longer espouse “‘Communism.’  They call it social justice.’  Like Communism, social justice is an impossible future in which the inequalities and oppressions that have afflicted human beings for millennia will miraculously vanish and social harmony will rule” (p. 29).  They fail to acknowledge the great truth proclaimed by Alexander Solzhenitsyn in The Gulag Archipelago: “The line separating good and evil passes not through states, nor between classes, nor between political parties either, but right through every human heart, and through all human hearts.”

Solzhenitsyn’s position—the ancient Christian view—is anathema to the social justice warriors now so prominent in the United States, for they lobster good and evil in classes or ethnic groups or genders.  “These opposing visions are the root cause of the war that is the subject of this book. The social redeemers view the Christian concern for the salvation of individual souls as counterrevolutionary, a cause of social oppression.  To them, religious believers are obstacles on the path to the future—and must be removed. That is why progressives have declared war on religious liberty, which is America’s founding principle.  And that is why they seek to silence and suppress its defenders” (p. 32).

This effort to silence Christians is significantly, if surreptitiously, on display in the U.S. Capitol Visitor Center, which was opened a decade ago.  To Horowitz:  “The $621 million center is less a monument to the nation’s founding and institutions than it is to the antireligious left’s vision for America.  When it opened, all references to God and faith had been carefully, deliberately edited out of its photos and historical displays.  One panel in particular claimed that the national motto of the United States is E Pluribus Unum (‘Out of Many, One’).  In fact, the national motto, as established by an act of Congress in 1956, is ‘In God We Trust.’  A replica of the Speaker’s rostrum of the House of Representatives omits the gold-lettered inscription ‘In God We Trust’ above the chair.  Photos of the actual Speaker’s rostrum were cropped to hide the inscription.”  “An enlarged image of the Constitution was photoshopped to remove the words ‘in the Year of our Lord’ above the signatures of the signers.”  It is obvious that:  “The designers of the center had gone to great lengths to alter essential American history. (p. 33).  

Those in charge of the U.S. Capitol Visitor Center rather resemble the “ministry of truth” in George Orwell’s l984.   As do educators crafting textbooks for public schools!  For example, references to the Pilgrims or the Mayflower have been deleted from elementary school textbooks in several states.  Since the word “Pilgrim” might lead children to think religion was important, they must learn about “early settlers” or “European colonizers.”  If Thanksgiving is mentioned there is no clue regarding whom is to be thanked.  As well as censoring textbooks, the public schools “prohibit Christian students from reading the Bible, praying, displaying the Ten Commandments, and even mentioning the word ‘God’” (p. 48).  Leftists controlling the public schools refuse to expose students to the fact that America was deeply rooted in Christian thought.  These endeavors rely on a radical reinterpretation of the “establishment clause” of the First Amendment so that:  “Not only is it now unconstitutional to freely exercise religion in a public school, but the freedom of speech is abridged (prayer is speech).  Freedom of the press is shut down (God has been edited out of our history textbooks).  The right of the people to peaceably assemble on school grounds (such as a prayer huddle after a game or a baccalaureate service) is severely restricted” (p. 52).  

What’s transformed the public schools has also transpired in the nation’s sexual ethos.  Horowitz analyzes this process by citing Margaret Sanger’s resolve in the 1920s to “remake the world” by “controlling birth” and granting women “reproductive freedom.”  In time her aspirations flourished as the Supreme Court (in its 1965 Griswold edict) found “a ‘right to privacy’ in the ‘penumbras’ and ‘emanations’ of the Bill of Rights” which “would provide a rationale for a series of new rights that would change the American landscape for generations to come:  in 1972, the right to birth control for unmarried couples; in 1973, a woman’s constitutional right to abortion; in 1977, a right to contraception for juveniles at least sixteen years of age; in 2002 a right to homosexual relations; and in 2015, a right to same-sex marriage” (p. 64).  That all this constituted a “sexual revolution” cannot be denied.  And that it constituted a full-fledged assault on the Christian tradition is also quite clear.  

Repeatedly Horowitz emphasizes that today’s culture wars are, from the Left’s perspective, mere aspects of one ultimate war-to-the-death!  Issues such and abortion are part-and-parcel of a larger agenda—a deadly agenda to destroy Christianity.  “Each victory motivated the leftists to move on to the next item on their expansive agenda.  The issue was never the issue.  The issue was always the revolution.  Each radical victory only inspired more radical aspirations and efforts” (p. 76).  Underlying all the various assaults on religious liberty is “a radical movement whose members are convinced the society-transforming ends justify the undemocratic and extra-constitutional and even racist means” (p. 73).  These radicals threaten to dissolve what the nation’s Civil War once established, “a society that approaches the ideals laid down in our country’s founding documents.”   They control a political party “committed to an identity politics that is the antithesis of the ideas and principles the founding established.”  He warns that:  “A nation divided by such fundamental ideas—individual freedom on one side and group identity on the other—cannot long endure” (p. 126).  

347 Trampling Down Death by Death

When my late wife Marilyn died, my old college friend Keith Walker sent me a book by Julian Barnes:  Levels of Life (New York:  Vintage Books, c. 2013), an elegant depiction of the grief he endured when his wife died.  As is evident in various versions of psychotherapy—and especially in Victor Frankl’s “logotherapy”—there is much solace and inner healing in rightly naming things, in precisely identifying what’s true about one’s soul and the world, in moving beyond illusory platitudes and accepting how things really are.   So anyone looking for a serious—and at times searing—depiction and diagnosis of grief will find Levels of Life therapeutic.  Consider, for example, Barnes’ perceptive analysis of the temptation to “relish the pain”—to nurture one’s sense of virtue in suffering so heroically the loss of a spouse.  There are, in fact, “many traps and dangers in grief, and time does not diminish them.  Self-pity, isolationism, world-scorn, an egotistical exceptionalism:  all aspects of vanity.  Look how much I suffer, how much others fail to  understand:  does this not prove how much I loved?  Maybe, maybe not.”  Amazingly:  “Mourning can also become competitive:  look how much I loved her/him and with these my tears I prove it (and win the trophy).  There is the temptation to feel, if not to say:  I fell from a greater height than you—examine my ruptured organs” (p. 123).   Such passages cannot but challenge the reader to carefully examine and honestly evaluate one’s grief.  Is it really about one’s lost love?  Or is it merely another form of pride seeking to burnish one’s self-esteem?  

The first two sections of the book are devoted to describing the intersecting lives of two 19th century celebrities—Fred Burnaby, a famous balloonist, and Sarah Bernhardt, a noted actress.  Their story provides an artistic prelude for Barnes’ main task:  describing “the loss of depth,” the “grief story” he experienced when his wife of 30 years died quite quickly (37 days) as an aggressive form of cancer sucked away her life.  What he felt was not depression but sadness, a paralyzing sense of lostness, an inability to live with the zest and hopes earlier known.  Once valued things such as money and fame and world-saving political causes lost their luster.  

He did not hope to see her again.  “I believe dead is dead,” he said.  And yet he talked to her continuously!  “This feels as normal as it is necessary” (p. 111).  He dreamed of her regularly and found himself much consoled thereby.  He sensed, for years after she died, her presence.  On the one hand he believed she was dead-dead, but on the other hand he felt she somehow lived on.  Barnes is admittedly trapped in the Nietzsche’s godless world.  Consequently:  “When we killed—or exiled—God, we also killed ourselves.  . . . .  No God, no afterlife, no us” (p. 94).  The atheist creed must be accepted, of course, he acknowledges:  “But we sawed off the branch we were sitting on.  And the view from there, from that height—even if it was only the illusion of a view—wasn’t so bad” (p. 94).  In short:  he acknowledges what he needs is a religious perspective promising life everlasting, but he cannot embrace it.  For ultimately, as he declares in his final paragraph:  “It is all just the universe doing its stuff, and we are the stuff it is being done to.  And so, perhaps, with grief” (pp. 127-128).  

Ironically, Barnes cites Dr. Samuel Johnson’s diagnosis of his sorrow.  He “well understood the ‘tormenting and harassing want’ of grief; and he warned against isolationism and withdrawal.  ‘An attempt to preserve life in a state of neutrality and indifference is unreasonable and vain.  If by excluding joy we could shut out grief, the scheme would deserve very serious attention.’  But it doesn’t.  Nor do extreme measures, like the attempt to ‘drag [the heart] by force into scenes of merriment;’ or its opposite, the attempt ‘to sooth it into tranquility by making it acquainted with miseries and more dreadful and afflictive.’  For Johnson, only work and time mitigate grief.  ‘Sorrow is a kind of rust of the soul, which every new idea contributes in its passage to scour away’” (pp. 117-18).  But what Barnes fails to say is that Samuel Johnson was a deeply devout Christian who could perceptively describe the reality of grief while trusting in the goodness of God to finally wash away all our tears and bring us to eternal life with lost loved ones.  

In A Grief Observed, C.S. Lewis wrote powerfully about what he felt when his wife died.  He wrote the book in a few weeks, and one might think he would be permanently paralyzed by grief.  But we know, from other sources, that he did by no means lose his faith in God and faithfully served Him in the years remaining to him.   Reading Barnes moving description, one remembers St Paul’s wonderful testimony—“we are not as those who have no hope.”

* * * * * * * * * * * * * * * * * * * * * * * * * * * *

Another good friend from college days, Barth Smith, suggested I read Trampling Down Death by Death by Spyridon Bailey, an Orthodox theologian living in Britain  (Great Britain:  FeedARead Publishing, c. 2014).  The book’s title is taken from an Orthodox hymn: “ Christ is risen from the dead, / Trampling down death by death / And on those in the tombs / Bestowing life.”  Bailey not only sets forth his understandings but routinely cites trustworthy authorities from earlier centuries.  (He also makes clear, sometimes with polemical sharpness, why he thinks Roman Catholic and Protestant theologies cannot be fully embraced).  In essence, he says:  “Engaging with the truth of our death,” he says, in accord with the Christian Tradition, “must become a priority if we are to live as we should.”  

We will all die not because we are mere mortals but because Adam failed to embrace God’s plan for us.  By nature, we were designed to live forever, so death is not truly “natural” for us.  But sin entered the human story, and we endure its consequences.  “We must recognize that death violates God’s purpose for us, it sits in opposition to the intended fruitfulness and communion for which we were created.”  Despite our sins, however, a loving God designed a plan whereby we may attain His original will for us.  Unlike the worldly view of most moderns, who find everything ultimately meaningless, Christians know that “the love of God gives meaning to everything we do since God wills that we experience His love for all eternity.”

Given God’s original design:  “The body is not an old set of clothes to be discarded at death.”  Though many people think the soul will be liberated from the body and live forever in a purely spiritual realm, well-instructed Christians look to the reunion of body and soul in the final resurrection, and “we can at least say with certainty that our bodies will be the same ones but made incorruptible.”  We must embrace and delight in the body God has given us, for in eternity “it will be perfected and made the true servant of our soul.”  Consequently Christians have always buried their dead, believing their bodies are “precious” and will play “an important part in our eternal future.”  Cremating the body, as Hindus do, signifies a belief that the body is an “old shell” worth discarding and annihilated in the cremation-furnace.  

But the story of Jesus, for Christians, is largely about death and resurrection.  He became incarnate—embodied—for us and died the death we all must die.  “Our physical existence is dignified and the process of renewal began when Christ entered His mother’s womb.”  But on the Cross and in the tomb He “entered into that darkness and overcame the final enemy by trampling down death by death:  He illuminated even the despair of the tomb with the power of His love.”  We were, after all, created in God’s image, which is manifestly evident in our souls, not our bodies.  But the soul forms the body and “does not undermine the body’s importance.”  So death merely interrupts God’s plan for us, and it does not “completely separate the soul from the body, for we “are one being, body and soul,” and death “is an outrage” made tolerable only by the Reality of Christ’s Resurrection, “the central and essential belief of Christianity.”

Bailey also explains the Orthodox position on a phrase in the Apostles’ Creed, sometimes called “the harrowing of hell,” a doctrine that was prominent in the Early Church.  While Jesus’ body lay dead in the grave, His soul sought “to release those who could not yet enter Paradise.”  So His “soul descended into Hades while His body lay broken and dead in the tomb.  Saint Peter tells us that He went and preached unto the spirits in prison (1Peter 3: 19).  He descended so that he could raise them up from captivity with Him when he ascended.  Saint John Chrysostom . . . says:  ‘Hell was taken captive by the Lord Who descended into it. It was laid waste, it was mocked, it was put to death, it was overthrown, it was bound.’”

When Jesus died on the Cross his followers grieved.  So too “we hold and weep over our dead or dying.  But we are called to see that the cross is not simply an instrument of torture and death but a means to our redemption.  We cannot remain fixed at the point of the cross, we must move beyond it to what follows.  And so we must place the cross at the centre of our ordinary lives in order that it may become the prism through which we see life and the world.”  So grief is “the sting of love when the one we love dies, and “bereavement is a cross most of us must bare at some point in our lives.  It isn’t a Christian duty to believe so much in eternal life that we are untouched by someone’s death; that would be a grotesque distortion of what it means to be a human being.”  

Importantly, the grief felt by Jesus’ followers was soon assuaged by the Joy of His Resurrection.  So too our grief will be healed inasmuch as we look not at the tomb but to the Risen Lord, and we “benefit enormously from staying focussed on the reality of eternal life, and on all that Christ achieved in His resurrection.”  Being healed from grief is not “a betrayal of the one who has died.”  Good grief ultimately helps “us to trust in God.  Only when we accept that our loved one is in the hands of God can we gain peace.”  Finding comfort and peace, having one’s heart healed, “does not diminish the person’s importance to us.” Rather, one finds joy rather than sorrow in many wonderful memories of the departed.

Because Jesus triumphed over death, his followers enjoy a living hope of life everlasting with Him, and the Scriptures make “clear that the souls of the righteous enjoy the blessings of God after death.”  At the moment of death we will face a “particular judgment” that determines whether or not we enter Paradise (an antechamber of the final Heaven).  Following Christ’s return, we will face the “general judgment,” whereby all accounts are finally and rightfully settled.  During our earthly sojourn, the most important thing we do, as is evident in many New Testament passages, is to repent of our sins.  Those who are finally saved are they who were truly penitent.  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Since I first began studying philosophy I have admired and relied on the works of Joseph Pieper—short, incisive models of cogent Christian thinking.  My initial introduction to him came when I read his Leisure:  The Basis of Culture.  In the “Introduction” to this book, T.S. Eliot (who had studied philosophy at Harvard) noted that academic philosophers had little impact upon the 20th century public while Pieper managed to do so.  He did this by “restoring philosophy to a place of importance for every educated person who thinks, instead of confining it to esoteric activities which can affect the public only indirectly, insidiously and often in a distorted form.  He restores to their position in philosophy what common sense obstinately tells us ought to be found there:  insight and wisdom” (p. 14).  Equally laudatory remarks regarding Pieper were given by one of the greatest 20th century theologians, Hans Urs von Balthasar, who said he was a “philosopher, who in Goethe’s words contemplates the ‘holy and manifest mystery’ of Being and its meaning” and effectively employs the “language which always grows out of the wisdom of man as he philosophizes unconsciously” (“Foreword” to Josef Pieper:  An Anthology, p. ix).  

I recently reread (for probably the third or fourth times) Pieper’s Death and Immortality (New York:  Herder & Herder, c. 1969).  He begins his discussion by noting that the subject is “an especially philosophical subject,” contrasting the radically dissimilar declarations of two eminent 20th century thinkers, Jean-Paul Sartre and Pope John XXIII.  The former, an atheistic existentialist, said:  “It is absurd that we are born; it is absurd that we die.”  On the contrary, said the pope:  “Every day is a good day to be born; every day is a good day to die” (p. 8).  It’s neither the times nor the places that shape our attitudes—it’s what we take to be true regarding Reality.  Importantly, as Kierkegaard declared:  “‘Honour to learning, and honour to one who can treat the learned question of immortality in a learned way.  But the question of immortality is no learned question.  It is a question of the inner existence, a question which the individual must confront by looking into his own soul’” (p. 130).  

For us mortals, pondering our own death cannot but prompt the most serious of all thoughts.  Thus St Augustine, following the death of a close friend when he was 19 years old, noted:  “I had become a great question to myself.”  His study of Cicero no doubt deepened this concern, for to Cicero, Pieper says, “philosophizing is nothing else but consideration of death, commentatio mortis” (p. 10).   But while we witness others dying and wonder at the prospects of our own death, we cannot experience it as we do eating and drinking, laughing and crying.  It eludes the kind of analysis we give other human activities.  It is the most certain thing in the world—but precisely what it is remains bewilderingly uncertain.  Nevertheless, “in the shock that is inflicted upon us by the death of a beloved person” we come closest to personally experiencing it.  The great Christian dramatist (and existential philosopher) Gabriel Marcel said, “To love a being is to say, ‘Thou, thou shalt not die!’” (p. 20).  When we profess our love, something in us prompts us to declare its everlasting dimensions.  And if our love is eternal, surely the one we love is equally eternal.  So as a loved one dies we know (inasmuch as one may know) what death means.

We have also developed remarkable euphemisms, designed to evade the harsh reality of death, to “not name the reality of the thing, rather to obscure it, make it unrecognizable and divert our attention to something else” (p. 23).  So we say the person “passed away” or “expired” or “fell asleep.”  An even deeper evasion is “the sophism of not encountering death, which Epicurus seems to have been the first to formulate; ‘Death is nothing to us; for as long as we are, death is not here; and when death is here, we no longer are.  Therefore it is nothing to the loving or the dead’” (p. 29).  Skeptics and atheists ever since have repeated this refrain, but something about it always rings hollow.  So to live honestly, Pieper insists, we must consider all the aspects “of the human experience embodied in living speech,” of reality itself, embracing the many paradoxes posed by end-of-life experiences (p. 30).  

We do, instinctively it seems, follow Socrates in his final hours and inevitably speak of the “separation of body and soul.”  Thus Thomas Aquinas said “that the ratio mortis, the ‘concept’ of death, implies that the soul separates from the body” (p. 33).  Precisely what that means, however, defies easy explanation.  It’s obvious, to most of us, that there’s an inner “self” which gives “orders” to the body.  I “tell” my hand to move, my legs to run, my jaw to chew.  Still more, it is obvious that losing a limb—or even most of my limbs—doesn’t really change the nature of my inner self, my soul.  So, as Plato insisted in Alcibiades, “the soul is the man” and “we are dealing with one of the firmest findings in the history of philosophy—all Christian thinkers before Thomas Aquinas were “Platonists”; all defined man as the soul which uses the body as the musician his lute” (p. 34).  

But by nature we are both soul and body.  To be finally separated from the body is inescapably tragic.  Here “the great tradition of Christian theology” speaking through Aquinas, “is unequivocal:  ‘Of all human evils, death is the worst’; it is ‘the most extreme of all human suffering’; by it man is ‘robbed of what is most lovable:  life and being’” (p. 51).   In a perfect world, soul and body would never sever.  So we cannot but wonder if death is a “natural event or a punishment.”  Atheists and Naturalists, of course, deny the reality of the soul and thus see death as a purely natural event, like a leaf falling from a tree.  But Christians, while believing that God made a perfectly good world, take seriously the ramifications of original sin and conclude that as a consequence man became “something different” from his original design.  Thus death, the separation of body and soul, comes as a consequence of Adam choosing to turn away from God, to live life on his own terms.  

Only when death is understood as a punishment for Adam’s and our sins—a punishment well-deserved—will we rightly understand and accept it.  Though the human justice system often fails, rendering unjust punishments of various sorts, the Divine Justice is perfectly calibrated and we will ultimately know and accept how death was justly prescribed for us.  We will “become aware that in this case, and perhaps in this case alone, crime and punishment are in complete accord; that death is not, unlike all humans penalties, something imposed more or less without relation to the fault, but is the consequence and fruit already implied in the sin” (p. 68).  “The only honest and clean way not to sweep the scandal of death under the rug and on the other hand not to fall into a state of revolt against Creation consists in coming to see death as punishment, and submitting to that; once more, not death as an ‘idea’ and general phenomenon, but our own death and the death of those we love” (p. 75).  Only when we put the badness of death within the context of the far worse badness of sin will we be able to freely accept it.  But, still more:  there is one death which above all makes our deaths tolerable.  This was the “one single death which was entirely an act of freedom, though it took the form of a cruel execution; . . .  ‘only a Man who . . .  served in our sad regiment as a volunteer . . . could perform this perfect dying’” (p. 82). 

This means, of course, that we are pilgrims rather than permanent residents on planet earth.  Death’s reality constantly reminds us, as Pascal said, that “We are not, we hope to be” (p. 85).  Throughout life’s journey, we make decisions that prepare us for the final moment, the point of transition, the end (meaning both the termination and the purpose) of our endeavors.  “The tradition,” Pieper says, “has coined a formula for this personal sealing of earthly existence.  It is described as the termination of the status viatoris” (p. 84).  A viator is a pilgrim.  The great question, at the end, is what will be his status, his standing, his readiness for what’s to come.  “In death the last decision is passed, for good or ill, upon the life as a whole; henceforth nothing in that life can ever again be undone” (p. 86).  So Kierkegaard confided to his diary:  “’In the moment of death a man is helped by the situation to become as true as he can be’” (p. 93).  

Thoughts of death necessarily awaken questions regarding immortality.  While philosophical materialists have always denied the immortality of the soul, Pieper was astounded by some “modern Protestant” theologians who shared their view!  He argues, reiterating the classic stance Thomas Aquinas, that:  “Innumerable (infinitae) are the testimonies of Holy Scripture which witness the immortality of the soul” (p. 107.  He finds further support in the oft-misrepresented Plato, who posited immortality mainly in the light of divine judgment and its fearful punishments.  To Plato, only the good, who are right with God, will enjoy the “true bliss” of life everlasting.  In one of his final works, Phaedrus, “when he launches on what seems a wholly fresh approach to the question of ‘in what sense a living being is termed mortal or immortal’, he suddenly ceases to speak of the soul alone.  ‘We think,’ he says, ‘of a living being, spiritual and physical at once, but both, soul and body, united for all time.’  Moreover, he goes on, immortality is not to be regarded as a mere rational concept susceptible of demonstration; rather, we think of it with our minds on ‘the god whom we have never seen, nor fully conceived’” (p. 116).  Plato, Pieper says, “seems to be suggesting:  If ever immortality is conferred upon us, not just the soul but the entire physical human being will in some inconceivable manner participate in the life of the gods; for in them alone is it made real in its original perfection” (p. 116).  Thus for Plato, persons are better termed indestructible or imperishable rather than immortal.  As Aquinas, commenting on Aristotle’s Metaphysics, put it:  “’That is perishable which possibly cannot be; that is imperishable, incorruptible, which cannot possibly not be’” (p. 117).  

# # #

346 HEAVEN (reprint)

As you would expect, I’ve thought much of Heaven since my wife Marilyn’s death on September 8, 2021.  I’m now grieving very much as I did following my first wife Roberta’s death in 2007.  As many of you well know, mourning saps one’s energy—beyond doing what’s absolutely essential for the day there’s little interest in doing things like writing book reviews.  As C.S. Lewis said:  “no one ever told me about the laziness of grief.”  So I’m simply going to reprint my February 2008 “Reedings” which was devoted to books on Heaven.  (Incidentally, my sister Barbara, teaching a course in counseling for Nazarene Bible College, uses this issue as a reference).  Thankfully, there are wonderful written works that provide scriptural, philosophical, and testimonial perspectives—solid sources for Christian belief.  Randy Alcorn’s Heaven (Carol Stream, IL:  Tyndale House Publishers, Inc., c. 2004) certainly provides what Rick Warren considers “the best book on Heaven I’ve ever read.”  Presciently, perhaps, during the summer of 2020, with no awareness that my wife Marilyn would soon be diagnosed with stage four cancer, I explored Alcorn’s book with an Adult Bible Study I led in the Community Fellowship of  Christians in Lake George, Colorado.  It was, and is, a work of great consolation as well as illumination.  

Alcorn prefaces his biblical discussion with a brief reference to history and anthropology, where ample “evidence suggests that every culture has a God-given, innate sense of the eternal—that this world is not all there is” (p. xvii).  He cites St. Cyprian, who said that death “‘sets us free from the snares of the world, and restores us to paradise and the kingdom.  Anyone who has been in foreign lands longs to return to his own native land . . . .  We regard Paradise as our native land’” (p. xviii).  Alcorn then dedicates his book to all who are “burdened discouraged, depressed, or even traumatized” (p. xx).  Only Heaven can salve our deepest sorrows.  “‘It becomes us,’ wrote the great American theologian, Jonathan Edwards, ‘to spend this life only as a journey toward heaven . . . to which we should subordinate all other concerns of life.  Why should we labor for or set our hearts on anything else, but that which is our proper end and true happiness?’” (p. 5).  Scripture devotes much attention to this ultimate end, though today’s teachers and preachers say little about it.   “What God made us to desire, and therefore what we do desire if we admit it, is exactly what he promises to those who follow Jesus Christ:  a resurrected life in a resurrected body, with the resurrected Christ on a resurrected Earth.  Our desires correspond precisely to God’s plans” (p. 7).  

Any lack of interest in Heaven betrays an atrophied imagination.  In his fictional works C.S. Lewis activates this essential human faculty.  Using the imagination, Alistair McGrath says, affirms “‘the critical role of the God-given human capacity to construct and enter into mental pictures of divine reality, which are mediated through Scripture and the subsequent tradition of reflection and development’” (p. 15).  Still more, Lewis said:  “‘While reason is the natural organ of truth, imagination is the organ of meaning’” (p. 22).  Alcorn then develops “a theology of heaven,” differentiating between the “present” and “eternal” heavens.  At death the redeemed go immediately to a “present” or “intermediate” Heaven.  At the end of time and the resurrection of the body and the final judgment, they will enter the “eternal” Heaven—or the New Earth that will be established for them.  Though not yet enjoying their resurrected bodies, residents of the present Heaven occupy a unique space and enjoy a mysteriously embodied existence—as was evident when Moses and Elijah appeared on the Mount of Transfiguration.  

Adam was first made a physical being, then given a spirit.  So, Alcorn reasons, “God may grant us some physical form that will allow us to function as human beings while in that unnatural state ‘between bodies,’ awaiting our resurrection” (p. 57).  Since the Resurrected Jesus has a body, “If Christ’s body in the present Heaven has physical proportion, it stands to reason that others in Heaven might have physical forms as well, even if only temporary ones” (p. 59).  “It might be better, then, if we think of the location of the present Heaven as not in another universe but simply as a part of ours that we are unable to see, due to our spiritual blindness.  If that’s true, when we die we don’t go to a different universe but to a place within our universe that we’re currently unable to see” (p. 184).  To envision this possibility, note how contemporary physicists who routinely talk about a dozen or so invisible “dimensions” to the universe!

Sifting through the Scriptures, Alcorn finds no less than 21 details concerning the saints in the present Heaven.  They are the same persons, conscious of their new place, remember their earthly life, know what’s happening on earth, pray for us, wear robes, have a sense of time, and feel bound to believers on earth.  “There is not a wall of separation within the bride of Christ.  We are one family with those who’ve gone to Heaven ahead of us” (p. 67).  While he’s firmly Protestant, this position squares precisely with the Catholic tradition regarding the “communion of saints,” declaring that “it is that the union of the wayfarers with the brethren who sleep in the place of Christ is in no way interrupted, but on the contrary, according to the constant faith of the Church, this union is reinforced by an exchange of spiritual goods” (#955).  At the end of time, Alcorn insists we will be restored to the Earth God initially envisioned.  The New Earth will be the original Eden Adam lost.  Though scarred by sin, this earth is at least a shadow of the New Earth.  So to think about the eternal heaven it helps to look about us and rejoice in the mountains and streams, the music and sunsets, that daily ennoble our lives.  “It is no coincidence that the first two chapters of the Bible (Genesis 1-2) begin with the creation of the heavens and the earth and the last two chapters (Revelation 21-22) begin with the recreation of the heavens and the earth” (p. 132).  

After setting forth his “theology” of heaven, Alcorn turns to answering common questions regarding it.  We will live (sin excepted) much like we live now.  We will eat and drink, read and study and discuss new truths, work creatively, enjoy fellowship with friends and family.  Married ties will be strengthened and the joys of the man-wife relationship intensified.  Animals will be there, occupying their niche in God’s design.  For all these positions Alcorn has texts.  And frequently he cites respected authorities, ranging from Augustine to Wesley to C.S. Lewis, to support his views.  The book is helpful and persuasive—simply the place to begin thinking biblically about the hereafter.  

* * * * * * * * * * * * * * * * * * * *

Equally valuable to me is a philosophical treatise by Boston College’s Professor Peter Kreeft, titled Heaven:  The Heart’s Deepest Longing  (San Francisco:  Ignatius Press, c. 1980).  “This book,” he says, “is the thought-experiment of looking with the eye of the heart and exploring what we see of the deep desire hidden there, the desire for heaven” (p. 39).  He begins by noting that the “question of hope is at least as ultimate as the other two great questions [what can I know?  what ought I do?].  For it means “‘what is the point and purpose of life?  Why was I born?  Why am I living?’” (p. 12).  All of us wonder, at times at least, “Is that all there is?” (p. 46).  Historically, as is evident in everything from Indian burial mounds to Egyptian pyramids, man has above all hoped for heaven, however variously envisioned.  

What all men long for, Aristotle persuasively argued, is happiness.  And this happiness, added Pascal, “is neither outside nor inside us:  it is in God, both outside and inside us” (p. 32).  Pascal further said that the heart has its reasons that reason never knows, and when we honestly look within (listening to our hearts), Kreeft says, there is  “a heavenly hole, a womblike emptiness crying out to be filled, impregnated by your divine lover.  Heaven is God’s body; earth is ours” (p. 35).  Though our minds may open up mathematical means to decipher the universe, our hearts give us different but equally valid and valuable truths regarding ourselves.  

Love also has its ways of knowing, a clairvoyant “X-ray vision” (p. 37), seeing the essence of things.    “Only one who loves you really knows you, and the deeper the love, the deeper the knowledge.  The non-lover may know everything about you, but only the lover knows you” (p. 37).  So thinking about heaven is an exercise of the heart and of love.  Inasmuch as we love God we come to know Him.  Inasmuch as we lovingly long for happiness and heaven we come to know them.  As Malcolm Muggeridge said, in Jesus Rediscovered:   “‘I had a sense, of something enormously vivid, that I was a stranger in a strange land; a visitor, not a native . . . a displaced person.”  Consequently, he concluded:   “The only ultimate disaster that can befall us, I have come to realize, is to feel ourselves to be at home here on earth.  As long as we are aliens, we cannot forget our true homeland’” (p. 63).   

Such musings are prodded by the genuinely strange dimensions of earthly realities such as time.  At times it seems we never have enough time.  At other times we wonder if the clock will ever change.  We think much about the future and the past, wondering how things will be and how things were.  In a moment we can encompass the centuries in our minds.  To Kreeft, our “nostalgia for Eden is not just for another time but for another kind of time” (p. 70).   Underlying all our musings on time, we truly “long for the infinitely old and the infinitely new because we long for eternity” (p. 80).  “Time and death make life precious, but they do not make it eternal.  But that is what we long for (‘thou hast put eternity into Man’s heart’), even if we do not know what it is” (p. 73).  Facing death we long to live on, forever and ever.  

Clues to eternal, heavenly bliss may be found, Kreeft suggests, in our authentically personal relationships.  Consider, for example, how much a person reveals through his face.  There is a “numinous, most magical” quality to the face, for here mind controls matter.  “A human face is more than a part of the body, an object; it is a part of the soul, a subject, an I.  It is the place where soul still transfigures body as its Creator designed it to” (p. 99).  Furthermore, romantic love, like a face, reveals a deep inner reality.  “It is like a sacrament in that way:  a special sign of a general truth, a local reminder of a universal reality” (pp.101).  That’s because:  “As the face is the epiphany of the person, the person is the epiphany of the universe, the universe’s face as seen by the ‘haunt detector’ called romantic love” (p. 101-102).  Readers of Dante’s Divine Comedy remember that “God appeared to Dante as a Beatrice-shaped glory.  Yet Beatrice was not obliterated by the divine light.  Dante did not merely pass through Beatrice to God; he found God in Beatrice.  He did not love Beatrice less because he loved God more” (p. 102).  “Romantic love is a powerful image of the love of God because, unlike lust, it does not desire a possessable and consumable thing (like a body).  It wants not to possess but to be possessed, not by the beloved but by love itself, the reality in which both lovers stand” (p. 107).   

Our world, Kreeft says, is not really distant from heaven.  Rather, “heaven includes earth as the soul includes the body.  My soul includes my body because it is my me, my personhood, and part of this is what I call ‘my’ body” (p. 115).  Earth’s an accurate image, a shadow of Heaven, which is “more real, more substantial than earth,” has “more dimensions than earth, not fewer,” and is “clearer, more detailed and specific than earth, not vaguer” (pp. 116-117).  The Eternal Word indwells both Heaven and Earth.  He is “Christ the Haunter, the incarnate divine Mind, the Logos.  . . . .  The divine Idea perfectly and completely expressed before the world was created, the divine Word that was the instrument creating the universe, the divine design reflected in all created order, finally focuses at this single point:  a human individual who says ‘I am’, claming to be the divine I AM ‘before Abraham was’.  All signs lead to him because all signs come from him” (p. 118).  Rightly seeing His world we see his Face!  The God Whose Face was visible in Christ is the great “I AM who says, ‘I am with you always’, and that I AM is the absolute, the unchangeable, the utterly reliable.  Our I is flighty, relative, and unreliable.  But our I can plug into the I Am and then it and its joy become as eternally solid as the joy of I AM.  Faith is that plug. (p. 160) 

What we hope for in Heaven is a continuous state of joy, an extension of those moments of joy we experience while on Earth.  Such joy is a truly ecstatic—rooted outside of us, not inside us—reality.  “Just as love is not in us but we are ‘in love’ (‘it’s bigger than both of us’), joy is not in us but we are in it:  ‘Enter into the joy of thy Lord’” (p. 145).  “Heaven is ek-stasis; hell is in-stasis.  Heaven in coinherence; hell is incoherence.  Heaven is aspiration; hell is greed.  Heaven is love; hell is lust” (p. 150).  Joy comes to those who fully, and finally, submit their wills to God’s will.  “In His will,” said Dante, “is our peace,” and “’Thy will be done,’” echoes Kreeft, “is the infallible road to total joy” (p. 158).

Submission to God’s will here-and-now gives us a foretaste of Heaven, for “Earth is not outside heaven; it is heaven’s workshop, heaven’s womb” (p. 172).  Seen from this perspective, we are not “pie-in-the-sky” daydreamers fantasizing a better world.  Indeed, “Heaven is not escapist because we are already there, just as the fetus in the womb is already in the world because the womb is in the world and subject to its laws, such as the laws of gravity and genetics” (p. 174).  Still more:  “Heaven is not a thing or even a place; it is a Person; that’s why it (he) is present.  Heaven is where God is—God defines heaven, not heaven, God—and God is present in every place” (p. 175).  All who will may enter, for we are justified by faith—and our “faith is in God’s present (gift) of his Present (now) presence (here)” (p. 181).  For “This is the Gospel, the scandalously good news:  that we are guaranteed heaven by sheer gift” (p. 183).  

In a profound appendix, Kreeft cogently develops a philosophical case for C.S. Lewis’s “argument from desire,” which he finds to be one of the most persuasive arguments for the existence for God ever advanced.  Without question “it is far more moving, arresting, and apologetically effective than any other argument for God or for heaven” (p. 201).  He sums it up thusly:  1) “The major premise of the argument is that every natural or innate desire in us bespeaks a corresponding real object that can satisfy the desire.”  2) “The minor premise is that there exists in us a desire which nothing in  time, nothing on earth, no creature, can satisfy.”  And 3) “The conclusion is that there exists something outside of time, earth, and creatures which can satisfy this desire” (p. 202).  

To his knowledge, Kreeft says, agreeing with Lewis, “No case has ever been found of an innate desire for a nonexistent object” (p. 203).  By nature we desire more than nature affords.  We desire a supernatural reality called heaven.  Even better, C.S. Lewis asserted (in The Problem of Pain):  “Your soul has a curious shape because it is . . . a key to unlock one of the doors in the house with many mansions. . . .  Your place in heaven will seem to be made for you and you alone, because you were made for it—made for it stitch by stitch as a glove is made for a hand” (p. 67).  

* * * * * * * * * * * * * * *

Two “testimonial” books, testifying to the reality of “life-after-death,” deserve perusing.  Don Piper’s 90 Minutes in Heaven:  A True Story of Death & Life  (Grand Rapids, MI:  Fleming H. Revell, c. 2004) tells the story of a Baptist pastor who had a terrible automobile accident in 1989, was declared dead at the scene, lay immobile for 90 minutes, and then revived to spend many months recovering from his injuries.  During those 90 minutes he entered heaven, where he met many people he knew who had preceded him.  While there, he says, “My heart filled with the deepest joy I’ve ever experienced” (p. 31).  Marvelous music, praising Christ the King, thrilled him and still resounds in his memory.  Colors were more vivid, people were more wonderful—all was in fact perfect.  “I was home; I was where I belonged,” he says.   “I wanted to be there more than I had ever wanted to be anywhere on earth” (p. 33).  

A more fascinating story, for me, was written three decades ago by George G. Ritchie, M.D., entitled Return from Tomorrow (Waco, TX:  Chosen Books, c. 1978).  His medical training, as well as his philosophical bent, make the book both a fascinating narrative and a meaningful reflection.  In 1943, aged 20, Ritchie was in Texas, preparing for service in WWII.  While there he was stricken by the flu, which turned into double pneumonia.  Despite the doctors’ efforts, he apparently died; they pronounced him dead and covered his face with a sheet.  From his perspective, however, he simply became immaterial.  He walked down the hospital corridors, but no one saw him.  Then he flew overland, heading toward his home in Richmond, Virginia.  He saw the countryside passing underneath.  Then he alighted in a strange town (Vicksburg, MS) and wandered about a bit, noticing specific details of it.  As in the hospital, he saw things as if he were physically present, but no one could see him—and he could easily move through solid things.

At that point he decided he needed to get back to Texas and recover his embodied state.  He returned to the hospital and “began one of the strangest searches that can ever have taken place:  the search for myself.  From one ward to another of that enormous complex I rushed, pausing in each small room, stooping over the occupant of the bed, hurrying on” (p. 42).  In time he found the room where his body lay—though the face was covered he knew it because of a distinctive ring on his left hand.  He realized that others thought he was “dead” but he really wasn’t!  While trying to get back into his body, the room suddenly turned bright—“it was like a million welders’ lamps all blazing at once” (p. 48).  The light was not an “it” but a “He,” a Man who was clearly “the Son of God” (p. 49).  This was not the Jesus he’d heard about (with considerable disinterest) in Sunday school!  He was powerful.  And He “loved me” (p. 49).  In that moment he envisioned all the details of his 20  years on earth.  And the question from the Light was:  “What did you do with your life?” (p. 52).  He realized that he’d lived, almost exclusively, for himself.  Though he had professed a faith in Jesus as a child, he hadn’t really sought to serve Him.  He realized that a life rightly lived was a life consumed by love.  

Thereafter Ritchie was taken on a journey that exposed him to various places inhabited by those who had died.  He saw self-absorbed people, self-promoting and verbally vicious and vindictive people who had made their own hell—much like the folks in C.S. Lewis’s The Great Divorce.   “With a feeling of sick familiarity I recognized here my own thinking” (p. 65).  With Jesus beside him, he realized that Jesus had not “abandoned them, but they who had fled from the Light that showed up their darkness” (p. 66).  Above all they had failed to see Him.  Then he was taken to another kind of place where people peacefully studiously in an extensive library, engaged in a great project of some sort.  They were “supremely self-forgetful” and thus utterly at peace.  They were not yet in heaven, but “They grew and they have kept on growing” (p. 71).  Finally, though at quite a distance, he was granted a glimpse of heaven itself.  “At this time I had not yet read the book of Revelation.  I could only gape in awe at this faraway spectacle, wondering how bright each building, each inhabitant, must be to be seen over so many light-years of distance.  Could these radiant being, I wondered, amazed, be those who had indeed kept Jesus the focus of their lives?” (p. 72).  

After this incredible journey, Ritchie returned to his body in the hospital, reviving nine minutes after being declared dead.  Alive again, he began to live fully.   He suddenly began to notice and care for others.  There were “no casual events for me since that night in Texas, . . . no ‘unimportant’ encounters with people.  Every minute of every day since that time, I’d been aware of the presence of a larger world” (p. 85).  Equally important, though he naturally feared the physical pain of dying, “as for death itself, I not only felt no fear of it, I found myself wishing it would happen” (p. 105).  

In time he went to medical school and, after practicing medicine for 13 years, further studied to become a psychiatrist.  Through it all, a lesson he learned while treating a Christ-like soldier in Europe remained paramount:  “in losing myself, I had discovered Christ.  It was strange, I thought:  I’d had to die in Texas, too, to see Him.  I wondered if we always had to die, some stubborn part of us, before we could see more of Him” (p. 112).  In retrospect, he reflects upon his “return from tomorrow,” saying:  “Whatever I saw was only—from the doorway, so to speak.  But it was enough to convince me totally of two things from that moment on.  One, that our consciousness does not cease with physical death—that it becomes in fact keener and more aware than ever.  And secondly, that how we spend our time on earth, the kind of relationships we build, is vastly, infinitely more important than we can know” (pp. 15-16).  

345 Return of the God the Hypothesis

During the past two decades the dramatic increase of Americans self-identifying as “Nones” (having no religious faith) has concerned thoughtful Christians.  Particularly among the young, it seems, many are disinterested and even hostile to the Christian tradition.  Asked by pollsters to explain their stance, they often say “science” (especially  the chemical evolution of life and the biological evolution of species) had disproved it.  Concerned by this development, Stephen Meyer has written Return of the God Hypothesis:  Three Scientific Discoveries that Reveal the Mind Behind the Universe (New York:  HarperOne, Kindle Edition, c. 2021) to show why their position is certainly questionable and probably untenable.  

Meyer received his PhD from the University of Cambridge in the philosophy of science and works with the Discovery Institute in Seattle.  He has written two scholarly treatises, Signature in the Cell and Darwin’s Doubt, which I have favorably reviewed in earlier editions of my “Reedings.”  Therein he “argued that certain features of living systems—in particular, the digitally encoded information present in DNA and the complex circuitry and information-processing systems at work in living cells—are best explained by the activity of an actual designing intelligence.  Just as the inscriptions on the Rosetta Stone point to the activity of an ancient scribe and the software in a computer program points to a programmer, I’ve argued that the digital code discovered within the DNA molecule suggests the activity of a designing mind in the origin of life” (p. 10), though  he refrained from making philosophical claims concerning the existence of God.  

With this recent treatise he makes such claims, challenging entrenched scientific dogmas that have shaped the worldview of millions of people.  He begins by acknowledging that today’s scientific worldview is deeply materialistic, asserting that “matter, energy, and/or the laws of physics are the entities from which everything else came and that those entities have existed from eternity past as the uncreated foundation of all that exists.  Matter, energy, and physical laws are, therefore, viewed by materialists as self-existent.”  Without any mental qualities, these entities have randomly assembled themselves into all that exists.  So there cannot be immaterial realities such as God or the human soul.  Varieties of materialism have been propounded for thousands of years by ancient Greeks such as Democritus as well as makers of modernity such as Thomas Hobbes, Charles Darwin, and Francis Crick.  The materialistic position was succinctly summed up by astronomer Carl Sagan:  “The cosmos is all that is, or ever was, or ever will be.”

On the contrary, Meyer seeks to demonstrate that:  “The properties of the universe and of life—specifically as they pertain to understanding their origins—are just ‘what we should expect’ if a transcendent and purposive intelligence has acted in the history of life and the cosmos.  Such an intelligence coincides with what human beings have called God, and so I call this story of reversal the return of the God hypothesis” (p. 19).  This “hypothesis” was basic to the development of modern science, as is evident in the works of Copernicus, Robert Boyle and Isaac Newton.  Such theistic scientists endeavored to “‘Diligently pursue the physical causes of things, for that’s how science is done; but, at the same time, [recognize that] design is sometimes evident in the whole contrivance one is studying’” (p. 54).  Indeed:  “This tradition attained an almost majestic rhetorical quality in the writings of Newton” (p. 72).  “As he explained: ‘How came the Bodies of Animals to be contrived with so much Art, and for what ends were their several Parts?  Was the Eye contrived without Skill in Opticks, and the Ear without Knowledge of Sounds? . . .  And these things being rightly dispatch’d, does it not appear from Phænomena that there is a Being incorporeal, living, intelligent, omnipresent?’” (p. 72).  In short:  Meyer’s views are thoroughly in accord with the West’s greatest scientists.

In the 19th century, however, the “theistic science” of Newton et al. was edged aside by the “scientific materialism” now dominating the West.  Influential philosophers such as David Hume and Karl Marx, as well as scientists such as Pierre Laplace and Charles Darwin, worked to eliminate the need for a Creator/Sustainer, so that:  “By the beginning of the twentieth century, science—despite its theistic beginnings—seemed to have no need of the God hypothesis” (p. 99).  

Yet, as the 21st century begins, some of the guiding assumptions of materialism may be crumbling.  By definition, materialism assumes the eternality of the material world—in one mode or another matter has existed and simply circulated from one thing to another, and no Creator is needed to explain it.  However, sophisticated discoveries by astronomers such as Edwin Hubble led to an avalanche of evidence regarding an expanding universe which pointed to an initial, explosive moment of creation, popularly known as the “big bang.”  (This is the first of Meyer’s three scientific discoveries justifying the return of the God hypothesis.)  One of Hubble’s gifted associates, Alan Sandage, an agnostic for much of his life, found the evidence so persuasive that he ultimately changed his mind.  Speaking at a meeting in 1985, “he not only described the astronomical evidence for the beginning of the universe; he shocked many of his colleagues by announcing a recent religious conversion and then explaining how the scientific evidence of a ‘creation event’ had contributed to a profound change in his worldview.  I recall his looking intently at the audience and gravely stating, ‘Here is evidence for what can only be described as a supernatural event. There is no way that this could have been predicted within the realm of physics as we know it.’  As he spoke, he paused between the words ‘super’ and ‘natural,’ saying them separately for emphasis. He went on to explain that ‘science, until recently, has concerned itself not with primary causes but, essentially, with secondary causes.  What has happened in the last fifty years is a remarkable event within astronomy and astrophysics.  By looking up at the sky, some astronomers have come to the belief that there is evidence for a ‘creation event’” (p. 172).  “He continued:  I now have to go from a stance as a complete materialistic rational scientist and say this super natural event, to me, gives at least some credence to my belief that there is some design put in the universe.’”  Still more:  “‘I am convinced that there is some order in the universe.  I think all scientists, at the deepest level, are so startled by what they see in the miraculousness of the inner connection of things in their field . . . that they at least have wondered why it is this way’” (p. 172).

What Sandage and contemporary cosmologists recognize as a point of “singularity” indicates the physical universe came into being from nothing physical!  It was, as Christian theologians have always declared:  “creatio ex nihilo—‘creation out of nothing’ (nothing physical, that is)” (p. 186).  Trying to evade such a possibility, various materialists have proposed alternative theories, including the “many universes” hypothesis.  But quite recently some of the world’s finest physicists have ruled out such options, showing why “all cosmological models in which expansion occurs—including inflationary cosmology, multiverses, and the oscillating and cosmic egg models” cannot evade a creation event.  Indeed, the “evidence for a beginning is now almost unavoidable.  As he [Alexander Vilenkin] explains, ‘With the proof now in place, cosmologists can no longer hide behind the possibility of a past-eternal universe.  There is no escape; they have to face the problem of a cosmic beginning’” (p. 203).  Indeed, “in the beginning God!”  

The second scientific discovery Meyer discusses is often called the “Goldilocks Universe.”  From every angle of investigation, the universe seems amazingly fine-tuned.  Four fundamental forces underlie all that is:  gravity; electromagnetism; the strong nuclear force; the weak nuclear force.  The slightest difference in any one of these forces would have made the formation of the universe impossible.  Essential chemicals, most especially carbon, need to be precisely what they are in order for anything to be.  Physicist Fred “Hoyle was stunned by these and other ‘cosmic coincidences’ that physicists began to discover after the 1950s.  Whereas before he affirmed atheism and denied any evidence of design, he began to see fine tuning as obvious evidence of intelligent design.  As he put it in 1981, ‘A common-sense interpretation of the facts suggests that a super-intellect has monkeyed with physics, as well as with chemistry and biology, and that there are no blind forces worth speaking about in nature.  The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond question’” (p. 218).  Meyer carefully cites the scientists and provides the mathematical data to show that the universe is incredibly fine-tuned, suggesting the Mind of a Maker at work.

The third scientific discovery Meyer explores is the “origin of life and the DNA enigma.”  Monistic materialists, such as Richard Dawkins, tenaciously upheld the dogma that biology is “‘the study of complicated things that give the appearance of having been designed for a purpose’” (p. 257).”  But the more we know of living things the more it seems they have been meticulously designed by an all-knowing Mind.  One of America’s premier biologists, Dean Kenyon, published a rigorously materialistic textbook, Biochemical Predestination in 1969.  But within a decade he was overwhelmed by the implications of recently discovered realities of DNA and began to question his own positions.   In 1985 he publicly repudiated his earlier theory and “argued that the presence of information in the DNA molecule defied explanation by all current naturalistic theories of the origin of life, not just his own” (p. 263).  The notion that “chance and necessity” would bring living creatures into being appears less and less possible.  Thus “Nobel laureate Christian de Duve, a leading origin-of-life biochemist until his death in 2013, categorically rejected the chance hypothesis precisely because he judged the necessary fortuitous convergence of events implausible in the extreme.  In a memorable passage in his 1995 article ‘The Beginnings of Life on Earth,’ de Duve made explicit the logic by which he rejected the chance hypothesis.  As he put it, ‘A single, freak, highly improbable event can conceivably happen.  Many highly improbable events—drawing a winning lottery number or the distribution of playing cards in a hand of bridge—happen all the time.  But a string of improbable events—drawing the same lottery number twice or the same bridge hand twice in a row—does not happen naturally’” (p. 273).  Nor, says Meyer, could life on earth have happened naturally!  Furthermore,  since the discovery of DNA every materialistic explanation of the information indwelling and shaping biological cells has failed.  

Beyond calling into question the materialistic position on the origin-of-life, Meyer argues Intelligent Design properly explains it.  Drawing upon the notion that new information is a consciously-developed activity, there might be a “way to formulate a rigorous scientific case for intelligent design as an inference to the best explanation—specifically, the best explanation for the origin of biological information.  The creative action of a conscious and intelligent agent clearly represents a known and adequate cause (one ‘now in operation’) for the origin of specified information.  Uniform and repeated experience affirms that intelligent agents can produce large amounts of functional or specified information, whether in software programs, ancient inscriptions, or Shakespearean sonnets. The specified information in the cell also points to intelligent design not just as an adequate explanation, but as the best explanation. Why?  Experience shows that large amounts of specified information invariably originate from an intelligent source.  This is particularly apparent when the information is expressed in a digital or alphabetic form.  A computer user who traces the information on a screen back to its source invariably comes to a mind, that of a software engineer.  Similarly, the information in a book or newspaper article ultimately derives from a writer—from a mental, rather than a strictly material, cause” (p. 288).  

In his final chapters, Meyer shows how the these scientific discoveries justify believing the “God hypothesis.”  An Intelligent Being could bring into being all that is, design it meticulously, and create living creatures on planet earth.  The rational process of abduction—inference to the best explanation—makes such belief highly reasonable and persuasive.  Reflecting on his research at Cambridge University, especially devoted to one of its most illustrations professors, Sir Isaac Newton, he thought about “how thinking about science and God had changed since the publication of Newton’s great Principia in 1687, almost exactly three centuries earlier.  In the epilogue to a later edition of that book called ‘The General Scholium’ and in other scientific works, notably the Opticks, Newton articulated a profoundly theological perspective.  Not only did he extol the order and uniformity of nature as a reflection of God’s character and superintending care of creation; he argued for the existence of God based on the design evident in nature—in short, for a God hypothesis” (p. 593).  Newton “also understood that the most fundamental laws of nature either merely describe the observed regularities in nature or they manifest the ‘constant Spirit action’ of a ‘Divine Sustainer’ of the world.  He did not think the laws of physics alone explained the origin of the solar system or, still less, the origin of the universe” (p. 596).  Consequently:  “For Newton, nature not only provided evidential support for belief in God, but his God hypothesis functioned as a hugely productive science starter.  There is no reason to think that updating that hypothesis will threaten scientific advance today. On the contrary, there is good reason to expect that it will inspire deeper interest in discovering more about the intricacy, order, and design of the universe, just as it did for Newton himself” (p. 622).

Though Meyer’s in-depth scientific discussions may challenge general readers, he generally makes his ideas clear and provides personal insights as well as illustrative materials.  To understand how an advocate of Intelligent Design applies his scientific expertise to theological positions, Return of the God Hypothesis is a fine presentation. 

                              * * * * * * * * * * * * * * * * * * * * * * * * * * 

Richard E. Simmons III worked in the insurance industry for 28 years before establishing The Center for Executive Leadership to help counsel and inspire for businessmen and professionals.  He recently published Reflections On The Existence Of God:  A Series Of Essays (Birmingham, AL:  Union Hill, c. 2019)—a decidedly non-academic, but deeply serious, presentation of reasons to believe there’s a God.   Simmons has been seriously reading and pondering the issue for nearly three decades, so his short chapters (easily read in 10 minutes) reveal his rhetorical gift for popularization.  They usually focus on a significant person, or a quotation, eliciting commentary by the author.  “This book lays out,” he says, “in short essays, much of the evidence for the existence of God that is available.  We should seek to take the evidence offered and use it to make reasonable conclusions.  What you will find is, as the evidence accumulates, it enables us to come to confident conclusions about God. Who He is.  And, that He truly is” (p. 21).

Simmons deals with the importance of seeking truth, resolving the problem of pain and evil, discerning moral principles, finding meaning in life, understanding science, the importance of Jesus and His Resurrection, etc.  Finding atheism irrational and self-contradictory, he endorses the Christian Way.  He hopes to “help people see how a God-centered worldview makes sense of what we see and experience in life.  I have tried to demonstrate that Christianity is logical, non-contradictory, and more fully true to the facts of human existence than atheism.  It clearly leads to a more dignified and compassionate view of human life.  The bottom line: It has greater explanatory power than atheism. The reason for this is because it is true. God exists. When you live in harmony with His design you will experience a coherence to your life, which will help make sense of the world” (p. 219).  That he found it true for himself is clarified in the book’s final pages.  

                              * * * * * * * * * * * * * * * * * * * * * * * * *

In God?  Very Probably:  Five Rational Ways to Think about the Question of a God (Eugene OR:  Cascade Books, Kindle Edition, c. 2015), Robert H. Nelson, an economist and professor at the the School of Public Policy of the University of Maryland, sought to set forth “the record of the recent progress of my thinking, taking me from a long-standing basic agnosticism as recently as about eight years ago to now believing that a god (very probably) exists.”  I’ll examine only two of the “rational ways” he proposes.  

One reason to believe is “The Miracle of Mathematical Order in the Natural World.”  Mathematics, as Plato showed, take form in our minds and seem to inform every aspect of the cosmos.  This mathematical order, guiding physicists in their research, has given birth to an amazing series of discoveries (e.g. Isaac Newton and Albert Einstein.  In a 1960 article, “The Unreasonable Effectiveness of Mathematics in the Natural Sciences,”  Eugene Wigner referred to the miraculous nature of the intricately mathematical laws of nature and of our mental ability to understand them.  No purely naturalistic process, as is expounded by Charles Darwin, can begin to explain this phenomenon.  Centuries earlier, one of the most powerful thinkers in human history, Gottfried Wilhelm Leibniz—“the smartest person who ever lived,” some think, who co-developed calculus along with Newton—thought similarly.  To him, “all thought has a ‘fundamentally mathematical structure,’ [and] it follows that “God must be a perfect mathematician” (p. 59).

Inasmuch as atheists routinely cite Darwin as their inspiration, Nelson makes a careful study of naturalistic evolution in a chapter entitled “Darwinism as Secular Fundamentalism.”  There’s no doubt the living world constantly changes, or evolves, in many ways, and Darwin’s careful observations are useful in understanding life’s history.  But “to say that ‘evolution exists’ is to say very little, not much more informative than to say that ‘history exists’—a virtual truism” (p. 99).  So too, to say “natural selection” occurs is little more enlightening than to observe wars have helped shape world history.  In fact, Nelson thinks, the updated version of Darwinism (neo-Darwinism) simply lacks empirical confirmation.  It was, as one might say, closer to a matter of Darwinist faith than a demonstrated historical fact” (p. 106).  In fact, it’s significantly different from science—it’s what Mary Midgley says is a ReligionModern secularists, she says, are clearly “evolution-worshippers,” with Life as something akin to a god.  Thinkers such as Richard Dawkins, Stephen Jay Gould wrote shortly before he died, “are the ‘apostles’ of a new secular fundamentalism that was seeking to win converts to ‘the true Darwinian scripture’” (p. 108).

Discounting the new fundamentalism, Nelson exposes its internal self-contradictions, joining legions of logicians who have doubted the Darwinian story.  C.S. Lewis, for example, famously said:  “Naturalism . . . offers what professes to be a full account of our mental behavior; but this account, on inspection, leaves no room for the acts of knowing or insight on which the whole value of our thinking, as a means to truth, depends.”  Darwinists routinely claim to have discovered the ultimate “truth” that explains all that lives.  But if they are right, explaining the mechanism of “evolution through natural selection,” they cannot claim to know any durable “truth.”  Yet they dogmatically insist that all beliefs, as well as  all species, continually evolve through the survival of the fittest.  The notion of “fittest” may very well enable it to survive for a time, but all beliefs, like all species, will be replaced by better ones.  “Hence, to believe that Darwinism is ‘true’ leads to the conclusion that it is ‘not true,’ a direct contradiction.  Applying the method of contradiction as widely employed by mathematicians, we can logically then conclude that Darwinism itself is necessarily ‘not true,’ although it may be evolutionarily useful (but we could not know this as a ‘truth’ either).  In order to find real truth in the world, it requires stepping outside the workings of biological evolution, something which for the true-believing Darwinist is impossible” (p. 112).  Importantly, “Darwin himself was aware of this problem,” confessing in 1881, “that ‘with me the horrid doubt always arises whether the convictions of man’s mind, which has been developed from the mind of the lower animals, are of any value or at all trustworthy.  Would anyone trust in the convictions of a monkey’s mind, if there are any convictions in such a mind?’” (p. 112).

As an eminent economist demanding empirical evidence and employing rigorous logic, Nelson gives readers considerable reason to believe in God.

344 White No Longer or Fault Lines?

When some of my friends informed me the Nazarene Theological Seminary has invited Robert P. Jones, the author of White Too Long:  The Legacy of White Supremacy in American Christianity (New York:  Simon & Schuster, Kindle Edition, c. 2020) to give a series of lectures, I decided to read it and see what he might say.  Jones is the CEO of the Public Religion Research Institute and writes regularly for The Atlantic online.  He holds a PhD in religion from Emory University, briefly taught religious studies at Missouri State University, and earlier published The End of White Christian America, which won the 2019 Grawemeyer Award in Religion.  His sequel has  garnered accolades, such as that by Gary Dorrien, Reinhold Niebuhr Professor of Social Ethics at Union Theological Seminary, who says:  “White Too Long is a rich and astute reflection on the role of white churches in creating and sustaining America’s system of racial caste.  Robert P. Jones features his customary skillful blend of journalism, social science, and commentary, adding splashes of illuminating personal memoir, to explicate how churches perpetuated white supremacy for centuries—and still do.” 

Jones’ thesis is concisely summed up in the book’s epigram, a 1968 statement by James Baldwin:  “I will flatly say that the bulk of this country’s white population impresses me, and has so impressed me for a very long time, as being beyond any conceivable hope of moral rehabilitation.  They have been white, if I may so put it, too long;   they have been married to the lie of white supremacy too long; the effect on their personalities, their lives, their grasp of reality, has been as devastating as the lava which so memorably immobilized the citizens of Pompeii.  They are unable to conceive that their version of reality, which they want me to accept, is an insult to my history and a parody of theirs and an intolerable violation of myself.”  In brief, to Baldwin and Jones:  this nation’s original sin of slavery has so tarnished its history that nothing short of a massive cultural upheaval could redress the past and establish a truly equitable society.  

White Too Long combines copious details regarding Jones’ personal pilgrimage as well as historical anecdotes.  (Indeed, his approach to history is largely a matter of finding illustrations to prove the points he wants to make!  Such, moreover is the method followed by influential writers Robin D’Angelo in White Fragility and fits in nicely with a postmodernist commitment to “narratives” rather than traditional “objective” approaches.)  Reared in a pious Southern Baptist environment in Jackson, Mississippi, Jones followed the path of devotion enjoined by his church—numerous weekly services, revivals and youth camps, daily routines of prayer and Bible study.  Following high school he attended Southwestern Theological Seminary intending to enter the ministry.  In time, however, he became critical of his church, with its focus on personal salvation, and determined to make social justice his vocation, working through his research center.  

He now takes a decidedly jaundiced look at the history of Southern Baptists, a “convention” organized in the 1840s to defend slavery, though he occasionally notes similar developments in Methodist, Presbyterian, Episcopalian and Catholic churches.  He condemns celebrated Southern Baptist founders, such as Basil Manly, not only for their antebellum activities but also for their prominent roles in supporting the Confederacy, resisting Reconstruction, passing Jim Crow laws, imposing mandatory segregation, and opposing the civil rights movement.  “While the South lost the war, this secessionist religion not only survived but also thrived.  Its powerful role as a religious institution that sacralized white supremacy allowed the Southern Baptist Convention to spread its roots during the late nineteenth century to dominate southern culture.  And by the mid-twentieth century, the SBC ultimately evolved into the single largest Christian denomination in the country, setting the tone for American Christianity overall and Christianity’s influence in public life” (p. 2).  Consequently, white churches have ever led the way in making racism America’s true DNA. 

Today’s evangelicals simply carry on, in more subtle ways, the nation’s pernicious  racism.  To Jones, current efforts of Southern Baptists to address racial issues are basically “the white Christian shuffle.”  Thus Richard Land, a prominent denominational spokesman criticized some Black Lives Matter assertions (bolstered by President Obama) regarding the death of Trayvon Martin.  Then there’s “Al Mohler, president of Southern Baptist Seminary—the oldest SBC seminary, which was founded in 1859 in Greenville, South Carolina, but relocated to Louisville, Kentucky, after the Civil War.  Mohler presents a case study in the limitations of how far even well-intentioned white evangelicals are willing to go to reckon with their white supremacist past” (p. 56).  At times Mohler has spoken boldly about the sinfulness of racism and the need for racial reconciliation, but he ultimately denies “that their legacy requires reparative or costly actions in the present” (p. 56).  Even worse, though Mohler acknowledges the seminary’s founders, James Boyce and John Broadus “served as chaplains for the Confederate army, he also defends them as ‘consummate Christian gentlemen, given the culture of their day’” (p. 58).  

Christians, past or present, Jones suggests, should share his position on things racial or forfeit their claims to the true faith.  Indeed, he wonders if “Christian conceptions of marriage and family, the doctrine of biblical inerrancy, or even the concept of having a personal relationship with Jesus developed as they did because they were useful tools for reinforcing white dominance?  Is it possible that the white supremacy heresy is so integrated into white Christian DNA that it eludes even sincere efforts to excise it?” (p. 71).  Anyone wanting to peruse a voluminous litany of evil deeds orchestrated by white American Christians will find in this book an abundant supply.  Should one want to enter the “woke” world of many modern churchmen, this is a useful text.  

What’s lacking is the balanced perspective of thoughtful historians such as Eugene Genovese, whose many scholarly works (not listed in Jones’ bibliography) afford in-depth nuances missing in White No Longer.  The book is, in fact, an excellent illustration of the “anachronistic fallacy” (imposing current ethical standards upon past persons or institutions) succinctly dispatched by David Hackett Fisher in Historians’ Fallacies.  So too, Jones seems to be unwilling to imagine that there is such a thing as “invincible ignorance,” blinding people in certain times and places to what seems virtually self-evident in other eras.              

                                * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

In Fault Lines: The Social Justice Movement and Evangelicalism’s Looming Catastrophe (Washington:  Salem Books, Kindle Edition, c. 2021), Voddie Baucham Jr., casts a critical look at those promoting most versions of  “Critical Social Justice.”  Born in Los Angeles, he was reared by a single mother who “shaped my thinking about who I was and what I was capable of.  She never said or did anything to cause me to believe that my blackness was a curse or a limitation.  She gave me a sense of agency and accountability that remains with me to this day” (p. 14).  Since they lived in a tough neighborhood, she decided to re-locate to Texas, where they found a more healthy environment.  “Not only would I go to high school, college, and seminary in Texas, but it is also where I met and married my wife, welcomed all nine of my children, and started my ministry.  I often say, ‘I am a Californian by birth, but a Texan by the grace of God!’” (p. 15).

Largely because of his mother’s strict discipline, Baucham flourished in high school, excelling academically as well as athletically.  He fully entered into the life of his school, serving as a leader in many organizations and graduating as a Merit Scholar.  Granted a football scholarship to New Mexico State University, he almost instantly became a starting end and enjoyed a successful year.  More importantly, he met a Crusade for Christ representative and began a process of biblical study and philosophical seeking that led him to faith in Christ.  “I believed the Gospel. I repented of my sin.  And God saved me” (p. 24).  Transferring to Rice University in Houston the next year, he continued playing football while pursuing a pre-law program.  He also  he met and married a wonderful woman.  After two years at Rice, feeling a call to ministry, he transferred to Houston Baptist University, joined a Southern Baptist church, and “was welcomed into Southern Baptist life” (p. 30).  A gifted preacher, he was soon speaking all over the country and gaining the attention of prominent Baptist leaders such as Al Mohler.  

He was, however, at that time more black than Baptist!  His early infatuation with Malcolm X and the “black power” movement had prompted him to assume a defiant stance vis-a-vis white America.  Only time, experience and study—particularly the importance of careful exegesis when expounding the Bible—awakened him to the realization that his deep concern for justice actually alienated him from the “social justice warriors” so influential in today’s culture.  In particular, as he looked at the celebrated cases of “injustice” he found folks bearing false witness!  Looking at the celebrated stories of George Floyd, Tamir Rice, Michael Brown, Breonna Taylor, Baucham found the  Black Lives Matter spokesmen consistently “bearing false witness,” a violation of biblical justice.  

While the issues he addresses appear throughout society, Baucham’s concern is with Evangelicals.  He’s concerned that pastors such as Tim Keller and various denominations have embraced the social justice agenda, while other pastors, such as John MacArthur, and organizations have opposed it.  He’s concerned that the historically “mainline” evangelical magazine, Christianity Today, published “nothing less than a full-throated recitation of the ideology of Critical Race Theory” (p. 127).  And he fears “fault lines” preceding an earthquake are appearing.   As a black Southern Baptist minister, now serving as the dean of the School of Divinity at African Christian University in Lusaka, Zambia, he brings to the discussion both rich scholarship and personal perspectives making his treatise an important source whereby we may better understand important issues.  “This book is,” he says, “among many things, a plea to the Church. I believe we are being duped by an ideology bent on our demise.  This ideology has used our guilt and shame over America’s past, our love for the brethren, and our good and godly desire for reconciliation and justice as a means through which to introduce destructive heresies.  We cannot embrace, modify, baptize, or Christianize these ideologies.  We must identify, resist, and repudiate them.  We cannot be held hostage through emotional blackmail and name-calling.  Instead, we must ‘see to it that no one takes you captive by philosophy and empty deceit, according to human tradition, according to the elemental spirits of the world, and not according to Christ’ (Colossians 2:8)” (p. 204).

Of ultimate import, Baucham thinks, is to rightly understand and properly pursue justice.  But there are contentious arguments concerning its true nature.  He has, has “pursued justice my entire Christian life.  Yet I am about as ‘anti–social justice as they come’”  because he thinks “the current concept of social justice is incompatible with biblical Christianity” (p. 5).  Battles are being waged by “two competing worldviews in this current cultural moment.  One is the Critical Social Justice view—which assumes that the world is divided between the oppressors and the oppressed (white, heterosexual males are generally viewed as “the oppressor”).  The other is what I will refer to in these pages as the biblical justice view in order to avoid what I accuse the social-justice crowd of doing, which is immediately casting its opponents as being opposed to justice” (p. 6).  Fault lines have appeared in Evangelicalism, and an earthquake may very well follow.

“At the epicenter of the coming evangelical catastrophe,” Baucham believes, “is a new religion—or, more specifically, a new cult.  While some may consider the term ‘cult’ unnecessarily offensive, it happens to be the most accurate term available to describe the current state of affairs.  John McWhorter was the first observer I am aware of to refer to it as the ‘Cult of Antiracism.’  Others have used similar terms, and I think they are right to do so” (p. 66).  “This new cult has created a new lexicon that has served as scaffolding to support what has become an entire body of divinity.  In the same manner, this new body of divinity comes complete with its own cosmology (CT/CRT/I); original sin (racism); law (antiracism); gospel (racial reconciliation); martyrs (Saints Trayvon, Mike, George, Breonna, etc.); priests (oppressed minorities); means of atonement (reparations); new birth (wokeness); liturgy (lament); canon (CSJ social science); theologians (DiAngelo, Kendi, Brown, Crenshaw, MacIntosh, etc.); and catechism (‘say their names’)” (p. 67).  Missing from the lexicon is soteriology!  That’s because in the antiracist religion there’s no salvation—“only perpetual penance in an effort to battle an incurable disease” (p. 67).

The sin of racism, according to social justice warriors, can be neither forgiven nor eradicated because it’s not tied to  individuals’ beliefs or behaviors.  Rather, it is “systemic,” embedded in the amorphous depths of “society.”  It’s evident in economic or educational inequalities which must be eliminated in order for oppressed groups to get justice.  Whites cannot, as persons, confess or repent of their sin because it’s not really theirs—it’s “institutional” or “structural.”  They cannot pray for God’s forgiveness, nor can they plead the blood of Christ.  Instead they’re “told that they must do the unending work of antiracism. And this work must be done regardless of their own actions since the issue at hand is a matter of communal, generational guilt based on ethnicity” (p. 129).  So, ironically:  “today we have ‘racism without racists’” (p. 85). 

In 2018 Baucham was one of 15 men, recruited by John MacArthur, who drafted the Dallas Statement on Social Justice and the Gospel.  “What came out of that meeting would,” Baucham thinks, “prove to be a pivotal piece of the puzzle in the contemporary discussion of race, ethnicity, and justice inside and outside the Church” (p. 133).  Though the document failed to stimulate the healthy dialogue its signers hoped for, it elicited a response from the 2019 Southern Baptist Convention, which passed a resolution (carefully guided by SBC President J.D. Greear) supporting Critical Race Theory.  The one man who might have persuasively opposed the resolution was Albert Mohler, “the most respected theologian and cultural apologist in the SBC, who has repeatedly repudiated CRT” (p. 149).  But he “didn’t say a word.  Nor could he.”  To do so would have put him in the position of openly opposing a prominent black delegate and given his critics the opportunity to brand him a racist representing the “white supremacist faction” within the SBC.  Subsequently, however, Mohler and the Council of Seminary Presidents of the Southern Baptist Convention released a statement repudiating CRT and the convention’s controversial resolution.  While they condemned “racism in any form,” they declared that the “affirmation of Critical Race Theory, Intersectionality and any version of Critical Theory is incompatible with the Baptist Faith & Message.”

Baucham supports this critique of CRT because it tries to rectify problems in the black community by blaming whites and demanding reparations of various sorts.  But many blacks such as himself want to challenge their communities to solve their own problems.  Thus many pastors get “standing ovations as they passionately admonish their young members to ‘pull up your pants, get an education, stop dropping babies all over the place, learn to speak proper English, get all that gold out of your mouth.…’  They and their members know that, regardless of what is going on outside the black community, culture matters.  The black family matters.  Education matters.  Decisions and choices matter.  And above all, God’s Word matters” (p. 158).  For example, God’s Word clearly condemns abortion, something generally ignored or carefully nuanced by most social justice proponents.  To Baucham this is a scandal, for the killing of the unborn is devastating the black community.  Christians may genuinely differ when seeking to alleviate poverty or care for immigrants or provide housing for needy families.  But abortion is another matter.  “‘How we will respect and understand the nature of life itself is the overriding moral issue,’” said Jesse Jackson’s in his pro-life days,  “‘not of the black race, but of the human race.’  I could not agree more! That is why I believe the abortion question belongs at the center of any discussion about race and justice” (p. 172).  “Fifteen and a half million black babies have been aborted since 1973.  That means abortion is not only the leading cause of death among black Americans, but it has taken more black lives than heart disease, cancer, accidents, violent crime, and AIDS combined.  Though black women make up less than 13 percent of the population, they account for 35 percent of all abortions.  In major cities like New York, Philadelphia, and Los Angeles, more black babies are aborted than born” (p. 175).  Don’t murder!  Stop killing the innocents!  That’s a fundamental component of living justly.   Yet numbers of “Christians” approve it!  Critical Social Justice spokesmen  rarely condemn it.  Indeed, “access” to it crucial, enabling women to freely choose whether or not to kill the baby.  “‘Abortion is a social justice issue,’ says SafeAbortionWomensRight.org, ‘in that criminalizing, restricting or stigmatising abortion creates barriers that women with unwanted pregnancies face in exercising body autonomy’” (p. 181).

Other than recovering a biblical concept of justice, Bauchan has no easy solutions  to the divisive racial issues we face.  In part this is because he doesn’t believe there is actually a “racial injustice problem.”  He’s encountered racists and acknowledges racism exists.  But he rejects “the idea that America is ‘characterized by racism,’ or that racism is an unavoidable byproduct of our national DNA.  In fact, I believe America is one of the least racist countries in the world” (p. 201).  Christians must realize there’s a war going on, and:  “If white people need to ‘check their privilege,’ then Christians will soon be asked to do the same.  Make no mistake about it—we are under attack” (p. 209).  Inasmuch as the Black Lives Matter “organization is Marxist, revolutionary, feminist, misandrist, pro-LGBTQIA+, pro-abortion, and anti-family, with roots in the occult” (p. 223) make it something Christians should resolutely condemn and oppose. 

In the book’s final chapter, Baucham sets forth a remarkable testimony.  Living in Africa, reflecting upon the tragic history of slavery, he had a moment of clarity and charity.  His ancestors once lived where he now lives, and “for one reason or another, other Africans sold them into slavery—probably after taking them as slaves themselves.  I thought about the horrors of the Middle Passage and the indignities of bondage in America.  I thought about the fact that slavery had robbed me of so much that I didn’t even know which African country my ancestors had come from, let alone which tribe.  Then I thought about the moment at hand, and something switched.  Suddenly, I realized that I had traveled thousands of miles from the place of my ancestors’ oppression to the place of their betrayal.  And for the first time in my life, I forgave.  I didn’t forgive because I was big enough, or a godly enough man.  Nor did I forgive because anybody asked me to. I forgave because I was overcome by the weight and majesty of God’s providence.  By God’s providence, my ancestors survived their ordeal.  By God’s providence, one of their descendants (me) had returned—not as a slave of men, but as a slave of Christ.  By God’s providence, I was born a free man and a citizen of the greatest Republic in the history of mankind.  By God’s providence, I was numbered among the healthiest, freest, most prosperous people (of any race, not just black people) on the planet.  By God’s providence, I had received the best theological education available in the world.  And by God’s providence, He had brought me back to Africa to bless the descendants of the people who sold my ancestors into slavery.  So I forgave.  I forgave the Africans who took my ancestors’ freedom.  I forgave the Americans who bought and exploited them.  . . . .   I just forgave!  I did not harbor any ill will.  I did not feel entitled to any apologies or reparations.  By God’s grace, I recognized that Providence had blessed me beyond my ancestors’ wildest dreams—or my own.  I couldn’t help but remember Joseph’s words: “As for you, you meant evil against me, but God meant it for good.”

In the end, it is forgiveness that will heal our wounds. My hope is not that white Christians can feel sorry enough for their past or that ministries and organizations can dig up and grovel over enough historical dirt.  That is not the powerful, life-changing, world-confounding message of the Gospel.  That is the message of the world” (p. 229).

343 Christians & Pagans

In Pagans and Christians in the City:  Culture Wars from the Tiber to the Potomac (Grand Rapids:  William B. Eerdmans Publishing Company, Kindle Edition, c.  2018), Steven D. Smith, a law professor at the University of San Diego, contends that many people share the beliefs of the ancient Romans.  They may be “godless”from a theistic perspective, but they passionately revere such things as racial equity and sexual freedom and environmental purity.  They have sacred spheres, but they are all within the natural world, rather than the supernatural realm dear to Christians and Jews.  They are the “modern pagans” T.S. Eliot described in his 1939 lectures, and they are as likely to shun or persecute Christians today as they were 20 centuries ago.  

Smith begins his discussion by presenting undeniable evidence that we are homo religiosus.  By nature we are as deeply religious as we are rational or tool-making or playful.  Drawing on sources as diverse as Sophocles, Victor Frankl, Leo Tolstoy, and William James, he illustrates the ancient adage:  “man does not live by bread alone.”  Throughout human history there’s been an insatiable craving for meaning and purpose in life, for answers to the great “why” questions.  Thus, said Ludwig Wittgenstein:  “‘To believe in God means to understand the question about the meaning of life.’  ‘To believe in God means to see that the facts of the world are not the end of the matter.’  ‘To believe in God means to see that life has a meaning’” (p. 45).  To Rabbi Abraham Heschel, religious reflection begins with the “awe” or “wonder” we feel when confronting creation.  Importantly:  awe “‘is more than an emotion; it is a way of understanding.  Awe is itself an act of insight into a meaning greater than ourselves’” (p. 48). 

It’s simply part of who we are—religious, meaning-mongering creatures.  So it’s never a question of whether or not we’ll have religious concerns but rather what form these concerns will develop.  Consequently, in the ancient world, both pagans and Christinas were deeply religious, but pagans sought meaning in this world, which may include certain invisible realms, while Christians and Jews discerned it in an invisible, metaphysical one.  With all their gods and goddesses, temples and public ceremonies, Romans were in many ways “‘the most religious people in the world’” (p. 87).  In its grandeur Rome certainly possessed military, economic, architectural and cultural riches, but it also featured distinctly religious goods—“meaning, sublimity, and communal connection to the sacred” (p. 105).  These goods were present in the natural world, with its beauty, order, and awe-inspiring fecundity. 

Then Christians boldly challenged this Roman religion with an offensive theology and ethics.  Over the centuries Rome had tolerated various tribal deities and mystery cults,  but Christianity was something else.  Above all, it insisted to uniquely possess “the way, the truth, and the life.”  It wasn’t simply one of many ways, which would have been most congenial to the Romans, but it was The Way!  Importantly, whereas the pagan gods inhabited only this world, Christians worshipped “‘the creator of the world, which he guides in its course and maintains in its existence—an invisible, hidden, spiritual god who dwells beyond time and space’” (p. 146).   For pagans, the natural world is our home, and we should settle in and enjoy its goods.  For Christians, however, a heavenly home awaits us as pilgrims.  Beholding the heavens, the pagan “exclaims, ‘How divine!’  The theologically fastidious Christian looks up and says, ‘What a sublime manifestation of the divine!’” (p. 152).  To the pagan, the good life meant good food, casual sex, comforts of various sorts.  But Christians would forego all temporal goods so as to gain “eternal life.”  Pagans disdainfully rejected any notion of the resurrection of the body—death simply ended, once and for all, one’s life.  But St Paul exclaimed:  “O death, where is thy victory?  O grave, where is thy sting?” In fact,  “Luc Ferry asserts that ‘the entire originality of the Christian message resides in “the good news” of literal immortality—resurrection, in other words and not merely of souls but of individual human bodies’” (p. 233).  

Inevitably these two worldview clashed.  The greatest analysis of this conflict, of course, was St Augustine’s City of God, showing how the earthly city constituted itself by loving self self rather than God, whereas the heavenly city was composed of those who disregarded this-worldly matters in pursuit of everlasting well-being.  Though generally tolerant of religious diversity, when pressed Rome resorted to persecuting Christians when they too clearly threatened the stability of their “earthly city.”  And when Christians became politically powerful in the fourth century, A.D., they often (and generally half-heartedly) sought to eliminate paganism.  Christians, of course, prevailed and subsequently established the Western Christian Civilization that so shaped Europe for a millennium.  

And yet, Smith thinks:  “In a certain sense, the Western world has arguably always remained more pagan than Christian.  In some ways Christianity has been more of a veneer than a substantial reality” (p. 251).  Throughout the Medieval world vestiges of paganism persisted—witchcraft and astrology, names and holidays, philosophy and literature.  Then came the Renaissance, which some historians, such as Jacob Burkhardt, think featured a resurgence of ancient paganism highly evident in the great artistic works of that epoch.  To Paul Johnson the Renaissance incubated “‘the first great cultural war in European history’” (p. 260).  And inasmuch as Christianity survived the Renaissance it faced an even more formidable foe in the European Enlightenment.  

“Consequently, in his admired and admiring history of the movement, Peter Gay interprets the Enlightenment as ‘the rise of modern paganism.”  And how exactly were the Enlightenment thinkers ‘pagan’?  Primarily, in Gay’s telling, in their forceful criticism and rejection of Christianity.  ‘The most militant battle cry of the Enlightenment,’ Gay explains, “ecrasz l’infame, was directed against Christianity itself, against Christian dogma in all its forms, Christian institutions, Christian ethics, and the Christian view of man.’  The Enlightenment amounted to a ‘great campaign against Christianity’” (p. 264).  Of Voltaire, a prominent exponent of Enlightenment verities, “Gay explains that the torrent of pamphlets that poured out . . . in the last sixteen years of Voltaire’s life reveals a distaste for Christianity amounting almost to an obsession.’”   As Voltaire declared:  “Every sensible man, every honorable man, must hold the Christian sect in horror.”  All things Christians he despised—“the Trinity, the chastity of the Virgin Mary, the body and blood of Christ in the Mass, all are cruelly lampooned” (p. 265).  Though less malicious, his counterpart across the Channel, was the skeptic “David Hume, ‘the complete modern pagan’” (p. 265).  Such paganism was on full display amidst the furor of the French Revolution, when Notre Dame was rededicated as a “temple of reason,” the clergy vilified and martyred, and Christianity widely denounced.  

Lest we think the Renaissance and Enlightenment were novel epochs, however, Smith thinks that paganism is simply man’s natural condition, and whenever Christianity recedes it resurges.  But what we see emergent today is an irreligious agnosticism—“secular humanism”—which finds nothing sacred, not even human beings.  There is no ultimate purpose or “telos” to anything, just an evolutionary unfolding of a material world.  To a Princeton philosopher, Walter Stace, the triumph of modern science gave us a philosophical (scientistic) naturalism “which is ‘purposeless, senseless, meaningless.  Nature is nothing but matter in motion.’  This new worldview, Stace thought, ‘though silent and unnoticed, was the greatest revolution in human history, far outweighing in importance any of the political revolutions whose thunder has reverberated through the world’”  (p. 286).  It lacks the consolations of paganism as well as theism.  

Yet not all thinkers embrace this revolution, portending the failure of sheer secularism.  Such was one of the “most influential (and thoroughly secular) English-speaking legal scholar and philosopher of recent decades, Ronald Dworkin” (p. 293).  As he aged, Dworkin hungered for something more than mere matter-in-motion.  He longed for a “moral realism” giving some basis for ethics as well as something “sacred” to endow his life with some sort of meaning.  And in his final book he “explicitly embraced “religion”—albeit “religious atheism,” as he called it” (p. 296).  There could be nothing transcendent, so he found comfort in the views of Spinoza and Einstein, “whose philosophies he offered as representative of the kind of ‘religious atheism” he himself advocated.”  To Spinoza God and the world are one and the same.  And Einstein “‘did not believe in a personal god, . . . but he did ‘worship’ nature’” (p. 298).  In the disenchanted world of modern secularism, Dworkin prescribed re-enchanting it with nature-worship.  It’s a revival of paganism, reclaiming “the city that Christianity wrested away from it centuries ago” (p. 330).

Consequently we have a divisive cultural war, pitting traditional believers committed to a transcendent Authority against a progressive cohort locating all moral authority within this world.  “In short, the conflicting orientations—toward ‘transcendent’ or conversely toward ‘inner-worldly’ sources of moral authority—reflected, and reflect, the competing transcendent and immanent religiosities” seeking to control America.  “In that sense, the condition of contemporary America is comparable to that of fourth-century Rome, when Christianity and paganism, each with its powerful representatives . . . .  struggled for mastery within the city” (p. 337).  For example, there is a struggle over symbols, traditionalists seeking to retain and progressives to remove Christmas creches, 10 Commandment monuments, “under God” additions to the pledge of allegiance, etc.  

Then there’s sex!  The past half-century has witnessed a momentous “struggle over a variety of issues connected in various ways with sexuality:  contraception, pornography, abortion, homosexuality, same-sex marriage.”  All were once illegal.  Now they’re all contested, and traditionalists are losing!  Though it may seem that a “new morality” is triumphing, it’s actually nothing new, for in antiquity sexual standards sharply divided Christians from the pagan world.  Contraception and abortion were embraced in the ancient world, as they are by today’s progressives, helping constitute, Mary Eberstadt says, “‘a new, quasi-religious orthodoxy’” (p. 256).  “As in Rome, it may seem, contemporary society ‘find[s] in erotic fulfillment nothing short of salvation’” (p. 356).   The victories progressives have won have taken place primarily in the nation’s courtrooms, where Christian values have been relentlessly disregarded.  This is vividly evident when considering contraception, “the expressive or symbolic core of the transformation in sexual morality” (p. 361).  Legal restrictions were discarded as the sexual revolution of the ‘60s commenced, and sexual intimacy was effectually severed from its “its traditional connections to procreation and marriage” (p. 361).  Consequently, “the Christian norms of sexual morality and marriage that previously were officially recognized in law, and has moved the law decisively in the direction of a view of sexuality that resonates with the immanent religiosity of both ancient and modern paganism” (p. 368).

Modern pagans endeavor to dismiss the freedom of religion along with traditional sexual standards.  Ancient pagans tolerated a variety of religions but routinely persecuted Christians.  That was because all of the pagan religions shared a commitment to an immanent metaphysics.  When Christians injected their beliefs in a transcendent Creator who prescribes ethical absolutes, the pagans turned malevolent.  Two thousand years later, despite their pretense of “tolerance,” modern pagans increasingly illustrate an “intolerant tolerance” flourishing in the halls of Congress as well as social media.  Epitaphs such as “racist” or “bigot” or “Hitler” are plastered on whomever dares disagree with them.   These modern pagans are determined to repudiate the Supreme Court’s 1892 claim that “we are a Christian nation.”  Rejecting not only the notion that America is a Christian nation but the 1992 Restoration of Religious Freedom Act, today’s pagans want to stamp out any hint of transcendence in the public square.  What you do by yourself in your own home is fine, but you dare not bring your religious commitments into the workplace (bakers, florists, pharmacists, photographers) or school or armed services.  When, for example, the Indiana legislature recently passed a law securing religious freedom “it was vehemently denounced by a veritable legion of politicians, pundits, government officials, scholars, CEOs, late night talk show hosts, athletic directors, and major corporations.  Boycotts were threatened.  Governors and mayors announced that public officials would not be reimbursed for travel to do business in the Hoosier state” (p. 398).  Hoosiers quickly buckled under the assault, and religious freedom retreated.  

Smith’s amply-documented, cogently argued case brings an insightful perspective on developments in our world.  

* * * * * * * * * * * * * * * * * * * * * * * *

In 1939 T.S. Eliot delivered a series of three lectures at Cambridge University, published the following year as The Idea of a Christian Society (New York:  Harcourt, Brace & World, Inc., c. 1940).  Though noting the “difficulties of the moment,” as Europe stood on the cusp of WWII, he insisted on addressing more urgent, fundamental issues, looking at things with a long lens and seeking to understand them.  He made no pretense to be a scholar or pundit, simply a poet trying the plumb the inner essence of what made Europe Europe—a Christian Society forged in Medieval times that was rapidly being displaced by what some thinkers judged a Pagan Society birthed by modernity.  That had not finally occurred, he thought, but he wondered if the once-prevailing Christian Society had any hope of resurgence.  To be clear, Eliot believed that a real Christian “can be satisfied with nothing less than a Christian organization of society—which is not the same thing as a society consisting exclusively of devout Christians.  It would be a society in which the natural ends of man—virtue and well-being in community—is acknowledged for all, and the supernatural end—beatitude—for those who have the eyes too see it” (p. 27).  

Such a society would feature many factors, synthesizing subsidiary “units of the community,” including family, workplace and church.  But its most notable feature would be education.  Indeed:  “A nation’s system of education is much more important than its system of government; only a proper system of education can unify the active and contemplative life, action and speculation, politics and the arts” (p. 33).  Unfortunately, education had become equated with “instruction,” generally of a utilitarian sort.  The sidelining of the liberal arts so evident in today’s universities was clearly on Eliot’s mind!  Still more, he understood that wherever the state takes control education becomes a tool for indoctrination, making youngsters devotees of the political regime.  But “a Christian Society education must be religious,” not in that it is controlled by the clergy or committed to doctrinal indoctrination, “but in the sense that its aims will be directed by a Christian philosophy of life” (p. 30). 

A Christian philosophy of life would be maintained not because it was useful, but because it is eternally true.  At its heart is dogma rather than development.  And it would encourage a right “conformity with nature.”  Long before “ecology” became popular, Eliot lamented the mechanization of life following the industrial revolution, leading to both a “deformation of humanity” and “the exhaustion of natural resources,” so “that a  good deal of our material progress is a progress for which succeeding generations may have to pay dearly” (p. 48).  He believed that “a wrong attitude towards nature implies somewhere, a wrong attitude towards God, and that the consequence is an inevitable doom.  For a long enough time we have believed in nothing but the values arising in a mechanized, commercialized, urbanized way of life:  it would be as well for us to face the permanent conditions upon which God allows us to live upon this planet.”  Without reverting to revering the primitives, we could well heed their examples in some areas.   Unfortunately, we’ve bowed before the altar of progress and thereby compromised our “spiritual knowledge and power.”  In fact:  “We need to know how to see the world as the Christian Fathers saw it; and the purpose of reascending to origins is that we should be able to return, with greater spiritual knowledge, to our own situation.  We need to recover the sense of religious fear, so that it may be overcome by religious hope” (p. 49).   

This of course was not Eliot’s England a century ago, for it had been largely shaped by a Liberalism which was eroding the Christian Society it merely tolerated.   Committed to an insatiable progressivism, it had carelessly jettisoned religious traditions and dogmas.  “By destroying traditional social habits of the people, by dissolving their natural collective consciousness into individual constituents, by licensing the opinion of the most foolish, by substituting instruction for education, by encouraging cleverness rather than wisdom, the upstart rather than the qualified, by fostering a notion of getting on to which the alternative is a hopeless apathy, Liberalism can prepare the way for that which is its own negation:  the artificial, mechanized or brutalized control which is a desperate remedy for its chaos” (p. 12).  This Liberalism was, however, weakening, and a deeply non-Christian materialistic philosophy appeared poised to replace it.  

Eliot’s hope for a return to a robust Christian Society was, of course not to be, as was evident in post-WWII Europe.  But his lectures stand as a monument to what once was—and what might be—if the Church could once again provide guidance for our world.

* * * * * * * * * * * * * * * * * * * * * 

In 1942, amidst a war that was “more disastrous than any that Europe had known since the fourteenth century,” the noted historian Christopher Dawson published The Judgment of the Nations (New York:  Sheed and Ward, c. 1942; republished by The Catholic University of America Press, 2011).  Noted for his many works detailing the power of The Faith in shaping Western Christian Culture, he wrote with dismay at the evident “disintegration” of that culture.  The world had changed more in the past century, he thought, than in any other “period in the history of the world” (p. 3).  WWII revealed the accelerating power of evil orchestrating “a spiritual catastrophe which strikes directly at the moral foundations of our society, and destroys not the outward form of civilization but the soul of man which is the beginning and end of all human culture” (p. 10).  

After analyzing the religious origins of European disunity and the failure of both Liberalism and the League of Nations, Dawson discussed “The Secularization of Western Culture.”  On a purely material level, the West has progressed impressively—our standard of living is demonstrably superior our grandparents, not to mention their grandparents!  But:  “This is the greatness and misery of modern civilization—that it has conquered the world by losing its own soul, and that when its soul is lost it must lose the world as well.” (p. 67).  The classical 19th century Liberalism of Adam Smith granted religion a small sphere of autonomy and influence—the “freedom of religion” established in America’s Bill of Rights.  “But the progress of mechanization and the social organization which it entails, has steadily reduced this margin of freedom, until today in the totalitarian states, and only to a slightly less degree in the democratic ones, social control extends to the whole of life and consciousness.  And since this control is exercised in a utilitarian spirit for political, economic and military ends, the complete secularization of culture seems inevitable” (p. 72).  Our technological society, it seems, plows ahead as relentlessly as a bulldozer, leveling everything to a purely secular level.  And it cannot but demolish both religion and personal freedom.  

Though Dawson lamented this destruction, he did not despair.  So he crafted a series of chapters calling for the “restoration of a Christian order” to contravene the planned societies espousing Socialism (whether in Stalin’s Russia or FDR’s America).  Mandating equality at the expense of freedom, modern “civilization” seeks to control both the natural world and the persons resident in it.  “A free culture is an unplanned culture” (p. 81).  Consequently, a planned culture is an unfree culture.  A planned culture cannot but destroy the freedoms needed for art, literature, philosophy and religion to flourish.   And so it goes!  


Despite disquieting episodes in the ‘90s—the riots in Los Angeles provoked by Rodney King’s arrest and O.J. Simpson’s acquittal being celebrated in many black communities—for three decades I naively assumed race relations were genuinely improving as Americans endorsed the aspirations and policies adumbrated by Martin Luther King, Jr.  Giving primacy to the “content of one’s character” rather than the “color of one’s skin” seemed the right recipe for racial justice.  But then Barack Obama, (the nation’s first black President who signified racial progress for many of us) helped expose and ignite racial divisions throughout the country.  This was evident as Obama defended Trayvon Martin and then helped inflame passions following the killing of Michael Brown in Ferguson, Missouri.  Thus was born Black Lives Matter, and there was evident (in pronouncements by Obama and his Attorney General Eric Holder) our current and rapidly escalating crisis.    

So to re-think what’s happening today I turned to one of the more thoughtful analysts of race in America:  Shelby Steele.  Three decades ago he published The Content of Our Character: A New Vision of Race in America (New York: St. Martin’s Press, c. 1990), setting forth a position differing from the mainline civil rights establishment and its supportive liberal intelligentsia.  In the ‘60s he had been a militant black-power advocate, helping stage demonstrations and demanding instant solutions to racial injustices.  But ultimately he wearied of that and began to think more deeply.  By 1990, as an English professor at San Jose State University, Steele determined to challenge readers to envision new paths for America’s racial minorities, seeking to discern “the human universals that explain the racial specifics” (p. xi).  What’s needed, he thought, is a philosophical realism, thinking about what actually is rather than what ought to be.  “What one is after is the right fit of ideal to reality.  And reality must always have priority, accepting only those ideas that truly illuminate it” (p. xii).  Consequently, Steele thinks carefully and crafts arguments that are most enlightening—as relevant today as when they were when first written.  

At the heart of the racial struggle, Steele thinks, is a “struggle for innocence.”  Trying to locate and blame others for one’s problems is a way of establishing one’s own purity.  Both whites and blacks do it.  Their rationales may differ but their ends are identical—to maintain a saintly impeccability.  Thus lamenting the legacy of slavery enables blacks to manipulate their past victimization into positions of  power.  But victims are by definition passive—and the “power” derived from victimization is a social rather than a personal exercise.  Blaming others actually empowers them inasmuch as they remain responsible for economic poverty or educational failures.  So protesting racism rather than promoting individual responsibility will never actually improve black lives, and promoting affirmative action and racial quotas will forever fail to motivate self-actualization and achievement.  

One path Steele rejects is the endless refashioning of politically correct terms in an elusive quest for self-respect.  Thus the newly-minted “African-American” label was, he thought, “yet another name to the litany of the names that blacks have given themselves over the past century” (p. 47).  While understandable:  “This self-conscious reaching for pride through nomenclature suggests nothing so much as a despair over the possibility of gaining the less conspicuous pride that follows real advancement.  In its invocation of the glories of a remote African past and its wistful suggestion of homeland, this name denies the doubt black Americans have about their contemporary situation in America” (p. 47).  New names change nothing and will never rightly establish one’s identity.   (Incidentally, it’s non-Indians who most strongly insist on calling American Indians “Native Americans”).  

What’s needed is a new way of acting, Steele says, a new (or is it the old Booker T. Washington strategy of patiently working within the American system?) way of taking the initiative to live creatively and well, of accepting responsibility for one’s actions, successes and failures.  In the 1990’s, the antiquated agenda of the 1960’s (appropriate though it was in that decade) no longer suffices. In particular, he argues, it’s time for blacks to stop blaming whites for their problems and get on with the business of personal and cultural achievement.  The book’s title, “the content of our character,” comes from a speech of Martin Luther King, Jr.  “What made King the most powerful and extraordinary black leader of this century,” Steele says, “was not his race but his morality” (p. 19).  What black leaders need today, he believes, is to recover King’s moral stance, to emphasize integrity and responsibility rather than ethnic victimization.  King’s message had power because it transcended race, binding men and women of all kinds together in a common endeavor.  King’s message had power because it was, in fact, not a “black power” message.

Unfortunately, too many black leaders routinely exploit white Americans’ guilt, asking for special treatment, thereby reducing themselves and their followers to inferiors needing a helping hand.  Tragically, “the price they pay for this form of ‘politics’ is to keep blacks focused on an illusion of deliverance by others, and no illusion weakens us more.  Our leaders must take a risk.  They must tell us the truth, tell us of the freedom and opportunity they have discovered in their own lives” (p. 174).  Steele himself represents—and seeks to speak for—the growing middle class black community.  But what distresses him is the persistence of racial sensitivity even in his own circles.  “As a middle-class black I have often felt myself contriving to be ‘black.’  And I have noticed this same contrivance in others—a certain stretching away from the natural flow of one’s life to align oneself with a victim-focused black identity” (p. 106). 

In a way, he argues, blacks choose to see themselves as inferiors, internalizing an inferiority rooted in alleged social discrimination rather than genetic factors, because it allows them to escape responsibility for competing and achieving as individuals.  Blacks lack power in America not simply because prejudice excludes them but because power comes solely to those who live responsibly.  “Personal responsibility is the brick and mortar of power” (p. 33).  The longer a group marches to the drumbeat of victimhood, even though it may elicit sympathy and applause and even reparations from the crowd, the longer it remains subservient and impotent.  This is not to excuse injustice, which abounds in America.  It is to insist that despite obstacles minorities can succeed here.  “Whites must guarantee a free and fair society.  But blacks must be responsible for actualizing their own lives” (p. 34).  Steele nowhere argues American society is fully free and fair!  He’s suffered discrimination.  Prejudice still stains our national life.  But it must be honestly depicted, not exaggerated as an excuse for immobility, not milked to preserve politicians’ positions.  We need not deny the injustices of the past to admit that “when today’s black college students—who often enjoy preferential admission and many other special concessions—claim victimization, I think that it too often amounts to a recomposition of denied doubts and anxieties they are unwilling to bear” (p. 61). 

Illustrating such preferential treatment, Steele cited Penn State University, which had a program which paid “black students for improving their grades—a C to C+ average brings $550, and anything more brings $1,100” (p. 90).  Minority students at Stanford University seized control of the president’s office several years ago, determined to make known their grievances, among which were complaints about their inadequate financial assistance—which for some totaled $15,000 a year!  Though he taught in a large state university, Steele appreciated the value of small black colleges.  Only 16 percent of black students enrolled in them, but they graduated 37 percent of all black graduates.  “Without whites around on campus, the myth of inferiority is in abeyance and, along with it, a great reservoir of culturally imposed self-doubt” (p. 136).  Consequently, black students in black colleges take more responsibility for their studies, work harder, and accomplish more. 

More broadly, if blacks can move beyond their “racial identity struggle” and begin to live as individuals in American society, Steele thinks this nation offers “a remarkable range of opportunity if we were willing to pursue it” (p. 168).  This book is highly personal, both in its style and its interpretations.  It clearly reflects the experience of only one black man in America.  Yet it’s worth reading, for it makes some important observations and offers some positive suggestions, though they do not lend themselves to political action.  Which is Steele’s message:  blacks as individuals must take charge of their lives.

             * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Shelby Steele followed up The Content of Our Character with A Dream Deferred:  The Second Betrayal of Black Freedom in America (New York:  Harper Perennial, c. 1998), bringing together essays explaining why any society racked by shame will fail.  The liberalism that developed in the 1960s embraced as its “all-consuming goal . . . the expiation of American shame rather than the careful and true development of equality between the races.  Shame pushed the post-sixties United States into an extravagant, autocratic, socialistic, and interventionist liberalism that often betrayed American’s best principles in order to give whites and American institutions an iconography of racial virtue they could use against the stigma of racial shame” (p. xiii).  By this time in his life Steele had become a “black conservative” and found himself subject to various forms of abuse, particularly on colleges campuses.  When he would give a lecture addressing racial concerns, “a virtual militia of angry black students would rush to the microphones and begin to scream” (p. 4).  They had no reasoned arguments, just irrational exclamations, curses hurled at a despised foe.    

They revealed the toxic consequences of a “self-esteem” pedagogy promoted in the ‘80s that made them angry and irrational.  Designed to enhance their self-esteem, assuming they would perform better once their sense of identity was enhanced, such policies inevitably failed.  For self-esteem follows successful performances.  Real success entails meeting high expectations.  (And it makes sense that blacks have actually excelled in precisely those demanding fields where discipline and hard work establish competence—athletics and music.)  But a large segment of the black community is locked into protecting its victimization status.  All inequalities, all injustices, must be understood as a result of white privilege and power.  Even those who enjoy immense advantages—Ivy League degrees, remunerative jobs, media celebrity—insist they’re victims.  As the only important factor worth considering:  “It leads us to believe that all suffering is victimization and that all relief comes from the guilty and good-heartedness of others” (p. 10).  Conversely, white liberals look “at black difficulties—high crime rates, weak academic performance, illegitimacy rates, and so on—and presume them to be the result of victimizing forces beyond the control of blacks” (p. 13).  And rather than try to rectify actual problems in the black community, white liberals relish lamenting their guilt and  dispense hand-outs effectively paralyzing the “poor” blacks they love to indulge.  

Under the rule of what Steele calls “redemptive liberalism, we blacks lost the first chance we ever had in the United States to truly control our own fate.”  Following the abolition of segregation, blacks looked to whites to redeem them.  Lyndon B. Johnson’s “Great Society was the first ambitious expression of redemptive liberalism” (p. 47), and Steele observed the impact of its policies first-hand.  The rivers of cash flowing into black communities looked so wonderful!  But no one envisioned “the price we would pay,” the personal liberty that would be lost.  “Welfare without a time limit or an expectation of work may have shown white America as compassionate, but it also took the problem of poverty away from those who suffer it” (p. 47).  Sadly enough:  “It is not at all an exaggeration to say that the welfare policies of the last thirty years—direct expressions of redemptive liberalism—created the black underclass in America.  This class of husbandless homes, fatherless children, and healthy nonworking adults follows the incentive patter of welfare policy perfectly” (p. 78).  

Consequently the final decades of the 20th century were “possibly the saddest chapter” in black history.  Relying on race rather than accomplishment has sapped the strength of the black Americans, for “to be human is to be responsible.  Correspondingly, living without responsibility constitutes a kind of inferiority, even when people are prevented by oppression for carrying responsibility for themselves” (p. 108).  Whatever relieves a person of responsibility—be it slavery or welfare or affirmative action—diminishes him.  Lowering academic standards in the name of racial justice is “the most dehumanizing and defeating thing that can be done to black Americans” (p. 113).  But it’s been done and the results are clear in the failing schools in the nation’s inner cities.   Here’s the question that haunts Shelby Steele:  “if the Great Society was so good, why did black America produce its first true underclass after it was over?” (p. 124).  And that question—not “white supremacy” or “the legacy of slavery”—that should prompt us all to rethink the problem of race in America.

That whites since the ‘60s have facilitated this destructive process—all in the name of racial justice—strikes Steele as a tragic betrayal of this nation’s finest principles.  In order to burnish their own sense of righteousness in “helping” blacks they have lost sight of the truly “first things” most desperately needed.  Rather than requiring everyone, both black and white, embrace “such timeless American principles as self-reliance, hard work, moral responsibility, sacrifice, and initiative,” whites have resorted to “deference.”  Overlook, explain away, make excuses for black failures!  Be sufficiently remorseful to affirm one’s own goodness without requiring accountability in a “victimized” racial minority.  Treasure empathy and abjure judgment!   “And this deference is always a grant of license—relief from the sacrifice, struggle, responsibility, and morality of those demanding principles that healthy communities entirely depend on.  And virtually all race-related reform since the sixties has been defined by deference.  This reform never raises expectations for blacks with true accountability, never requires that they actually develop as Americans, and absolutely never blames blacks when they don’t develop.  It always asks less of blacks and exempts them from the expectations, standards, principles and challenges that are considered demanding but necessary for the development of competence and character in others” (p. 125).  

                 * * * * * * * * * * * * * * * * * * * * * * * * * *

Current discussions of “white fragility” and “white privilege” make Shelby Steele’s White Guilt:  How Blacks and Whites Together Destroyed the Promise of the Civil Rights Era (New York:  HarperCollinsPublishers, c. 2006) essential reading for anyone wanting to rightly understand race relations in America.  As George Will said, “Steele is America’s clearest thinker about America’s most difficult problem.  Braiding family memories with an acute understanding of national policies, he demonstrates what went wrong then whites for their reasons, and blacks for theirs, implanted the idea that white guilt explains black  problems and can be the basis of policies for ameliorating them.”  

A major watershed was passed (Steele thinks) circa 1968, when the stigma of “white guilt” replaced the “white racism” that had persisted for nearly four centuries.  Subsequently a “vacuum of moral authority that comes from simply knowing that one’s race is associated with racism” enervated whites in America, rendering them impotent in the face of “black power.”  Whites decided to rectify past injustices by devising programs of racial preferences, freeing blacks from the task of properly preparing for schools or qualifying for skilled jobs—“clearly implying an inherent and irredeemable black inferiority” (p. 134).  Emboldened as victims, blacks seem ironically intent on resurrecting segregation—black churches, black professional associations, a congressional black caucus, black student associations, etc. “Now in the promised land of freedom we reach for the lost Eden of separatism” (p. 26).  

As a young collegian Steele embraced black power.  Listening to a speech by Dick Gregory, who softly peddled the Marxist notion of social determinism, Steele turned away from the vision of Martin Luther King and his embrace of great American principles, simply asking they be extended to all persons.  But Gregory’s rhetoric advocated outrage at the nation’s “systemic” or “structural” or “institutional” racism—something so elusive and toxic that it could never be overcome.  King himself would be replaced by “an entirely new kind of black leadership, not selfless men like King who appealed to the nation’s moral character but smaller men, bargainers, bluffers, and haranguers—not moralists but specialists in moral indignation—who could set up a gird with white guilt” (p. 34).  Ultimately, Steele realized that Gregory and his companions were “not fighting to end racism as King had always done; he was giving us the ideas we needed to enlarge it” (p. 35).  So Jessie Jackson would promote not a color blind society but a “rainbow coalition,” and interpret every racial incident as proof of systemic racism.  “This is why one black man being beaten by police in Los Angeles could trigger a massive riot in which some sixty people were killed” (p. 36).  White guilt, Steele explains, determined O.J. Simpson’s “innocent” verdict, something that would never have happened 20 years earlier.  To black jurors, the race consciousness promulgated by leaders such as Dick Gregory had become more important than judicial fairness.  Embracing this illusion enabled most blacks to persuade themselves that individual initiative and responsibility were less powerful than social determinism.  If you were a “powerless victim of racial oppression, this new morality of social justice meant you could not be expected to carry the same responsibilities as others” (p. 53).  You were also free from “moral constraints, and even the law” (p. 54).  Rioting and looting as “social justice!”  

Playing the “race card” enabled blacks to propound “an unwritten law more enforceable than many actual laws:  that no black problem—whether high crime rates, poor academic performance, or high illegitimacy rates—could be defined as largely a black responsibility, because it was an injustice to make victims responsible for their own problems.  To do so would be to ‘blame the victim,’ thereby repeating his victimization.  Thus, in the national consciousness after the sixties, individual responsibility became synonymous with injustice when applied to blacks” (p. 55).  The outraged response to Daniel Moynihan’s The Negro Family illustrated this phenomenon.   It was clear in 1965 that “whites simply could not criticize black life without being seen as racist, no matter what their intentions were.  His fine study immediately became an untouchable document in both government and academia.  He was made an object lesson for America’s intellectual class:  castigation and disregard await all white scholars who see black poverty outside a context of victimization” (p. 121).  Thenceforth has transpired an enduring “culture war between two political and moral cultures, one grounded in principle and values, the other in dissociation [i.e. separating personal and social morality]—the former broadly focusing the right, the latter focusing the left” (p. 174). 

This, however, was not what Steele and learned as a child.  “I had been raised around what might be called the ‘good man’ ethic.  A good man was the one you turned to when work got really tough, when quality counted, when deadlines had to be met.  A good man always finished what he started.  Such men were quiet figures of dignity in my working class neighborhood.  And in the name this ethic I had continuously held some sort of job since my sixth-trade paper route” (p. 47).  His father was a “good man” who had only a third grade education.  But he worked hard at various tasks, taking advantage of economic opportunities, buying and restoring “three ramshackle homes,” making a good living for his family.  He looked for opportunities rather than complaining about his circumstances.  Young Shelby followed suit, industriously looking for work and trying to get ahead.  Propitiously, he was hired by the Chicago Transit Authority as a bus driver, which he did proudly and well.  But then, enamored with Dick Gregory’s rhetoric, he abruptly resigned his job in order to marinate in Black Power resentment!  In time he came to lament that decision—but in microcosm it demonstrates the misdirection afflicting the civil right movement, preferring to indulge in resentment rather than take advantage of opportunities.   Though not in vogue with today’s ruling class, Shelby Steele’s solution to racial strife is the best I know.  

341 Sensible Environmentalists

The late novelist-physician Michael Crichton, in his 2003 “Remarks to the [San Francisco] Commonwealth Club,” presciently said:  “The greatest challenge facing mankind is the challenge of distinguishing reality from fantasy, truth from propaganda.  Perceiving the truth has always been a challenge to mankind, but in the information age (or as I think of it, the disinformation age) it takes on a special urgency and importance.”  His admonition now guides Patrick Moore, who in 1971 helped launch and lead Greenpeace for 15 years.  One of the most radical environmental organizations, employing confrontational tactics, Greenpeace challenged various countries and corporations, trying to eliminate atmospheric nuclear testing and halt the killing of whales and baby seals.  He and his “little band of protesters” showed how a few “dedicated people could effect real change at a global level.”  Within a decade Greenpeace became a major movement, bringing in $100 million a year with offices and staff around the world.  

But ultimately Moore grew disillusioned with the organization he’d founded.  He came to believe:  1) “sustainable development” provides  the key to preserving the environment; and 2) the increasingly extremist and irrational views of many of his erstwhile allies discredit them.  Rather than protesting problems he wanted to propose solutions.  So he wrote two truly significant books.  The first was Confessions of a Greenpeace Dropout: The Making of a Sensible Environmentalist (Beatty Street Publishing Inc.; Kindle Edition, c. 2010, rev. 2013).  Explaining, he said:  “The truth is Greenpeace and I underwent divergent evolutions.  I became a sensible environmentalist; Greenpeace became increasingly senseless as it adopted an agenda that is antiscience, antibusiness, and downright antihuman.  This is the story of our transformations” (p. 9).  Holding a PhD in ecology, Moore was the only credentialed scientist among Greenpeace leaders!

Moore thinks Greenpeace and similar environmental organizations “have adopted policy after policy that reflects their antihuman bias” and rejected the very scientific and technological innovations that help both people and environment.  Too many of them “are stuck in the 1970s and continue to promote a strain of leftish romanticism about idyllic rural village life powered by windmills and solar panels” (p. 19).  “They oppose forestry even though it provides our most abundant renewable resource.  They have zero tolerance for genetically modified food crops, even though this technology reduces pesticide use and improves nutrition for people who suffer from malnutrition.  They continue to oppose nuclear energy, even though it is the best technology to replace fossil fuels and reduce greenhouse gas emissions.  They campaign against hydroelectric projects despite the fact that hydro is by far the most abundant renewable source of electricity.  And they support the vicious and misguided campaign against salmon farming, an industry that produces more than a million tons of heart-friendly food every year” (p. 16).  In the book’s conclusion, having given ample scientific data to support his views, Moore repeats and expands upon these pivotal points, which are the essence of this treatise.  

Though he discusses, in scholarly detail, many of these issues, he mainly seeks to show how “environmentalism has gone off the rails and has become an apocalyptic religion that is self-defeating and demoralizing” (p. 46).  To illustrate he cites a declaration by Robert Kennedy Jr. that can easily be duplicated by the likes of Joe Biden:  “Our generation faces the greatest moral and political crisis in human history.  Will we take the steps necessary to avert catastrophic global warming or will we doom our children to a new Dark Ages in a world that is biologically and economically impoverished and defined by ever diminishing quality of life? . . .   The scientific debate is over except among a few polluter-financed junk scientists and ideologically blinded flat Earthers’” (p. 47).  That none of these assertions are true—and can easily be disproved—seems not to concern either Kennedy or the ruling class he represents.  

A 1984 conference in Kenya prodded Moore to become a sensible, rather than a radical, environmentalist.  Here he understood, for the first time, the importance of helping people work wisely and productively with the natural world.  Sustainable development could be both ecologically attuned and technologically advanced.  “I came away from Nairobi a changed person.  I now realized that as an environmentalist I could either act as if the more than seven billion people didn’t matter (or pretend they didn’t even exist) or I could expand my thinking to include them as part of the challenge.  The latter approach seemed both more honest and more intellectually stimulating.  It got me outside the box of purely environmental thinking and into the real world of recognizing the entire system” (p. 170).  

Leaving Greenpeace in 1986, he returned to his childhood home on Vancouver Island and launched a fish farm, growing salmon.  Rather than depleting the oceans’ wild salmon, growing them in seawater ponds promised to both provide fresh fish for market and protect the species.  To his amazement his old Greenpeace associates condemned him!  In a few years large corporate fish farms made it impossible for him to compete, but this endeavor opened doors for him to support the forestry business, which “was being unfairly used as a whipping boy.”  His family had, for 75 years, run a sawmill, so when he had an opportunity to work as director of the Forest Alliance in British Columbia he jumped at the chance.  This elicited a “firestorm of public and private invective” from tree-hugging radicals who called him an “eco-Judas.”  Their anger was fueled by documents such as “Forests in Trouble,” published by the World Wildlife Fund which claimed there was a worldwide “crisis,” irresponsibly repeating “many of the false claims being spread by the anti-forestry” forces.  Trees are an eminently renewable resource, absolutely needed for our well-being, and Moore believes they “are the answer to many questions about the future of human civilization and the preservation of the environment” (p. 244).  Well-tended forests can perpetually provide us with invaluable materials and concurrently recycle carbon dioxide.  In addition to tapping wood energy, using fossil fuels properly suits a “sensible” environmentalism, particularly when used in the transportation system.  In fact:  “It may turn out to be a very good thing that humans discovered fossil fuels and started burning them for energy,” because CO2 greens the planet and helps counteract harmful cooling!   “This is perhaps my most heretical thought:  that our CO2 emissions may be largely beneficial, possibly making the coldest places on earth more habitable and definitely increasing yields of food crops, energy crops, and forests around the entire world” (p. 462).

* * * * * * * * * * * * * * * * * * * * * * * * * * * *

A decade after publishing Confessions of a Greenpeace Dropout, Patrick Moore updated it in Fake Invisible Catastrophes and Threats of Doom (NP:  Ecosense Environmental, Kindle Editions, c. 2021).  “Awhile back it dawned on me,” he says, “that the great majority of scare stories about the present and future state of the planet, and humanity as a whole, are based on subjects that are either invisible, like CO2 and radiation, or extremely remote, like polar bears and coral reefs.  Thus, the vast majority of people have no way of observing and verifying for themselves the truth of these claims predicting these alleged catastrophes and devastating threats.  Instead, they must rely on the activists, the media, the politicians, and the scientists—all of whom have a very large financial and/or political stake in the subject—to tell them the truth.  This welcomes the opportunity to simply invent narratives such as the claim that ‘CO2 emissions from burning fossil fuels are causing a climate emergency’” (p. 11).  Such propagandists “are definitely a scurrilous and dishonest lot” and honest scientists have an ethical obligation to refute and denounce them.  A prime example of such the propaganda scientists should pillory is Al Gore’s “effective piece of misinformation in his film An Inconvenient Truth.” (p. 66).  

Rather than touch upon all of the issues he considers—e.g. endangered polar bears and walruses, genetically modified seeds, pernicious plastics, etc.—I’ll focus on two chapters.   One addresses the current “climate of fear and guilt.”  Earth’s climate is ever-changing—“sometimes relatively rapidly, sometimes very slowly, but always surely.  Hoping for a “perfect stable climate” is as futile as hoping the weather will be constantly pleasant, day-after-day, forever!  Yet alarmists routinely cry out about an “existential threat” to our climate if radical steps are not taken to stop global warming.  They irresponsibly cite both higher and lower temperatures, both more snow and drought, disappearing glaciers and species’ extinction, dying forests and coral reefs, crop failures and acidic oceans, cancer and heart disease.  In fact, Moore insists, “there is no hard evidence that any of these things have been or will be triggered by human-caused emissions of CO2.  It is all conjecture based on the hypothesis that carbon dioxide controls temperature, which itself has never been determined as fact.”  Most of these claims are predictions, not observations, and are too frequently “based on simulations, which are computer-generated models created by authors who decide what they want their model to predict and then build assumptions into the model that provide them with the results they are looking to achieve” (p. 33).

The alleged culprit fueling the global warming hysteria is, of course, CO2.  We humans are allegedly pushing inordinate amounts of it into the atmosphere, thereby endangering life on earth.  This is manifestly untrue!  The actual amount of CO2 has been declining for 150 million years.  Today’s level is “much lower than it had been during the majority of the existence of modern life” and was 15 times higher at its inception.  Though you’d never imagine it if you listen to the doom-sayers decrying the “climate crisis,” the earth has been cooling for the past 50 million years.  “The irony of what the alarmists are saying concerning the temperature of the Earth and that it is too hot, is that it is actually colder than it has been during most of life’s existence, and that life, historically, has better flourished during the warmer periods than the comparatively colder periods, like we are in today” (p. 59).   We’re now, geologically, in The Holocene Interglacial era, during which there was a 4,000-year Climatic Optimum wherein “the average temperature of the Earth was at least 1ºC (1.8ºF) warmer than today.”  The Sahara Desert was then green and supported “towns and livestock-herders” (p. 69).  It’s called “Climatic Optimum” for a reason—it was a wonderful time for all kinds of life to flourish. 

We’re now in the “Neoglacial” era, wherein temperatures descended “into the coldest period since the early beginnings of the Holocene.  The Little Ice Age, which reached its coldest around 1650-1700 AD, followed the Medieval Warm Period, when the Vikings colonized and farmed southern Greenland” (p. 70).  This was followed by the Little Ice Age, from which we are now emerging.  We are in The Modern Warm Period, which began in 1700.  “Human emissions of carbon dioxide from 1700 to 1850 were insignificant and yet historical records indicate the Earth warmed at about the same rate during that period as it has since” (p. 71).  The global warming that has actually  occurred since 1850 amounts to a 1.2ºC (2.2º F) rise in temperature, which is quite typical of the ups and downs throughout the planet’s history.  Unfortunately, ill-informed activists and politicians want to radically change the world’s economy to “fight catastrophic climate change.”  Thus they promote “renewable energy—in particular wind and solar—devices which have nothing renewable in their machinery” (p. 78).  Instead, Moore insists, we should grow more trees, build more hydroelectric dams, promote nuclear energy and utilize geothermal heat pumps.  

The second chapter I’ll examine deals with nuclear energy, “one of the safest, if not the safest technology, for generating electricity on the basis of casualties per unit of energy produced” (p. 147).  In his Greenpeace days Moore mindlessly opposed it.  But having educated himself he now sees it as one of the great goods available to mankind.   It is “the most cost-effective, feasible, and timely” answer to our energy needs.  It “will likely be the most important energy technology for the next 100 years and beyond” (p. 328).  Unfortunately, environmental doom-sayers have misled the public and only a few nations have embraced it.  There have only been three significant accidents, and only one (Chernobyl) wrought fatalities.  Every year 1.35 million people die in roadway accidents, while  “there have been no more than 60 nuclear-power-related fatalities from the more than 440 nuclear power plants worldwide; and all of these fatalities were from Chernobyl and the freak accident that occurred because of their poorly designed reactor” (p. 153).  Nothing’s perfectly safe!  But nuclear power comes the closest we have when generating electricity.  On the other hand he discounts the efficacy of solar and wind power, subjecting these industries to careful analysis and showing how ultimately they can only supplement the more reliable generators of electricity we need. 

* * * * * * * * * * * * * * * * * * * * * * * * 

In accord with Patrick Moore, Michael Shellenberger believes sustainable development requires positive “post-environmental” policies designed to wisely use nature’s resources.  He and a colleague were named Time magazine “heroes of the environment” in 2008 and he served as a respected expert reviewer of one of the Intergovernmental Panel on Climate Change (IPCC) reports.  He has published extensively in prominent newspapers such as the New York Times, the Washington Post, the Wall Street Journal for 20 years. In Apocalypse Never:   Why Environmentalism Hurts Us All (New York:  HarperCollins, c. 2020; Kindle edition), Shellenberger offers a significant critique of the movement. Representing much that Shellenberger critiques is Bill McKibben, an influential environmentalist who in 2019 published Falter to argue “that climate change is the “greatest challenge humans have ever faced.”   He writes regularly for influential media such as The New York Times and heads an organization with a $20 million annual budget.  He promotes the secular religion which has replaced God with Nature.  Its devotees have constructed an “apocalyptic environmentalism” which gives them “a purpose: to save the world from climate change, or some other environmental disaster” (p. 264).  Shellenberger believes thoughtful people, rooted in copious evidence, must reject the “apocalypse now” hysteria and champion sane solutions. 

He begins his treatise with a 2019 TV interview featuring two spokesmen for “Extinction Rebellion,” a radical group spreading climate warming fears.  Some 6000 of them had blocked bridges in London and they warned that “billions of people are going to die” if draconian measures are not adopted.  They claimed:  “‘Life on Earth is dying.’  And, ‘Governments aren’t addressing it.’”  Their endeavors garnered the praise and support of many journalists and celebrities, and one extensive survey showed that nearly half of the world’s people actually believe “climate change would make humanity extinct.”  Such alarmism is rapidly gaining political momentum—as is evident in the “green new deal” promoted by the Biden administration.  Sadly enough, such activists rarely get the “facts and science right.”  Shellenberger believes “environmental scientists, journalists, and activists” have betrayed their “obligation to describe environmental problems honestly and accurately, even if they fear doing so will reduce their news value or salience with the public.  Much of what people are being told about the environment, including the climate, is wrong, and we desperately need to get it right.  I decided to write Apocalypse Never after getting fed up with the exaggeration, alarmism, and extremism that are the enemy of a positive, humanistic, and rational environmentalism” (p. xi).

Determined to show “it’s not the end of the world,” Shellenberger devotes a number of chapters to defusing the misleading claims environmental activists make as they scheme to panic the public.  He shows why “Earth’s Lungs Are Not Burning.”  Despite much media madness, we need not worry about the Amazon’s rainforest!  Certainly there are fires in that vast region—and a fear-monger, such as Greta Thornberg, can easily take pictures suggesting the whole region is endangered—but in fact fires are nothing new.  Some of the more arresting photos weren’t taken in the Amazon and others had been taken many years ago.  “In reality, almost everything the news media reported in summer 2019 about the Amazon was either wrong or deeply misleading” (p. 30).  Actually earth’s “forests are returning, and fires are declining.  There was a whopping 25 percent decrease in the annual area burned globally from 1998 to 2015” and “new tree growth exceeded tree loss for the last thirty-five years, by an area the size of Texas and Alaska combined” (p. 32).  The globe, thanks in part to more available carbon dioxide, is greening!  And that’s a good thing!

Shellenberger champions nuclear power, lamenting its unfair, fear-inducing misrepresentations.  He laments the influence of proponents of the “Green New Deal” such as New York’s Rep. Alexandria Ocasio-Cortez, who calls for the abolition of the industry.   Should she have even the slightest interest in the truth she should compare Germany with France.  Heavily invested in nuclear power, France spends half as much for electricity and emits only one-tents of the carbon emissions!  “Had Germany invested $580 billion into new nuclear power plants instead of renewables like solar and wind farms, it would be generating 100 percent of its electricity from zero-emission sources and have sufficient zero-carbon electricity to power all of its cars and light trucks, as well” (p. 152).  But the Germans congratulate themselves for their “green” commitments!  More broadly, during the past 50 years, “the world spent about $2 trillion for nuclear, and $2.3 trillion for solar and wind” and “received about twice as much electricity from nuclear as it did from solar and wind” (p. 152).   Soon after WWII, President Eisenhower stressed the prospects of “atoms for peace,” providing cheap energy for the world.  If only the world had embraced his vision!  But within a decade environmental activists injected fears of radiation into the public mind.   Folks such as Ralph Nader and Jane Fonda, groups like the Sierra Club, Greenpeace, and the Natural Resources Defense Council, all “tapped into significant anxieties over nuclear weapons among baby boomers who had been subjected to duck-and-cover drills, where teachers ordered them to prepare for the apocalypse by hiding under their desks as schoolchildren, not to mention both government and Hollywood propaganda films” (p. 162).   They successfully opposed building any nuclear power plants, using the courts to delay and needlessly encumber their construction.   They thereby deprived the world of its best electrical energy source.

Unfortunately, these activists persuaded politicians to build air polluting fossil fuel plants or invest in patently misguided solar and wind power plants (which he calls the “unreliable”).  Wind and solar sound wonderfully “renewable” until you dig into the data.  Neither of them are constant, so you must have other power plants (usually natural gas) up and running to supply energy when the wind dies and the sun sets.  Here again Germany is instructive.  For 20 years the Germans have promoted “renewables,” investing nearly a half-trillion dollars in them.  However, “Germany generated just 42 percent of its electricity from wind, solar, and biomass, as compared to the 71 percent France generated from nuclear in 2019.  Wind and solar were just 34 percent of German electricity, and relied upon natural gas as back-up.”  But, ominously, serious problems have surfaced.  “Germany’s electricity grid came close to having blackouts for three days in July 2019.  Germany had to import emergency power from neighboring nations to stabilize its grid” (p. 184).  Still more:  “Renewables contributed to electricity prices rising 50 percent in Germany since 2007.  In 2019, German electricity prices were 45 percent higher than the European average” (p. 184).  So, Shellenberger concludes:  “there is no amount of technological innovation that can solve the fundamental problem with renewables.  Solar and wind make electricity more expensive for two reasons:  they are unreliable, thus requiring 100 percent backup, and energy-dilute, thus requiring extensive land, transmission lines, and mining.  In other words, the trouble with renewables isn’t fundamentally technical—it’s natural” (p. 185).  Few things better illustrate apocalyptic pipe-dreams that building inefficient windmills while disabling nuclear power plants!  But that’s us—not homo sapiens but homo credulous, willing to believe anything!

340 Hitler, Stalin, Roosevelt

Since monstrous despots seem self-evidently inhumane, many of us fail fathom why many millions of generally decent, ordinary folks followed the likes of Hitler and Stalin.  We too easily fancy that had we been there we would have responded quite differently—discerning their deviancies and resisting their allure.  So reading first-hand accounts of folks close to them enables us to better appreciate how easily they fell prey to these despots’ charms and manipulative powers.  Ernst Hanfstaengl, in Hitler:  The Memoir of a Nazi Insider Who Turned Against the Fuhrer (New York:  Arcade. Kindle Edition, c. 1957, 2011), gives us fascinating insights into this phenomenon.   The author’s mother was a blue-blooded New Englander, his father German.  The Hanfstaengls were prominent Bavarians with important political ties who owned a successful art-publishing house in Munich, dealing mainly in high-quality reproductions.  

Since his father wanted him to take over the New York branch of the business, it was decided that, after attending school in Germany, Ernst would go Harvard University to finish his academic work and get fully acquainted with his mother’s country.  In 1905 he did so and there “made friends with such outstanding future figures as T. S. Eliot, Walter Lippman, Hendrik von Loon, Hans von Kaltenborn, Robert Benchley, and John Reed” (p. 26).  He also made friends with Teddy Roosevelt’s eldest son, who told the President that Hanfstaengl had composed a song the Harvard football team adopted, featuring him playing his piece before the games.  (Ironically, this little tune was appropriated by the Nazis, who changed the words “Rah, rah, rah!” to Sieg Heil, Sieg Heil!).  Subsequently he was invited to the White House to entertain the household and would spend time with Teddy after he left Washington.  One time they “got to talking about art, literature, and politics, and the ex-President came out with the phrase which has stuck with me ever since:  ‘Hanfstaengl, your business is to pick out the best pictures, but remember that in politics the choice is that of the lesser evil’” (p. 28).  

Later, running the family business in New York and dining in the Harvard Club, he made friends with the young Franklin D. Roosevelt, giving him still more important connections.   When FDR won the 1932 election, he knew Hanfstaentl was close to Hitler, so he dispatched a private emissary, urging him to “do my best to prevent any rashness and hotheadedness.  ‘Think of your piano-playing and try and use the soft pedal if things get too loud,’ my visitor quoted.  ‘If things start getting awkward, please get in touch with our ambassador at once.  The message heartened me enormously, and in due course I was to do just that” (p. 188).  He also developed a deep respect for the United States and her industrial powers.  When World War I broke out an anti-German hysteria gripped America, and the government seized “the assets of the Hanfstaengl firm in the final months of the war.  They were worth half a million dollars and were sold at auction for about $8,000” (p. 30).  

Following the war Hanfstaengl stayed in New York for three years, running a small business he established.  But in 1921 he returned to Germany, finding a war-ravaged land with a demoralized populace.  Hoping to support politicians who would help the country recover, he attended a rally in Munich featuring a relatively unknown orator, Adolf Hitler.  Hanfstaengl was overwhelmed by “his gifts,” for “he had a command of voice, phrase and effect which has never been equalled, and on this evening he was at his best” (p. 34).  The crowd responded enthusiastically.  “It sounded like the demoniacal rattle of thousands of hailstones rebounding on the surface of a gigantic drum.  It had been a masterly performance.  I had really been impressed beyond measure by Hitler.  . . . .  With his incredible gifts as an orator he was clearly going to go far, and from what I had seen of his entourage there seemed no one likely to bring home to him the picture of the outside world he manifestly lacked, and in this I felt I might be able to help” (p. 37).

Greeting Hitler after the speech, Hanfstaengl offered his help, thus beginning a complex relationship that endured until 1936.  They saw each other frequently and Hitler often visited the Hanfstaengl home.  Hitler especially liked Hanfstaengl’s young son and seemed to have a special fondness for children.  But he most enjoyed Ernst’s ability to play the piano.  “Probably one of the main reasons why he kept me near him for so many years, even when we began to differ radically over policies, was this particular gift I apparently possessed of playing the music he liked in exactly the orchestral style he preferred.”   Hitler cared little for Bach or Mozart, but he had an insatiable craving for Wagner’s Meistersinger, Tristan and Isolde and Lohengrin.  “I must have played them hundreds of times and he never grew tired of them” (p. 50).  In those days, he confesses:  “I was an idealistic National-Socialist, I make no bones about it. It is a term which meant many things to different men, and I was no politician, but a piano player and art lover with ambitions to become a historian.  I had a better eye for effects than causes.  I had seen Germany degraded and destituted, and wanted to see the return of the comfortable and traditional values of my youth, combined with an honoured and respected position for what were then still called the working classes.  Behind a cloud of words and threats and exaggerations, I thought this was what Hitler wanted.  Above all, in his second surge of political activity, I was convinced again that nothing was going to prevent him from reaching the top.  If only the radicals like Strasser and Goebbels and the crackpots like Rosenberg and Hess could be off-set by people of more cosmopolitan views, in which I included myself, I believed the social revolution he preached would be orderly and beneficial. I was convinced, to use the old phrase, that there was every possibility of this poacher becoming a reliable gamekeeper” (p. 172).

In those early years there was much about Hitler to admire, though his close associates were less attractive.  They were petty-minded and constantly juggling to get power, willing at any time to slander or eliminate their rivals.  Many had immoral, “unsavoury habits” which Hitler disregarded.  His main ideological guide, Alfred Rosenberg, was “intrinsically illiterate, carried along by his ridiculous Nordic race resentments.”  Though Hitler considered him a great philosopher, to Hanfstaengl:  “‘It is tripe,’ I insisted, ‘and tripe remains tripe.’  I really did talk to him like this, any number of witnesses will confirm it. . . . .  Rosenberg is a dangerous and stupid man and the sooner you get rid of him the better.”  As events turned out I might just as well have been talking to a brick wall” (p. 122).  Detailing a decade of Nazi development and showing how Hitler evolved into an increasingly dictatorial leader, Hanfstaengl’s portraits of men such as Himmler, Goering, Goebels, Hess, et al. reveal his deepening concern for the trajectory of the movement.  When Hitler gained strength in the early 1930s, he needed someone capable of interacting with the world’s press corps, so he arranged a meeting and said:  “’Herr Hanfstaengl, I have come to ask you to take over the post of foreign press chief of the Party.  Great things are before us.  In a few months, or at the most in a couple of years, we must irresistibly sweep to power.  You have all the connexions and could render us a great service’” (p. 152).   Ever hopeful of injecting some balance and wisdom into the movement, Hanfstaengl thought that “this was my best opportunity of entering on the ground floor on equal terms with the wild men of the Party whose influence I had always feared, so in the end I agreed” (p. 152).

His hopes foundered, however, as Hitler took control of the country in 1933.  Hanfstaengl was effectively aside by the “wild men of the Party” and only occasionally saw Hitler himself.  Any cautionary notes he might sound, any restrained policy he might suggest—particularly if it dealt with the Jews—were quickly disregarded.  Hanfstaengl further detected a shift in Hitler’s rhetoric, as when he said:  “‘Now it is the heroic Weltansehauung which will illuminate the ideals of Germany’s future. …’   I pulled myself together with a start.  What was this?  Where had I read that before?  This was not Schopenhauer, who had been Hitler’s philosophical god in the old Dietrich Eckart days.  No, this was new.  It was Nietzsche” (p. 206).  A few months earlier Hitler had visited Nietzsche’s aged sister, who had given him her brother’s last walking stick.  It was as if something shifted within him, and he soon began spouting “Nietzchian catch phrases” such as “Wille zur Macht, Herrenvolk, Sklavenmoral—the fight for the heroic life, against formal dead-weight education, Christian philosophy and ethics based on compassion” (p. 208).

Sadly, Hanfstaengl confesses:  “Too many of us realized too late that the regeneration of the national life and economy was only part of the goal.  Hitler and a majority of his followers really believed their anticlerical, anti-Semitic, anti-Bolshevist, xenophobic catch-phrases and were prepared to keep the whole country in uproar in order to put them into practical effect” (p. 232).  Mid-way through 1934, after the killing of Ernst Roehm, it became clear to him that Hitler was a “pathological murderer” who must be opposed.  Hanfstaengl had naively helped bring “to power a bunch of dangerous gangsters” who would do incredible harm.  Two years later he slipped across the Swiss border and determined to live in exile so long as the Nazis controlled Germany.  He and his son managed to find refuge in England, but when the war broke out he was placed in various internment camps.  Later he was relocated to Canada.  While there he managed to get a message delivered to President Roosevelt, offering his assistance and was brought to a hide-out near Washington in order to provide intelligence.  But before long the British demanded he be returned to their custody, and after the war he was sent to yet another internment camp in Germany.  So for nearly a decade he suffered rather shabby treatment in various camps.  Finally freed, he wrote his book on Hitler, offering us a unique perspective on the man and his movement.

                                             * * * * * * * * * * * * * * * * * * * * * * * * * *

One of the most distinguished scholars in the post-WWII era, Robert Nisbet, examined an important aspect of that war in Roosevelt and Stalin:  The Failed Courtship (Washington, D.C.:  Regnery Gateway, c. 1988).   It’s a deeply tragic story, showing how the arrogance of an American President harmed millions of innocent people by imagining he could charm and manipulate a devious dictator.  Ignoring the advice of well-informed advisors who actually knew a great deal about the Bolsheviks and their leader, condescending to Winston Churchill, who represented to him an antiquated imperialism, FDR thought he could win the war and reshape there post-war world through his personal finesse.  Much that he did, the experienced diplomat George Kennan said, grew out of groundless assumptions and a manifest “puerility unworthy of a statesman of FDR’s stature” (p. 6).  But as FDR told a former ambassador to the USSR, William Bullitt:  “‘I think if I give him everything I possibly can, and ask nothing from him in return, noblesse oblige, he won’t try to annex anything and will work with me for a world of peace and democracy’” (p. 6).  To bank on noblesse oblige from a mass-murderer shows the depth of the president’s naïveté!

Soon after the United States entered WWII Roosevelt wrote Churchill and assured him “‘that I think I can personally handle Stalin better than either your Foreign Office or my State Department’” (p. 15).  He’d never met Stalin, nor did he know much about Russia, but he had no doubts he could handle things!  Throughout the 1930s the New Deal liberals such as Harry Hopkins had perennially praised and supported the Bolsheviks, so Nisbet says:  “It is impossible to understand the wartime White House or even Roosevelt’s leadership in the war without reference to Harry Hopkins as friend, adviser, envoy, and trusted confidant to the President” (p. 20).  Hopkins, of course, was a social worker turned bureaucrat with decidedly socialistic propensities.  He had visited Moscow in July, 1941, and returned totally enthralled by Stalin, who had treated him royally, ao he continually prodded FDR to treat the USSR as a favored nation and to support its tyrant.  

Stalin hated Churchill, but he developed a superficial rapport with Roosevelt.  But he distrusted virtually everyone, even including eminent Bolsheviks who occupied prominent positions within the regime, and he easily learned how to manipulate the American.  Stalin desperately needed supplies only America could provide and wanted the Allies to open a second front in Europe to ease the Nazi’s military pressure on Russia.  So when FDR suggested face-to-face meetings he was happy to oblige.  They first met, along with Churchill in Teheran, Iran, in November 1943, where Stalin schemed to see FDR three times before the official sessions began, allowing the two of them to make decisions without Churchill’s participation.  In these sessions Roosevelt promised to allow the USSR to exercise control over Poland and the Baltic states when the war was over.  In return, FDR gained support for his vision of a post-war United Nations.  These private talks undermined both Churchill and the Anglo-American military leaders, and throughout the official sessions FDR unfailingly supported Stalin while poking fun at Churchill.  “‘If the tale is true,’ writes Keith Eubank, ‘Roosevelt had insulted Churchill who admired him, and demeaned himself before Stalin who trusted neither man.  In his craving for Stalin’s approval and friendship, Roosevelt imagined the joke had been on Churchill and that Stalin had laughed with him.  More probably Stalin had laughed at the President of the United States for belittling an ally to find favor with a tyrant’” (p. 49). 

In Nisbit’s judgement, “Teheran can be compared to Munich in 1938,” when Chamberlain appeased Hitler, and it marked the beginning of the Cold War.  “What would take place at the later Yalta summit meeting would be little more than a formalizing, a moralizing, to cover what had essentially been decided between Roosevelt and Stalin at Teheran” (p. 49).  Admiral King, one of the American chiefs at the meeting, “said at the end:  ‘Stalin knew just what he wanted when he came to Teheran and he got it’” (p. 50).  Unlike FDR, Churchill hated Communism and had gone to Teheran believing Germany would lose the war; so “‘The real problem is Russia.  I can’t get the American’s to see it’” (p. 50).  He left the conference depressed and pessimistic, realizing what would soon come to pass.  Churchill also thought the “total war” strategy of Stalin and Roosevelt, designed to utterly destroy Germany, would prove disastrous.  But FDR’s plan as promoted by America’s Secretary of the Treasury, Henry Morgenthau, gained authorization; it called for “the complete pastoralization of Germany,” confiscating all of her industrial equipment and permanently occupying the country.  “At Teheran, Roosevelt had agreed with Stalin that Germany must be dismembered and perhaps divided into half dozen or more small and separate states” (p. 55).  As one observer noted, it would replace “factory workers with shepherds and goat herders.”  Though this did not actually happen following the war, since Harry Truman was President, the Morgenthau proposal shows the degree to which FDR and Stalin wanted to radically re-frame the post-war world.  

Roosevelt and Stalin also opposed Churchill’s Mediterranean strategy, whereby the Allies would move quickly from Africa through Italy into Germany.  This would potentially defeat the Nazis and simultaneously deter the Soviets from occupying Eastern Europe—a strategy eminent generals such as Mark Clark strongly favored.  Following the war Clark wrote:  “A campaign that might have changed the whole history of relations between the Western world and the Soviet Union was permitted to fade away” because  the decision had been made at Teheran to open a second front instead (Calculated Risk, p. 368).  German officers, talking after the war, were mystified by this decision, but Stalin knew it would keep the Allies out of the Balkans, delivering them to the Red Army.  So he and FDR determined the Allies would invade France and move eastward.  “It is safe to say that had Churchill’s vision been allowed to prevail, the postwar history of eastern Europe and also central Europe, not to forget the Cold War against the West, would be somewhat different” (p. 61).  Opening a second front in France gave the Red Army time to march deep into Germany and seize Berlin.  Even then, given the rapidity with which Ike’s Anglo-American troops swept eastward, they might have reached Berlin earlier, for an American army reached the Elbe River, 60 miles from Berlin.  The Germans were furiously battling the Russians and left a path to Berlin relatively free for the Americans.  General Simpson recalled that he “had six or seven divisions” on the Elbe with   sufficient supplies to “have gone right on to Berlin within twenty-four to forty-eight hours easily’” (p. 87).  Still more:  “I have the feeling that maybe the Germans would have welcomed us’” (p. 87).  

However, General Eisenhower stopped him, giving the German capital to the Russians.  “Stalin’s joy must have been intense.  He knew very well the value of Berlin and the crucial importance of being first to reach the bunker that housed Hitler” and others.  “The Soviet capture of Berlin, courtesy of General Eisenhower, would be a crowning completion to a larger Soviet plan to assume hegemony in all of central Europe—Vienna and Prague included.  Stalin knew this; and he knew that Churchill had been working against its possibility from early in the war.”  Stalin also knew that Ike would have made his military decisions in accord with the East-West policies of FDR.  “Stalin might well have considered it another generous gift from the President, in accord with their private discussion at Yalta” (p. 84).  Churchill protested, knowing full well what would follow, but he could do little about it.    

Though the truly major decisions had already been made at Teheran, “Roosevelt’s courtship of Stalin proceeded apace at Yalta.  Of all the episodes of the Second World War, the Yalta summit in early February 1945 probably has the worst odor” (p. 69).  As ever, FDR scoffed at the “experts” who cautioned against trusting Stalin and charted his own course, granting “moral legitimation” to the Soviet occupation of territories conquered by the Red Army by issuing the Declaration on Liberated Europe.  Under its provisions, Timothy Garton Ash says, the peoples of “liberated” East Central Europe would be “‘compelled to abandon their hopes of Democracy, Sovereignty, Independence, Representative Government—to use Churchill’s own list’” (p. 71).  Churchill later termed the document “fraudulent” inasmuch as it served only one purpose:  to justify Soviet control over East Central Europe.  In private conversations FDR granted Stalin’s every request.  Anxious to involve the USSR against Japan, America’s President promised to give Stalin the southern half of Japan’s Sakhalin Island and the Kjurile Islands.  Amazingly:  “If Churchill is to be trusted, Roosevelt’s faith in Stalin even reached the point where he expressed intent to share the secret of the atom bomb with the Soviet leader” (p. 74).  

The iron fist of Stalin (the word means “hammer”) appeared wherever Soviet troops prevailed.  A month after the Yalta accords “mass arrests were taking place in Cracow, with whole trainloads of Polish intellectuals, priests, professors, and labor union leaders being taken to a huge prison-work camp” (p. 78).  Similar things happened in the Baltic states and Rumania.  Churchill wrote FDR a long letter informing him of such developments, stressing that:  “‘we are in the presence of a great failure and an utter breakdown of what was settled at Yalta,’” (p. 79), but Roosevelt would not join the British Prime Minister in opposing Stalin, for appeasing the Soviets shaped FDR’s policies.  He rejected not only Churchill’s advice but that of his own ambassador to Moscow, Averell Harriman, who wrote, just a month after Yalta:  “‘I feel the time has come to reorient our whole attitude, and our method of dealing with the Soviet government.  Unless we wish to accept the 20th century barbarian invasion, with repercussions extending further and furthers and in the East as well, we must find ways of arresting the Soviet domineering policy.’  In a separate message, Harriman wrote:  ‘We must come to clearly realize that the Soviet program is the establishment of totalitarianism, ending personal liberty and democracy ads we know it’” (p. 81).  

FDR died on April 12, 1945, just two months after Yalta.  Would he have reconsidered his relationship with and promises to Stalin had he lived longer?  Probably not, because one of his deepest desires was to accomplish what Woodrow Wilson failed to do—remaking the world.  FDR wanted, Sir John Wheeler-Bennett says, not only to “‘establish the United Nations but to superimpose upon it an American-Soviet alliance which should dominate world affairs rot rather detriment of Britain and France, and to this end he made copious concessions to Marshal Stalin’” (p. 95).  In quest of that goal he consigned millions to misery for four decades.  Courting a monster inevitably entails falling prey to his machinations. 

# # #