324 Untethered Minds

As a  prototypical, optimistic “progressive,” believing the world was getting better and better, and after devoting his life to celebrating biological and societal evolution, H.G Wells in 1945 wrote a final, deeply pessimistic book, entitled A Mind at the End of Its Tether, sorrowing that everything seemed to be flying apart and nothing made sense.  A few years earlier the great Irish poet, E.B. Yeats had written an equally doleful poem, “The Second Coming,” lamenting the shape of things to come:  

 Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the center cannot hold;
Mere anarchy is loose upon the world;
The blood-limned hoard is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity. 

Such passionate intensity is routinely visible in today’s campus protests, fueled by angry, profanity-spouting youngsters determined to prohibit controversial speakers from speaking.  Sure enough:  “Mere anarchy is loose upon the world”!  Their adolescent incoherence is thoughtfully analyzed by Mary Eberstadt in Primal Screams:  How the Sexual Revolution Created Identity Politics (West Conchohocken, PA:  Templeton Press, c. 2019).  Screaming youngsters, she insists, are indirectly asking a deeply personal and unanswered if perennial question:  Who Am I?  In the past, living in families and surrounded by stable communities, answering that question was relatively simple.  (For example, I could say I am my father’s son, reared on the high plains, and immersed in the life of a local Church of the Nazarene.)  Today, however, increasing numbers of folks cannot really find roots in such communities and turn to various groups wherein they seek to anchor their identities.  When this turn takes on political dimensions, they embrace “the desires and agendas” of aggrieved factions, providing a base whereby “human beings outside those chosen factions are treated ore and more not as fellow citizens, but as enemies to be eliminated by shame, intimidation, and, where possible, legal punishment” (p. 7). 

Allen Bloom had earlier discerned this development in his widely-discussed The Closing of the American Mind, wherein he described students as reared in accord with Rousseau’s prescriptions in Emile, “in the absence of any organic relation between husbands and wives and parents and children.”  Consequently, Bloom said:  “That is it.  Everyone has ‘his own little separate system.’  The aptest description I can find for the state of student’s souls is the psychology of separateness.”  Bloom blamed divorce as the primary reason for such separateness.  Now, thirty years later we must factor in the astonishing increase of out-of-wedlock births, all together resulting in what Eberstadt calls “The Great Scattering.”  Fractured families frequently mean not only missing fathers but fewer (if any) siblings and cousins and grandparents who are part of one’s life.  Still more (a point made in Eberstadt’s How the West Really Lost God):  youngsters without stable families have difficulty believing in and worshipping God.  Without family or faith to tether them to abiding realities, growing numbers of people seek to find their identity in self-selected groups.

Thus we witness the emergence of “identity politics.”  To answer the question “Who Am I” when traditional ways have collapsed, millions of moderns have relapsed into “one of the most revealing features of identity:  its infantilized expression and vernacular” (p. 64).  To speak personally, I have been utterly perplexed while witnessing utterly irrational behavior on university campuses as well as committee meetings in Congress!  Allegedly educated persons are, in fact, screaming rather than speaking coherently.  There are now “safe spaces” as well as “tiny ersatz treehouse stuffed with candy, coloring books, and Care Bears” on the campuses of the nation’s most prestigious universities (p. 66).  Apparently taking their clues from university professors, thousands of alienated youngsters take solace in identity groups, including feminism, androgyny, the #MeToo movement, etc., etc.

“‘Destroying the family life of highly social, intelligent animals leads inevitably to misery among individual survivors and pathological misbehavior among the group.’ J. M. Coetzee, recipient of the Nobel Prize for Literature has explained.  He was speaking of elephants, of course” (p. 103).  But it’s also true of humans.  Consequently:  “Identity politics is not so much politics as a primal scream.  It’s the result of the Great Scattering—our species’ unprecedented collective retreat from our very selves.”  Indeed:  “Anyone who has ever heard a coyote in the desert, separated at night from its pack, knows the sound.  The otherwise unexplained hysteria of today’s identity politics is nothing more, or less, that just that:  the collective human howl of our time, sent up by inescapably communal creatures trying desperately to identify their own” (p. 109).

* * * * * * * * * * * * * * *

An important aspect of our current culture is diagnosed by Douglas Murray in The Madness of Crowds:  Race, Gender, Identity (London:  Bloomsbury Publishing. Kindle Edition, c. 2019), explaining that “we have been living through a period of more than a quarter of a century in which all our grand narratives have collapsed” (#36).   Consequently, “We are going through a great crowd derangement.  In public and in private, both online and off, people are behaving in ways that are increasingly irrational, feverish, herd-like and simply unpleasant” (#31).  As Yeats lamented, “Things fall apart, the center cannot hold; / Mere anarchy is loosed upon the world.”  For many centuries the West was nourished by some “grand narratives,” including the heritage of the classical world of Greece and Rome as well as the religious traditions of Judaism and Christianity.  For half-a-century now that story has been shunted aside in favor of a “new religion” best evident in various versions of “‘social justice’, ‘identity group politics’ and ‘intersectionalism’” (#60).  Consequently:  “identity politics” provides “the place where social justice finds its caucuses.  It atomizes society into different interest groups according to sex (or gender), race, sexual preference and more” (#66).  The identity groups Murray describes are gays, women, non-white races, and transsexuals.  Much of the book is devoted to detailing illustrations of these four groups, and anyone wanting a very up-to-date journalistic accounting of what’s taking places throughout our world can glean ample information from perusing its pages. 

But the real worth of The Madness of Crowds is the philosophical analysis Murray provides.  All these identity groups share common intellectual roots are manifestly Marxist (updated by fashionable, academic postmodernists such as Foucault and Gramsci) and feel they are engaged in a great class struggle.  There are the haves and the have-nots, but today’s exploiters are not so much capitalists as patriarchs.  So:  “At the top of the hierarchy are people who are white, male and heterosexual.  They do not need to be rich, but matters are made worse if they are.  Beneath these tyrannical male overlords are all the minorities: most noticeably the gays, anyone who isn’t white, people who are women and also people who are trans.  These individuals are kept down, oppressed, sidelined and otherwise made insignificant by the white, patriarchal, heterosexual, ‘cis’ system.  Just as Marxism was meant to free the labourer and share the wealth around, so in this new version of an old claim, the power of the patriarchal white males must be taken away and shared around more fairly with the relevant minority groups” #975).   Thus when we hear about “toxic masculinity” or “white privilege” or “rape culture” we need to remember such slogans are all lethal weapons in our  cultural war. 

Many of these phrases are manifestly nothing more than phrases or slogans which frequently contradict each other.  But “Marxists have always rushed towards contradiction. The Hegelian dialectic only advances by means of contradiction and therefore all the complexities – one might say absurdities – met along the way are welcomed and almost embraced as though they were helpful, rather than troubling, to the cause” (p. 1099).  To those of us perplexed by declarations of men claiming they are women—as irrational as any statement could possibly be—the cultural Marxists simply dismiss us a “logocentric” (e.e. thinking logically).  This, as Stephen Pinker (a Harvard psychologist) “wrote in 2002, ‘Many writers are so desperate to discredit any suggestion of an innate human constitution that they have thrown logic and civility out the window . . . The analysis of ideas is commonly replaced by political smears and personal attacks . . . The denial of human nature has spread beyond the academy and has led to a disconnect between intellectual life and common sense.’  Of course it had.  . . . . The purpose had instead become the creation, nurture and propagandization of a particular, and peculiar, brand of politics. The purpose was not academia, but activism” (#1119).

This essentially Marxist narrative has been recently amplified, Murray argues, by the social media.  In literally the blink of the eye the world has been transformed by “a communications revolution so huge that it may yet make the invention of the printing press look like a footnote in history” (#2030).  Thoughtful books and wisely-edited newspapers have smaller audiences today, for “twitter” and “facebook” postings have superseded them.  “It is there that assumptions are embedded.  It is there that attempts to weigh up facts can be repackaged as moral transgressions or even acts of violence” and enables anyone “to address everything, including every grievance.  And it does so while encouraging people to focus almost limitlessly upon themselves – something which users of social media do not always need to be encouraged to do” (#2056). 

And not only can one say anything about anything—everything he he has ever sent into cyberspace is forever there.  Using words or espousing positions which were once quite acceptable may be used to assail folks.  Something one may have tweeted a decade ago as an adolescent can be uncovered and weaponized to destroy him through excoriation and “public shaming.”  Social media “appears able to cause catastrophes but not to heal them, to wound but not to remedy” (#3280).   Especially absent is any possibility of forgiveness.  We face “the question that the internet age has still not begun to contend with:  how, if ever, is our age able to forgive?  Since everybody errs in the course of their life there must be – in any healthy person or society – some capacity to be forgiven.  Part of forgiveness is the ability to forget.  And yet the internet will never forget” (#3320).  Even the words of one’s father may be resurrected to punish a person, as the British race car driver Conor Daly found out when he lost a sponsorship when it was discovered that 10 years before he was born his father gave a radio interview and used a racially inappropriate word.

To cope with the madness of crowds, more common sense reasoning,  such as Murray provides, must be recovered in all segments of our society.

* * * * * * * * * * * * * * *

A significant and largely unintended consequence of the sexual revolution is elucidated in Warren Farrell and John Gray in The Boy Crisis:  Why Our Boys Are Struggling and What We Can Do About It (Dallas, TX:  BenBella Books, Inc., c. 2018; Kindle Edition.)  Older men, such as myself, grew up in a time when “masculinity came with a built-in sense of purpose of being the provider-protector (e.g., warrior; sole breadwinner)” (p. 10).  As boys we wanted to grow up and assume the responsibilities of mature adults.  We had good reason to be.  But young men today frequently fail to find it.

To prove there is in fact a crisis Farrell considers boys’ mental, physical and economic health as well as their educational success.   Men kill other men and themselves far more frequently than do their female counterparts.  Indeed, the data are depressing!  Though “only 6 percent of the overall population, black males make up 43 percent of murder victims.   More black boys between ten and twenty are killed by homicide than by the next nine leading causes of death combined” (p. 16).  As many white men have killed themselves as have died of AIDS.  As soon as they enter puberty, boys turn suicidal:  “between ten and fourteen, boys commit suicide at almost twice the rate of girls.  Between fifteen and nineteen, boys commit suicide at four times the rate of girls; and between twenty and twenty-four, the rate of male suicide is between five and six times that of females” (p. 16).  Indeed, “the male-female suicide gap in the United States has tripled since the Great Depression” (p. 273).  “Women cry, men die!”  Men also go to jail in alarming numbers and “93 percent are male and are disproportionately young” (p. 18).

Though women were in the distant past called the “weaker” sex, that is certainly not true if one considers longevity as a marker of physical well-being, for men and boys are twice as likely to die as their female counterparts of the same age, making for “a greater life-expectancy gap than at any time since World War II” (p. 20).  Indeed:  “Being male is now the single largest demographic factor for early death,” says Randolph Nesse, Director of the Center for Evolution and Medicine at Arizona State University (p. 20).  Young men are alarmingly overweight and unfit.  As another indicator of physical well-being, an alarming decline of sperm count has been tracked by researchers.  “Boys today have sperm counts less than half of what their grandfathers had at the same age” (p. 20). Economically, the picture is equally drear, especially for men who don’t go to college.  “Over the last forty years, the median annual earnings of a boy with just a high school diploma dropped 26 percent.”  He is 20 percent more likely to be unemployed for significant times.  And if a young man lives “in an urban area, he’ll likely live in one of the 147 US cities in which young women under thirty haven’t just caught up to their male peers, but now outearn them (by an average of 8 percent)” (p. 26).  If he had a university degree things would be different, of course, but many men are failing to pursue higher education. 

“Worldwide, reading and writing skills are the two biggest predictors of success. These are also the two areas in which boys fall the most behind girls.  In the United States, by eighth grade, 41 percent of girls are at least ‘proficient’ in writing, while only 20 percent of boys are.  Many boys used to ‘turn around’ in about their junior or senior year of high school.  Anticipating the need to become sole breadwinner, and therefore gain familial pride, peer respect, and female love, they got their act together.  The expectation of becoming sole breadwinner became his purpose.  No longer.  In one generation, young men have gone from 61 percent of college degree recipients to a projected 39 percent; young women, from 39 percent to a projected 61 percent” (p. 28).  And these well-educated young women almost always refuse to consider lesser educated men as potential husbands!

Digging more deeply into the boy crisis, Farrell identifies a lack of purpose as one of its primary reasons.  “The Japanese call it ikigai, or ‘a reason for being.’  Japanese men with ikigai are less likely to die of heart disease.  And both sexes with ikigai live longer.   Whether we call it ikigai or sense of purpose, when we pursue what we believe gives life meaning, it gives us life.  Historically, a boy’s journey to prove himself is what gave him that sense of purpose” (p. 46).  To protect and provide for his wife and family have, throughout human history, given men ikigai.  But today, in Japan as well as much of the modern world, boys struggle to find it.  Much of this results from men being less and less needed to provide food and shelter for their families.  They also have far fewer heroes to emulate.  “What is a hero?  The word hero derives from the root ser, from which we also get the word “servant” (think “public servant”), as well as slave, and protector.  In Japan and China, the word samurai also derives from the word for servant, saburai.  Billions of boys throughout history have embraced the opportunity to serve and to protect in the hope of being labeled a hero or samurai.  Though the fiercer the enemy, the greater their chance of death, boys were willing to exchange their lives for the label.  They were, in a sense, slaves to the potential honor they might receive if they served and protected their families, villages, or countries” (p. 62).  Occasionally our youngsters see such heroes in action.  Consider the first responders on 9/11—99% were males!  In fact 76% of the firefighters in the country are volunteers, virtually 100% men!  So there are heroes in our midst, but too often young boys are fed anti-hero messages in feminist-run schools and popular culture. 

Thus parents need to strategically prepare their sons for adulthood, and that requires preparing them for employment in our digital age.  If they do well in school, opportunities abound for them if they persevere and find a well-paying slot in the economy.  If they’re not academically-inclined, it’s important to help them train for well-paying blue collar jobs—welders, plumbers, etc.  Participating in athletics is often crucial in helping boys become men.  Farrell provides lots of practical tips for parents (and grandparents) wanting to help their boys mature.  Above all, in a culture celebrating instant gratification and victimization:  “The discipline of postponing gratification is the single most important discipline your son needs” (p. 98).   But practical advice may mean little unless we face “the most important single crisis in developed countries:  dad-deprived children, and especially dad-deprived boys” (p. 102).  Boys reared without an attentive father are inevitably harmed.  If their dads dies, boys do OK, for they have memories of good men.  But when they lose their dads through divorce or never even know them because they were born out-of-wedlock, their stories frequently end poorly.  For those concerned, Farrell provides an appendix listing “some seventy ways that children benefit from significant father involvement—or put another way, seventy-plus ways in which dad-deprived children are more likely to suffer” (p. 117).  They are more likely to fail in school, to join gangs, to go to prison, to lapse into various addictions, to fail in marriage.  To cite only one painful fact:  “Prisons are the United States’ men’s centers (93 percent male).  A staggering 85 percent of youths in prison grew up in a fatherless home.  More precisely, prisons are centers for dad-deprived males—boys who never became men” (p. 120).  In short:  boys without dads do poorly!

But Farrell does more than alert us to problems.  He sets forth quite detailed ways in which dads can help rear healthy boys.  Simply being present in a boy’s life is hugely significant.  Merely interacting with a father boosts a boy’s IQ, strengthens his ability to trust others, reduces aggressive behavior, and enables him to rightly develop.  Stepfathers, unfortunately, have less (if any) positive influence.  Nor do same-sex parents!  Only biological fathers can do the crucial role of fathering.  Added to being present, good dads should preside over routine family dinners—a remarkably important ritual for children.  They can also enforce behavioral boundaries, whereas moms often set but fail to enforce them.  “One boy half-joked, ‘My mom warns and warns; it’s like she ‘cries wolf.’  My dad gives us one warning, and then he becomes the wolf” (p. 136).  Still another illustration:  women are more likely than men to give underage teenagers alcohol, admitting “that their desire to please trumped what they knew was right” (p. 140).  Dads normally roughhouse with and tease their kids—teaching them important lessons never derived from a woman.  They can lead them on wilderness excursions, camping trips, adventures of various sorts demonstrably valuable for youngsters.  They challenge their kids to accomplish things (whether in sports or school) and allow them to deal with defeats.

Farrell devotes many pages to the problem of divorce—and to ways to cope with it.  He also suggests legal changes to better enable men to be better fathers.  But the main message of The Boy Crisis is just that:  it’s a crisis and it’s devastating our culture.  Though wildly overstating the case, Jed Diamond claims:  “The Boy Crisis is the most important book of the 21st century.  Farrell and Gray are absolutely brilliant,” showing “why our sons are failing.”  Indeed:  ‘If you care about the very survival of humankind, you must read this book.”

323 “Social Justice” Casualties

In Why Meadow Died: The People and Policies That Created The Parkland Shooter and Endanger America’s Students (Post Hill Press, c. 2019, Kindle Edition), Andrew Pollack maintains that his daughter, Meadow, died not because of guns or NRA deviousness but because permissive school district policies enabled the killer (Nikolas Cruz) to escape proper treatment and unleash his fury on the students of Marjory Stoneman Douglas High School, located in Parkland, Florida.  According to Meadow’s brother, Hunter:  “If one single adult in the Broward County school district had made one responsible decision about the Parkland shooter, then my sister would still be alive.  But every bad decision they made makes total sense once you understand the district’s politically correct policies, which started here in Broward and have spread to thousands of schools across America” (#111).

This is not, of course, to diminish the responsibility of Nikolas Cruz!  He was, as is detailed in three lengthy chapters, a troubled young man.  Indeed:  “There was something profoundly dark and disturbed at the core of Nikolas Cruz’s soul.  Even his mother, Lynda, described her son as ‘evil’” (#1816).  His kindergarten teachers worried about his aggressiveness and fantasies.   He was known to enjoy torturing animals as well as threatening other students, and wherever he went he misbehaved and “wrecked havoc.”  In and out of special schools designed to help disturbed youngsters, he was frequently identified as a threat to both himself and others.  His teachers and counselors feared him.  “They knew about his obsession with guns and dreams about killing people.  They were so frightened that they took the extremely rare step of contacting his private psychiatrist. Yet not only did they return him to a traditional high school at an unprecedented speed, they also enrolled him in JROTC, a course in which he would learn to shoot using an air gun that resembled an AR-15” (#2187).  While the killings were taking place many staff and students suspected Cruz was the killer.  Sheriff’s officers had over the years responded to calls at Cruz’s home a total of 45 times.  But nothing was done to deal effectively with him.  They were all committed to following “the philosophy of the Broward school district, as expressed by Superintendent Runcie: ‘We are not going to continue to arrest our kids’ and give them a criminal record” (#2554).  

Following the Parkland shooting, many Americans demanded action, and politicians quickly began posturing, promising, and endlessly pontificating.  Responding to the outrage President Trump set up a “listening session” and invited Parkland parents, including Andy Pollack, to attend.  He spoke briefly and urged practical steps be taken to prevent further tragedies.   Subsequently the president talked with him and his son, discussing how to make the nation’s schools safer.  Returning to Florida, Pollack determined to memorialize his daughter by establishing a playground in her memory (Princess Meadow’s Playground) and establishing a nonprofit, Americans for Children’s Lives and School Safety (CLASS).  Yet his efforts were barely noticed amidst the massive national publicity generated by a group of Parkland students who organized a “March For Our Lives” to singularly focus on gun control. 

But Pollack knew guns were not the real problem.  So he began an intensive investigation, determined to understand why his daughter had died and he concluded the main culprit was a pernicious political correctness that pervaded Broward County bureaucracies:  school district officials, mental health providers, and law enforcement officers all failed.  “The only man who could have stopped him, School Resource Officer Scot Peterson, refused to enter the building and actively prevented other officers from entering” (#839).  He drew his gun—and stood still, safely hiding for 50 minutes!  “Ever since Columbine, police have been trained to immediately confront a school shooter” but Peterson stayed safe!  Five other deputies arrived, donned bulletproof vests, and listened (from safe distances, hiding behind cars or trees) to the gunfire killing kids.   Eleven long minutes passed before some of the deputies dared enter the building, long after the shooter had fled the scene.  Two courageous teachers died trying to protect the students, but law enforcement officers lacked their resolve. 

Especially culpable, in Pollack’s view, was the school superintendent, Robert Runcie, who meticulously followed federal guidelines issued by Barak Obama’s Secretary of Education, Arnie Duncan.  He hewed carefully to the agenda promoted by “social justice activist groups” which insisted schools serve “as laboratories for social justice engineering and force politically correct policies into our schools based on the assumption that teachers are too prejudiced to be trusted do the right things.  One policy is known as ‘discipline reform’ or ‘restorative justice.’  Activists and bureaucrats worried that minority students were being disciplined at higher rates than white students, and rather than recognize that misbehavior might reflect bigger problems and inequities outside of school, they blamed teachers for the disparity.  They essentially accused teachers of racism and sought to prevent teachers from enforcing consequences for bad behavior.  They thought that if students didn’t get disciplined at school, if instead teachers did ‘healing circles’ with them or something, then students wouldn’t get in trouble in the real world.  Superintendents then started pressuring principals to lower the number of suspensions, expulsions, and school-based arrests.  All that actually happened was that everyone looked the other way or swept disturbing behavior under the rug, making our schools more dangerous” (#227). 

To personalize his presentation Pollack portrays a number of folks intimately involved in the event.  There’s a math teacher, Kimberly Krawczyk, who was almost killed and became quickly disillusioned with the school district’s cover-up endeavors.  And there’s an immigrant father, Royer Borges, who “moved his family from Venezuela to America in 2014 to keep them safe” (#756).  His son was shot and seriously injured, so he hired an attorney to represent him.  Doing research, the attorney found an essay that linked the shooting with Parkland’s progressive educational policies, especially the district’s PROMISE program, which had been heavily funded by wealthy leftists such as Goerge Soros.  PROMISE was proposed and implemented to help “the victims of institutional racism” by refusing to arrest and punish public school students.  When Royer Borges learned about PROMISE’s permissive prescriptions, he “was furious.  He couldn’t believe that public officials had decided that the law shouldn’t apply in schools.  And he couldn’t believe that no one was going after Broward’s leaders for rolling the dice with children’s lives. It made no sense to Royer why instead of going after these local officials, everyone was marching on Washington, D.C. for gun control.  Venezuela had total gun control.  That’s how the government and the colectivos were able to terrorize the citizens” (#831).

Adding scholarly heft to the book is its co-author, Max Eden, a senior fellow at the Manhattan Institute for Policy Research, who had long researched and written about education and its needed reforms, concluding that a “‘social justice industrial complex’ had taken hold of American education” (#1010).  Using money from Obama’s 2009 stimulus bill, Arne Duncan had used money “to incentivize (some might say bribe) states to follow DCPS’s policy lead on test-based teacher evaluations and the new (and much-hated) Common Core academic standards.”   Educrats from across the country, attending ‘“woke’ conferences and training programs, . . . learned that the fastest path to career advancement is to fake statistical progress for minority students while passionately decrying privilege and institutional racism” (#1015).  Florida’s “Broward County was the standard-bearer for the new approach to school discipline: an aggressive push for leniency on the grounds that racially biased teachers were unfairly punishing minority students” (#1017).  Eden was also deeply distressed by the conduct of the Broward County Sheriff’s Department, which was determined to end “the school to mass murder pipeline” pattern evident throughout the region.  To do so the sheriff joined the school district and its PROMISE program by refusing to arrest adolescents.  Despite multiple calls to Cruz’s house and repeated warnings regarding his conduct, he was given a “free pass” that ultimately enabled him to launch his killing spree.

An unexpected hero in Why Meadow Died is a 19 year-old home-schooler named Kenny Preston, who proved to be the most tenacious and perceptive “journalist” writing about the shootings.  When he recognized two of the students killed by the shooter Preston began studying the incident and was early appalled by the reactions of Broward County authorities.  He spotted Superintendent Runcie’s instant concern to deflect attention from himself rather than mourn the victims’ deaths and was distressed by Sheriff Israel’s calloused response to questions.   “That’s when something inside of Kenny flipped.  The bodies of children who had been murdered under Runcie’s leadership were still lying on the schoolhouse floor directly behind him, and he had already started politicking” (#1315).  So Kenny Preston dug into the documents he could access on-line and interviewed a number of persons, including “Robert Martinez, a recently retired school resource officer, who told him, ‘We all knew some sort of tragedy like this was going to happen in Broward.  You can’t just stop arresting kids without expecting something like this.  As officers, our hands were tied.’  More alarming still, Martinez told Kenny that district officials had explicitly told school resource officers not to arrest students for felonies, in addition to the official PROMISE misdemeanors” (#1419).  Kenny’s on-line articles proved more perceptive than the mainstream media, which could do little more than repeat anti-NRA bromides.  In fairness, some of the local Florida papers did more honest work, but the story detailing Why Meadow Died remained largely for her father to tell! 

Concluding that the Broward County school board needed to change, Andy Pollack and a group of activists motivated by the Parkland shootings decided to challenge its entrenched power structures.  So they  ran candidates who mounted a vigorous campaign.  But all was naught!  Brossard County reelected the seasoned politicians aligned with Robert Runcie, and little was done to address the real problems in the district.  Though he was non-political (never even voting) before the shootings, Pollack finally realized:  “This happened in a Democrat county with a Democrat sheriff, a Democrat superintendent, and a Democrat school board, implementing Democrat ideas on criminal justice, Democrat ideas on special education, and Democrat ideas on school discipline.  And after Democrat voters gave all these Democrats a resounding vote of confidence in the school board election, the Democrat teachers union president, Anna Fusco, wrote in a Facebook group about our campaign for accountability:  ‘Now you can all shut up!’  Meanwhile, at the national level, Democrat organizers swooped in and weaponized my daughter’s murder for their Democrat agenda and to fund-raise to elect more Democrats” (#6220).

* * * * * * * * * * * * * * * * * * * * * * * * * * *

In Stand Down: How Social Justice Warriors Are Sabotaging America’s Military (Washington:  Regnery Gateway Editions, Kindle Edition, c. 2019), James Hasson explains:  “The Army that I entered as a second lieutenant during President Obama’s initial years in office was nothing like the Army I left [as a captain] in late 2015” (#7) because of an “eight-year social engineering campaign against our armed forces” (#15) waged by “hard-left ideologues” such as Ray Mabus, Brad Carson, Deborah Lee James and Eric Fanning, who occupied “some of the most influential national security positions” (#23).  They were all committed to radical feminist and LGBT ideologies and implemented “gender equality” programs, following President Obama’s orders.  He had famously promised to fundamentally transform the country, and Hasson believes he certainly did so in the one realm “over which he would exercise nearly complete control,” the military.   Illustrating such changes, a 2012 article in Stars and Stripes described how, in one Washington state post:  “The Army is ordering its hardened combat veterans to wear fake breasts and empathy bellies so they can better understand how pregnant soldiers feel during physical training” (#2182).   Then, in “2015, Army ROTC cadets at multiple universities participated in ‘Walk a Mile in Her Shoes’ events on campuses.  The events—‘designed to raise awareness about sexual violence against women’—had male Army cadets replace their combat boots with bright red high-heeled shoes” (#2183).  So Stand Down “is the story of what will be President Obama’s enduring legacy:  the sacrifice of the combat readiness of our armed forces to the golden calves of identity politics and progressive ideology” (#94) shaped and driven by homosexual and radical feminist activists. 

Such golden calves were installed in the nation’s military academies, which have substantially changed during the past 25 years as increased numbers of civilian professors have been hired.  Indeed, Hassan “interviewed academy graduates of all ranks who raised serious concerns about the cultural changes imposed upon the academies from above” (#490), all of whom were alarmed by the incursions of political correctness in these schools.  Symptomatic of the problem is a letter written by Robert Heffington, a retired Army lieutenant colonel who had taught at West Point.  He said:  “‘I firmly believe West Point is a national treasure and that it can and should remain a vitally important source of well trained, highly educated Army officers and civilian leaders.  However, during my time on the West Point faculty . . . I personally witnessed a series of fundamental changes at West Point that have eroded it to the point where I question whether the institution should even remain open.”  He charged that “standards at West Point are nonexistent” and lamented “the academy’s failure to enforce the honor code and its lax enforcement of conduct and disciplinary standards.”  Changes in West Point’s curriculum particularly distressed Huffington:  “‘The plebe American History course has been revamped to focus solely on race and on the narrative that America is founded solely on a history of racial oppression.  Cadets derisively call it the ‘I Hate America Course.’  Simultaneously, the plebe International History course now focuses on gender to the exclusion of many other important themes.  On the other hand, an entire semester of military history was recently deleted from the curriculum . . . at West Point!” (#502).

Turning to the other academies Hassan finds equally disturbing phenomena, even extending to concerns for “microaggressions”!  Training warriors by worrying about microaggressions seems at best counterproductive, but one finds “safe space” placards adorning office doors of both military and civilian professors at the United States Naval Academy.   “If the signs were stripped of identifying features, you would be hard pressed to distinguish them from those marking the offices of Yale gender studies professors” (#645). There’s even a “Safe Spaces Faculty Rep” entrusted with making sure no midshipman might be offended by offensive words.  At the Air Force Academy, a visiting psychology professor taught a course on “Interdisciplinary Perspectives on Men and Masculinity.”  The professor styles himself as a feminist and once wrote an article saying, “I challenge you to tell me one way in which the sexes are opposite” (#670).  The academy also deleted the phrase “so help me God” from the oath of enlistment in its cadet handbook as well as the cadet Honor Oath.

Illustrating the harm political correctness has done the military is the “real” story of females graduating from the Army’s Ranger School—considered by many “the hardest combat course on the planet.”   “For the sixty-two days of the course, candidates train for up to twenty hours a day and subsist on little more than a thousand daily calories” (#1300).  Only a few wanna-be male Rangers actually make it.  But in 2016 the Army celebrated two women for completing the course.  Then a journalist, Susan Katz Keating, decided to investigate the story and found the women were granted special exemptions and treatment, getting special “individualized training” and granted additional time “in the ‘pre-Ranger’ screening course despite failing critical tests.  And the instructors felt intense pressure to make sure the women passed, pressure that led to sharp departures from normal Ranger School standards” (#1213).  For example, whereas men were given only 48 hours to recover from stage one of the training before moving on, the women were given two to three months “to regain lost sleep, allow taxed muscles to recuperate, and otherwise recover physically” (#1355).   On-site instructors (speaking anonymously for fear of retaliation) universally commended the women’s efforts but lamented “how systematic political pressure forced changes to the legendary Ranger course, damaging its integrity, just as political pressure forced detrimental changes at every level of the military during the eight years of the Obama administration” (#1454).

In 2013 the Obama administration determined to allow women to serve in ground combat units.  Asked to study the issue, the various services prepared reports.  The Marines devised a meticulous study designed to record “injury rates, the speed at which the companies evacuated causalities on the ground, marksmanship scores, and dozens of other measurements.”  They thought, if all-male units proved superior, the Administration would preserve their traditions.  A combat veteran of Afghanistan, former Marine Captain Jude Eden, “summarized what they found:  ‘[A]ll-male units outperformed coed units in 69 percent of the 134 combat tasks. . . . If the figure had been even a mere five percent difference it would have been ample reason to maintain women’s exemption, since five percent is easily and frequently the difference between life and death in offensive ground combat.  But in fact the figure was 69 percent!” (#1615).  

University of Pittsburg researchers “conducted a comparative analysis of all-male and mixed-sex infantry units’ performance in critical battle drills and corroborated the findings of other teams.  In a thorough analysis of the injury reports from each of the training exercises, the researchers also discovered that the injury rate for female Marines during weight-carrying exercises was more than twice that of their male counterparts” (#1675).  The issue was never whether or not women could fight and die but whether they  could “walk up to fifteen miles a day, carry eighty pounds of equipment, and often sleep and tend to bodily functions in austere environments with little to no privacy?”  (#1933).  In fact, virtually none can!  But the Obama administration cared little for facts.  Senior military officers soon learned “that the administration had no interest in military readiness or lethality.  Instead, it was waging an ideologically driven campaign with an end goal of creating an equal number of male and female generals and the first female chair of the joint chiefs” (#1988). 

All available evidence merely confirms common sense:  “there are real and substantial physiological differences between men and women” (#1629).  But who cares!  The Administration, imposing its agenda upon the Marine Corps, was determined to “crack the glass ceiling” by placing women in combat units, opening for them important opportunities for promotion.  Secretary of the Navy Ray Mabus was especially determined to sexually integrate combat units because he was pursuing an  “ideologically driven quest for a ‘genderless’ Navy and Marine Corps.  He directed senior naval commanders to “ensure [that job titles] are gender-integrated . . . removing ‘man’ from their titles.  Traditional naval and Marine Corps job titles such as ‘yeoman’ and ‘rifleman’—titles that date to the founding of our republic—apparently needed to be changed to reflect a ‘gender-integrated’ force” (#1875).  Though the job titles were not actually changed, Mabus did manage to redesign uniforms to better fit women, and his broader agenda was enacted, fully in accord with radical feminist dogma.

Concluding his case, Hasson insists the United States still has the finest military in the world, but its strength is eroding.   Political correctness is “hurting our ability to retain talented officers and enlisted troops who entered the military for all the right reasons but find they spend their days acting as bureaucrats” implementing societal change” (#2727).  To rectify the problem the author thinks we must immediately reverse many of the Obama policies.  Doing so would enable us to follow the prescription of Revolutionary War hero “Light-Horse Harry” Lee, who “said he could not ‘withhold my denunciation of the wickedness and folly’ of a government that sent its soldiers ‘to the field uninformed and untaught.’  Such a government, he believed, was ‘the murderer of its citizens.’” (#2816).

322 The Myth of the Dying Church

Several weeks ago I began the Sunday school class I teach during the summer by referencing a recent article in First Things entitled “Belief Limbo,” by Ronald Dworkin, lamenting the growing number of folks in America “who are unsure, uninterested, undecided, or just too busy for religion, and who live in ‘belief limbo.’”  Since his concerns regarding the decline of religion had been widely diffused throughout by the religious media, I took his pessimism seriously, and we discussed how churches might better evangelize the nation.   Literally a few days later I read an article referencing a recent book that refutes many of these notions—Glen T. Stanton’s The Myth of the Dying Church:  How Christianity is Actually Thriving in America and the World (New York:  Worthy Publications, c. 2019)—so I acquired and rapidly read it.  Stanton is the director of Global Family Formation Studies at Focus on the Family, where he has worked since 1993, and he reminds us that Theodore Beza, John Calvin’s successor in Geneva, said:  “[ L] et it be your pleasure to remember that the Church is an anvil which has worn out many a hammer.” 

Stanton begins by acknowledging the influence of various “Chicken Littles” who have persuaded the public that Christianity is declining.  A headline in the Washington Post asserted:  “Christianity Faces Sharp Decline as Americans Are Becoming Even Less Affiliated with Religion.”  Similarly,  Newsmax declared:  “Christianity Declines Sharply in US, Agnostics Growing:  Pew.”   An article posted on BeliefNet lamented:  “Declining Christianity:  The Exodus of the Young and the Rise of Atheism.”  National Public Radio, in a celebratory note, said:  “Christians in U.S. on Decline as Number of ‘Nones’ Grows, Survey Finds.”  And that depository of all things properly liberal, the New York Times intoned:  “Big Drop in Share of Americans Calling Themselves Christian.”  And as if the secular doomsayers were not enough, trusted Christian sources often affirm the litany of woe.  One leading Christian author declared:  “Young people are leaving the church in droves,” reflected in “‘staggering numbers’” of those who say they no longer believe.”  An advertisement in a Christian magazine said:  “This generation of teens is the largest in history— and current trends show that only 4 percent will be evangelical believers by the time they become adults.  Compare this with 34 percent of adults today who are evangelicals.  We are on the verge of a catastrophe.”  Then a parachurch organization declared:   “Up to 90 percent or more of Christian kids will leave the church by the time they reach adulthood” and a youth ministry publication warned: “86% of evangelical youth drop out of church after graduation, never to return.”  Unfortunately, many of the folks circulating bad news are in organizations selling books or programs designed to address the problem!  Apologetics is an important discipline, but practitioners of the discipline frequently overstate the threats the church faces order to elicit support.

Given such a plethora of pessimism, many of us may have rather despaired at the prospects for the church!   But Stanton urges us to reconsider:  “I have good news for you:  IT’S SIMPLY NOT TRUE!” (p. xx).  There’s certainly little good news for mainline Protestant churches, for they have sustained significant losses.  “Pew’s America’s Changing Landscape states that between 2007 and 2014, mainline Protestant churches declined by 5 million adult members; taking into account margin of error, that number could be as high as 7.3 million lost members.  Regardless, the loss is massive.  But here is the part you didn’t hear.  Churches in Pew’s ‘evangelical’ category continued to grow in absolute numbers by about 2 million between 2007 and 2014” (p. 26).  Stanton finds this good news because he’s dived into serious scholarly literature—in-depth analyses by renowned professors, scholarly articles in trustworthy journals, and data-packed studies “from leading mainstream organizations that track church growth and decline numbers” (p. 12). 

The percentage of Protestants and Catholics who say their faith “is very important” to them has “increased two percentage points since 2007” (p. 37); they pray daily, join small groups for Bible study and fully believe it is God’s inspired Word.  Stressing the vitality of the faith in today’s America, Greg Smith, for example, “has long worked as the associate director of research for the Pew Research Center, one of the most trusted and respected institutions on this topic.  In an interview with Christianity Today a few years ago, Smith was asked by Dr. Ed Stetzer of Wheaton College if evangelicalism was dying.  He said simply, ‘Absolutely not,’ and went on to explain, ‘There’s nothing in these data to suggest that Christianity is dying.  That Evangelicalism is dying.  That Catholicism is dying.  That is not the case whatsoever’” (p. 13)  In fact, Evangelicalism is, “if anything, growing.”  It’s growth is substantiated by a Indiana/Harvard study that finds it increased from 18 percent of the population in 1972 to 28 percent in 2016.  Much of this growth in taking place in nondenominational, independent churches, many of them of the “megachurch” variety.

Turning to the oft-cited “nones” Stanton says it’s just a new name for an irreligious or nominally Christian group which has always been part of American culture.  “Let me put it directly,” he says:  “The rise in these much talked about and fretted-over nones are not people leaving their faith or the church.  They are not a new kind of unbeliever.  They are not actually a new group at all.  These are folks who are simply being more honest and accurate in their description of where they have always been in terms of their belief and practice.  This is who the nones are.  Their rise is not because of some great secularizing upheaval in American’s faith beliefs and practices.  They are simply reporting their actual faith practices in more candid ways, largely due to new ways in which polling questions have been asked in the last ten years or so.”  Wheaton’s Ed Stetzer, “has given one of the best clarifying explanations of this phenomenon that I’ve seen.  In USA Today, he wrote that ‘Christianity isn’t collapsing, it’s being clarified’” (pp. 53-54). 

Still more:  rather than multitudes of young people rejecting their parents’ faith “nearly 90 percent of kids coming from homes where they were taught a serious faith retain that faith into adulthood” (p. 56).   That collegians may for a time turn irreligious is an old, old story, for young people often demonstrate their independence by rejecting the faith of their fathers.  But, Rodney Stark says:  “‘That [young adults] haven’t defected from the church is obvious from the fact that a bit later in life, when they have married, especially after children arrive, they become more regular attenders. This happens every generation’” (p. 99).  Still more, Ed Steltzer says the University of Chicago’s universally respected General Social Survey (GSS) reveals that:  ‘If you look at young [evangelical] adults, eighteen to twenty-nine years old, we are at the highest reported levels since 1972 of regular church attendance among this group.  That’s a pretty big deal’” (p. 100).  And the reason these young people adhere to the faith is equally big:  parents!  Authentically devout parents enable their children to become devout adults. 

Professor Christian Smith, one of the nation’s finest sociologists, has for years overseen the National Study of Youth and Religion (NSYR), and he asserts that “‘parents are huge— absolutely huge— nearly a necessary condition’ for a child to adopt a living and lasting faith.  He concludes, ‘Without question, the most important pastor a child will ever have in their life is a parent’” (p. 113).  In fact, “fully 85 percent of teens raised by parents who took their faith very seriously, and lived in a home with consistent faith practices, became young adults who not only had a serious faith, but had the highest levels of religious belief and practice among their peers!” (p. 114).  The data show that effective parents:  1) “take their faith very seriously and live it out in meaningful ways;” 2) establish warm relationships with their kids; 3) encourage them to pray regularly and do so themselves;  4) engage them in and exemplify Bible reading; 5) routinely attend church and take part in its various ministries; 6) celebrate “miracles in their own lives and the lives of others;” 7) encourage children to deal honestly with their doubts and difficulties; 8) stand alongside them when teachers or classmates ridicule or persecute them for their faith; 9) enlist “satellite adults” to model and help them in living out the “family’s faith and convictions” (pp. 133-134).

Looking beyond the United States, the state of Christianity around the world is even more encouraging, particularly in the “Global South”—Latin America; Africa; and Asia.  “In terms of sheer numbers, Christianity is flowering around the world and doing so soundly, even dominantly” and probably will do so throughout this century.  . . . .  Specifically, the coming two decades will see the world’s population of Christians grow from today’s 2 billion to a remarkable 3 billion adherents, making Christianity the world’s largest faith for at least the next eighty years” (p. 74).  In stark contrast to Europe and the mainline churches in America, churches in the Global South are almost universally “strongly conservative in their theology, ecclesiology, and sexual teachings” (p. 80). 

And, most importantly, what’s evident in this world-wide church growth is the power and presence of the Holy Spirit, for He has empowered “Christ’s church across time and throughout the nations.  He is unstoppable, unquenchable, and inherently life-giving.  He is not nodding off, sickly, or on vacation.  The work of His heart and very character will not be thwarted.  He is God.  To believe the church is dying is to deny these truths and judge God either confused or a liar” (p. 191).  As was evident at Pentecost, “God’s Word will not return void.  What the Bible says of the church on its first day will also be true of these churches today:  ‘And the Lord added to their number day by day those who were being saved’ (Acts 2: 47).  Church, be of good cheer.  God is true.  Aslan is on the move.  Chicken Little is mistaken.  God’s future is bright.  It cannot be otherwise” (p. 193).

* * * * * * * * * * * * * * * * * * * * * * * * * *

For many years Rodney Stark, a professor at Baylor University,  has been trying to correct some pernicious errors regarding Church history.  In Bearing False Witness:  Debunking Centuries of Anti-Catholic History (West Conshohocken, PA : Templeton Press, Kindle Edition, c. 2016), he sought to rectify the record—not to defend the Catholic Church (since he is a Protestant) but to defend history.  To do this he first addresses various “distinguished bigots” (such as Edward Gibbon) posing as scholars who have maliciously slandered Catholics.  “It all began with the European wars stemming from the Reformation that pitted Protestants versus Catholics and took millions of lives, during which Spain emerged as the major Catholic power.  In response, Britain and Holland fostered intense propaganda campaigns that depicted the Spanish as bloodthirsty and fanatical barbarians.  The distinguished medieval historian Jeffrey Burton Russell explained, ‘Innumerable books and pamphlets poured from northern presses accusing the Spanish Empire of inhuman depravity and horrible atrocities…. Spain was cast as a place of darkness, ignorance, and evil.  Informed modern scholars not only reject this malicious image, they even have given it a name: the ‘Black Legend.’ Nevertheless, this impression of Spain and of Spanish Catholics remains very much alive in our culture—mere mention of the “Spanish Inquisition” evokes disgust and outrage” (#68). 

Inasmuch as much of the “Black Legend” is patently untrue, so other allegations regarding the ignorance and crimes of Roman Catholics need to be disproved.  This includes rightly portraying the Spanish Inquisition, long a whipping boy for cynical critics.  For years Stark had believed the Inquisition illustrated the depravity of the Catholic Church, so “when I first encountered the claim that not only did the Spanish Inquisition spill very little blood but that it mainly was a major force in support of moderation and justice, I dismissed it as another exercise in outlandish, attention-seeking revisionism.  Upon further investigation, I was stunned to discover that in fact, among other things, it was the Inquisition that prevented the murderous witchcraft craze, which flourished in most of Europe during the sixteenth and seventeenth centuries, from spreading to Spain and Italy. Instead of burning witches, the inquisitors sent a few people to be hanged because they had burned witches” (#128).

Without question the “Spanish Inquisition” is routinely included in anti-Catholic polemics, generally written by zealous Protestants or cynical secularists.  Best-selling books by historians such as Will Durant, easily fueled prejudices by declaring that “‘we must rank the Inquisition … as among the darkest blots on the record of mankind, revealing a ferocity unknown in any beast’” (p. 110).  Shocking stories about Torquemada’s brutality, estimates of victims killed ranging from hundreds of thousands to millions (including 300,000 burned at the stake), contributed much to the “Black Legend” so beloved by many.  However, Stark says:  “The standard account of the Spanish Inquisition is mostly a pack of lies, invented and spread by English and Dutch propagandists in the sixteenth century during their wars with Spain and repeated ever after by the malicious or misled historians eager to sustain “‘an image of Spain as a nation of fanatical bigots’” (p. 111).  Contemporary scholars, scouring Spanish archives, have actually read the ”records made of each of the 44,674 cases heard by these two Inquisitions between 1540 and 1700” as well as diaries and letters written in those years.  During the first 50 years, perhaps 1500 people may have been executed, though the records are sparse.  But during “the fully recorded period, of the 44,674 cases, only 826 people were executed, which amounts to 1.8 percent of those brought to trial.  All told, then, during the entire period 1480 through 1700, only about ten deaths per year were meted out by the Inquisition all across Spain, a small fraction of the many thousands of Lutherans, Lollards, and Catholics (in addition to two of his wives) that Henry VIII is credited with having boiled, burned, beheaded, or hanged” (p. 114).

Dealing with the “Sins of Anti-Semitism,” Stark provides an important historical context, showing how Jews have frequently suffered in various historical epochs.  Long before Christianity flourished there were influential Romans, such as Cicero, Seneca, and Tacitus who manifested Anti-Semitism.  In fact:  “The Jews were expelled from Rome in 139 BCE by an edict that charged them with attempting ‘to introduce their own rites’ to the Romans and thereby ‘to infect Roman morals’” (p. 4).  Then, in 70 A.D., the Romans brutally suppressed a Jewish rebellion, destroyed the Temple, and inaugurated a massive Jewish diaspora throughout the Empire.  As the Early Church developed, Jews often played a major role in denouncing and persecuting it.  Thus we find, in both the NT and subsequent Christian writings, many anti-Jewish statements.  But as Christianity triumphed there was relatively little persecution of Jews.  Throughout the Early Middle Ages they enjoyed considerable toleration within Christian communities, but things changed rather dramatically in the 11th century when the Islamic threat precipitated attacks on Jews.

Unfortunately, in the 11th century many Christians became almost morbidly concerned with heresies of various sorts and Jews often suffered alongside them.  “Unlike Christian heretics such as the Cathars, Waldensians, Fraticelli, and similar groups,” however, “the Jews were the only sizeable, openly nonconformist religious group that survived in Europe until the Lutherans did so by force of arms” (p. 19).  Indeed, “no pope in the Middle Ages ever undertook a campaign to convert the Jews,” and the distinguished historian Steven T. Katz, “wrote:  ‘Though Christendom possessed the power, over the course of nearly fifteen hundred years, to destroy that segment of the Jewish people it dominated, it chose not to do so … because the physical extirpation of Jewry was never, at any time, the official policy of any church or of any Christian state” (p. 19).

One of the widespread myths was popularized by Edward Gibbon when he declared Christianity prevailed in the Roman Empire because emperors and prelates ruthlessly imposed the Faith by persecuting pagans.   Consequently, as Peter Brown said:  “‘From Gibbon and Burckhardt to the present day, it has been assumed that the end of paganism was inevitable, once confronted by the resolute intolerance of Christianity; that the interventions of the Christian emperors in its suppression were decisive.’  But it isn’t true. As Peter Brown continued, large, active pagan communities “continued to enjoy, for many generations, [a] relatively peaceable … existence.” All that really happened is that they “slipped out of history’” (p. 46).  Solid historical work now shows pagans peacefully coexisted with Christians following Constantine’s Edict of Toleration.  Indeed we read, in the Code of Justinian:  “‘We especially command those persons who are truly Christians, or who are said to be so, that they should not abuse the authority of religion and dare to lay violent hands on Jews and pagans, who are living quietly and attempting nothing disorderly or contrary to law’” (p. 47).  As one of the finest contemporary historians,  Ramsey MacMullen, emeritus professor of history at Yale University and cited by the American Historical as “the greatest historian of the Roman Empire alive today” put it: “‘The triumph of the church was not one of obliteration but of widening embrace and assimilation’” (p. 61).

Intermingled with misinformation regarding the Inquisition are charges of multitudes of witches being burned.  Feminist “historians” have been particularly aggressive in making such accusations, part and parcel of their assault on evil patriarchs!  “Perhaps no historical statistics have been so outrageously inflated as the numbers executed as witches during the craze that took place in Europe from about 1450 to 1700.  It is sometimes alleged that some nine million witches were burned, often at the hands of Catholic Inquisitors.  But it’s all “vicious nonsense,” for solid scholarship now shows that perhaps 60,000 witches were actually executed, and these were in Protestant rather than Catholic countries.  Indeed, Henry C. Lea (no friend of Catholics) “agreed that witch-hunting was ‘rendered comparatively harmless’ in Spain and that this ‘was due to the wisdom and firmness of the Inquisition’” (p. 116). 

As is evident in the New York Times’ recent determination to date America’s founding in 1619, when slaves first landed in Virginia, slavery provides formidable fodder with which to attack one’s cultural foes.  So too, various historians have asserted that the Catholic Church legitimated and supported slavery.  But in fact slavery had slowly disappeared in the Early Middle Ages as Christianity extended its influence.  Furthermore, the Church’s greatest theologian, Thomas Aquinas, said slavery is a sin, and his position “has guided papal policy ever since” (p. 162).   Thus Pope Paul III declared that American Indians “and all other peoples—even though they be outside the faith—… should not be deprived of their liberty or their other possessions … and are not to be reduced to slavery, and that whatever happens to the contrary is to be considered null and void’” (p. 164).  Unfortunately, other popes occasionally departed from this policy, and it had little influence in Spanish and Portuguese colonies, where monarchs defied the popes and slavery flourished for centuries.  “The problem wasn’t that the Church failed to condemn slavery; it was that few heard it and most did not listen” (p. 165).  Stark carefully examines the French Code Noir and Spain’s Código Negro Español, showing how historians have selectively quoted the documents to disparage the Catholic Church, when in fact the codes set forth much more humane practices than could be found in Protestant colonies.  In America, these two codes helped shape slave-treatment in Louisiana when France (and briefly Spain) controlled the colony, so in 1830 “a far higher percentage of blacks in Louisiana were free (13.2 percent) than in any other slave state” (p. 171).  In fact, in “New Orleans, 41.7 of the blacks were free in 1830,” whereas in Charleston, South Carolina only 6.4 percent were free” (p. 172). 

Stark’s approach to various “myths” stands forth in his chapter titles, setting forth the errors he endeavors to expose:  1. Sins of Antisemitism  2. The Suppressed Gospels  3. Persecuting the Tolerant Pagans  4. Imposing the Dark Ages  5. Crusading for Land, Loot, and Converts  6. Monsters of the Inquisition  7. Scientific Heresies  8. Blessed Be Slavery 9. Holy Authoritarianism  10. Protestant Modernity. 

What Stark helps us do is read history more carefully, remaining especially vigilant whenever historians deal with Catholicism—or Christianity for that matter.  Just as Jesus warned, our enemies will “utter all kinds of evil against you falsely” (Mt 5:11).  That such is done is eminently evident in history books!

321 Readable Histories

Beginning with the “father of history,” Herodotus, most historians crafted interesting stories designed to appeal to the reading public.  Then, in the 19th century, German historians determined to make their craft more scientific, more fact-focused, and wrote increasingly for others in the profession who were compiling (in accord with positivistic scientists) a tapestry of information.  Diligently souring archives and seeking “objectivity” was certainly admirable and useful, but what was too often lost was the literary skill needed to interest general readers.  Fortunately, there are still many histories written (often by journalists as well as scholars with literary skills) that deserve being considered as “readable” histories.  One was given to me by a good friend, Dr. Dean Nelson, who as a journalism professor appreciates effective writing and is a friend of one if its authors (Lynn Vincent).  In Indianapolis:  The True Story of the Worst Sea Disaster in U.S. Naval History and the Fifty-Year fight to exonerate an Innocent Man (New York:  Simon & Schuster, c. 2018), Vincent and Sara Vladic tell the story of a flagship of the World War II Pacific fleet, basing their presentation in extensive interviews as well as library research.  When she was sunk, in literally the final days of the war, it “was the greatest sea disaster in the history of the American Navy” (p. 2).

Rather than write a strictly chronological account, the authors weave together technical details regarding the USS Indianapolis, personal anecdotes regarding her officers and crew, explanations regarding naval strategy and policy, insights from Japanese sources, and political perspectives regarding America’s efforts in WWII.   They thus provide highly detailed, accurate information in an engrossing manner.  For example, they describe the 610’ craft herself—construction materials, physical appearance, guns, 250 ton turrets, bulkheads, armor, etc.  They also provide pictures.  Christened in 1932, “Indy was grand but svelte.  Franklin Delano Roosevelt made her his ship of state and invited world leaders and royalty to dance under the stars on her polished teak decks” (p. 1).  One of the Navy’s 18 “Treaty Cruisers,” the Indianapolis was built to meet “treaty displacement limitations that produced thinly armored vessels shipbuilders referred to as ‘tin clads.’”  The men who manned them, however, “often fell in love with their speed and grace,” and one of the Indy’s sailors, 19 year-old Seaman Second Class L.C. Cox, simply “stood and gawked” when he saw the ship.  “She was colossal.  Sleek.  Magnificent.  He could hardly wait to get aboard” (p. 27). 

Inasmuch as the authors can make the details of a warship interesting, it’s inevitable they’ll be even more winsome when describing the men who manned her.  They provide vignettes of admirals and captains, cooks and gunnery sergeants, Japanese submariners and naval officers.  They not only portray the men but tell about their girlfriends and wives, their local backgrounds and personal proclivities.  A central figure in the story, Captain Charles McVay III, was the son of a Navy officer who’d fought in the Spanish American War and WWI, wherein he commanded two battleships.  Unconcerned with his son’s self-esteem, he routinely unleashed “a steady stream of sharp-tongued verdicts on the younger McVay’s Navy performance and demanded that he cover himself with glory befitting an admiral’s son” (p. 28).  The younger man, like has father, graduated from the Navy Academy, was commissioned in 1920, and rapidly rose through the ranks.  During WWII, he received a Silver Star for gallantry for his action in the Solomon Islands.  He was sensitive to the needs of his men and worked hard to maintain morale.  

During its activities in the Pacific Theater of WWII, the Indianapolis took part in many battles, including the Battle of Okinawa, where the Japanese fought tenaciously and sent hundreds of suicide-bombers to attack the American fleet.  The conflict was costly:  “36 ships sunk, 368 damaged, 763 aircraft lost, more than 12,000 soldiers, sailors, and Marines dead, drowned. or missing” (p. 72).  One of the ships struck was the Indianapolis, leaving her with “damage too serious for repairs at sea” (p. 35).  So she limped back to San Francisco to undergo repairs, and her captain, Charles McVay, assumed she’d see no more combat since the war seemed to be quickly approaching its end.  But as soon as the repairs were done McVay was called to a highly-secret meeting and ordered to take an important shipment back to Okinawa.

Though Japanese forces were retreating, it had become evident to Admiral Nimitz and President Truman that any invasion of the home islands might incur ghastly casualties.  Serious discussions in the very highest sectors ensued, and it was debated whether or not to use the recently-tested atomic bomb to force Japan to surrender.  Using an airplane to transport the bomb across the Pacific was considered imprudent, so it was decided to use Indy for the task.  “The contents of the shipment were not to be revealed to anyone aboard Indianapolis, even McVay” (p. 68).  On July 16, 1945, her voyage began.  The cruiser was built for speed, and McVay ordered her to move fast, since he’d “been told that ever day we take off the trip is a day off the war” (p. 95).  Ten days later Indy anchored at Tinian Island and unloaded her secret cargo.  It would then be placed in the hold of the Enola Gay, which would on August 5 make history by dropping an atomic bomb on Hiroshima. 

Having completed her mission, the Indianapolis sailed from Okinawa to Guam, preparing to sail on to Leyte, in the Philippines.  No naval officers thought this journey would be particularly risky since it would be on the periphery of the combat zone.  As Commodore “Jimmy” Carter said, “‘The Japs are on their last legs and there’s nothing to worry about” (p. 116).  Some intelligence reports indicated a handful of enemy submarines were prowling about in the area, but they were thought to be some distance from the Indy’s projected route.  The ship sailed on July 28 and planned to arrive in Leyte on July 31.  Following established procedures, the ship zigzagged during daylight but resumed base course when it became completely dark.  Then, just before midnight, July 30, a Japanese submarine commanded by Mochutsura Hashimoto launched six torpedoes toward the Indianapolis.  Built for speed, Indy had rather thin armor plates and some authorities had speculated that even one torpedo could sink her.  The Japanese torpedo “carried a huge explosive payload designed to mortally wound battleships and cruisers” (p. 151).   Two of them struck the Indianapolis, and she rapidly began sinking, disappearing 12 minutes. 

Facing the inevitable, Captain McVay ordered the men to abandon ship.  Many had died in the explosions, of course, but some 800 managed to escape and survive the initial disaster.  Yet their trials had only begun!  Clinging to life rafts and debris, they assumed rescue forces (planes and ships) would soon arrive to save them.  Indeed one of the great questions the authors raise is this:  “How was it possible that no one had known Indianapolis was missing?” (p. 262).  But day after day the survivors scoured the horizon and saw no one coming to their assistance.  The merciless sun seared their bodies and there was little food or water to sustain them.  Then came the sharks!  Since the authors interviewed many of those who survived, their description of these days renders the men’s suffering palpable.  Many of them had been injured by the torpedo blasts and quickly expired.  Each day hundreds of them disappeared.  The men prayed fervently.  Many of them behaved courageously.  And, of course, some behaved abominably.  After three days in the water they were spotted by an airplane and help began to arrive, with Captain McVay and the last survivors being rescued on August 3.  They were “ emaciated and shark-bitten.  Some had lost as much as forty pounds.  Their skin looked like burned bacon and was pocked with oozing sores.  Many were delirious” (p. 273).  In all, of the 1200 crewmen manning Indy, only 311 survived (a handful only for a brief time).  Sadly enough, hundreds more would have survived if rescue efforts had come quickly. 

Almost as soon as the tragedy transpired, it was necessary to blame someone!  Trying to escape personal accountability, high-ranking Navy officers (preeminently Fleet Admiral King), tried to blame Captain McVay and ordered him brought to trial for the disaster!  Amazingly, he was the only captain of the hundreds of Amerrican ships sunk in the war to be brought to trial.  But he was, at the end of 1945, court-martialed and his naval career effectively ended—finishing his career stuck in an insignificant posting in New Orleans.  At the time—and increasingly as the years passed—many observers thought McVay was punished to protect some of his superiors who were actually responsible.  Thus the authors devote a significant section of the book to describing “the Fifty-Year fight to exonerate an Innocent Man,” for “to a man, the survivors believed McVay was innocent” (p. 292).  Meticulous researchers and courageous witnesses were able, in time, to provide irrefutable evidence that McVay had been railroaded and was in fact innocent of the charges leveled against him.  But it would not be until George W. Bush was President that McVay was finally exonerated, so in the end his reputation (if not his career) was redeemed. 

* * * * * * * * * * * * * * * * * * * * *

Now and then I read a book I wish I could have written.  This is particularly true of John Sedgwick’s Blood Moon: An American Epic of War and Splendor in the Cherokee Nation (New York:  Simon & Schuster, c. 2018; Kindle Edition), since it deals with the same material I detailed in my 1976 PhD dissertation, entitled Brother Brother Slew:  Factionalism in the Cherokee Nation, 1835-1865.  (In fact, Sedgwick references my dissertation several times in his footnotes, demonstrating how exhaustively he researched the subject!).  The factionalism I discussed is portrayed by Sedgwick as a conflict between two men who personified this conflict:  “The Ridge—short for He Who Walks on Mountaintops—was a big, imposing, copper-skinned Cherokee, a fearsome warrior turned plantation owner, whose voice quieted any room, and whose physique awed anyone who crossed his path.  Smaller, almost twenty years younger, [John] Ross was descended from Scottish traders and looked like one:  a pale, unimposing half-pint who wore eastern clothes, from laced shoes to a top hat.  If The Ridge radiated the power of a Cherokee who could drop a buck at a hundred paces, Ross could have strolled into an Edinburgh dinner party without receiving undue attention.  Tellingly, The Ridge spoke almost no English, and Ross almost no Cherokee” (p. 3).  Ross and Ridge were one-time friends and allies who fell apart under the pressures for removal applied by President Andrew Jackson in the 1830s.  During those years there erupted a “blood feud” which morphed “from personal vendetta to clan war to a civil war that swept through the entire Cherokee Nation before it got caught up in the even greater cataclysm of the American War Between the States” (p. 4).

Sedgwick devotes the first section of his book (“Paradise Lost”) to the history of the Cherokees during from 1770-1814.   (Invoking the word “paradise” to describe that world reveals the author’s rather romantic approach to the natives, inasmuch as my reading of the primary sources certainly unveils a great deal of violence, blood feuds, revenge killings, superstition, insecurity, factionalism, etc. that made the Cherokees something less than Edenic peoples!  Anyone wanting a more realistic depiction of Native Americans would do well to read the  multi-volume Jesuit Relations, or Kevin Seiper’s Conquistador Voices, or Bernard Bailyn’s The Barbarous Years).  During these years the Cherokees watched their territory shrink as a result of conflicts with the first English and then the Americans.  Military conflicts and treaties and trade brought the Indians and Anglo-Americans together in disparate ways, leading to the emergence of a unique Cherokee embrace of the white man’s “civilization.” 

Embodying this transition was The Ridge, formerly known as a great warrior and hunter, who declared:  “‘The hunting is almost done & we must now live by farming, raising corn & cotton & horses & hogs & sheep.  We see that those Cherokees who do this live well” (p. 69).  He and his wife discarded “the habits of their race” and took up “Christian employments.”  Intrinsically industrious, The Ridge soon built a fine house and oversaw a thriving plantation.  He supported the political organization of the tribe, beginning in 1808 with a tribal council passing its first “law.”  He sent his children to a nearby Moravian school, insisting they become literate and ready to prosper in the emerging Cherokee Nation.  Initially uninterested in the missionaries’ Gospel message, he was in time drawn to “the fall and salvation of man” story they shared.  When the great Tecumseh (a Shawnee from Indiana) came visiting in 1812, he implored the Cherokees to join in his conspiracy, averring “that the Great Spirit was furious to see the Cherokee with the whites’ gristmills, cotton clothes, liquor, featherbeds, and house cats” (p. 93).  But the Ridge and most Cherokees declined to join him. 

In fact, when the War of 1812 broke out and Andrew Jackson launched an expedition to punish the “Red Stick” Creeks in 1813, The Ridge and many Cherokees joined him, enthusiastically killing and scalping their ancient foes at the Battle of Horseshoe Bend.  For his warrior-skills and courage, The Ridge was made a major—thenceforth to be called, much to his pleasure, “Major Ridge.”  In a critical phase of the battle, he killed six knife-wielding Red Stick warriors, and Jackson’s ultimate success in the battle at New Orleans was facilitated by his Cherokee allies.  Sadly enough, while the Cherokees were helping Jackson, the Tennessee militia had charged through their lands “like an avenging army, stealing horses, slaughtering hogs and cattle, destroying corncribs, tearing down fences, seizing private stores of corn, maple sugar, and clothing and what few possessions the Cherokee could call their own” (p. 115).  Helping Jackson would not, in the long run, help them!  In fact, when they asked him for compensation for these “spoliations,” he

confiscated some of the Cherokee lands!  Like Major Ridge’s son, John, began to realize, “Old Hickory” was also a “snake in the grass.”

To establish themselves in the face of the advancing frontier—some 14,000 Indians confronting hundreds of thousands of Anglo-Americans (mainly Georgians) forging westward—the  Cherokees rapidly formed a government following the pattern of the U.S. Constitution, the first to be crafted by any Indian tribe.  They launched a national newspaper, The Cherokee Phoenix, thanks to the phenomenal work of Sequoyah, an illiterate genius who single-handedly designed a Cherokee syllabary that enabled adult Cherokees to become literate in a few days.  In fact, “it was so easy to learn that schools didn’t bother to teach it, since children could pick it up on their own.  Remember the eighty-six symbols, sound them out, and you had it.  More than a Gutenberg, Sequoyah was a Leonardo, an inventor who created not just an invention, but modernity.  It is hard to find in all of recorded history as dramatic a transformation of a people in such a brief period of time.  It unleashed an outpouring of notes, letters, essays, records, reports, newspapers, Bible translations, books” (p. 146).  Using the weapons of the press and petitions and lawsuits and delegations to Washington, the Cherokees (led by Principle Chief John Ross) endeavored to deal with the white man on his own terms, vowing:  “Not one foot of land in cession.”  And he was fully supported by Major Ridge, elected as “first counselor to the principal chief, a post that made him, after Ross, the second most powerful man in the nation” (p. 170).

But they faced an implacable foe in Andrew Jackson, who was elected President in 1828.  One of his first concerns, following his inauguration, was passing and implementing the Indian Removal Act.  All Indians east of the Mississippi were to be driven from their homes and resettled in the West.  In resistance, the Cherokees won important legal victories (most notably in the Supreme Court in Worcester vs. Georgia) and found much support throughout the United States, especially in religious sectors.  Jackson, however, cared little for courts or public opinion.  “Incredible as it seemed to Ridge and Boudinot, Jackson had indeed decided that this epic Supreme Court ruling was merely John Marshall’s opinion, nothing more. ‘John Marshall has made his decision,’ Jackson was said to have declared, and rather idly.  ‘Let him enforce it’” (p. 193).  Seeing the writing on the wall, some Cherokees decided it would be wiser to remove to the West (where there was already a settlement of Western Cherokees) on their own rather than wait for the U.S. to force them.  Thus Major Ridge and his extended family, supported by a largely mixed-blood faction, formed what would be known as the Treaty Party—Cherokees willing to negotiate the best removal treaty possible.  They signed a treaty offered them by Jackson’s envoy (Rev. John F. Schermerhorn, a former missionary), taking $4.5 million for their eastern lands and getting lands in Indian Territory.   Though John Ross and the national assembly staunchly rejected the spurious “treaty,” President Jackson claimed it was legitimate and submitted it to the U.S. Senate for approval.  There Jackson “prevailed by just one vote beyond the two-thirds needed.  So, on May 23, 1836, the New Echota Treaty became the law of the United States—and this was one law that Andrew Jackson had every intention of enforcing” (p. 248). 

The Treaty Party (numbering about 1000, including many slaves), led by Major Ridge, his son John, Elias Boudinot (the editor of The Cherokee Phoenix) and his brother Stand Watie, removed on their own.  In what is today northeastern Oklahoma, they settled amongst the “Old Settlers”—Cherokees (including Sequoah) who had on their own migrated west in the previous two decades.   “‘It is superior to any country I ever saw in the U.S.,’ John Ridge declared after he’d had a chance to ride about the territory.  ‘In a few years it will be the garden spot of the United States’” (p. 269).  But when the U.S. Army rounded up and drove west the bulk of the tribe, they were understandably bitter and disillusioned, blaming both the United States and the Treaty Party.  “Of the 15,000 Cherokee who undertook the journey that became universally known as the Trail of Tears, roughly 2,000 died, and countless more simply disappeared en route.  Two thousand more died after they arrived from disease, starvation, and the misery that comes with such suffering” (p. 188).

Their misery prompted thoughts of revenge, and the tribe’s “blood law” justified them.  “No one was to sell Cherokee land without official permission.  No one.  John Ridge had written this law himself, at his father’s instigation.  The fury had been smoldering for some time, possibly from the moment the ink was dry on the page just after Christmas 1835, now almost three and a half long years before.  Did they not know that the land was not theirs to sell?” (p. 297).  Thus a well-orchestrated plot targeted the leaders of the Treaty Party and assassins killed Major Ridge, John Ridge and Elias Boudinot.  The nation divided into a bitter factional strife, punctuated by clandestine killings and political turbulence, sometimes rendered dormant by peaceful interludes supported by all sides.  “With each side cloaked in righteousness, the killing went on and on.  Murder became so common, said one Cherokee, that it was like hearing ‘of the death of a common dog.’  From the end of 1845 to the end of 1846 [for example], thirty-four killings were recorded, nearly all of them political” (p. 338).

Nevertheless, when the American Civil War broke out, the old Treaty Party folks supported the Confederacy and followed Stand Watie, who was commissioned a colonel, leading a corps known as the Cherokee Mounted Rifles.  Ultimately Watie became a Brigadier General in the Confederate Army (the highest rank attained by any Indian during the Civil War), fighting a series of battles in Arkansas and Indian Territory.  “Watie, fighting to the end, was the last Indian commander” to lay down arms “just north of the Texas border.  It was there that Watie officially surrendered, the very last Confederate officer to give up the fight” (p. 392).  The Ross Party, led by John Ross, joined the Union as soon as possible, and many of his followers took up arms and battled for the North in the Cherokee Nation.  “In the Cherokee Nation, the ravages of the American Civil War had been compounded by the internal equivalent. Six thousand Cherokee, a quarter of the population, had died in the battles that occurred in every corner of the nation, or from the terrible starvation and rampant disease that followed them.  It turned 7,000 more out of their homes to roam the landscape in search of sustenance and shelter.  It widowed a third of all Cherokee wives, orphaned a quarter of the children, killed or scattered 300,000 head of cattle, and drove virtually everyone to depend on the federal government dispensing scant aid from the major forts, chiefly Fort Gibson” (p. 394). 

In short:  a Blood Moon shown on the tribe for 30 years wherein “brother brother slew.”  

320 New Conversations on Faith and Science

Two months ago I attended a conference hosted by Faith Bible Church, a large church in The Woodlands, Texas, titled “Reasons 2019:  New Conversations on Faith and Science.  Four speakers were featured, so I read books by each of them before joining others to hear their presentations.  The first presenter was Michael J. Behe, a professor of biochemistry at Leheigh University and one of the leading thinkers advocating the superiority of “Intelligent Design” over “Natural Selection” as the key to understanding the living world.  Nearly 30 years ago Behe published Darwin’s Black Box:  The Biochemical Challenge to Evolution, arguing “that life was designed by an intelligent agent” fully evident in his own study of biochemistry, revealing the intricacies of molecular life, the actual basis of life on planet earth.

A decade later he developed his position in The Edge of Evolution, wherein he noted that current orthodoxy in the scientific community defends a Darwinism composed of “random mutation, natural selection, and common descent.”  Of the three, random mutation is most crucial for understanding the emergence of novel life forms, but “except at life’s periphery, the evidence for a pivotal role for random mutations is terrible.”  In fact, we need the kind of precise, empirical data evident in engineering and anatomy.  For this we must plumb the mysterious realms of tiny molecules, proteins, and DNA.  To do so Behe focused on malaria—“the single best test case of Darwin’s theory.”  Because of its widespread devastation, malaria has been carefully studied for a century, and we can see, in 100 years of malaria parasites’ development, what has taken 100 million years in other species.  Amazingly, “the number of malarial parasites produced in a single year is likely a hundred times greater than the number of all the mammals that have ever lived on earth in the past two hundred million years.”  And though mutations have occurred, rendering us less susceptible to the disease, only minor molecular changes distinguish the parasites.  The Darwinian theory simply cannot explain one of the best-documented stories in biology.

Behe has recently published Darwin Devolves: The New Science about DNA that Challenges Evolution (New York: HarperOne, Kindle Edition, c. 2019).  He begins by reflecting on the philosophical questions he began asking as a boy—where did we come from? why are we here?  And in a simple but deeply profound way there are only two possible world-views addressing such questions, for “the enigma of where nature came from goes back as far as there are written historical records and, with a few lulls, has continued strongly up to the present.”  And despite many variations, “all particular positions on the topic can be considered to be elaborations on either of just two general mutually exclusive views:  (1) contemporary nature, including people, is an accident; and (2) contemporary nature, especially people, is largely intended—the product of a preexisting reasoning mind” (p. 1).  Though the two positions were debated in the Greco-Roman world, the “epitome of science” in antiquity “was arguably the work of the second-century Roman physician Galen, who had a very definite point of view on the origin of nature.  In his book On the Usefulness of the Parts of the Body, . . . Galen concluded that the human body is the result of a “‘supremely intelligent and powerful divine Craftsman,’ that is, the result of intelligent design” (p. 3).  That ancient insight, Behe holds, has been confirmed by the latest scientific understandings of DNA, and he has written this book “to give readers the scientific and other information needed to confidently conclude for themselves that life was purposely designed” (p. 20).

He begins illustrating his case by discussing polar bears, who have evolved, Darwinists claim,  from black bears and are uniquely adapted to the stark polar landscape.  “Yet,” he says, “a pivotal question has lingered over the past century and a half: How exactly did that happen?” (p. 16).  Just recently new research techniques have revealed the polar bear’s genetic heritage, and the “results have turned the idea of evolution topsy-turvy” (p. 16).  Scrupulous studies have shown that the genetic mutations differentiating polar bears from nearby relatives were “likely to be damaging—that is, likely to degrade or destroy the function of the protein that the gene codes for” (p. 17).  “It seems, then, that the magnificent Ursus maritimus has adjusted to its harsh environment mainly by degrading genes that its ancestors already possessed.  Despite its impressive abilities, rather than evolving it has adapted predominantly by devolving.  What that portends for our conception of evolution is the principal topic of this book” (p. 17).

Only recently have scientists been able to study life on a molecular level, where, “it turns out that, as with the polar bear, Darwinian evolution proceeds mainly by damaging or breaking genes, which, counterintuitively, sometimes helps survival.  In other words, the mechanism is powerfully devolutionary.  It promotes the rapid loss of genetic information.  Laboratory experiments, field research, and theoretical studies all forcefully indicate that, as a result, random mutation and natural selection make evolution self-limiting.  That is, the very same factors that promote diversity at the simplest levels of biology actively prevent it at more complex ones.  Darwin’s mechanism works chiefly by squandering genetic information for short-term gain” (p. 38).  Rather than developing new, more vibrant life-forms, natural selection degrades those that already exist.

To Behe, this points out a fatal flaw in the Darwinian dogma.  In fact, Darwin never showed how “purposeful systems could be built by natural selection acting on random variation.  Rather, he just proposed that they might.  His theory had yet to be tested at the profound depths of life.  In fact, no one then even realized life had such depths” (p. 155).  But now we know something about such depths!  And the more we ponder the mysterious inner workings of molecular life the more we’re prompted to discern a Mind at work informing it.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Following Behe’s evening presentation, the next morning’s session featured a Brazilian biochemist, Marcos Eberlin, who just published Foresight: How the Chemistry of Life Reveals Planning and Purpose (Seattle: Discovery Institute, c. 2019).  Eberlin is an internationally-acclaimed scientist, and his treatise has garnered “endorsements” from an impressive variety of world-class scientists.  Thus Sir John B. Gurdon, who won the Nobel Prize in Physiology or Medicine in 2012, recommends Foresight to anyone “interested in the chemistry of life.  The author is well established in the field of chemistry and presents the current interest in biology in the context of chemistry.  I am happy to recommend the work.”  One of his Brazilian colleagues, Rodinei Augusti, says the “book demonstrates that the currently available scientific knowledge increasingly points to the existence of a supreme being who carefully planned the universe and life.  This breakthrough will revolutionize science in the years to come.”

Listening to Eberlin speak, I was both pleased and amazed at his enthusiasm for his work.  As he described some of the intricate, complex processes evident in the natural world, he had an almost childlike joy in showing just how wonderful it all is.  On a cosmic level, it’s amazing that earth, a tiny planet amidst two trillion galaxies, each containing some 100 billion stars, is perfectly placed to nourish life.  Our sun is perfectly sized and exudes just the right amount of energy for earthlings.  Our atmosphere perfectly protects us from harmful radiation, allowing just the right amount of sunlight to reach the earth and promote life. The earth’s magnetic shield perfectly protects us from solar winds.  The moon stabilizes earth rotation, promoting the yearly seasons so needed for life to flourish.  Water itself is a most amazing substance, containing some 74 unique properties.  As Eberlin touched on these topics he obviously rejoiced to be alive and well—and able to understand a bit—on this wonderful planet. 

Much of Eberlin’s speech emphasized how much scientists have learned in the past decade!  Current technology enables them to probe both the vastness of the universe and the intricacies of the cell in novel and illuminating ways.  In Foresight, he begins by saying, “as plainly as I can:  This rush of discovery seems to point beyond any purely blind evolutionary process to the workings of an attribute unique to minds—foresight” (#118).  Such is evident, for example, in cell membranes, which must both protect it from external threats, allow nutrients to enter, and expel waste.  “Selective channels through these early cell membranes had to be in place right from the start.  Cells today come with just such doorways, specialized protein channels used in transporting many key biomolecules and ions.  How was this selective transport of both neutral molecules and charged ions engineered?  Evolutionary theory appeals to a gradual, step-by-step process of small mutations sifted by natural selection, what is colloquially referred to as survival of the fittest.  But a gradual step-by-step evolutionary process over many generations seems to have no chance of building such wonders, since there apparently can’t be many generations of a cell, or even one generation, until these channels are up and running.  No channels, no cellular life.  So then, the key question is: How could the first cells acquire proper membranes and co-evolve the protein channels needed to overcome the permeability problem?” (#144).

Were you to try and hire the best engineers in the world to make such a membrane, they “might either laugh in your face or run screaming into the night.  The requisite technology is far beyond our most advanced human know-how.  And remember, getting two or three things about this membrane job right—or even 99% of the job—wouldn’t be enough.  It is all or death!  A vulnerable cell waiting for improvements from the gradual Darwinian process would promptly be attacked by a myriad of enemies and die, never to reproduce, giving evolution no time at all to finish the job down the road” (#164).  Membranes merely protect the cell, of course, and when you study the inner working of the cell itself you behold wonders within wonders.  Scientists receive Nobel Prizes for describing tiny bits of cellular life, but it’s obvious to Eberlin that “if Nobel-caliber intelligence was required to figure out how this existing engineering marvel works, what was required to invent it in the first place?” (#272).  More Nobel Prizes were recently given scientists who discovered how cells repair damaged DNA by making tiny nanomachines.  Their incredible “research and engineering sophistication thoroughly deserved” world acclaim.  But, asks Eberlin:  “Are we then to believe that the marvels of engineering that these brilliant scientists discovered were themselves produced by a mindless process?  If discovering the function of these engineering marvels took genius, how much more genius would be needed to create them?” (#836).

Foresight contains Eberlin’s explorations through a multitude of revealing details—“the code of life,” “bacteria, bugs and carnivorous plants,” birds, “the human form,” etc.  He clearly understands and writes clearly, helping readers share his awe at the wonders of the world we live in, and it is clear to him that it’s all here because of foresight, which by nature requires intelligence.  “The need to anticipate—to look into the future, predict potentially fatal problems with the plan, and solve them ahead of time—is observable all around us.  It is clear from the many examples in this book that life is full of solutions whose need had to be predicted to avoid various dead-ends.  Put another way, many biological functions and systems required planning to work.  These features speak strongly against modern evolutionary theory in all its forms, which remains wedded to blind processes” (#2066).

So Eberlin concludes his book thusly:  “Nobel laureate J. J. Thomson—one of the giants of early modern physics, the discoverer of the electron, and the father of mass spectrometry, my field of expertise, beautifully conveyed this optimistic, open-ended view of science.  I can think of no better words for concluding a book about a world filled with evidence of foresight, words as true today as when Thomson penned them in the early twentieth century:  ‘The sum of knowledge is at present, at any rate, a diverging, not a converging, series.  As we conquer peak after peak we see in front of us regions full of interest and beauty, but we do not see our goal, we do not see the horizon; in the distance tower still higher peaks, which will yield to those who ascend them still wider prospects, and deepen the feeling, the truth of which is emphasized by every advance in science, that “Great are the Works of the Lord”’”( #2149).

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

The third speaker at the “Reasons 2019” conference was Melissa Cain Travis, a young professor of apologetics at Houston Baptist University.  She has recently published Science and the Mind of the Maker: What the Conversation Between Faith and Science Reveals about God (Eugene Oregon: Harvest House Publishers, c. 2018, Kindle Edition).  Citing Oxford philosopher Richard Swinburne, she says: “‘I do not deny that science explains, but I postulate God to explain why science explains. The very success of science in showing us how deeply orderly the natural world is provides strong grounds for believing that there is an even deeper cause of that order’” (#102).  She thus argues, developing what she calls a “Maker Thesis,” that “Christian theism uniquely provides a well-rounded account of both the findings and the existence of the natural sciences.  I will argue that not only do scientific discoveries have positive implications for the existence of a Mind behind the universe, they strongly suggest that this Mind intended for human beings to take up the noble project of rational inquiry into the mysteries of nature.  In other words, Christian theism, unlike atheism, offers a sufficient explanation of the observable features of the natural world as well as mankind’s impressive scientific achievements” (#107).

Travis thus begins by critiquing the philosophical naturalism and scientism so evident in many modern scientists’ worldview, citing as evidence evolutionary biologist Richard Lewontin, who dogmatically declared:  “‘It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counterintuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute, for we cannot allow a Divine Foot in the door’” (#222).  Lewontin nicely illustrates what G.K. Chesterton noted: “I never said a word against eminent men of science.  What I complain of is a vague popular philosophy which supposes itself to be scientific when it is really nothing but a sort of new religion” (#132).

Countering this new religion, Travis directs us back to an ancient insight famously crafted by Virgil in The Aeneid:  “The moon’s bright globe, the sun and stars are nurtured / By a spirit in them.  Mind infuses each part / And animates the universe’s whole mass.”  And Virgil was just poetically phrasing Cicero’s philosophical position, set forth in The Nature of the Gods:  “What can be so obvious and clear, as we gaze up at the sky and observe the heavenly bodies, as that there is some divine power of surpassing intelligence by which they are ordered?”  Centuries later St Augustine would blend these Roman thinkers’ views into Christian theology, saying creation is “a great book” we should read carefully.  Indeed: “‘Look carefully at it top and bottom, observe it, read it.  God did not write in letters with ink but he placed what is created itself in front of you to recognize him in; he set before your eyes all these things he has made.  Why look for a louder voice?  Heaven and earth cries out to you:  God made me’” (#510).

Augustine’s philosophical position, Travis thinks, is forever true and can be sustained amidst all the details of modern science.  Thus she conducts a knowledgeable tour of both historical and contemporary astronomy, physics, chemistry, biology, etc.  It’s important to note that “Copernicus, Kepler, Galileo, Newton, and Boyle were key players in the scientific revolution, and all five of them saw the attributes of the cosmos as indicators of a wise Creator in whose image we are made” (#1210).  Their position, however, was significantly undermined by Charles Darwin’s evolutionary naturalism, effectively removing divine design from the universe.  But a growing number of contemporary scientists—such as Michael Behe and Marcos Eberlin—have effectively set forth evidence favoring Intelligent Design.

Thus there took place a “Revival of the God Hypothesis” in 20th century physics and cosmology.  For example, Max Planck (sometimes called the “father of quantum physics”), “was particularly fascinated by the congruence between the mathematical, law-governed structure of the material world and human rationality; he saw this correspondence as indicative of a designing Mind” (#2256).  Travis finds mathematicians particularly interesting, for they often conclude that inasmuch as the cosmos follows mathematical laws it makes sense to posit a Divine Mind responsible for it all.  To her: “The Maker Thesis has no difficulty explaining the objectivity of mathematical truth, how beautifully mathematics applies to physical reality, and mankind’s corresponding intellectual capacities.  If the cosmos is the creation of a rational Mind in whose image we are made, a Maker who desires our awareness of him, this deep interconnection makes perfect sense.  As Oxford mathematician John Lennox has said, it is ‘not surprising when the mathematical theories spun by human minds created in the image of God’s Mind find ready application in a universe whose architect was that same creative Mind’” (#2816).

* * * * * * * * * * * * * * * * * * * * * * * * * * * *

The fourth and final speaker at the “Reasons 2019” conference was a historian, Michael Newton Keas, who provided a brief survey of his treatise, Unbelievable: 7 Myths About the History and Future of Science and Religion (Wilmington, Delaware: Intercollegiate Studies Institute, c. 2019, Kindle Edition).  As serious scholars know all-too-well, many “historical truths” are anything but true!  In part that’s because the past is a “story” made up of multiple stories.  And stories can be either true or imaginary!  Thus when scientists tell their stories, detailing their endeavors to understand the cosmos, their renditions “sometimes have implications for belief or disbelief in God or a spiritual heaven.  Too often, however, these stories are false.  They are nothing but myths.  And yet many leading scientists and science writers offer these stories as unassailable truth.   The myths make their way into science textbooks—which is a useful measure of a myth’s influence, as we will see in this book” (p. 2).  The myths further shape minds through science fiction, including Carl Sagan’s Cosmos TV series, and films such as the wildly popular Star Trek.  In fact: “The executive producer of Cosmos 2014 says that he has spent most of his professional life creating myths for the greater truth of atheism” and celebrated “his part in creating “atheistic mythology” in more than 150 episodes of Star Trek: Next Generation” (p. 153)

The myths Keas endeavors to deconstruct involve baseless stories about such things as the “Dark Ages” and pre-modern thinkers’ failures to understand the immensity of the universe and the global shape of planet earth.  Following Carl Sagan’s example, Neil deGrasse Tyson routinely invokes the “Dark Ages” to mock Christians, asserting they believed in the “flat earth” for five centuries.  Doing so, “Tyson, probably the world’s most influential public voice for science, is spreading misinformation about medieval views of the earth’s shape” (p. 43).  Tyson et al. tell demonstrably untrue but emotionally-evocative stories about persecuted scientists such as Bruno, Galileo, and Copernicus.  Keys describes how these misleading stories made their way into school textbooks and thence into the public consciousness.  He is particularly persuasive when pointing out how textbooks—so often taken as the final word on whatever they cover—serve as propaganda devices for the regnant worldview.  Doing so he provides needed clarity, refuting many of the allegedly “scientific” certainties pervasive in our culture.

319 Suspect “Science” – the Low-Fat Diet

Fifty years ago I regularly read Organic Gardening and Prevention Magazine—back-to-the earth publications urging readers to embrace a simpler, more natural approach to life.   The articles contained much dietary advice, some of which I embraced and still follow.  I also heeded the advice set forth in Runners’ World, loading up on carbohydrates as the best fuel for active athletes.  Then when the leading “experts” in nutrition began promoting a “low fat” diet and my primary care physician urged me to embrace it, I more-or-less ate in accord with its dictates.  Now and then I heard of folks embracing a “low carbohydrate” rather than “low fat” regimen (as in the Atkins diet), but I assumed they were food faddists or kooks of some sort.   After all, the US Department of Agriculture had issued its “food pyramid” that supposedly summed up the nutritional experts’ evidence—the last word on it all.  Eating lots of carbs, following a near-vegetarian diet and exercising, I believed, was the sure way to good health.  

But recently my curiosity was piqued when my step-son and his family embraced the “Keto” diet.  Then I met a man who’d lost nearly 30 pounds following a “low carb” diet who he suggested I read Nina Teicholz’s The Big Fat Surprise: Why Butter, Meat and Cheese Belong in a Healthy Diet (New York:  Simon & Schuster, Kindle Edition, c. 2014).   I did so and found it both provocative and persuasive.  It’s a very personal book, since Teicholz had for many years devoutly followed the low-fat prescriptions promoted almost everywhere.  But then she moved to New York and found work writing restaurant reviews—and getting free meals!  “Suddenly,” she says, “I was eating gigantic meals with foods that I would have never before allowed to pass my lips:  pâté, beef of every cut prepared in every imaginable way, cream sauces, cream soups, foie gras—all the foods I had avoided my entire life.  Eating these rich, earthy dishes was a revelation. They were complex and remarkably satisfying.  I ate with abandon.  And yet, bizarrely, I found myself losing weight” (p. 2).  That led her to seriously question what she’d been told, and in time she came to believe “that all our dietary recommendations about fat—the ingredient about which our health authorities have obsessed most during the past sixty years—appeared to be not just slightly offtrack but completely wrong.  . . . .  Finding out the truth became, for me, an all-consuming, nine-year obsession.  I read thousands of scientific papers, attended conferences, learned the intricacies of nutrition science, and interviewed pretty much every single living nutrition expert in the United States” (p. 2).

She discovered that a small coterie of nutritionists, trying to explain skyrocketing numbers of heart attacks, had “hypothesized that dietary fat, especially of the saturated kind (due to its effect on cholesterol), was to blame.  This hypothesis became accepted as truth before it was properly tested” (p. 3).  Critics of the hypothesis—and there were many—found themselves ostracized and effectively silenced, “cut off from grants, unable to rise in their professional societies, without invitations to serve on expert panels, and at a loss to find scientific journals that would publish their papers” (p. 4).  The dissenters, however, have been proven right!  For what Teicholz discovered “was not only that it was a mistake to restrict fat but also that our fear of the saturated fats in animal foods—butter, eggs, and meat—has never been based in solid science.  A bias against these foods developed early on and became entrenched, but the evidence mustered in its support never amounted to a convincing case and has since crumbled away.  This book lays out the scientific case for why our bodies are healthiest on a diet with ample amounts of fat and why this regime necessarily includes meat, eggs, butter, and other animal foods high in saturated fat” (p. 7).

Teicholz begins her presentation with some telling illustrations, including the story of Vilhjalmur Stefansson, who lived for several years with the Inuit natives in the Canadian Arctic a century ago.  They ate virtually nothing but fat meat and enjoyed good health.  Subsequently he and another man volunteered to duplicate the Inuit diet and thrived for a year (under medical supervision) eating nothing but meat.  Then comes evidence from Africa, where the Masai and other tribal peoples eat little except meat and dairy products without experiencing significant heart disease.  And in India, “Sir Robert McCarrison, the British government’s director of nutrition research in the Indian Medical Service and perhaps the most influential nutritionist of the first half of the twentieth century, wrote that he was ‘deeply impressed by the health and vigour of certain races there.  The Sikhs and the Hunzas,’ notably, suffered from ‘none of the major diseases of Western nations such as cancer, peptic ulcer, appendicitis, and dental decay.’  These Indians in the north were generally long-lived and had ‘good physique[s],’ and their vibrant health stood ‘in marked contrast’ to the high morbidity of other groups in the southern part of India who ate mainly white rice with minimal dairy or meat” (p. 14).

Given such evidence, one wonders why meat and dairy products suddenly became the great culprits to be avoided!  In large part the notion “that saturated fat causes heart disease was developed in the early 1950s by Ancel Benjamin Keys, a biologist and pathologist at the University of Minnesota” (p. 19).  He and his colleagues zeroed in on cholesterol as the primary culprit causing heart disease.  Though it “is a vital component of every cell membrane, controlling what goes in and out of the cell,” it also helped form the “atherosclerotic plaques” which clogged “the arteries until it cuts off blood flow, [and it] was thought at the time to be the central cause of a heart attack” (p. 21).   Keys himself ran tests that involved giving volunteers massive amounts of cholesterol-rich foods without affecting the cholesterol levels in their blood.   “He found that ‘tremendous’ dosages of cholesterol added to the daily diet—up to 3,000 milligrams per day (a single large egg has just under 200 mg)—had only a “trivial” effect” (p. 23).  Disregarding his own research, Keys simply, instinctively knew better—eating fat must make you fat; cutting calories would cut down weight.  “‘No other variable in the mode of life besides the fat calories in the diet is known which shows anything like such a consistent relationship to the mortality rate from coronary or degenerative heart disease,’” he declared in 1954.  “If people just stopped eating eggs, dairy products, meats, and all visible fats, he argued, heart disease would ‘become very rare’” (p. 32). 

Soon thereafter, in 1955, President Dwight Eisenhower had the first of several several heart attacks and his personal doctor, a Harvard Medical School professor, Paul Dudley White strongly endorsed Keys’ position.  Speaking to the nation from Ike’s bedside, White explained why heart attacks occurred and urged everyone to stop smoking and eat less saturated fat and cholesterol-laden foods.  Writing a front-page New York Times article regarding the President’s health, White cited Ancel Keys’ “brilliant” work and urged the nation to follow his advice.  In fact, the President had no family history of heart disease and had quit smoking a decade earlier.  He exercised, had normal blood pressure, and his total cholesterol (167) was considered normal.  After his heart attack, however, Ike became “obsessed with his blood-cholesterol levels and religiously avoided foods with saturated fat; he switched to a polyunsaturated margarine, which came on the market in 1958, and ate melba toast for breakfast”(p. 33).  He rarely ate meat or eggs or cheese, but by the end of his presidency his cholesterol registered 259—just days after Ancel Keys appeared on Time magazine’s cover, urging everyone to embrace the heart-healthy diet Ike had been so diligently following.  In  Ike’s case, sadly enough, the more he dieted the more cholesterol flooded his system!  He died of heart disease in 1969.

To prove his diet-heart hypothesis, Ancel Keys orchestrated a “Seven Countries” study that seemed to do so.  Yet though frequently cited as evidence, his study at best established “an association between a diet low in animal fats and minimal rates of heart disease; it could say nothing about whether that diet caused people to be spared the disease” (p. 42).  Carefully examined, Keys’ study was full of flaws—nearly fraudulent in some aspects.  But though considerable evidence existed to suggest he had little demonstrable (i.e. clinical) proof, he managed to enlist the American Heart Association in his cause and persuaded the National Institutes of Health to subsidize his research.  Time magazine celebrated him as “Mr. Cholesterol” and he enjoyed virtually unanimous media support, urging folks to eat less meat, drink less meat, and eschew fats of all sorts. 

Columnists such as New York Times health writer Jane Brody relentlessly promoted Keys’ diet-heart hypothesis, and everyone of consequence agreed!  Brody urged everyone to follow a low-fat diet and in 1990 published a seven-hundred-page manifesto:  The Good Food Book: Living the High-Carbohydrate Way.  The message was crystal clear:  dietary fat elevated blood cholesterol which “would eventually harden arteries and lead to a heart attack.  The logic was so simple as to seem self-evident.  Yet even as the low-fat, prudent diet has spread far and wide, the evidence could not keep up, and never has.  It turns out that every step in this chain of events has failed to be substantiated:  saturated fat has not been shown to cause the most damaging kind of cholesterol to go up;  total cholesterol has not been demonstrated to lead to an increased risk of heart attacks for the great majority of people, and even the narrowing of the arteries has not been shown to predict a heart attack” (p. 53).

In fact, while Keys was promoting his hypothesis a multitude of careful, clinical studies—“some of the biggest and most ambitious trials of diet and disease ever undertaken in the history of nutrition science” disputed it (p. 57).  Triglycerides, not cholesterol, looked like a more probable culprit.  Total cholesterol apparently has little significance, for HDL-cholesterol (the “good” kind) contributes to good health whereas LDL-cholesterol (the “bad” kind) proves deleterious.  Consuming vegetable oils, not animal fats, appeared closely linked to the increased incidence of heart disease.  And carbohydrates, not fat, seemed to actually cause obesity.  One of the most celebrated studies—the Framingham Study—early seemed to substantiate Keys’ position, but in 1992, a study leader admitted:  “‘the more saturated fat one ate . . . the lower the person’s serum cholesterol . . . and [they] weighed the least’” (p. 67).  More alarmingly, many studies revealed  “an extremely uncomfortable fact for the promoters of the diet-heart hypothesis:  people who eat less fat, particularly less saturated fat, appear not to extend their lives by doing so. Even though their cholesterol inevitably goes down, their risk of death does not” (p. 74).  “Another study in Israel followed ten thousand male civil service and government employees for five years and found no correlation between heart attacks and anything they ate.  (The best way to avoid a heart attack, according to the study, was to worship God, since the more men identified themselves as being religious, the lower was their risk of having a heart attack” (p. 98).

Yet such dissenting studies failed to register with the American public.  In large part this was because the federal government threw its massive weight into promoting the low-fat diet.  In 1977 Senator George McGovern issued a committee report—“Dietary Goals”—which declared that Americans’ diet was harming their health.  Eating too much meat and eggs and dairy products was responsible for “heart disease, cancer, diabetes and obesity,” whereas eating grains, fruit, and vegetables, would improve the nation’s health.  Though the Dietary Goals came out of a typically brief Senate hearing—not a demonstrative scientific study—it had enormous impact.  “We cannot afford to await the ultimate proof before correcting trends we believe to be detrimental,” said the senators.  “So it was that Dietary Goals . . . without any formal review, became arguably the most influential document in the history of diet and disease.  Following publication of Dietary Goals by the highest elective body in the land, an entire government and then a nation swiveled into gear behind its dietary advice” (p. 120).  Thereafter the Dietary Guidelines for Americans was published, including the USDA food pyramid which was widely endorsed as a guide to good health.  “Here, then, was the new reality:  a political decision had yielded a new scientific truth” (p. 125).   As of 2010 the USDA was still promoting a plant-based diet—grains, vegetables, fruits and nuts.

Yet the USDA had no good evidence for its edict!  In fact, “the largest and longest trial of the low-fat diet ever undertaken” (the Women’s Health Initiative) demonstrably failed.  “A review in 2008 of all studies of the low-fat diet by the United Nation’s Food and Agriculture Organization concluded that there is ‘no probable or convincing evidence’ that a high level of fat in the diet causes heart disease or cancer.  And in 2013 in Sweden, an expert health advisory group, after spending two years reviewing 16,000 studies, concluded that a diet low in fat was an ineffective strategy for tackling either obesity or diabetes. Therefore, the inescapable conclusion from numerous trials on this diet, altogether costing more than a billion dollars, can only be that this regime, which became our national diet before being properly tested, has almost certainly been a terrible mistake for American public health” (p. 173).  Unfortunately:  “Despite the original good intentions behind getting rid of saturated fats, and the subsequent good intentions behind getting rid of trans fats, it seems that the reality, in terms of our health, has been that we’ve been repeatedly jumping from the frying pan into the fire. The solution may be to return to stable, solid animal fats, like lard and butter, which don’t contain any mystery isomers or clog up cell membranes, as trans fats do, and don’t oxidize, as do liquid oils.  Saturated fats, which also raise HDL-cholesterol, start to look like a rather good alternative from this perspective” (p. 285).

That “good alternative,” Teicholz believes, is conveniently set forth in the Atkins Diet.  Robert Atkins was a cardiologist who helped tens of thousands of patients lose weight.  “Based on his experience treating patients, Atkins believed that meat, eggs, cream, and cheese . . . were the healthiest of foods.  His signature diet plan was more or less the USDA pyramid turned on its head, high in fat and low in carbohydrates.  Atkins believed that this diet would not only help people to lose weight but also fight heart disease, diabetes, and possibly other chronic diseases as well” (p. 287).  As an active physician, however, he had no “scientific studies” to bolster his claims.  He urged academic “experts” to look at his files, but none was interested.  Though many considered Atkins a faddish innovator, his dietary prescriptions actually had a long, impressive history, beginning with William Banting’s 1863 pamphlet, Letter on Corpulence, Addressed to the Public, which sold thousands of copies around the world and enabled him personally to shed unwanted pounds.  Then:  “In the United States, Sir William Osler, a worldwide medical authority in the late nineteenth century and one of the founders of Johns Hopkins Hospital, promoted a variation of the diet in his seminal 1892 medical textbook.  And a London physician, Nathaniel Yorke-Davis, used a version of the low-carbohydrate diet to treat the obese President William Taft from 1905 on, helping him lose 70 pounds” (p. 293).

Scores of other researchers reached the same conclusion, for during the first half of the 20th century it was discovered how insulin profoundly affects body-weight.  “The body secretes insulin whenever carbohydrates are eaten.  If carbs are eaten only occasionally, the body has time to recover between the surges of insulin.  The fat cells have time to release their stored fat, and the muscles can burn the fat as fuel.  If carbohydrates are eaten throughout the day, however, in meals, snacks, and beverages, then insulin stays elevated in the bloodstream, and the fat remains in a state of constant lockdown.  Fat accumulates to excess; it is stored, not burned.” However, “the absence of carbohydrates would allow fat to flow out of the fat tissue, no longer held hostage there by the circulating insulin, and this fat could then be used as energy.  A person would lose weight, theoretically, not because they necessarily ate less but because the absence of insulin was allowing the fat cells to release the fat and the muscle cells to burn it” (p. 296).  A small group of scholarly researchers have been compiling compelling evidence regarding the advantages of a low-carb, high-fat diet, though they have as yet failed to dislodge the dominant “consensus” regarding healthy diets. 

Yet to Teicholz:  “The sum of the evidence against saturated fat over the past half-century amounts to this:  the early trials condemning saturated fat were unsound;  the epidemiological data showed no negative association; saturated fat’s effect on LDL-cholesterol (when properly measured in subfractions) is neutral;  and a significant body of clinical trials over the past decade has demonstrated the absence of any negative effect of saturated fat on heart disease, obesity, or diabetes.  In other words, every plank in the case against saturated fat has, upon rigorous examination, crumbled away.  It seems now that what sustains it is not so much science as generations of bias and habit—although, as the latest 2013 AHA-ACC guidelines show, bias and habit present powerful, if not impenetrable, barriers to change” (p. 326).

So the low-fat mantra is dutifully repeated in most sectors, and Americans have obediently reduced their consumption of red meat, eggs, and butter.  “Americans continue to avoid all fats:  the market for ‘fat replacers,’ the foodlike substances substituting for fats in processed foods, was, in 2012, still growing at nearly 6 percent per year, with the most common fat replacers being carbohydrate-based” (p. 330).  Yet what they’ve believed lacks credibility.  Angel Keys declared, in 1952, that heart disease would “become very rare” if folks followed his low-fat diet.  In fact, while following his prescription “Americans have experienced skyrocketing epidemics of obesity and diabetes, and the CDC estimates that 75 million Americans now have metabolic syndrome, a disorder of fat metabolism that, if anything, is ameliorated by eating more saturated fat to raise HDL-cholesterol.  And although deaths from heart disease have gone down since the 1960s, no doubt due to improved medical treatment, it’s not clear that the actual occurrence of heart disease has declined much during that time” (p. 327).

Teicholz concludes her presentation with these sobering words:  “If, in recommending that Americans avoid meat, cheese, milk, cream, butter, eggs, and the rest, it turns out that nutrition experts made a mistake, it will have been a monumental one.  Measured just by death and disease, and not including the millions of lives derailed by excess weight and obesity, it’s very possible that the course of nutrition advice over the past sixty years has taken an unparalleled toll on human history.  It now appears that since 1961, the entire American population has, indeed, been subjected to a mass experiment, and the results have clearly been a failure.  Every reliable indicator of good health is worsened by a low-fat diet.  Whereas diets high in fat have been shown, again and again, in a large body of clinical trials, to lead to improved measures for heart disease, blood pressure, and diabetes, and are better for weight loss.  Moreover, it’s clear that the original case against saturated fats was based on faulty evidence and has, over the last decade, fallen apart.  Despite more than two billion dollars in public money spent trying to prove that lowering saturated fat will prevent heart attacks, the diet-heart hypothesis has not held up. In the end, what we believe to be true—our conventional wisdom—is really nothing more than sixty years of misconceived nutrition research” (p. 330).

* * * * * * * * * * * * * * * * * * * * * * * * * *

Much that Nina Teicholz says in The Big Fat Surprise was earlier set forth, in much more detail and scholarly erudition, by Gary Taubes in Good Calories, Bad Calories (New York:  Knopf Doubleday Publishing Group, Kindle Edition, c. 2007).  “The reason for this book is straightforward,” he says:  “despite the depth and certainty of our faith that saturated fat is the nutritional bane of our lives and that obesity is caused by overeating and sedentary behavior, there has always been copious evidence to suggest that those assumptions are incorrect, and that evidence is continuing to mount. ‘There is always an easy solution to every human problem,’ H. L. Mencken once said—‘neat, plausible, and wrong’” (#216).  That easy solution—the low-fat diet—was promoted by Universities and federal bureaucracies, but doing so has not particularly affected death rates or overall health because total cholesterol—the big bogeyman in dietary circles—has little to do with heart disease!   But Ancel Keys had insisted, based on statistical data, the contrary.  And he won the day, making low-fat diet virtually mandatory for folks desiring to live well.  Yet he may well have been wrong!  And we’ve all paid the price for his error! 

318 Christian Ethics

As is evident in the New Testament, from the beginning Christians have been concerned with both theology and morality—what we believe and how we behave.  Commending Wayne Grudem’s just-published Christian Ethics:  An Introduction to Biblical Moral Reasoning (Wheaton, IL:  Crossway, c. 2018, Kindle edition), Al Mohler Jr., President of The Southern Baptist Theological Seminary, says:  “Insightful, encyclopedic, biblical, and distinctively evangelical, this new book from Wayne Grudem is a massive contribution to Christian ethics. It will stand as one of the most important and definitive works of this generation.  Readers should engage it chapter by chapter, and then keep it close at hand for continuing consultation.”  High praise from an esteemed evangelical scholar!  As the book’s title indicates, Grudem endeavors to set forth a biblical ethic, saying:  “I have written this book for Christians who want to understand what the Bible teaches about how to obey God faithfully in their daily lives.  I hope the book will be useful not only for college and seminary students who take classes in Christian ethics, but also for all other Christians who seek, before God, to be ‘filled with the knowledge of his will in all spiritual wisdom and understanding,’ with the result that they will live ‘in a manner worthy of the Lord, fully pleasing to him: bearing fruit in every good work and increasing in the knowledge of God’ (Col. 1: 9– 10)” (#334).

Grudem follows the pattern he established 25 years ago in another valuable work, Systematic Theology, “asking what the whole Bible says about various topics” (#560).  Thus every chapter ends with questions to consider, books to consult, a verse to memorize, and a hymn to sing.  He’s little interested in historical or philosophical or theological or natural law ethics inasmuch as he’s persuaded that “only Scripture has the final authority to define which actions, attitudes, and personal character traits receive God’s approval and which ones do not, and therefore it is appropriate to spend significant time analyzing the teaching of Scripture itself” (#573).  Thus he sets forth what ethicists label a “divine command” ethical system.  As embedded in the Bible, God’s moral standards illuminate His moral character, and He “could not have made other moral standards for us than the ones that he made” (#1425).  

God’s moral standards are clear, for Grudem believes one of Scripture’s hallmarks is clarity.  Though recorded only in the Bible, they are applicable to all men at all times.  They are absolutes—and they never conflict.  Though they may demand careful thought to implement we never need to choose the “lesser of two evils” as some Christians, such as Norman Geisler (who thinks there are times when we must choose, for example, to save a life while destroying property) have averred.  Gruden champions what he calls “the nonconflicting biblical commands view” and insists “that God requires us to obey every moral command in the entire Bible that rightly applies to us in our situations” (#4715). 

He believes that problematic passages—such as Rahab helping the Hebrew spies in Jerico—can be fully explained without justifying sinful behavior (such as lying in Rahab’s case).  So if a Christian were asked by Nazi soldiers if he was hiding some Jews he must always tell the truth—for in such situations “there are always other options besides lying or divulging where the Jews are hidden.  Silence is one option.  Inviting the soldiers to come in and look around for themselves is another option.  In a comparable situation, several other possible responses might present themselves, including offering hospitality and refreshments to the soldiers” (#4508).  Here Grudem joins the Kantians who insist one must always both intend to do what’s right and then do it, regardless of the consequences, since you cannot control them.  In a frequently cited and debated passage Immanuel Kant declared that if your friend takes refuge in your home, fleeing from a man intent on murdering him, you cannot lie to save your friend’s life.   

Some apparent dilemmas find resolution in Grudem’s Old Testament hermeneutic.  The material included in Genesis 1 to Exodus 19, he says, is relevant to all people at all times.  Thus the “Noahide Covenant” with its seven “laws” is ever and everywhere valid.  Obeying these laws, as prescribed by the Jewish tradition, all of us must (1) refrain from denying God’s Oneness or (2) cursing Him.  We are not to (3) murder, (4) eat the flesh of a living animal, or (5) steal.  We must (6) rightly channel our sexual drive and thereby maintain the family.  And we are to (7) establish civil laws and authorities to maintain justice.  These OT moral imperatives, Grudem says, are for all mankind.  But, he contends:  “The Mosaic covenant, which began when God gave the Ten Commandments at Mount Sinai (Exodus 20), was terminated when Christ died, and Christians now live instead under the provisions of the new covenant.  Nevertheless, the Old Testament is still a valuable source of ethical wisdom when it is understood in accordance with the ways in which the New Testament authors continue to use the Old Testament for ethical teaching and in light of the changes brought about by the new covenant” (#5034).  Thus the sacrificial, ceremonial, dietary, and civil laws of the Hebrew scriptures carry no mandate for Christians.  Nor do the historical, wisdom, and prophetic books, as part of the Mosaic Covenant, provide normative laws for Christians.  So too for the Decalogue:  “Although the Mosaic covenant was terminated at the death of Christ, the Ten Commandments (Ex. 20: 1– 17; Deut. 5: 6– 21) still provide a useful summary of ethical topics.  However, all of these commandments (except the Sabbath commandment) are reaffirmed in the New Testament and should be thought of as part of the ‘law of Christ,’ which should guide the lives of Christian believers in the new covenant” (#6068).

Consequently, Grudem structures his Christian Ethics in accord with “protecting” (1) God’s honor, (2) human authority, (3) marriage, (4) property, and (5) purity of heart, seeking guidance from the Ten Commandments, which provide “a useful framework for studying all ethical topics” (#6113).  Thus, for example, the first commandment, “Thou shalt have no other gods before me” can be followed by urging moments of prayer in public schools, and the second commandment, prohibiting “graven images” can be applied to deviant notions of God.  “Taking the Lord’s name in vain” should prompt us to treasure “purity of speech” and avoid cursing and obscenity as well as specifically misusing the God-word.  Bearing false witness includes lying, so one should ever speak truly.  Even a lack of punctuality, Grudem thinks, constitutes a kind of lying and merits censure.  Thus one can find hundreds of ethical questions answered by bringing to bear appropriate biblical texts. 

Few folks—other than ethics teachers such as myself—would read right through Grudem’s Christian Ethics, but it has real value as a reference work.  Should one want to deal with any number of issues—ranging from parental authority, capital punishment, war, self-defense, abortion, suicide, birth control, divorce, vacations, to borrowing and lending—he will likely find them helpfully addressed in this book.   

* * * * * * * * * * * * * * * * * * * * * *

Whereas Wayne Grudem espouses a divine command version of Christian ethics, David Haines and Andrew Fulford take a different approach in Natural Law:  A Brief Introduction and Biblical Defense (The Davenant Press, Kindle Ediction, c. 2017).  The Davenant Institute, which published this guide, “supports the renewal of Christian wisdom for the contemporary church.  It seeks to sponsor historical scholarship at the intersection of the church and academy, build networks of friendship and collaboration within the Reformed and evangelical world, and equip the saints with time-tested resources for faithful public witness” (p. 126).  Clearly written, brief without being superficial, the book accomplishes its purpose and could easily be used in either college or Sunday school classes. 

The authors begin by listening carefully to the great declaration found in both Bible and Creeds, where we read that “God is the source of all creation, and that all created things were, in their divinely instituted natural states, good.  As we will see, the very fact of divine creation seems to point towards what has been traditionally called natural law:  the notion that there is, because of the divine intellect, a natural order within the created world by which each and every created being’s goodness can be objectively judged, both on the level of being (ontological goodness), and, for human-beings specifically, on the level of human action (moral goodness).  Ontological goodness is the foundation of moral goodness” (Introduction).  The best biblical passage clarifying this is Romans 2:14-15, and to great theologians, including John Calvin (whom the authors generally follow), this passage says all persons have “engraved on their hearts, a warning and judgment by which they discern between right and wrong, between honesty and villainy.”  Still more, Calvin continues:  “Men, therefore, have a certain natural knowledge of the law, which teaches them and tells them, in themselves, that one thing is good, and the other detestable.”

In accord with Calvin and other Christians, Haines and Fulford say:  “By natural law, then, we mean that order or rule of human conduct which is (1) based upon human nature as created by God, (2) knowable by all men, through human intuition and reasoning alone (beginning from his observations of creation, in general, and human nature, in particular), independent of any particular divine revelation provided through a divine spokesperson; and, thus (3) normative for all human beings” (p. 5).  It is not human (or positive) law but God’s law revealed in His handiwork—and primarily in us.  He designed us in His own image, giving us an ability to think and understand our own essence, and “if there is a natural law, then there is a Being which is superior to Human-beings, which is rational, and which is powerful enough to enforce the standard He has imposed upon the beings He governs” (p. 13).  Divinely-ordered, the natural law combines a “combination of metaphysical and epistemological Realism which we will call Moderate Realism” (p. 23). 

Most fully developed in the “common sense” or philosophia perennis tradition shaped by Aristotle and Thomas Aquinas, Moderate Realism holds that all things are what they are because they were designed to be so—and their designs can be rationally known by us.  As Étienne Gilson, said, when we know a thing we simply grasp its “nature,” a reality “‘situated in an existence which is not that of the knower, the ens of a material nature’” (p. 41).  As human beings (i.e. rational animals) we are able  “to love sacrificially, communicate through language, laugh,” and choose to sacrifice our own “good” in order to help another person.  “These distinguishing features of Human nature are what we should call essential attributes, that is, attributes which result from human nature. The word rational refers to, among other things, the capacity to reason, to consider abstract concepts for the sake of knowing, to deliberate about means to ends, etc. We propose, therefore, that humans are rational animals” (pp. 32-33).  Thus we have developed the natural sciences, such as astronomy, physics, and chemistry, wherein we seek to rigorously describe and fully understand the essence (or nature) of such things as gravity or water or osmosis.  There are real things that have their own essence, quite apart from our observations, that can be truly known when we study them, enabling us to their material, efficient, formal, and final causes.

The natural law tradition easily synthesizes with a biblically-based ethic, for the Bible “everywhere assumes, and in some places explicitly appeals to, natural law. The written book of God constantly bears witness to God’s other book, the book of nature” (p. 50).   Consequently, Haines and Fulford take us on a tour of the Hebrew, extra-canonical Jewish literature, and the Christian scriptures, showing how frequently they rely on natural law thinking.  For Christians, of course, the locus classicus for such is Romans 1 and 2.   Thus “Christians who believe in Scripture ought to be defending the existence and visibility of natural law, both to other Christians and to the world at large” (p. 107).

* * * * * * * * * * * * * * * * * * * * * * * * *

To the great Roman philosopher and orator Cicero:  “The absolute good is not a matter of opinion, but of nature.”  A good knife has a sharp blade and cuts effectively—it’s good because it attains its end, not because we like the way it looks.  A good surgeon removes a diseased kidney—he’s good because he does what he’s equipped to do, not because he’s a congenial guy.  Cicero’s words mark him as an advocate of the “natural law,” which is not a “law of nature” such as gravity but the right reading of human nature, both in its essence and end.  Embracing this approach in An Introduction to Ethics A Natural Law Approach (Eugene, OR:  Cascade Books, Kindle Edition, c. 2018), a young philosophy professor now teaching at Ohio Dominican College, Brian Besong, endeavors “to explain clearly and briefly to a non-philosophical audience the principles of ethics that dominated moral thinking in the West at least until the so-called ‘Age of Enlightenment’ that began in late seventeenth-century Europe” (#53).  For all to many intellectuals, little value can be found in the ancient and medieval eras, so most modern ethicists embrace systems such as existentialism, utilitarianism, pragmatism, etc.  However, as Besong insists, the “natural law approach” was for many centuries simply called ethics since “most of the major philosophers in the West [Plato; Aristotle; Cicero; Augustine; Aquinas] endorsed views that fell within this tradition” (#106). 

He begins by looking at “foundational” issues, showing why he finds many positions (such as rational egoism, moral relativism, and logical positivism) inadequate.  Then he begins building his own case by citing Boethius’s The Consolation of Philosophy:  “Nature has implanted in the minds of men a genuine desire for the good and the true, but misled by various delusions they often reach the wrong goal.”  Most deeply and by design we all desire happiness—to flourish in a fully human way—and the only way to do so is to live rightly, to be morally good.  Natural law philosophers, following Aristotle and St. Thomas Aquinas, identify “happiness” as our final end, the ultimate goal of the good life.  But such “happiness” cannot be confused with pleasures, all of which “are cheap thrills, which Aristotle suggests are mainly pursued by the ‘most vulgar type’ of people, who prefer a life ‘suitable to beasts’” (#1160).  So too neither prosperity nor fame provide lasting happiness, for it is, rightly defined, a state of completeness and contentment, a spiritual reality attained through right reasoning, a contemplative activity transcending worldly endeavors.  Though Aquinas will largely embrace Aristotle’s position, he insists the true happiness for which we hunger cannot be attained on earth, where we are limited and mortal.  Real happiness can come only in Heaven, where we may enjoy the beatific vision.  Though agreeing with Aristotle that contemplative thought makes one happy, Aquinas, insisted God Himself is Truth and Goodness, so rightly understanding God most fully makes one happy.

Joining Cicero as an advocate of the natural law, another Roman philosopher, Seneca, said:  “True wisdom consists in not departing from nature and in molding our conduct according to her laws and model.”  There are rights and wrongs.  Good and evil truly exist and can be known.  Moral goodness resides in acts that contribute to our flourishing, making us happy as we realize our true end.  Rightly understanding our nature as human beings gives us objective guidelines, enabling us to live well.  It is thus demonstrably true that we need to drink water, not gasoline, and doing so satisfies our thirst; we need love, not neglect, so children thrive in loving families; and we need to see how things really are rather than believe illusions or “social constructions,” so students must learn to think realistically.  Analyzing moral action, C.S. Lewis, a 20th century natural law proponent, suggested, in Mere Christianity, that we understand it in nautical terms, thinking how a naval fleet needs to function.  First, you need sea-worthy ships, capable of sailing across the waters; then you need to adjust your trajectory in light of other vessels, seeking to sail with rather than bump into them; and finally you need to know where you’re going, using maps and instruments to guide your journey.  Morality is thus personal (a solid ship), social (cooperating with other vessels), and teleological (knowing where to go). 

Natural Law ethicists hold people responsible for their behavior, though they always realize, with Velleius Paterculus, a Roman historian, that:  “It is customary for people to excuse all their own faults but never the faults of others, and to blame the affairs of others always on the person’s will rather than attendant circumstances.”  We don’t hold accountable tornadoes or three year old children or passengers in an out-of-control car.  We understand that one must be in full possession of his faculties (being mature and rational).  Thus neither instinctive acts, be it breathing or screaming or having “spontaneous thoughts, being characteristically both unfree and unintentional, are not something we are morally responsible for.  We need not feel guilty if we randomly and uncontrollably find ourselves feeling jealous of another person, for instance, or feeling an inappropriate attraction for someone, among many others.  To the degree that these were unintentional and unavoidable, they are not our fault, and we are not to blame for them” (#2642).  Good actions are fully rational, intentional, and volitional. 

Indeed:  “Intentionality, freedom, and knowledge are the three requirements for having an act that is morally evaluable, at least in the normal way” (#2331).  Professor Besong carefully explains these criteria, making helpful distinctions and developing a meaningful position capable of dealing with many of the true complexities of moral reasoning.  There is, for example, the “principle of double effect.”  One may primarily want to act rightly, but in fact there may be (honestly understood) bad as well as (equally well understood) good consequences.  You might, for example, decide to rescue a child from a burning house while leaving his pet dog (or even an elderly, obese quadriplegic) to die.  That would be a difficult but good act.  And unlike the disciples of Immanuel Kant (or Christians such as Wayne Grudem), there are important ethical distinctions we need to make.  “Consequently,” when wondering if it was right for Christians to mislead Nazis hunting Jews, “it is not always wrong to intend to deceive. It is always wrong to intend to lie, but lying and deception are not the same thing” (#2966). 

“To be mindful of his duty is true honor to an upright man,” said the Roman playwright Plautus (in Trinummus).  Thus “we must act (or avoid acting) in a certain way to be good and achieve happiness” (#3143).  By nature we are duty-bound to live rightly, that is righteously.  Doing one’s duty requires revering the rights of others.  This begins by defending everyone’s right to life, the most basic of all rights.  Murder—the deliberate taking of an innocent person’s life—is always wrong.  It logically follows that we also “have a right to health and bodily integrity” #3380), so to poison, or punch, a person harms him and should not be done.  Not all killing, however, is murder.  At times we may need to kill an aggressor—in self-defense of just war—because it is right to stop an aggressor intent on harming innocent persons.  Other nature-based rights include such things as a right to education (since we are rational beings who need to develop our minds) and private property (since we are material beings needing material goods to live well).  Such basic rights “do not depend upon an individual’s degree of maturity or physical growth, least of all upon an individual’s ability to physically defend himself. Instead, rights depend upon an individual’s potential to pursue happiness” (#3586).

To properly pursue happiness, the natural law tradition almost always emphasizes the importance of the classical virtues.  So Besong devotes an illuminating chapter to analyzing the four “cardinal” virtues—prudence (practical wisdom, knowing how to act); temperance (finding balance in all we do, avoiding extremes); fortitude (patiently enduring, courageously holding course); and justice (giving everyone what is due him).   Living virtuously makes us happy, so rather than deliberately trying to find what makes us happy we need to discover it comes as a side effect to living righteously.

317 Facts, Not Fear

Two decades ago, Bjorn Lomborg insisted (in The Skeptical Environmentalist:  Measuring the Real State of the World)that if more of us were more skeptical fewer of us would fear environmental catastrophes.  He provided a densely documented repudiation of the environmentalist litanies which then orchestrated world opinion and political action.  As “an old left-wing Greenpeace member,” it was difficult for Lomborg to entertain second thoughts about the movement he’d supported, but reading an interview with Julian Simon prodded him “to put my beliefs under the statistical microscope” (p. xixi).  The results—displayed in charts and graphs on almost every page as well as 2,930 footnotes and 1,800 bibliographical entries—undermined the worldview he’d too easily championed.  Lomborg devoted 68 double column pages to global warming, easily the most emotionally-charged current environmental issue.  He emphasized that many factors point to a slowly warming planet.  But the data are not totally persuasive.  And even worst-case scenarios will not dramatically change life on earth.  Many of the headline-grabbing projections are little more than “computer-aided storytelling.”  Frantic efforts to retard the warming trend would do little to alter the process.  We could easily expend enormous sums and slightly reduce the amount of global warming, but in a century such efforts will make little difference!  So Lomborg urged us to invest in more realistic endeavors and deal with the consequences of global warming when and if—when and if!—they transpire.  He further cautioned that if an estimated $100 trillion were spent during this century to curtail global warming it would reduce global temperature by only one-sixth of a degree Celsius!

Inasmuch as Congresswoman Aexandria Ocasio-Cortez recently rolled out a “New Green Deal” that has garnered endorsements from a variety of Democratic politicians (many of them running for President), it’s obvious we need to treat such proposals with considerable skepticism.  That’s especially true when we’re told we have only 12 years to save the planet!  (I’m reminded of Mark Twain’s quip in Life on the Mississippi:  “There is something fascinating about science.  One gets such wholesale returns of conjecture out of such a trifling investment of fact.”)  Agitators such as Ocasio-Cortez know that bad news makes the news, and popular publications and public school teachers pick up on alarming announcements, so that all too quickly extreme cases become accepted as basic norms.  Fears fly faster and further than facts!  Ungrounded alarms send folks scurrying for shelter when nothing has happened!  Scientists, as prone to vanity as any other professional group, enjoy the spotlight as lonely prophets and feed the frenzy.

So it’s helpful to peruse Gregory Wrightstone’s recent work, Inconvenient Facts:  The Science Al Gore Doesn’t Want You to Know (Silver Crown Productions, Mill City Press, c. 2017) for a readable (and well-illustrated) update to Lomborg’s Skeptical Environmentalist.  Introducing the book, England’s Viscount Monckton of Brenchley says:  “The Roman poet Virgil wrote of the scientist:  ‘Felix qui potuit rerum cognoscere causas:  Happy the one who finds the why of things.’  Science was originally known in the West as philosophia naturalis—the love of the nature of wisdom that is love of the wisdom of nature.  The noble philosophical mission of ‘the seeker after truth’, as the Iraqi mathematician and empiricist al-Haytham beautifully described the scientist, was to discern what is so in nature and why it is so, and to answer the question of the Greek philosopher Anaximander:  how to distinguish what is from what is not?” (Kindle #43).  Still more:  Science, to al-Haytham, “is not done by mere head count:  ‘The seeker after truth does not put his faith in any consensus, however venerable or widespread.  Instead he questions what he has learned of it, applying to it his hard-won scientific knowledge, and he inspects and inquires and investigates and checks and checks and checks again. The road to the truth is long and hard, but that is the road we must follow.’” 

“Gregory Wrightstone,” Viscount Moncton continues, “is a man of true science, firmly in the tradition of al-Haytham.  His mission in this book is not to prop up some failed Party Line willy-nilly, nor—on the other hand—unthinkingly to oppose that Party Line merely on the basis that it is as scientifically disagreeable as it is histrionically hysterical.  His mission is to distinguish what is from what is not in the climate debate.  He has splendidly succeeded” (#65).  Wrightstone himself says he writes “to provide non-scientists with well-documented, easily understood data on the basics of the science, while spotlighting the many glaring flaws in the climate-catastrophe arguments.”  The facts on display easily equip us to evaluate claims set forth by “climate change catastrophe” devotees who indulge in scare tactics to advance their political agendas.  “The inconvenient facts presented here show that the threat to humankind is not climate change or global warming, but a group of men (and women) intent on imposing an agenda based on severely flawed science” (#271). 

Wrightstone is a “geoscientist who has dealt with various aspects of the Earth’s processes for more than 35 years” who knows “that the brief hundred or so years of recorded temperatures—and the even shorter time frame since the first satellite was launched—is just a blink of a geologic eye.  It is too brief a period to evaluate the data adequately” (#258).  The planet has been much cooler—and much warmer—in the past, and carbon dioxide levels have oscillated wildly.  Indeed:  “Our current geologic period (Quaternary) has the lowest average CO2 levels in the history of the Earth” (#448).  Still more:  we also know that there were several previous eras (Minoan, Roman, Medieval) “that all were warmer than today, even though CO2 concentration was only 70% of today’s” (#581).  “It was warmer than today for 6,100 of the last 10,000 years,” and “the current warming trend is neither unusual nor unprecedented,” so it’s obvious CO2 levels had little to do with it (#858).  Contrary to alarmist articles, water vapor, not carbon dioxide, is mainly responsible greenhouse warming.  Indeed:  “Nearly 99% of the atmosphere consists of nitrogen and oxygen.  The remaining 1% consists of several trace gases, including CO2, whose current concentration represents just 0.04% of the atmosphere, or 400 molecules out of every million” (#375). 

Understanding elementary geoscience frees us from “climate apocalypse myths” popularized by National Geographic and environmental groups.  Fortunately the world is not “careening toward planetary doom because of our excesses.”  In fact:  “Humanity and the Earth are prospering wildly, not in spite of rising temperatures and increasing carbon dioxide, but because of them” (#1119).  Nor does anything like an overwhelming “consensus” regarding global warming exist in the scientific community.   A 2016 a survey of 4,000 members of the American Meteorological Society “found that 33% believed that climate change was not occurring, was at most half man-made, was mostly natural, or they did not know.  Significantly, only 18% believed that a large amount—or all—of additional climate change could be averted” (#1221).  Amazingly:  “Only 0.3% of published scientists stated in their papers that recent warming was mostly man-made” (#1223). 

Wrightstone presents data and charts to show that, contrary to apocalyptic myths, during the past several decades there have been fewer droughts, forest fires, famines, heat waves, tornadoes, and hurricanes.  Polar bear populations have increased, rather than decreased, and the rising sea levels have been doing so for 15,000 years with no dramatic increased rate during the past century.  Though a small peninsula of Antartica has been losing its ice cover, “Most of Antarctica is cooling and gaining ice mass” (#2006).  In sum:   “The inconvenient facts in this book support quite a different narrative from that offered by proponents of apocalyptic human-driven climate change.  On every key topic examined, the evidence, supported by voluminous peer reviewed studies, reveals that the ‘consensus’ opinion promoted by climate-apocalypse proponents is consistently at odds with reality” (#2026). 

For the reader’s convenience, there’s a list of the 60 “inconvenient facts” appended to the text. 

* * * * * * * * * * * * * * * * * * * * * * * * *

In Landscapes & Cycles:  An Environmentalist’s Journey to Climate Skepticism (c. 2013) Jim Steele, a biology professor at San Francisco State University, explains why he, as an active field scientist, came to distrust alarmist statements when they ran counter to his careful experiments.  He ever emphasizes:  “Although it is wise to think globally, all wildlife reacts locally” (p. 1).  Rather than endorse warnings issued by “authorities” (such as James Hansen—the “father of modern global warming theory”), he looked at the evidence he best knew, and “to my great surprise and great relief, when I examined 100 years of local climate observations throughout California, I found they contradicted the global models.  Global warming was not global and the local perspective suggested wildlife was not being harmed by climate change” (p. 3).  Though alarmist articles blamed rising CO2 levels for wildlife extinction, Steele found that “local temperatures never warmed or were never examined” (p. 15).  Instead, the slightly warmer temperatures during the past 60 years seemed to have benefitted rather than harmed the state’s wildlife.   Yet when he dared state this obvious truth he was attacked as a “denier and accused of helping Big Oil” (p. 9). 

Steele carefully examines celebrated environmentalist horror stories, beginning with Dr. Camille Parmesan’s research on butterflies.  She “is considered one of the leading figures in climate-change research” and was “ranked as the second-most cited author in papers devoted expressly to global warming and climate change” (p. 19).  She is one of the “global” thinkers, looking for “overall patterns” rather than specifics.  Importantly, her views garnered the endorsement of “one of the most prestigious scientific journals with one of the highest rejections rates, Nature” (p. 21).  Though she had become a scientific celebrity—even invited to speak in the White House—her claims triggered questions for Steele since they “contradicted the butterfly’s well-established biology” and “blamed ‘global’ warming even through local maximum temperatures had cooled.  Although butterfly experts and scientists dedicated to saving the butterfly from extinction had pointed to habitat destruction as the culprit and sought habitat restoration, Parmesan argued for reduced carbon emissions” (p. 20). 

Since one of the nation’s best butterfly experts, Paul Opler, finds no evidence bolstering Parmesan’s position Steele contacted her and “asked for the locations of her research sites.”  She refused!  “More than three years later,” he says, “I am still waiting” (p. 25).  And the butterflies she said were going extinct have, in fact, “been recovering” nicely, but one would never know it since “there have been no press releases to celebrate the good news” (p. 26).  After checking the facts in one of her famous papers—“Impacts of Extreme Weather and Climate on Terrestrial Biota”—Steele declares it “egregious.  Her conclusions are based on deceptive half-truths and grave sins of omission, yet it mesmerized the nation’s top climate scientists, who rapidly adopted her as blindly as the ants adopted a Large Blue [butterfly species]” (p. 85). 

Turning to another example, Steele examined the much-publicized decline of Emperor penguins—the largest of all Antarctic penguins.  Recent satellite surveys indicate that there are probably 600,000 of them, but the media persist in referring to “old data from a single colony that had suddenly declined during the 1970s to create a model demonstrating that rising CO2 will cause the Emperors to soon go extinct” (p. 52).  On-site data show “there has been absolutely no local warming,” yet climate scientists still issue warnings that the “Emperors are on the precipice of collapse, when in reality there are more penguins and more Antarctic sea ice now than as ever been observed before” (p. 16).  The same is true of Adelie penguins, cited by Al Gore as a sure indicator of climate change.  He, along with the World Wildlife Fund, focused on “one small area where 80% of the penguins have been lost” while withholding data showing that elsewhere the “Adelies are thriving” (p. 174), unfazed by global warming! 

Polar bears are likewise alive and well!  As are golden toads, pikas, walruses, and gray whales!  Unlike the computer generated alarmist declarations of species doomed to extinction, careful local studies often show them doing quite well, easily adjusting to changing environmental conditions.  Reading Steele’s essays provides a healthy antidote to the frenzy animating the “climate change” movement.

* * * * * * * * * * * * * * * * * * *

A decade ago Lawrence Solomon was working for Energy Probe, one of Canada’s oldest and largest environmental organizations.  For years he’d helped sound the alarm on global warming.  He also wrote a weekly column for the National Post.  Learning that there were a few distinguished scientists who disputed the global warming forecasts, he decided to devote a few of his columns to them.  One name led to another and in time he “profiled some three-dozen scientists, all recognized leaders in their fields, many of them actually involved in the official body that oversees most of the world’s climate-change research” (p. 6).  Ultimately he collected his columns in a highly readable and informative book—The Deniers:  The world-renowned scientists who stood up against global warming hysteria, political persecution, and fraud, and those who are too fearful to do so (Richard Vigilante Books, c. 2008).  One of the striking  truths he discovered was that the scientists he interviewed almost always said:  “‘I’m sure global warming exists.  All the science from all the different scientific disciplines say so.  But there is one exception—my particular area of expertise has found no compelling evidence of manmade global warming’” (p. 46).   So there seems to be a pervasive pattern:  “Affirmers in general.  Deniers in particular” (p. 46). 

For his first column he interviewed Dr. Edward J.Wegman, a professor at George Mason University who is considered one of the world’s finest statisticians.  He’d become involved in the global warming issue when asked by the House of Representatives to evaluate the famous hockey stick graph, the “poster child” of the Intergovernmental Panel on Climate Change and frequently featured “in the global warming debate” (p. 10).    Constructed by Professor Michael Mann, the graph showed global temperatures dramatically increasing throughout the 20th century, in tandem with increasing amounts of atmospheric CO2.  (Mann’s graph simply ignored long-accepted evidence regarding the “Medieval Warming Period” wherein temperatures were much warmer than they are today and during which the civilization of the “High Middle Ages” flourished.) 

Mann’s methodology had been subjected to intensive scrutiny by Stephen McIntyre, who concluded it was inaccurate at best and devious at worst!  So Professor Wegman was asked to decide who was right.  He assembled “an expert panel of statisticians” to help him, and pronounced Mann’s “hockey stick” was rooted in an erroneous methodology.  “Wegman argued not only that Mann was wrong but also that the mistakes he made were those that would have been fairly obvious to a top-notch statistician” (p. 18).  That Mann’s work had been “peer reviewed” also distressed Wegman , for discovered that the “peers” evaluating it all belonged to a small, tightly-bound circle of men committed to the global warming agenda.  In light of Wegman’s devastating critique, the IPCC dropped the hockey stick from its publications.

As a committed environmentalist, Solomon had long believed the UN climate-change scientists who linked hurricanes, such as Katrina, with global warming.  So he was “dumbfounded” when he found that the leading expert on Atlantic hurricanes, Dr. Christopher Landsea, denied any correlation, much less causation.  Summing up, he said:  “‘There are no known scientific studies that show a conclusive physical link between global warming and observed hurricane frequency and intensity’” (p. 31).  He wrote IPCC officials, “protesting:  ‘Where is the science, the refereed publications, that substantiates these pronouncements?  . . . .  As far as I know there are none’” (p. 33).  He then resigned from his IPCC position, lamenting that “I personally cannot in good faith continue to contribute to a process that I view as both being motivated by preconceived agendas and being scientifically unsound’” (p. 35).  Subsequently, “the IPCC quietly banished hurricanes as cover-story material.  Also like the Mann hockey stick, the hurricane fears have done their work” (p. 36).

Of particular importance is the stance of Dr. Richard Lindzen, a professor of meteorology at the Massachusetts Institute of Technology and one of the world’s most acclaimed climate scientists.  He authored a chapter in an IPCC report, only to find it seriously misrepresented in its “Summaries for Policymakers.”  To counteract the distortions therein, he wrote a short piece for the Wall Street Journal titled “Climate of Fear.”  Peoples’ fears were being stoked by alarmists who spread “‘model results we know must be wrong’” and predict “‘catastrophes that couldn’t happen even if the models were right’” (p. 50).  Testifying before the U.S. Senate, Professor Lindzen condemned both the media and politicians such as Al Gore who spread untruths.  “‘How is it that we don’t have more scientists speaking up about this junk science?’ he asks.  His grim answer:  carrots and sticks.  Those who toe the party line are publicly praised and have grants ladled out to them from a funding pot that overflows with more than $1.7 billion per year in the United States alone.  This who don’t are subject to attack’” (p. 52). 

When the White House asked the National Academy of Sciences to appoint a panel on climate change independent of the IPCC, Professor Lindzen was one of eleven American scientists asked to assess the evidence.  After careful study, the panel issued a finely-nuanced statement:  “‘Because there is considerable uncertainty in current understanding how the climate system varies naturally and reacts to emissions of greenhouse gases and aerosols, current estimates of the magnitude of future warming should to regarded as tentative and subject to future adjustments (either upward or downward)’” (p. 56).  So how was this careful assessment conveyed to the public?  “CNN, in language typical of other reportage, stated it represented ‘a unanimous deacon that global warming is real, is getting worse, and is due to man.  There is no wiggle room’” (p. 56).  No wonder Richard Lindzen despairs! 

Dr. Vincent Gray is more than despairing.  He’s angry!  So he wrote The Greenhouse Delusion:  A Critique of “Climate Change 2001.  He is one of the “2,500 top scientists” the IPCC cites as endorsing the global warming agenda.  He did, indeed, serve as a reviewer of the organization’s reports, submitting some 1900 comments on one of them.  But Gray has become “aghast at what he sees as an appalling absence of scientific rigor in the IPCC’s review process” (p. 58).  In fact he thinks the whole thing may be little more than a “swindle”!  He even challenges what’s taken for granted by many scientists—the fact that the earth is excessively warming.  He notes that global temperature records may well be flawed since temperature stations “are disproportionately located near cities and towns, which are heat sources, rather than out in the country,” and “many stations that were once in the country have had cities grow up around them, affecting temperature trends” (p. 59).  And even to focus on one century’s trends belies a mental myopia, for over the millennia earth’s climate has dramatically changed.  We have, in fact, recently emerged from the Little Ice Age, for which we should be grateful!  

Central to the “climate change” hysteria is claim that CO2 has dramatically increased during the past few decades.  Professor Zbigniew Jaworowski, a famed Polish scientist, has protested the IPCC’s reliance upon ice-core data to prove the CO2 threat.  “‘These ice cores are a foundation of the global warming hypothesis,”” he says, “‘but the foundation is groundless—the IPCC has based its global warming hypothesis on arbitrary assumptions and these assumptions, it is now clear, are false’” (p. 98).  In fact:  “Scientists have been studying and measuring ‘CO2 since the beginning of the 19th century, and they have left behind a record of tens of thousands of direct real-time measurements.  These measurements tell a far different story about CO2—they demonstrate, for example, that CO2 concentrations in the atmosphere have fluctuated greatly and that several times in the past 200 years CO2 concentrations have exceeded today’s levels’’’ (p. 107).  Despite such facts, desperate politicians still stoke the fears of an ignorant populace! 

316 Solzhenitsyn’s Witness

Since becoming aware of Aleksandr Solzhenitsyn 50 years ago I’ve read—and read about—him.  He has remained for me a powerful witness, revealing important truths regarding Communism, 20th century history, the importance of writers, and the durability of “permanent things.”  I’ve recently read the first three volumes of The Red Wheel novels:  August 1914, which focuses on the pivotal first month of Russia’s engagement in WWI, showing why and how the Tsarist state failed to rightly respond to the conflict; November 1916, which deals with the disintegrating home front (and is the most explicitly Christian of the novels); March 1917, which shows the government beginning to dissolve amidst the collapse of traditional authorities.  In the judgment of David Walsh:  “There is no doubt that The Red Wheel is one of the masterpieces of world literature, made all the more precious by its relevance to the tragic era through which contemporary history has passed.  Moreover, the impulse of revolutionary and apocalyptic violence associated with the age of ideology has still not ebbed.  We remain confronted by the fragility of historical existence, in which it is possible for whole societies to choose death rather than life.”

In toto, The Red Wheel constitutes what Solzhenitsyn considered “the chief artistic design of my life.”  He believed the two Russian revolutions in 1917 were the crucial events of the 20th century—the cauldron of destruction still defacing the globe.  Unfortunately, few Americans would plough through these massive (thousands of pages!) tomes since they deal, in intricate fashion, with figures and events in Russian history unlikely to interest them.  But to Solzhenitsyn, “there is always only one right path:  to tackle the main job.  That job will lead you to the right path of its own accord.  Tackling the job meant seeking out, for myself and for the reader, how, through our past, we can conceive of our future” (#5054).  Russian history merits our attention since it provides an important lesson regarding the fatal consequences of embracing any utopian, socialist vision for society.  Especially enlightening are passages such as one finds in November 1916, surveying the leftist movement that would finally prevail in the Bolsheviks’ triumph.  Largely responsible for that triumph was a “hapless Russian liberalism, prostrating itself, dropping its spectacles, raising its head again, throwing up its hands, urging moderation, and generally making itself a laughingstock” (p. 59).  Though feigning impartiality, Russian liberals unfailingly aligned themselves with leftist ideologies.  “Educated Russian society, which had long ago ceased to forgive the regime for anything, joyfully applauded left-wing terrorists and demanded an amnesty for all of them without exception.” (p. 59).  As the 20th century dawned, the democratic liberals issued angry fulminations against the Tsar and his government while refraining from any critique of “the revolutionary young” who had gained control of the universities and “knocked their lecturers down and prohibited academic activity.”  Students—then and now—unleash what seems to be an effluence of adolescence—“the normal sympathy of the young for the left” (p. 495).   

In a passage I’ve pondered many times—for it speaks as directly to the United States of 2019 as the Russia of 1917—Solzhenitsyn said:  “Just as the Coriolis effect is constant over the whole of this earth’s surface, and the flow of rivers is deflected in such as way that it is always the right bank that is eroded and crumbles, while the floodwater goes leftward, so do all the forms of democratic liberalism on earth strike always to the right and caress the left.  Their sympathies always with the left, their feet are capable of shuffling only leftward, their heads bob busily as they listen to leftist arguments—but they feel disgraced if they take a set to or listen to a word from the right” (p. 59).  Indeed, as Lenin as other revolutionaries realized:  “The wind always blows from the far left!  No Socialist in the world could afford to ignore that fact” (p. 485).   In a 1979 interview with the BBC Russian service, Solzhenitsyn lamented the 1917 failures of liberals and moderate socialists in the Duma and the Provisional Government.  They lacked the courage needed to oppose the hard left—a pattern of “weakening and self-capitulation” that would be “repeated on a world-wide scale since those days.”

Russian revolutionaries, many of them representing the “hard left,” also relied on a supportive Russian press.  As the main protagonist of The Red Wheel, Colonel Georgi Vorotyntsev, mused:  “It’s always leftist.  All destructive. Vilifies the Church, vilifies patriots—they narrowly avoid mentioning the throne directly, they’ve learned to yap about what they call the regime.  Every fly-by-night journalist speaks in the name of Russia.  They shower us with sewage but never print our denials, that’s their idea of freedom.  And any newspaper that stands up for the government is called reptilian or said to be on the government payroll” (pp. 902).  Joining the left-leaning press were the nation’s teachers.  “There is in Russia some sort of ‘education league,’ teeming with hundreds and thousands of teachers.  But what does ‘education’ mean to them?  To them there is nothing sacred in Russia, it has no historic rights, no natural foundations.  They hate everything Russian, everything Orthodox, everything that goes back into the depths of time.  Education, to them, means revolution” (pp. 903-904).  Thereafter, as Solzhenitsyn notes in his recently published, autobiographical Between Two Millstones, at the beginning of 20th century, Russia witnessed “a powerful student movement” whose “consequences . . .  were horrific.  Everything they did was from the purity of their hearts, but they lacked any civic experience and ended up being engulfed by theories of revolution and violence” (Millstones, #2165). 

It became evident, with the publication of August 1914,that Solzhenitsyn was a conservative—both a Russian patriot and an Orthodox Christian—who treasured much about the “old Russia,” despite its deeply-flawed Tsarist authoritarianism.  So he soon lost support in liberal circles in both Russia and the West.  Even Secretary of State Henry “Kissinger for a long time prevented the Voice of America from broadcasting me, and the BBC and Radio Free Europe were also beginning to avoid me as an ‘authoritarian figure,’ which was how I was being portrayed after my Letter to the Soviet Leaders” (#4165).  In particular he defended Pyotr Stolypin, the Russian prime minister from 1906-1911 who had sought to bring into being a “solid class of peasant proprietors,” convinced that they could support and preserve a constitutional monarchy of some sort.  Unfortunately, Stolypin was assassinated in Kiev in 1911; he was probably the last hope for a “conservative liberal” regime that might have avoided the revolutionary chaos that subsequently ruined the nation. 

For the many Americans interested in Solzhenitsyn but uninterested in his lengthy,  ponderous, history-laden literary works, Edward E. Ericson, Jr. and Daniel J. Mahoney have assembled The Solzhenitsyn Reader:  New and Essential Writings, 1947-2005 (Washington:  ISI Books, c. 2006) and provided helpful editorial comments on all of his works.  “The purpose of this book,” they say, “is to make the broad sweep of Solzhenitsyn’s remarkable oeuvre available to English-speaking readers” (p. xv), emphasizing that:  “Amid the exceptional flux of his life, one thing remained constant:  He remained committed to exploring the subject he had chosen in youth as the topic of his magnum opus, namely, the Bolshevik Revolution and its causes” (p. xvi).  Importantly, The Solzhenitsyn Reader contains important non-fictional pieces, including some poetic, deeply religious musings such as his autobiographical Acanthistus:  “When, oh when did I scatter so madly / All the goodness, the God-given grains? / Was my youth not spent with those who gladly / Sang to You in the glow of Your shrines? / Bookish wisdom, though, sparkled and beckoned / And it rushed through my arrogant mind, / The world’s mysteries seemed within reckon, / My life’s lot like warm wax in the hand.  / My blood seethed, and it spilled and it trickled, / Gleamed ahead with a multihued grace, / Without clamor there quietly crumbled / In my breathe the great building of faith.  / Then I passed betwixt being and dying, / I fell off and now cling to the edge, / And I gaze back with gratitude, trembling, / On the meaningless life I have led.  / Not my reason, nor will, nor desire / Blazed the twists and turns of its road. / It was purpose-from-High’s steady fire / Not made plain to me till afterward.  / Now regaining the measure that’s true, / Having drawn with it water of being, / Oh great God!  I believe now anew! / Though denied, You were always with me. . . .”  (P. 21). 

Solzhenitsyn came to the world’s attention with his publication One Day in the Life of Ivan Denisovich (a fictional depiction of his experience in Stalin’s prison camps) during the “Khruschev thaw.”  He was briefly a celebrity in his native land and welcomed by the state-controlled literary establishment.  In time he would be awarded The Nobel Prize for Literature, something he deeply appreciated simply because it enabled him to survive as a writer.  Before long, however, he encountered mounting governmental opposition.  So he began recording his struggles in a work entitled The Oak and the Calf:  Sketches of Literary Life in the Soviet Union (New York:  Harper Colophon Books, c. 1975).  He began by confessing:   “For the writer intent on truth, life never was, never is (and never will be!) easy:  his like have suffered every imaginable harassment” (p. 1).  Knowing this, Solzhenitsyn “entered into the inheritance of every modern Russian writer intent on the truth:  I must write simply to ensure that it was not all forgotten, that posterity might someday come to know of it” (p. 2).  He committed himself to writing because he believed “the Soviet regime could certainly have been breached only by literature.”  No military coup or political movement could begin to challenge Stalin’s brutal dictatorship.  “Only the solitary writer would be able” to effectively oppose it, simply because “one word of truth outweighs the world.”  Thanks to his international status, Solzhenitsyn continued working for several years, though little he wrote would be published in Russia.  But when he documented Stalin’s massive slave labor system in the three volume The Gulag Archipelago (first published only in the West) he was expelled from Russia in 1974.  As he departed, he left behind a short, memorable message to his people:  “Live Not by Lies!” 

After spending some time in Switzerland, Solzhenitsyn ultimately settled in Vermont’s mountains, near the village of Cavendish, in 1976.  Here he tried, inasmuch as possible, to create a little Russian outpost wherein he could continue his artistic/historical work.  He also granted interviews and delivered lectures, many of them reprinted in his Warning to the West.  Surprisingly he was not delighted by all things Western!  In 1978 he delivered the commencement address at Harvard University.  Entitled A World Split Apart, he began his speech abrasively, noting that though Harvard’s motto is Veritas graduates should know that “truth seldom is sweet; it is almost invariably bitter” (p. 1).  But he resolved to speak Veritas anyway!  And his words proved “bitter” to many who heard him!  After assessing various developments around the world, he questioned the resolve of the West to deal with them.  Unfortunately, he said, “A decline in courage may be the most striking feature that an outside observer notices in the West today.  The Western world has lost its civic courage, both as a whole and separately, in each country, in each government, in each political part, and, of course, in the United Nations.  Such a decline in courage is particularly noticeable among the ruling and intellectual elites, causing an impression of a loss of courage by the entire society” (pp. 9-11).  This decline, “at times attaining what could be termed a lack of manhood,” portended a cataclysmic cultural collapse.  

 He especially upbraided the media.  Granted virtually complete “freedom,” journalists in the West used it as a license for irresponsibility.  Rather than working hard work to discover the truth, they slipped  into the slothful role of circulating rumors and personal opinions.  Though no state censors restricted what’s written, “fashionable” ideas get aired and the public is denied free access to the truth.  Fads and fantasies, not the illumination of reality, enlist the mainstream media.  “Hastiness and superficiality—these are the psychic diseases of the twentieth century and more than anywhere else this is manifested in the press” (p. 27).    Consequently, “we may see terrorists heroized, or secret matters pertaining to the nation’s defense publicly revealed, or we may witness shameless intrusion into the privacy of well-known people according to the slogan ‘Everyone is entitled to know everything’” (p. 25). 

Politicians who appeased Communism especially elicited Solzhenitsyn’s scorn.  Appraising America’s recent withdrawal from Vietnam, he declared the antiwar agitators were “accomplices in the betrayal of Far Eastern nations, in the genocide and the suffering today imposed on thirty million people there.  Do these convinced pacifists now hear the moans coming from there?  Do they understand their responsibility today?  Or do they prefer not to hear?  The American intelligentsia lost its nerve and as a consequence the danger has come much closer to the United States.  But there is no awareness of this.  Your short-sighted politician who signed the hasty Vietnam capitulation seemingly gave America a carefree breathing pause; however a hundredfold Vietnams now looms over you” (p. 41).  He envisioned an immanent  “fight of cosmic proportions,” a battle between the forces of Good and Evil.   Two years before Ronald Reagan was elected President, Solzhenitsyn insisted that only a moral offensive could turn back the evil empire. 

Cowardice had led Americans to retreat in Southeast Asia.  Indeed, democracies themselves, Solzhenitsyn feared, lack the soul-strength for sustained combat.  Wealthy democracies, especially, become flaccid.  “To defend oneself, one must also be ready to die; there is little such readiness in a society raised in the cult of material well-being.  Nothing is left, in this case, but concessions, attempts to gain time, and betrayal” (p. 45).  More deeply, the “humanism” that has increasingly dominated the West since the Renaissance largely explains its weakness.  When one believes ultimately only in himself, when human reason becomes the final arbiter, when human sinfulness is denied, the strength that comes only from God will dissipate.  Ironically, the secular humanism of the West is almost identical with the humanism of Karl Marx, who said:  “communism is naturalized humanism” (p. 53). 

Consequently, he said, “If the world has not approached its end, it has reached a major watershed in history, equal in importance to the turn from the Middle Ages to the Renaissance. It will demand from us a spiritual blaze; we shall have to rise to a new height of vision, to a new level of life, where our physical nature will not be cursed, as in the Middle Ages, but even more importantly, our spiritual being will not be trampled upon, as in the Modern Era” (pp. 60-61).  The Harvard address ended Solzhenitsyn’s speaking career in the United States.  The nation’s elite newspapers—the New York Times and Washington Post—thenceforth ignored him.  Prestigious universities, such as Harvard, slammed shut their doors.  He became something of a persona non grata and spent the last 15 years of his life in America living as a recluse, working industriously on manuscripts devoted to Russian history. 

He also wrote a personal memoir—much like The Oak and the Calf—recording his observations while living in the West.  Entitled Between Two Millstones, Book 1:  Sketches of Exile, 1974– 1978 (Notre Dame, IN:  University of Notre Dame Press, c. 2018), it provides us, says Daniel J. Mahoney, “one of the great memoirs of our time.”  To Donald Rumsfeld, it “is an indispensable part of history,” a “lasting testimony to his unbending moral courage, his persistence, and his persuasiveness— all of which helped bring down Communism.”  The two “millstones” grinding away at him refer to the dictatorial, dehumanizing regime in Russia and the vapid Western “freedom” that proffered little meaning for mankind.  In America, Mahoney says:  “He had a new tension-ridden mission: to write with force, clarity, and artfulness about the Russian twentieth century while doing his best to warn the West about the pitfalls of a free society caught up in the cult of comfort and increasingly unwilling to defend itself against the march of evil” (Kindle, #158).

This march of evil, he thought, gained considerable impetus from the media, which pounced on him as soon as he arrived in Zurich.  In the USSR the press was rigorously censored and thus untrustworthy.  In the West, the press was “free” but irresponsible and thus also untrustworthy!  Consequently, “from the very outset the Western media and I were not to be friends, were not to understand one another” (#332), for he “was completely aware of how careful one had to be not to throw oneself into the arms of the press, though I did not know how to take cover from their relentless siege” (#443).  Bewildered by their indifference to his privacy and message, he thought “their fly-by-night trade” consisted mainly in trying “to outdo one another in snooping, conjecturing, and snatching at whatever they can.”  Having just published The Gulag Archipelago, “my book about the perishing of millions,” he found journalists nastily “nipping at some puny weeds” regarding passing remarks he had recently made.  Angrily he declared:  ‘You are worse than the KGB!’  My words instantly resounded throughout the world.  So from my first days in the West I did much to ruin my relationship with the press; a conflict that was to continue for many years had begun” (#527). 

Mystifying to him, Western elites reacted negatively to his letters defending Orthodox Christianity and the war in Vietnam.  Consequently, “in the wake of all the recent enthusiasm came a flood of abuse from the Western press, an about-turn in just three weeks!  If they had at least read the letter carefully!  From the reviews and the invective, it quickly became clear that these newspapermen had not taken the trouble to read the letter in its entirety.  It was the first time that I had encountered such a thing, but dishonesty of this kind quickly proved to be a steadfast characteristic of the press.  The New York Times, which had refused to print my letter, was among the most violent critics” (#907).  Pressured to sit down for a TV interview, Solzhenitsyn reluctantly agreed to do so with Walter Cronkite on CBS.  “They came to our house with a noisy, well-equipped crew of about ten, the only shortcoming being that they had not brought with them competent translators.  I, too, was poorly prepared, not realizing who Walter Cronkite was, how left his leanings were, his questions bristling with hidden jabs, all about the Western media and my attitude toward it (which by now had become common knowledge), and also about the Russian émigré community” (#1324).  He was especially angered by articles claiming to be based on interviews with him that never happened—and by reporters who cherry-picked statements from interviews to advance their own views rather than truthfully report his. 

Yet another source of evil Solzhenitsyn discerned in the West was the harm being done by industrialization.  He anticipated Anthony Esolen’s recent defense of “people in the modern world struggling against the Leveling force of a technocratic and culture-dissolving state.”  Thus Solzhenitsyn:  “Today’s prosperous world is moving ever further from natural human existence, growing stronger in intellect but increasingly infirm in body and soul” (#581).  On a personal level, he craved pastoral regions.  Though he found some cities such as Zurich charming, he preferred to work in the Swiss countryside, where the industrious farmers’ work “strengthened the peace within my soul.”   Looking at the alpine scenery “every day, every morning— somehow cleanses the soul and clarifies one’s thoughts.  The simple act of standing and looking is already labor for the soul and the mind. The task of evaluating one’s past and tracing out the future becomes easier” (#1484).   Struggling to settle into his writing routine, he found that “the grandeur and wisdom of this mountain place (almost as if a high mountain altar . . .) were soon to put me back in form” and he could resume his work on Russia (#1548).   Thus in time he found  inVermont a suitable environment wherein he could continue his literary work.  Returning to Russia in 1994 and dying in 2008, he failed to much influence developments in has native land.  And he is today largely ignored by most Westerners.  But to those with ears to hear, he remains a lasting witness to what ought ultimately concern us.

315 Culture of Fear

One of the more amazing contemporary phenomena—despite our very evident safety and comfort —is the pervasive insecurity and fragility identifiable in various segments of the West.  As the Norwegian philosopher, Lars Svendsen, says:  “a paradoxical trait of the culture of fear is that it emerges at a time when, by all accounts, we are living more securely than ever before in human history.”  Aware of this, Pope John Paul II frequently encouraged believers to “fear not”—for that biblical phrase, reiterated by the angels announcing Jesus’s coming, indicates the importance of courage in the Christian tradition.  (Indeed, some 365 times the Bible says “be not afraid!”)  But with the waning of Christendom courage seems similarly sidelined.  Thus Alexander Solzhenitsyn said (in his 1978 Harvard Commencement Address):  “A decline in courage may be the most striking feature that an outside observer notices in the West today.  The Western world has lost its civic courage. . . .”  Prophetically, he warned:  “Must one point out that from ancient times a decline in courage has been considered the first symptom of the end?” 

Courage, traditionally understood, enables one to conquer his fears, and most of us admire it—at least in theory.  “But in everyday practice,” Frank Ferudi says in How Fear Works, “we have become estranged from this ideal and do very little to cultivate it.”  It has frequently, in fact, been “downsized” and even extended to assorted self-help endeavors!  Rather than a moral virtue best evident on the field of battle, it has turned into a therapeutic suggestion.  Thus we commend the “courage” of suffering poor health or recovering from romantic distress or speaking in public.  “The classical virtue of courage rooted it within moral norms that emphasized responsibility, altruism and wisdom.  The twenty-first-century therapeutic version is not based on an unshakable normative foundation; it has become disassociated from moral norms and is adopted instrumentally as a medium for achieving wellness” (#3040).

This cultural shift is generally justified by the necessity of “worst-case thinking” and the “Precautionary Principle,” a philosophical rationale “systematically outlined in the works of the German philosopher Hans Jonas, whose influential 1979 text The Imperative of Responsibility advocated the instrumental use of fear—what he calls the ‘heuristic of fear’—to promote the public’s acceptance of a dreadful view of the future.  Jonas offers what he perceives to be an ethical justification for promoting fear, which is that through its application, this emotion ought to be used to avoid humankind’s infliction of an ecological catastrophe on the planet” (#2749).  Jonas propounded “a teleology of doom based on the premise that modern technology threatens the world with an imminent threat of disaster” (#2756).  Among the intelligentsia he is “something of a philosophical saint” revered for his ecological sensitivities.  However, Furedi warns:  “his promotion of the principle of fear, his elitist contempt for people, and his advocacy of deception and tyranny, are rarely held to account” (#2798).

Inasmuch as courage is rooted in moral convictions, the increased fear in our society indicates a loss of moral certitude.  This phenomenon was diagnosed by Frank Ferudi in his 2016 work, What’s Happened to the University:  A Sociological Exploration of Its Infantilisation.  The author began his academic life as a student in 1965 and is now Emeritus Professor of Sociology at the University of Kent in the UK.  In his own student days universities were open to new ideas and touted the virtues of free speech and debating ideas.  As the decades passed, however, they became “far less hospitable to the ideals of freedom, tolerance and debate than in the world outside the university gate.”  They became fearful!   Students now seek to ban books that threaten their vulnerable psyches and protest speakers who might offend a spectrum of sexual and ethnic groups.  The free speech mantras of the ‘60s have morphed into speech codes; the former devotees of free speech have frequently become, as tenured professors, enforcers of censorship.  Many teachers forego the use of red pens to mark papers lest they damage fragile students’ egos, and    “safe spaces,” “trigger warnings,” “microagressions” and “chill out rooms” (replete with play dough and “comfort” animals to relieve anxieties) indicate how many universities have in fact become infantilized.

Two decades ago Ferudi published Culture of Fear: Risk Taking and the Morality of Low Expectations, arguing that moral confusion had hollowed out Western culture, making persons both increasingly less able to deal with risk and uncertainty and less positive about human nature and man’s ability to aspire and adventure.  Now he has revisited the subject in How Fear Works:  Culture of Fear in the Twenty-First Century (London:  Bloomsbury Publishing, c. 2018; Kindle Edition).   To illustrate his thesis Ferudi notes:  “Even an activity as banal as forecasting the weather has been transformed into a mini-drama through adopting a rhetoric that inflates the threat posed by relatively normal conditions.  Routine occurrences like storms, heavy snowfall or high temperature have been rebranded as extreme weather by the media.”  Indeed:  “The term ‘extreme weather’ is a paradigmatic culture of fear expression” and is, strangely enough,  “often interpreted through a moralistic narrative that presents it as the inevitable outcome of irresponsible human behaviour” (#338).  Summing up his study, he says “society has unwittingly become estranged from the values—such as courage, judgement, reasoning, responsibility—that are necessary for the management of fear” (#580).

In the past, many of our fears were restrained by religious faith, the confidence that some things were eternally true and worth risking—or even giving—one’s life to secure.  “Religion has always been interwoven with guidelines about what and what not to fear. Secular fear appeals concerning health, the environment, food or terrorism continue this tradition and are also often conveyed through a moral tone.  However, in the absence of a master-narrative that endows the unknown and the threat it poses with shared meaning, people’s response to threats has acquired an increasingly confusing and arbitrary character” (#1875).  Thus as we enter the 21st century “a pessimistic teleology of doom pervades the public deliberations on this subject” (#1202).  Every hurricane elicits warnings regarding climate change—as do arctic cold fronts, volcanic eruptions, and earthquakes!  No solid evidence or logical analysis is required to stoke the fears of folks immersed in our media world.  Think for a moment about the current Socialist superstar in Congress, Alexandra Ocasio-Cortez, who solemnly says we only have 12 years to save the planet!  Such somber predictions of environmental collapse (following the pattern cut out by Rachel Carson 50 years ago in Silent Spring) are often accompanied by warnings of a global demographic time bomb (confidently decried by Paul Ehrlich in his now thoroughly discredited Population Bomb). 

Consider the outlandish rhetoric of many social justice warriors!  Former President Jimmy Carter, for example, recently published a book entitled A Call to Action: Women, Religion, Violence and Power and grandly declared that right now (today!) slavery is “a ‘serious problem in the US’” and is even “‘more prolific now than during the eighteenth and nineteenth century.’”  It is, however, invisible!  Somehow Carter just knows it’s there, unseen and insidious.  “Like the hidden toxins ‘playing their tricks’ . . . modern slavery is not visible to the eye.  Typically, its hidden victims are said to be invisible and, therefore, the number of cases that have been actually detected are only the ‘tip of the iceberg.’”  To the former president “the transatlantic slave trade, which was responsible for the brutal enslavement of 12 to 15 million Africans, is merely a less prolific version of the ‘modern’ variety of the twenty-first century” (#1932).   This mantra is also recited by Jeff Nesbit, a former White House communications director, who said:  “‘No one knows the numbers.  That’s what’s so scary!’” (#1940).  To which Furedi retorts:  what’s scary is the fact that highly influential men such as Carter and Nesbit knowingly spread baseless falsehoods!

Then we’re fed alarming reports of rampant obesity and of children facing a barrage of threats to their well being.   “In most Western societies, the population is healthier and lives longer than in previous times.  The latest generation of young people is likely to live 20 years longer than their grandparents.  Yet there has never been so much propaganda warning the public about yet another danger to its health” (#1736).  It’s apparently even risky to drink tap water!  “There was a time when people did not walk around holding different brands of bottled water in their hands; they drank tap water unless they lived in areas where tap water was considered to be unsafe, in which case water was boiled.”  But we now see people everywhere “clutching their bottles of water,” gripped by fears of contaminants of some sort.  “In 2016, bottled-water consumption in the US reached 39.3 gallons per person.”  This is done despite the fact “that the fears directed at tap water are not based on an objective evaluation of the risks of drinking it.  From a health perspective, the consumption of bottled water makes little sense.  Unfortunately, the sensible message that tap water is in most places safe to drink and that paying for the bottled variety is unnecessary is often distorted through a narrative of fear.  Instead of merely stating ‘Let’s get real and drink tap water’, opponents of the bottled-water fad frame their argument through the perspective of fear.

As one might expect from a sociologist, Furedi is most helpful when compiling data and describing problems.  He clearly demonstrates the pervasive fears stalking contemporary society.  And he clearly shows how the lack of courage contributes to their currency.  But while he recognizes the need for the moral virtues, courage included, he fails to acknowledge the necessarily deeper philosophical or theological foundations necessary to establish courageous persons. 

      * * * * * * * * * * * * * * * * * * * * * * * * * *

In The Coddling of the American Mind:  How Good Intentions and Bad Ideas are Setting up a Generation for Failure (New York:  Penguin Publishing Group, c. 2018; Kindle Edition), Greg Lukinoff and Jonathan Haidt stress the harm done children by teachers and parents excessively fearful for their safety.  The authors had become increasingly distressed by the onerous “speech codes” hindering free thought and expression on university campuses.  “Something began changing on many campuses around 2013, and the idea that college students should not be exposed to ‘offensive’ ideas is now a majority position on campus” (p. 48).  The “rationale for speech codes and speaker disinvitations,” once limited to racist or sexist declarations, “was becoming medicalized:  Students claimed that certain kinds of speech—and even the content of some books and courses—interfered with their ability to function.  They wanted protection from material that they believed could jeopardize their mental health by ‘triggering’ them, or making them ‘feel unsafe’” (p. 6).  To address their concerns Lukinoff and Haidt first wrote a widely-discussed article for The Atlantic Monthly and then, subsequently, this book to unmask three fashionably propagated “Great Untruths”:  1) “The Untruth of Fragility”—the notion that stress or discomfort harms you; 2) “The Untruth of Emotional Reasoning”—the injunction to disavow reason and “always trust your feelings; and, 3) “The Untruth of Us Versus Them”—the warning that evil people continually seek to damage you.  Consequently the authors say:  “We will show how these three Great Untruths—and the policies and political movements that draw on them—are causing problems for young people, universities, and, more generally, liberal democracies” (p. 4). 

To illustrate the falsity of fragility, Lukinoff and Haidt point out how parents trying to protect their youngsters from peanut allergies actually endanger them by prohibiting children’s powerful immune system from properly developing.  A careful study revealed:  “Among the children who had been ‘protected’ from peanuts, 17% had developed a peanut allergy.  In the group that had been deliberately exposed to peanut products, only 3% had developed an allergy.  As one of the researchers said in an interview, ‘For decades allergists have been recommending that young infants avoid consuming allergenic foods such as peanut to prevent food allergies.  Our findings suggest that this advice was incorrect and may have contributed to the rise in the peanut and other food allergies’” (p. 21).  Indeed, as Nassim Nicholas Taleb says in The Black Swan:  “Just as spending a month in bed . . . leads to muscle atrophy, complex systems are weakened, even killed, when deprived of stressors.  Much of our modern, structured, world has been harming us with top-down policies and contraptions . . . which do precisely this:  an insult to the antifragility of systems.  This is the tragedy of modernity:  as with neurotically overprotective parents, those trying to help are often hurting us the most’” (p. 23). 

That human beings—homo sapiens—should renounce reason and trust their feelings is similarly untrue.  Though pop psychologists and media personalities may urge it, trusting your feelings flagrantly contradicts “much ancient wisdom.”  Whether pondering Epictetus or Buddha or Shakespeare or Milton, the best philosophers have inisted we think rather than feel.  Consult, for example, Boethius’ The Consolation of Philosophy, once one of the basic texts for the liberal arts, wherein he praises “Lady Philosophy,” who “chides him gently for his moping, fearfulness, and bitterness at his reversal of fortune” before helping  “him to reframe his thinking and shut off his negative emotions.  She helps him see that fortune is fickle and he should be grateful that he enjoyed it for so long.  She guides him to reflect on the fact that his wife, children, and father are all still alive and well, and each one is dearer to him than his own life.  Each exercise helps him see his situation in a new light; each one weakens the grip of his emotions and prepares him to accept Lady Philosophy’s ultimate lesson:  ‘Nothing is miserable unless you think it so; and on the other hand, nothing brings happiness unless you are content with it’” (p. 35).  Wise words for all ages!

The “us vs. them” untruth has gained currency to a large degree because of identity politics.  When race becomes the key to your identity you easily suspect racism in anyone who differs from you.  When sex defines you, you easily accuse others of sexism when you feel dissatisfied.  A widely-discussed incident at Yale illustrated this.  Erika Christakis, a lecturer at the Yale Child Study Center responded to an administrative edict regarding Halloween costumes.  She approved concerns for “avoiding hurt and offense,” but “she worried that ‘the growing tendency to cultivate vulnerability in students carries unacknowledged costs.’”  Rather than issue behavioral rules, she suggested:  “Free speech and the ability to tolerate offense are the hallmarks of a free and open society’” (p. 57).  Her rather mild email aroused angry students who protested and denounced her for racial insensitivity.  The university president sided with the aggrieved students, and in time Erika resigned from her position.  So goes “academic freedom” in modern America!

As was evident at Yale, intimidation and violence are manifestations of the coddling of the American mind!  Defining speech they find objectionable as “hate” speech, it is easy to then insist it is a form of violence.  And in response to violence self-defense is justified.  So conservative speakers on university campuses are not only shouted down but physically attacked.  Witch-hunts are employed to root out dissenters on campus.  When a liberal mathematics professor at Evergreen College refused to approve a campus shutdown to show solidarity with people of color, students demanded he be fired.  Successfully intimidating the college president, “students chanted, ‘Hey hey/ho ho/these racist faculty have got to go’’ (p. 117).   “President Bridges, who at the beginning of the school year had criticized the University of Chicago for its policy protecting free speech and academic freedom, agreed to many of the protesters’ demands.  He announced that he was ‘grateful’ for the ‘passion and courage’ the protesters displayed, and later, he hired one of the leaders of the protests to join his Presidential Equity Advisors” (p. 119).  Most everything that’s wrong with the modern university stands starkly revealed at Evergreen College!

Having described the “coddling of the American mind,” the authors turn to explaining how it came to be and set forth “six interacting explanatory threads,” beginning with “rising political polarization and cross-party animosity.”  Political positions no longer reflect a positive agenda, rooted in traditional and reflection; rather they are too often fueled by angry disdain for perceived enemies.  Secondly, they point out the importance of “rising levels of teen anxiety and depression.”  An alarming, and very recent, increase in teenage depression and suicide clearly constrict the passage from adolescence to adulthood.  Data recently collected from 139 colleges indicate that “half of all students surveyed reported having attended counseling for mental health concerns” (p. 156).  Importantly, some persuasive studies especially stress the negative role electronic devices play in the lives of our young. 

Thirdly, “changes in parenting practices” or “paranoid parenting” clearly contribute to the malady.  The “permissive parenting” associated with Dr. Spock has morphed into the “intensive parenting” now dominant.  Responding to perceived threats to their children—such as being abducted by strangers, something that happens less than 100 times a year—parents overreact.  Though seat belts and bicycle helmets have certainly made children’s live safer, “efforts to protect kids from risk by preventing them from gaining experience—such as walking to school, climbing a tree, or using sharp scissors—are different.  Such protections come with costs, as kids miss out on opportunities to learn skills, independence, and risk assessment” (p. 169).  Fourthly, there has been a “decline of free play,” something absolutely necessary for childhood development.  A child’s brain needs “thousands of hours of play—including thousands of falls, scrapes, conflicts, insults, alliances, betrayals, status competitions, and acts of exclusion—in order to develop.  Children who are deprived of play are less likely to develop into physically and socially competent teens and adults” (p. 183).  Unfortunately, school children are less likely to have physical education classes or recess.  And rather than learning to play ball with neighborhood kids—and to choose teams and referee the game—kids are shoved into organized leagues with uniforms and trophies and assorted adult paraphernalia irrelevant to healthy personal development. 

Fifthly, once in the university students face a burgeoning “campus bureaucracy” devoted to insuring their comfort and security.  Thus we find the president of Louisiana State University declaring:  “‘Quite frankly, I don’t want you to leave the campus ever.  So whatever we need to do to keep you here, we’ll keep you safe here.  We’re here to give you everything you need’” (p. 199).  Such protective “safetyism” increasingly extends to emotional as well as physical well-being.  Students must be shielded from “microaggressions,” given “trigger warnings” when scary subjects are be breached, and supplied with “safe spaces” suitable for children.  Finally, students are immersed in “a rising passion for justice in response to major national events, combined with changing ideas about what justice requires” (p. 125).   They then become “social justice warriors” determined to eliminate inequalities and inequities wherever possible.  Little concerned with distributive or procedural notions of justice, they are increasingly devoted to “equal-outcomes social justice,” even if they trample on important concepts such as “innocent until proved guilty.” Concluding their treatise with a section titled “wising up,” Lukinoff and Haidt first proffer advice for parents who want to rear “wiser, stronger, and antifragile” kids who will become self-reliant adults.  Giving them lots of time for “free play,” encouraging them to walk or bike to school, placing limits on the time they spend with electronic devices, including television, are important aspects of their prescription.  And for “wiser” universities they urge a return to the vigorous pursuit of truth once considered essential for liberal arts education.   Rather than promoting “social justice,” universities should urge persons to freely think and speak, embracing Benjamin Franklin’s commitment to founding the University of Pennsylvania:  “‘Nothing is of more importance to the public weal, than to form and train up youth in wisdom and virtue. Wise and good men are, in my opinion, the strength of a state: much more so than riches or arms, which, under the management of Ignorance and Wickedness, often draw on destruction, instead of providing for the safety of a people’” (p. 269).