337 Pandemic Panic

Anyone saying “listen to the science” actually means “listen to the scientists I endorse,” for it’s scientists who speak and they often do so discordantly.  Thus we must discern which of them best explains what actually is—what’s the truth of things.  It’s up to us to check their data and evaluate their logic.  Doing so means we’ll frequently find courageous dissenters more persuasive than the proponents of a reigning consensus.  For nearly a year we’ve endured what’s arguably the worst pandemic since the Spanish flu, watching COVID-19 kill multiplied thousands of people and upend economies and polities around the world.  Getting accurate information, attending to responsible authorities, and coming to personal conclusions regarding it has been, to say the least, quite challenging.  But there is no good reason, when evaluating hydroxychloriquine as a therapeutic medication, to trust a bureaucrat such as Anthony Fauci rather than 6000 practicing physicians effectively using it to treat stricken patients.  Why not take seriously a distinguished Yale epidemiologist, Harvey Risch, with scores of scholarly publications, who thinks Fauci’s refusal to approve hydroxychloriquine caused many thousands of deaths?  Why not follow the 43,000 epidemiologists and health care professionals who signed “The Great Barrington Declaration,” proposing we deal aggressively with threatened population groups while allowing the rest of the country to return to normalcy?  Thus I’ll review several publications which all, to one degree or another, affirm the gravity of the pandemic while questioning the policies crafted by our public health and political leaders.  

Alex Berenson, an experienced investigative reporter who worked 10 years for the New York Times, was initially persuaded the country “might face an outbreak that would kill millions of Americans and potentially destabilize the nation” (p. 1).  So he stockpiled food and purchased N95 masks, preparing to survive the perilous times the experts predicted.   Then, as an investigative reporter curious concerning dubious data, he began checking the evidence and detailing his findings in a series of self-published booklets, beginning with Unreported Truths about COVID-19 and Lockdowns: Part 1: Introduction and Death Counts and Estimates (Kindle Edition, 2020).  (One benefit of publishing his findings in an ebook is this:  virtually every paragraph links the reader to the current, scholarly, in-depth, chart-studded, on-line publications he cites.)  Berenson quickly found that London’s Imperial College, in concert with the World Health Organization, had early “terrified politicians around the world and spurred what became a nearly universal lockdown.”  When carefully perused, however, the quality of their “research” shocked him and he began publishing his findings.  Initially the only outlet he had was his Twitter account, with only 10,000 followers.  But his posts attracted the attention of billionaire Elon Musk, who retweeted one of his them.  “Suddenly I found myself as one of the few people with any journalistic standing challenging the apocalyptic reporting that dominated media outlets like the [New York] Times.”  

The more scholarly studies he read the more skeptical he became—not regarding the virulent virus but the policies enacted to curtail it.  He quickly discovered the coronavirus “was more than 100 times as likely to kill people over 80 than under 50.  Yes, 100 times.  People under 30 were at very low risk.”  The median age of those dying is in the low 80s.   Still more, as is true of pneumonia, the elderly who were dying would have quite probably died within another year because of their other ailments.  So why, he wondered, enact shutdown policies harming whole populations rather than protect the most vulnerable?  Why shut down schools when children were almost never harmfully infected?  It was also clear, early on, that the virus was significantly less deadly than advertised—far less than the scare-mongering media proclaimed!  He saw how wildly exaggerated were the forecasts rendered by both medical “experts” and the politicians who quoted them.  They also changed their stories!  We were first told we needed to take extreme measures in order to “flatten the curve” and then informed even that was not sufficient—we needed to “stop the spread” of the disease!  The allegedly infallible officials had crafted simulations that dramatically failed in virtually every  way!  Nevertheless, shutdowns were mandated and masks prescribed.

Berenson followed up his initial publication with Unreported Truths about COVID-19 and Lockdowns: Part 2: Update and Examination of Lockdowns as a Strategy  (Blue Deep, Inc.. Kindle Edition, c. 2020).  As COVID-19 cases spiked in last summer, many governors decreed draconic lockdowns of all but “essential” activities.  “What went all-but-unnoticed in the push for lockdowns was the fact that major public health organizations had for decades rejected them as a potential solution to epidemics.”  Both the Center for Disease Control and the World Health Organization had earlier published lengthy guides dealing with influenza, citing ample “laboratory studies, clinical trials, and real-world evidence.”  They had counseled against lockdowns because they consistently proved ineffective!  Nothing in the past had effectively throttled, much less stopped, the spread of influenza epidemics.  So why would anyone think COVID-19 would be different?  Just because!  The “experts” just claimed it must be!  What happened, Berenson thinks, is this:  “Faced with a risk of hundreds of thousands or millions of deaths, the public health experts who for decades had counseled patience and caution flinched.  They found they could not live with acknowledging how little control they or any of us had over the spread of an easily transmissible respiratory virus.  They had to do something—even if they had been warning for decades that what they were about to do would not work and might have terrible secondary consequences.”  And this, I think, is the heart of the issue:  we’ve grown so accustomed to controlling our environment—and relying on the government to do things for us—that we cannot acknowledge some things are beyond our control!  

Just recently Breneson has issued Unreported Truths About Covid-19 and Lockdowns: Part 3: Masks  (Blue Deep, Inc., Kindle Edition).  He sincerely wishes masks worked!  They would, indeed, afford significant relief from the pandemic killing so many of us.  “But they don’t.  Not the ordinary cloth and surgical masks that nearly everybody wears, anyway.  Despite everything the media and public health experts has told you, they don’t work.  More accurately, we have no real evidence they do—and plenty of evidence they don’t.”  The World Health Organization had once stated it clearly:  the “WHO stands by recommendation to not wear masks if you are not sick or not caring for someone who is sick.”  Yet these same  health experts insist we wear them and Joe Biden proposes to require them of everyone.  Massive numbers of us have mutely complied!  Why did 85 per cent of those infected insist they either always or nearly always wore masks?  As virus still spreads we’re entitled to ask:  “How can that be, if masks work?”  

“The answer is,” Berenson says, “that the evidence that face coverings do any good turns out to be even more porous than masks themselves.”  To understand why we need to delve into the details regarding droplets, aerosols, and viruses.  A mask may well arrest the movement of a droplet (which may carry a virus) but is much to porous to stop an aerosol (which also may carry a virus).  Only what is called a respirator (the NP95s used in medical facilities) effectively stop aerosols and viruses they carry.  Most of the particles we breathe in and breathe out are tiny.  Inasmuch as “80–90% of droplets were smaller than 1 μm [micron], “masks have almost no chance of catching most of the particles we exhale.”  One of the scholarly studies Berenson cites “combined the results of the 10 trials into a single “meta-analysis”—a review that looks at each study and figures out what they say as a whole.  Their conclusion—published in Emerging Infectious Diseases, a Centers for Disease Control journal—was straightforward:  ‘We did not find evidence that surgical-type face masks are effective in reducing laboratory-confirmed influenza transmission, either when worn by infected persons (source control) or by persons in the general community to reduce their susceptibility.’”  

Just before Berenson published his booklet there appeared a “large randomized controlled trial that specifically examined whether masks protected their wearers from the coronavirus.”  It was published on Nov. 18 and covered almost 5,000 people in Denmark last spring.  The trial was carefully designed and executed, with half the participants told to wear high-quality surgical masks . . . .   The other half were not asked to wear masks.  Participants were followed for a month to see if they had been infected with Sars-Cov-2.”  The study’s conclusion?  “Mask wearing ‘did not reduce, at conventional levels of statistical significance, the incidence of Sars-Cov-2 infection.’”  So why are we shamed  (or bullied through treats of fines) into wearing masks?  Rather cynically Berenson suggests they help fuel the contagion of fear and sustain the illusion our rulers are doing something significant to save us.  “Masks are warnings none of us can escape.  This virus is different.  This virus is dangerous.  This virus is not the flu.  We had better hunker down until a vaccine is ready to save us all.  But the worst reason of all is that mask mandates appear to be an effort by governments to find out what restrictions on their civil liberties people will accept on the thinnest possible evidence.  They are the not-so-thin edge of the wedge.  Today, we must wear masks.  Tomorrow we’ll need negative Covid tests to travel between countries.  Or vaccines to go to work.”

                        * * * * * * * * * * * * * * * * * * * * * * 

In The Price of Panic: How the Tyranny of Experts Turned a Pandemic into a Catastrophe (Washington, D.C.:  Regnery Publishing, Kindle Edition, c. 2020), Jay Richards (a business professor at the Catholic University of America), Douglas Axe (a biology professor at Biola University), and William Briggs (an economist who’s published over a hundred scholarly papers), collaborate to evaluate the evidence and analyze the repercussions of the COVID-19 pandemic.  They endeavor “to sift prudence from propaganda.  And they have, George Gilder says, “written the definitive account of the most egregious policy blunder in the history of American government.”  The authors  acknowledge the lethality of the pandemic but are astounded at the concurrent, worldwide, destructive panic of medical experts and national leaders.  

It all began, of course, when a deadly virus spread from China.  Then doomsday forecasts, largely propounded by the World Health Organization, the Imperial College London, and the Institute for Health Metrics & Evaluation at the University of Washington.  The WHO relied on the Imperial College work which “predicted the new coronavirus would be about as deadly as the Spanish flu of 1918 (which killed between 18 and 58 million).  They predicted the coronavirus would claim 40 million lives worldwide, including 2.2 million in the U.S., if nothing were done to slow the spread.  Forty million deaths?  Terrifying!” (p. 78).  “We now know these models were so wrong they were like shots in the dark.  After a few months, even the press admitted as much.  But by then vast damage had been done” (p. xiv).  Then the models’ proponents, rather than confessing and correcting their errors, “began to massage the data” and  rationalize their declarations.  In this they were aided by a “gullible, self-righteous, and weaponized media that spread their projections far and wide.  The press carpet-bombed the world with stories about impending shortages of hospital beds, ventilators, and emergency room capacity.  They served up apocalyptic clickbait by the hour and the ton” (p. xv).  And social media websites promoted the fears by hyping the threats and censoring dissenting evidence concerning the pandemic’s true lethality. 

For context, the authors provide a historical record of pandemics—running from the ancient world to the present.  Placed in perspective, the current pandemic is rather routine—something we could have absorbed as part of life and addressed aggressively with every medical resource.  They also remind us of the sheer inevitability of death.  Every day 1,700 people die cardiovascular disease, 1,600 die of cancer, and nearly 700 die just from medical mistakes.  We’re accustomed to people dying—but dying of the new virus somehow became different!  That difference was the contagion of fear ignited by statistical projections!  Most of them predicted some 50 million deaths and such scary numbers naturally alarmed us all!  So we granted “emergency powers” to various authorities not because of “a catastrophe that had just happened, but rather a prediction about what might happen” (p. 17).  In New York, one of the very worst sites, experts predicted the city would need “140,000 hospital beds, only about 18,500 were in use.  Many thousands of field-hospital beds that had been brought in by ship or laid out in temporary shelters sat empty” (p. 111).  Predictions failed astronomically!  What really happened was “the first pandemic of panic.” (In our postmodern era, wherein we’re assured we “construct” rather than “discover” truth, such irrationality might be expected!). 

Some of the panic was spurred along by semantic equivocations.  For instance, it was decided to report anyone dying with the virus would be identified as dying from the virus!  The CDC reported that in only 7 percent of the victims was the virus the sole cause of death!  An Italian study of 355 COVID-19 victims showed that they “averaged 79.5 years of age and were in poor health.  More than a third had diabetes, and just under a third had ischemic heart disease.  A quarter had atrial fibrillation.  A fifth had active cancer, and over a sixth had either dementia or a history of stroke.  Of the 355 people, only three were in good health before catching the coronavirus” (p. 57).  Inflating numbers proved popular in the media, so the numbers of positive tests were called cases and easily conflated with significant infections.  

Richards, Axe, and Briggs carefully examine public health policies (i.e. lockdowns, distancing, masks) and show how problematic and potentially harmful they all are.  Countries or states that refused to lockdown fared as well as those who did so.  Copious charts and graphs fill the book, citing evidence and insisting we think logically.  Unfortunately, when we panic the “thinking parts of the brain stop functioning well” (p. 140).  We have no evidence the policies decreed by politicians actually helped curtail, much less vanquish, COVID-19.  Yet we have ample evidence how they harmed great numbers of people (students and middle aged adults who were hardly at risk of dying).  And the harms were enormous!  For example, though you’d never know it by watching the evening news, a United Nations agency says disruptions in the world’s food supply chains may have caused 300,000 deaths per day!  “In other words, more people around the world could die every two days from our response to the pandemic than those who died from the entire pandemic itself” (p. 139).  

The unintended consequences of the lockdowns will soon become clearer as we understand the follies of the small group of narrowly-focused “experts” who misled us.  We failed to consider a basic economic precept:  “Highlighting unintended consequences is perhaps the greatest gift economics has given to humanity.  ‘There is only one difference between a bad economist and a good one,’ wrote French economist Frédéric Bastiat.  ‘The bad economist confines himself to the visible effect; the good economist takes into account both the effect that can be seen and those effects that must be foreseen.’  He explains:  ‘Yet this difference is tremendous; for it almost always happens that when the immediate consequence is favorable, the later consequences are disastrous, and vice versa. Whence it follows that the bad economist pursues a small present good that will be followed by a great evil to come, while the good economist pursues a great good to come, at the risk of a small present evil’” (p. 168). 

Though it’s little consolation for us now, we can learn from a small number of countries (Japan, Taiwan, Sweden) and states (South Dakota, Arkansas, Utah) which followed Bastiat’s prescription, thinking about the “great good to come” rather than the “small present evil.”  More epidemics are sure to follow, as the past decades show, so let us hope we will more wisely respond to the next one.  

* * * * * * * * * * * * * * * * * * * * * * * * 

  John Schroeter’s COVID-19: A Devil’s Choice (John August Media, Kindle Edition, c. 2020) garnered high marks from distinguished scientists who applauded its commitment to dealing carefully with scientific data.  He believes Anthony Fauci and other public health “experts” unwisely, “under the pretext of public health and safety, advocated extreme social isolation measures, i.e., near-universal lockdown, to forestall the spread of infection until a vaccine can be found.  This strategy has had the two-fold effect of 1) precluding natural herd immunity, and 2) devastating the life-sustaining economy, and thus imperiling the health and wellbeing of vastly larger numbers of persons than a coronavirus could ever conceivably inflict” (p. 8).  Differing from Fauci, many distinguished epidemiologists “have publicly stated that had we opted for herd immunity at the outset, the pandemic would already be behind us.  Instead, we remain trapped in an open-ended nightmare scenario that not only promulgates fear and misery, but actively seeks to silence dissenting voices.  These responses not only have nothing to do with public health and safety, they actually exacerbate the crisis, deepening its effect in both the short- and long-term via widespread collateral damage.  Could there, then, be another agenda at work?  Come now, let us reason together” (p. 9).

The book contains sections of 100 “data-points”— short, succinct, factual arguments.  To think rightly we must first put things in perspective, and we know that people constantly die of health problems (hypertension, obesity, diabetes, respiratory weaknesses, heart problems) that are acerbated by by COVID-19 infections.  Smokers are especially susceptible, and folks breathing filthy air “are twice as likely to die” as those who aren’t.  Then too, unfortunately, many Americans “live unhealthy lifestyles.”  Consequently:  “Dying with COVID-19 (correlation) is not the same as dying from COVID-19 (causation).  And yet, health officials are making no such distinction” (p. 21).  Inasmuch as nearly half of the American people are deficient in Vitamin D, encouraging them to take ample amounts of it would have been a small, preventative step to helping the vulnerable to cope with the virus.  

We also need to acknowledge that epidemics never end until “herd immunity” is attained.  Any other measures are, frankly, illusions.  And you develop herd immunity by allowing the healthy to get infected, not by quarantining them!  So all the rhetoric about “flattening the curve” created an aura of managing the unmanageable.  It was little more than propaganda—much like that set forth by bureaucrats publishing “fire management policies” while the forests burn.  Flattening the curve, in fact, simply means delaying the dying.  Had we let the virus run its course “the pandemic would already be behind us, and the life-sustaining economy would be intact” (p. 17).  Indeed:  “Prior to the lockdown, according to antibody testing, herd immunity to COVID-19 was already well underway, and on its way to the necessary infection rate threshold in key populations.  Sadly, this process was interrupted by the ill-advised lockdown policies” (p. 19).  In fact, “more than two-thirds of newly reported COVID-19 cases are for those who have been sheltering in place!” (p. 29).  

Turning to the efficacy of masks, Schroeter cites “the declaration by the US National Academy of Sciences:  ‘Face masks are not designed or certified to protect the wearer from exposure to respiratory hazards.’  And yet, they are now being mandated for that very purpose.  Moreover, a number of studies have shown the inefficacy of the surgical mask to prevent transmission of viruses.”  This is because, as Dr. Rashid Buttar (who maintains a website on facemarks:  https://www.askdrbuttar.com/facemask/) explains, “the viral particles we’re trying to keep out of our bodies are so much smaller than the smallest pores of these masks.  ‘It’s like using a chain link fence to prevent a fly from entering your house,’ he says, ‘or a split-rail fence to keep mice out.  If our goal is to make people healthy, the first thing we should be doing is telling them to not wear a mask.”  Schroeter cites other scientific studies showing that masks (unlike respirators) simply don’t work to prevent respiratory influenza-like illnesses transmitted by droplets and aerosol particles.  “Dr. Lisa Brosseau, a nationally recognized expert on respiratory protection and infectious diseases and professor (retired), University of Illinois at Chicago, agrees. ‘I don’t necessarily discourage the public from wearing them if it makes them feel comfortable, but I hope they don’t think that they’re protecting themselves.’” (p. 57).   

So it goes!  We’ve panicked at the pandemic and are living in virtuality, not reality.  

336 Leaving the Plantation

For three decades, supporting the civil rights movement, I believed significant advances had been made in both black and white communities, leading to a more just and harmonious society.  My hopes were fueled by evangelicals such as John Perkins (recently feted as World Magazine’s “Daniel of the Year”) who tenaciously embraced Martin Luther King’s teachings on non-violence and loving one’s enemies.  To Perkins, the fruit of the Spirit is gentleness—though it seems harder and harder to find in the streets or in Congress these days!  And he still holds, unlike the advocates and devotees of Black Lives Matter, that:  “There is no black race—and there is no white race.  So the idea of ‘racial reconciliation’ is a false idea.  It’s a lie.  It implies that there is more than one race.”  There may be ethnic groups with diverse traditions and perspectives, but they all have the same blood, human blood!  Yet my hopes in the path proposed by Perkins suffered a setback during the the O.J. Simpson trial and acquittal, when I was shocked by both the verdict and the enthusiastic approval it garnered from the black populace.  Rooted in the black power movement of identity politics (earlier propounded by W.E.B. Dubois and various Marxist-leaning activists),  a different, markedly revolutionary vision seemed triumphant.  Whereas I’d thought blacks were committed to integration and pursuing the American dream, I then realized that many (if not most) of them  still think, as did Martin Luther King in 1963:  “The Negro is still not free.”

Uneasy with many aspects of this revolutionary movement, I began to wonder if the optimistic laws and policies enacted in the ‘60s—the Civil Rights Act of 1964 and the Voting Rights Act of 1965—  had unexpectedly harmed a clearly disadvantaged community.    At about the same time we had a guest speaker in a Point Loma Nazarene University chapel named Star Parker.  She’d become something of a celebrity in southern California for both her personal story and political positions, radically challenging the civil rights establishment.  She had written an autobiographical book—Pimps, Whores, and Welfare Brats (c. 1997)—which I read and found illuminating.  Soon thereafter she published Uncle Sam’s Plantation:  How Big Government Enslaves America’s Poor and What We Can Do About It (Nashville:  Thomas Nelson Inc., c. 2003, rev. 2010) setting forth her public policy positions.   

Parker grew up troubled and rebellious, stealing and doing drugs and engaging in promiscuous sex  leading to four abortions.  Her irresponsible lifestyle was subsidized by the maze of welfare programs which abetted her destructive behaviors.  In those years she routinely blamed racism for her predicament—though in time she came to identify her laziness as the real culprit.  Then, in her mid-20s, she encountered a pastor who challenged her to think and act differently, to take charge of her life and live responsibly.  She found a job and later became a self-employed publisher and radio host.  In the process she began to think through and share her markedly conservative, Christian views, thereby eliciting an unexpected tirade of abuse, including death threats from leftists (both black and white) who found her a threat to the status quo.  That various social support systems (welfare, affirmative action, public schools, health care, etc.) were failing meant nothing to her foes so long as they preserved what she came to see as “Uncle Sam’s Plantation.”  She saw that the material “poverty” addressed by various and infinitely-expanding federal programs (77 and counting!) was actually a symptom of the real poverty, which is preeminently spiritual and cultural.  “Government programs cannot help the broken poor because their poverty is in their heart and spirit” (p. 33).  Big Government agencies, allegedly helping the poor, have in fact reduced them to a passive dependency akin to slavery.  “Uncle Sam has developed a sophisticated poverty plantation, operated by a federal government, overseen by bureaucrats, protected by media elite, and financed by the taxpayers.  The only difference between this plantation and the slave plantation is perception” (p. 77).  On the governmental plantation the family has collapsed, the educational system failed, personal freedom eroded, and personal responsibility often disappeared.  

Rather than pour more trillions of dollars into this failing system, Parker calls for blacks to take charge of their lives and—above all—find what she found:  a vibrant faith in the living Lord Jesus Christ.   She’s left the plantation and believes her decision is the only realistic solution to racial problems in this country.   Hers may be the straight and narrow road that is the only way to the good life.  

* * * * * * * * * * * * * * * * * * * * * * * 

Very much in the mold of Star Parker, Candace Owens has recently written Blackout: How Black America Can Make Its Second Escape from the Democrat Plantation (New York:  Threshold Editions. Kindle Edition, c. 2020).   Owens has gained a sizable following for her U-Tube pod-casts, so she put her views in print in Blackout.  She insists the real problems in black America are moral rather than economic,  and the Democrat Party, by singularly focusing on material factors, effectively suppresses what’s most needed—a recovery of personal responsibility and discipline, especially regarding the family.  But, unfortunately:  “If you are a black person in America today, your identity is as much defined by your skin color as it was more than a hundred years ago and quite similarly, for all the wrong reasons.  To be a black American means to have your life narrative predetermined:  a routine of failure followed by alleged blamelessness due to perceived impotence.  It means constant subjection to the bigotry of lowered expectations, a culture of pacifying our shortcomings through predisposition” (p. 2).  

Consequently, Owens endeavors to “detail just why I believe the Democrat Party’s policies have led to the erosion of the black community by fostering a persistent victim mentality.  I will explain how a radicalized push for feminism is both emasculating and criminalizing men who are needed to lead strong families, and I will reveal the fallacy of socialism, in its inherent argument for the very same government that crippled black America in the first place.  Lastly, I will expose the inefficiency of the left-leaning public education system and tackle the media’s role in the collective brainwashing of our youth” (p. 11).  Her case is well-argued, filled with data as well as personal perspectives (emphasizing the positive role her hard-working, self-reliant grandparents played in her life), and worth reading simply to better understand a position that seems to be gaining some traction among younger blacks.  

* * * * * * * * * * * * * * * * * * * 

Vince Everett Ellison, following a career as a correctional officer, recently became involved in both politics and Christian ministry.  After supporting the Democrat party for most of his life he has recanted, persuaded that:  “Too many people become democrats because they want “FREE-STUFF”.  I’m a conservative because I demand “FREE-DOM” (p. 47).  Subsequently he wrote The Iron Triangle: Inside the Liberal Democrat Plan to Use Race to Divide Christians and America in their Quest for Power and How We Can Defeat Them (Outskirts Press, Inc.. Kindle Edition, c. 2020).  As is evident in the book’s subtitle, Ellison takes a jaundiced look at the Democrat Party which he thinks has internalized the perverse prejudices of President Lyndon B. Johnson, who sought to give blacks (via the Civil Rights, Voting Rights acts,  and Great Society programs) “a little something, just enough to quiet them down.”  Consistently thereafter blacks (by a 90 per cent margin) have supported Democrat politicians, and, as Nancy Pelosi cynically noted while lofting a glass of water:  “This glass of water would win with a D next to its name in those districts.”  Liberal Democrats have, Ellison charges, deliberately betrayed Black America so as to get and maintain control of the nation.  They accomplished this “by infiltrating and then compromising the three foundational institutions of the Black community:  the Black preacher, Black civic organizations, and the Black politicians.  I call this trifecta:  ‘the Iron Triangle’” (p. 7).  Consequently the black community now looks as if its ruled by totalitarian socialists, featuring “one-party rule, forced compliance, extreme poverty, government dependency, and dictator worship” (p. 7).  This was an amazing tradeoff:  “Blacks gained the right to eat and use the bathroom beside Southern Whites while White Liberals gained control of a nation and potentially the world” (p. 7).   And they’re supporting a political party which, according to its platform, supports “government funding of child murder, the legality of sexual perversions and pornography, the legalization of illegal drugs, and the promotion of atheism” (p. 138).

In accord with Star Parker and Candace Owens, Ellison thinks the main problem in the Black community is the failure of individuals to take personal responsibility for their lives.  In part this is because black preachers, organizations, and politicians forever claim it’s “someone else’s fault” and that blacks are inevitably victims of racism (whether overt or covert) and worse off than they were decades ago.  Rather than helping blacks learn to thrive in America, their leaders have done little more than teach them how to protest.  Ironically:  “If protest brought about desired change, Black people would be the most successful race in the country and Asians would be the least.  Instead the converse is true” (p. 24).  Members of the Iron Triangle profit mightily while ordinary Blacks suffer poverty, crime-ridden neighborhoods, failing schools, drug abuse, and fractured families.  For fifty years trillions of dollars, multiplied marches, riots and mayhem, had accomplished nothing! 

Ellison, a committed Christian, especially condemns the many black preachers who sell their souls for a bowl of porridge (i.e. money generously distributed by Democratic functionaries in pre-election periods).  Thereby, he thinks:  “The Democratic Party controls most Black preachers.  The Black preacher controls the Black church.  And the Black church is the spout that pours the Black community into the Democratic Party” (p. 87).  Before getting into politics, running for a congressional seat in South Carolina, he talked with his father, who had reared his family in the church and orchestrated a family singing group that produced several gospel albums.  His dad cautioned his aspiring adult son to trust bootleggers before preachers!  In fact:  “There is a common saying in the Black community:  most ‘Black preachers talk Black, live White, and think green.  I was going to find that it was more than a saying” (p. 95).  Many of them, like Barack Obama’s pastor, Jeremiah Wright, embrace racist versions of liberation theology, pitting victimized blacks against oppressive whites—“a heresy that has polluted the ‘Black’ Church since that the early 1970s” (p. 115).  To Ellison:  “Preachers that lead their congregation into the wretched ideology of Marxism, with its hatred of GOD, victimization, and murderous past, and away from the love, forgiveness, and reconciliation of Jesus Christ have committed the highest form of treason” (p. 122).

Ellison’s text is repetitious, badly needing editing.  His tone is strident and at times off-putting, but   his accusations merit consideration.  He certainly illustrates a significant voice in the burgeoning conservative black community.  

* * * * * * * * * * * * * * * * * * * * * * * 

A refreshingly upbeat approach to living as a black man in America comes from Rick Rigsby in Lessons From a Third Grade Dropout:  How the Timeless Wisdom of One Man Can Impact an Entire Generation (Nashville:  Thomas Nelson. Kindle Edition, c. 2006).  He was for several years the Life Skills Coordinator for the Texas A&M football team when it was still coached by the legendary R.C. Slocum, and later worked under Dennis Franchione as A&M’s team chaplain.  Understandably he laces his presentation  with with fascinating athletic anecdotes.  He warmly remembers Coach Slocum saying, “dozens of times:  My value as a head football coach will not be based on how many football games I won or lost.  My value will be directly related to the quality of the lives of the men I coached.  Are my former players productive citizens, good employees, good husbands and fathers?  The quality of their lives is the standard by which I will be judged as a head coach” (p. 123).  Rigsby spoke at Point Loma Nazarene College while I was the school’s chaplain, and I well remember the joyous, uplifting message he brought.  (He may have come, in part, because one of his best friends was Paul Holderfield, Jr., senior pastor of Friendly Chapel Church of the Nazarene in North Little Rock, Arkansas).  

Reading his book I now understand why he was so upbeat and infectiously uplifting.  He had a dad who made a difference.  His father, Roger Rigsby was black and poorly educated, but he never blamed others for anything, so this book has little about “racial” injustice in it.  Unlike the privileged black students in today’s elite universities, he repudiated all versions of victimhood.  Consequently, his timeless wisdom is precisely what both black and white folks most need to learn.  His wisdom looks identical to that of Amy Wax, a courageous law professor at the University of Pennsylvania, who endured enormous abuse from the “cancel culture” when she dared suggest affirmative action actually harms blacks (who need to succeed by cultivating healthy habits).  She set forth her views in Race, Wrongs, and Remedies, insisting blacks, like everyone else, need to recognize the problems generated by poor educational accomplishments and work habits, acerbated by drug abuse, criminality, and fatherless homes.  Government programs, however well-intended, nurture these pathologies.  Rather than choosing dependencies of various sorts blacks must take charge of their personal and communal lives.  Nothing else will succeed.  

Amy Wax’s tough love prescriptions gain affirmation in Rick Rigsby’s encomium to his father, for he wrote Lessons From a Third Grade Dropout “to re-acquaint readers with the wisdom—the common sense that was practiced simply and unwittingly by those who represent a generation gone by.   This was an era of individuals who worked hard without complaining.  They committed to doing whatever was necessary to help the company and support their families.  They took pride in doing a good job.  They worked without ceasing.  And they maintained high standards—they had high expectations for themselves and the others they were responsible for” (p. xxxiv).  Such a man was Roger Rigsby.  “This man never ever hid behind an excuse.  He never allowed his problems to determine his present or affect his future.  He realized that destiny was a choice and not a chance” (p. xxxii).  And his son wants to share his wisdom with his world.  “It’s the kind of wisdom that is rare in society today.  It’s the kind of wisdom that will cause you to be a better person, a greater leader, a more effective worker.  It’s the kind of wisdom that will cause you to make an impact . . . rather than just an impression” (p. xxxiv).   

Rick Rigsby consistently celebrates his father’s character.  “I have never met anyone like him. He simply lived character” (p. 121).  His “father believed to the core of his being that a man was not worth much if he could not be trusted to do the right thing at the right time in the right way.”  He lived uprightly, telling the truth, keeping his word, treating others well, loving your family, honoring the Lord.  Few younger folks, smothered in feel-good therapeutic babble, collecting meaningless “participation” trophies, endlessly looking at cell phone screens, encounter men like Roger Rigsby.  They rarely encounter members of a generation that prized doers rather than viewersDoers are, he constantly stressed, kind.  “My dad often said, ‘Son, it does not cost a dime to be kind’” (p. 34).  He taught his son to do good things, such as saying “thank you” or “yes, please” or “yes, sir,” and opening doors for others.  His dad urged him to encourage people you know and smile at folks you meet. 

Doers look for opportunities to help others.  “I can hear my dad’s voice ringing in my mind with a piercing familiarity, ‘Son, always put yourself in a position to help somebody else’” (p. 76).  It took his son “over four decades and three college degrees to understand” that his dad was saying:  “Son, you have a marvelous opportunity to build value in those around you by looking for ways to help humanity.  Remember, no job is beneath you, no task is too unimportant to be left incomplete.  Look for those you can help, and your life will be rich with exhilarating experiences, fond memories, and boundless energy from the satisfaction of assisting others.  . . . .  My son, there is no higher calling than to reach down and pull another up.  Helping is biblical, practical, and in great demand today.  Always make time to help another person!” (p. 82).  

Doers are, furthermore, disciplined—the “essence” of his father.  Doers find work and actually do it!  They show up on time—Roger Rigsby repeatedly stressed that it’s better to be an hour early than a minute late!  And in his 30 years of working as a cook for California Maritime Academy in Vallejo, California, he never once failed to be on time.   “Dad would leave home at 3:45 A.M., arriving at CMA one full hour ahead of his shift.  For years I thought the value of Dad’s behavior was the obvious.  But the real genius of my father’s discipline was what was produced as a result.  The quality of endurance was a hallmark in Dad’s life.  He never quit.  It was Dad’s lifestyle” (p. 58).  That explained “his incessant proclivity for excellence and his undeniable intolerance for mediocrity.  To this day, I hear his voice with a piercing familiarity: ‘Son, if you’re going to do a job, do it right!’  Nothing further needed to be mentioned.  Dad did not believe in slothful, lazy, mediocre, average, or adequate performance.  If you do something—he would say—you must take pride in it.  And how can a man take pride in something that is not his absolute very best?  There was no compromise here.  There was no shortcut here.  There was no gold medal for just getting by or special ribbon for finishing first.  At the very least, a good job was expected.  And if we did not do our best, we repeated the task until it met my father’s standard of excellence” (p. 85).  In this he was evoking the words of Martin Luther King, spoken a month before he was killed:  “‘All labor has value.  If you’re a street sweeper, sweep streets the way Michelangelo painted pictures.  Sweep streets the way Beethoven composed music.  Sweep streets the way Shakespeare wrote poetry.  Sweep streets in such a profound way that the Host of Heaven will say, ‘There goes a great street sweeper!’” (p. 86).

Such discipline enables one to remain standing amidst adversity and sorrow.  “This book,” Rigby says, “began as I stood at my wife’s casket.  Flanked by two young sons and a host of relatives and friends, our lives were over, our dreams dashed, our future bleak” (p. 135).  Before she died, he’d led what seemed to be a charmed life—teaching at a university, prospering, enjoying the good life.  But he was unprepared for the overwhelming sorrow that engulfed him as he watched his wife slip away.  “I never knew the pain of a broken heart could hurt so deeply.  I never knew loneliness so profound it could paralyze your life” (p. 149).   Fortunately for him, his father faithfully served as a  wise counsellor.  “He would say things like, ‘Son, now is the time to be a man.  Your wife needs a man, not a boy.  And I have not raised a boy.  I have raised a man.  And I am proud of you.  And with God’s help, you will make it through. And you begin right now by making a commitment—every day—to just stand’” (p. 148).  Just stand!  That’s a great prescription for right living.  “Son, just stand”—those words, spoken over his wife’s casket following her funeral, proved to be the lifeline for Rick Rigsby.  “Just stand.  The best lesson I have ever received.  The most profound lesson I have ever been taught.  The best job training course I have ever taken.  The best life coaching I have ever gained.  The best—absolute best—advice I have ever received.  My father’s life was speaking to me.  His life’s experiences were telling me a story.  It was a story that had two basic truths: 1) You can depend on God no matter what happens, and 2) If you can keep standing in the middle of hell, you will learn to walk again” (p. 121).

A year later Roger Rigsby died of cancer at the age of 77.  His son treasured the time he spent with his dad, for “even though he was leaving us slowly, the essence of my father was just as strong in that hospital bed as if he were standing on a podium.  “Dad, are you scared?” he asked.   “‘Heavens no, Son.  God has blessed me with two wonderful sons, a wonderful wife, and an amazing life.  And now I get to go home.  You boys carry on.  You carry on, Ricky.  Carry on.’  Even on his deathbed, he was teaching me to be a man.  Especially on his deathbed, he was teaching me to be a man.  Carry on.  Stay the course.  Hold your position.  Keep standing” (p. 157).  So Roger Rigsby, offers wisdom for us all! 

335 Fortitude & Rules for Living

A great 19th century Princeton theologian, A.A. Hodge, lamented:  “It is easier to find a score of men wise enough to discover the truth, than to find one intrepid enough, in the face of opposition, to stand up for it.”  That’s still true, for as Alexandr Solzhenitsyn noted:  “A decline in courage may be the most striking feature that an outside observer notices in the West today.   .  .  .  .  Such a decline in courage is particularly noticeable among the ruling and intellectual elites, causing an impression of a loss of courage by the entire society.”  Calling for a renewal of courage (one of the four cardinal virtues) Texas Congressman Dan Crenshaw wrote Fortitude:  American Resilience in an Age of Outrage (New York:  Grand Central Publishing, Kindle Edition, c.  2020).  

Crenshaw begins by recounting a recent incident in the halls of Congress, featuring a group of protesters wearing “shirts that simply read ‘stay outraged,’ along with a matching assortment of signs and buttons that appeared to be professionally crafted from an established vendor, not purchased hastily from some ragtag print shop” (p. 2).  They were obviously embracing the posture of Vermont Senator Bernie Sanders, who once tweeted:  “Never lose your sense of outrage,” relying on the fact that “the most effective political manipulation is achieved by raw emotion” (p. 2).   Rather than a thoughtful land of self-control and discussion, America has descended into an irrational culture of outrage which is “the latest threat to our American story” because of “the victimhood ideology that it elevates.  The threat is born of small beginnings, as big threats so often are.  It starts with toxic personal narratives wrapped in the cheap cloth of victimhood, always looking to an external culprit to blame for real or perceived injustices” (p. 221).  Instead,  Crenshaw insists we need something much better than unfettered emotions!  To him, a former Navy SEAL Lieutenant Commander who lost an eye on the battlefields of Afghanistan:  “Outrage is weakness.  It is the muting of rational thinking and the triumph of emotion.”  It’s not a virtue and “rarely is it productive, virtuous, or useful.  It is an emotion to overcome, not accept, and overcoming it requires mental strength. This book is about acquiring that necessary mental fortitude” (p. 4). 

Crenshaw’s “basic message is this:  If you’re losing your cool, you are losing.  If you are triggered, it is because you allowed someone else to dictate your emotional state.  If you are outraged, it is because you lack discipline and self-control.  These are personal defeats, not the fault of anyone else.  And each defeat shapes who you are as a person, and in the collective sense, who we are as a people.”  It is crucial that we Americans build “a society of iron-tough individuals who can think for themselves, take care of themselves, and recognize that a culture characterized by grit, discipline, and self-reliance is a culture that survives.  A culture characterized by self-pity, indulgence, outrage, and resentment is a culture that falls apart.  It really is that simple, and it is a truly existential choice.  We must make that choice.  And it must be a choice to be more disciplined, mentally tougher, and convinced of the fact that we control our own destiny.  The next chapter of our American story depends on it” (p. 10).

To provide perspective Crenshaw tells us that as a child he wanted to be a SEAL and ultimately fulfilled his aspirations on the battlefields of Afghanistan, where he was seriously injured by an exploding roadside bomb.  The doctors thought he would lose sight in both eyes, but he believed (and prayed) that one of them would, with appropriate surgery, heal.   “Though I am not one for overt expressions of faith, I will say this:  I genuinely believe God’s strength was working through me then.  He was allowing me to believe something impossible.  I prayed, and my family prayed, and we believed. We believed that military surgeons would pick through a pierced and shrapnel-ridden eye, remove the most minuscule shards and debris, and restore my sight. We did not have good reason to believe it. But we did” (p. 25).  Choosing to be positive, to hope for the best, was something his parents taught him.  He knew he could embrace either hope or despair, and he realized that, as Aristotle taught, that “habit defines us.  Before we pursue our higher purpose, before we have quality of character, we have habit.  My habit was to never quit.  My habit was to avoid self-pity and believe in a better future, albeit with a bit less vision.”   Importantly:  “Those habits were forged by lessons from a dying mother; her grit, her humor, her grace.  They were shaped in lessons from a loving father who gave us a decent life and refused to be beaten by the loss of the woman he had planned to spend his life with” (p. 34).

The lessons learned at home were reinforced by his BUD/S SEAL training (“the most effective screening process in the United States Armed Forces”) and subsequent service.  Taking note of his best officers he saw that they were not necessarily the strongest or best shooters.  Indeed:  “The qualities that made SEAL leaders great were rarely physical in nature” (p. 46).  The legendary SEAL toughness turns out to be more mental than physical!  Above all, they were calm, self-controlled, thoughtful.  They also insisted everyone be responsible for himself and his team.  “It was why Commander Jocko Willink, one of my mentors in the teams, wrote an entire book about the subject called Extreme Ownership.  The premise of the book is quite simple:  Everything is your fault.  Be accountable.  Take ownership.  Take responsibility. From this responsibility you will find freedom” (148).  Do every job—even small tasks like making your bed—as duty demands.  “The mantra ‘If you are going to do it, you might as well be the best at it’ is repeated constantly.  We live by it” (p. 150).  SEAL officers encourage their men to follow the SLLS prescription:  Stop; Look; Listen; Smell.  Before charging into battle make sure everything’s right.  Stay still before acting.  Silently study and think before moving.  “Don’t overreact, don’t let your emotions drive your action, think before you act.  In other words, stop and count to ten.  Like your mom and dad taught you.  This is stillness in the Stoic sense” (p. 80).   

Following Crenshaw’s rehabilitation from his battlefield injuries, he studied at Harvard’s Kennedy School of Government, earning a master’s degree in Public Administration.  While there he encountered youngsters who were amazingly gifted and ambitious.  But few of them had a deep sense of duty—as was evident in their disregard for doing small things well.  “I was amazed by how few people actually showed up on time to class, for instance.  I was amazed how many people typed away on their laptops—sending iMessages, not taking notes—while the professor tried to lecture.  It struck me because it was so normalized in college culture.  This lack of politeness and lack of basic manners was the norm, not the exception.”   He “couldn’t help but think, ‘You are going straight into the job market after this.  Who on earth will hire you if you can’t show up on time?’” (p. 176).  Such youngsters would easily become “vocal members of the outrage mob” haunting the halls of Congress—or staging protests in the streets—because they had lived remarkably easy lives.  “Few places on earth are as sheltered, and accommodating, and insulated from adversity as an American college campus” (p. 193), which almost necessarily produces angry protesters and self-pitying victims.  

What these students lacked was anything resembling the SEALs’ Stoic ethos.  Reading the ancient Stoics (Marcus Aurelius, Epictetus, Seneca) provides SEALs such as Crenshaw a philosophical perspective that’s invaluable for a warrior—and for a congressmen countering the angry, “woke,” outrage culture shredding the nation’s fabric.   This gives Fortitude a depth one rarely encounters in politicians’ electioneering boilerplates!  Concluding his treatise with a thoughtful analysis of American history and contemporary culture, Crenshaw says:  “I told you before about the SEAL Ethos.  Perhaps we now need an American Ethos.  Perhaps it goes something like this:  I will not quit in the face of danger or pain or self-doubt;  I will not justify the easier path before me.  I decide that all my actions, not just some, matter.  Every small task is a contribution toward a higher purpose.  Every day is undertaken with a sense of duty to be better than I was yesterday, even in the smallest of ways.  I seek out hardship.  I do not run from pain but embrace it, because I derive strength from my suffering.  I confront the inevitable trials of life with a smile.  I plan to keep my head, to be still, when chaos overwhelms me.  I will tell the story of my failures and hardships as a victor, not a victim.  I will be grateful.  Millions who have gone before me have suffered too much, fought too hard, and been blessed with far too little, for me to squander this life.  So I won’t.  My purpose will be to uphold and protect the spirit of our great republic, knowing that the values we hold dear can be preserved only by a strong people.  I will do my part. I will live with Fortitude” (p. 244).  Would there were more of his kind in Congress!  Would there were millions of us who would join him!

* * * * * * * * * * * * * * * * * * * * * * * * 

Robert Jeffress, pastor of the First Baptist Church Dallas, provides us valuable insights for living with integrity in Courageous:  10 Strategies for Thriving in a Hostile World (Grand Rapids:  Baker Publishing Group, c. 2020).  The book’s ten chapters are no doubt re-worked sermons, filled with biblical texts and illuminating examples, woven together to make a unified text.  Living in the increasingly anti-Christian “enemy-occupied territory” C.S. Lewis described, we need to cultivate important facets of courage.  

First:  “Don’t panic.”  Accept the fact that life is difficult, filled with trials and temptations, challenging in various ways.  When facing unexpected challenges, only a few of us take action—fight or flight; 80 percent freeze and fail to do anything.  But with God’s Grace we can, like Joshua, rise to the challenge and act wisely and well, for “The LORD your God is with you wherever you go.”  We can own the words the LORD spoke to Joshua:  “Be strong and courageous! Do not tremble or be dismayed” (Joshua 1:9).  Next:  “Gain Situational Awareness.”  Courage is neither rash nor cowardly.  It requires thoughtful assessment of what’s actually happening and how one should  react, learning to see and call things as they truly are, not as they wish them to be” (p. 44).  Embrace the example of the sons of Issachar “who understood the times, with knowledge of what Israel should do” (I Chron 12:32).  Third:  “Take Inventory.”  Be sure you are well-prepared, equipped with the “full armor of God” described by Paul in Ephesians.  Fourth:  “Develop a Victor, Not a Victim Mindset.”  In Christ, we’re called to overcome, not succumb, to the wiles of the devil!  Fifth, “Trust Your Training.”  Virtues get stronger,  Aristotle insisted,  as good habits are cultivated.  Live out the lessons you learned in Bible studies.  Sixth:  “Bend, Don’t Break.”  Seventh:  Beware of Celebrating the Summit.”  Even in apparent victories remember the war is never over.  Eighth:  ‘Learn from the Past.”  Learning from history and Scripture will fortify your soul and provide invaluable guidance.  Ninth:  “Help Others.”  Be a Barnabas.  Even if we must risk our lives (or fortunes or reputation) we must sacrificially seek to protect and care for others.  And, Tenth:  “Do the Next Right Thing.”  In accord with an ancient precept:  do all the good you can, where you can, while you can.  

                  * * * * * * * * * * * * * * * * * * * * * * * * 

Several years ago a Canadian psychologist, Jordan B. Peterson, became something of an internet sensation for speaking out against some of the politically-correct corruptions of the academic world.  He seemed to especially attract young men who were looking for a model of wisdom and strength.  In part this is because he is not a typical academic but a man who has worked with and understands the hard-working people he grew up with in Fairview, Alberta—very much a frontier settlement 400 miles distant from the nearest city.  Peterson set forth his central ideas in 12 Rules for Life:  An Antidote to Chaos (Toronto:  Random House Canada, c. 2018).  Though hewing to an agnostic secular perspective, routinely invoking Darwinian biology to explain both animal and human behavior, he nevertheless finds a wealth of insight in various religious traditions.  So much he says is compatible with Christian philosophy.  This is especially evident when he repeatedly deals seriously with the reality of Original Sin and our need of discipline.

He begins by explaining why he sets down “rules.”  As is evident in Exodus, God gave Ten Commandments, not Suggestions!  So too Peterson insists there are in fact given rules to follow if we are to avoid both the internal and external chaos of contemporary life.  As humans we simply need a “shared cultural system” that prescribes a code wherein some behaviors are accepted as true and valuable, prescribing goals worth celebrating and pursuing.  We are furthermore called to live rightly—“to shoulder the burden of Being and to take the heroic path.  We must each adopt as much responsibility as possible for individual life, society and the world.  We must each tell the truth and repair what is in disrepair and break down and recreate what is old and outdated” (p. 6).  So Peterson’s “12 Rules” prescribe ways to walk the straight and narrow way, avoiding soul-shrinking, soul-shredding chaos.  Therein one finds sufficient guidance to live the good life.  

“Rule 1:  STAND UP STRAIGHT WITH YOUR SHOULDERS BACK.”  He means this literally—attend to your posture!  Stand tall!  We’re  too often too easily defeated in life’s struggles, and when we cave in or lie down we slip into a dysfunctional state that easily begets depression and lethargy.  But if we do battle with the malevolent persons and powers we encounter we’ll become stronger.  Our nervous system will strengthen.  We’ll discover we’re braver than we feared.  We’ll discover we’re “not only a body” but “a spirit, so to speak— a psyche— as well.  Standing up physically also implies and invokes and demands standing up metaphysically.  Standing up means voluntarily accepting the burden of Being” (p. 41).  It means accepting the “terrible responsibility of life” as an adult, “accepting the end of the unconscious paradise of childhood,” and “willingly undertaking the sacrifices necessary to generate a productive and meaningful reality (it means acting to please God, in the ancient language)” (p. 42).

“Rule 2:  TREAT YOURSELF LIKE SOMEONE YOU ARE RESPONSIBLE FOR HELPING.”  If you’re caring for a sick child you insist he follow the doctor’s prescriptions.  Ironically, adults are far less likely to follow the doctor’s orders for themselves!  If you’re a good parent you want your children to become independent, self-reliant persons, strong rather than safe.  But all too many adults fail to do so themselves!  If you’re a wise person you also recognize and respect the ancient, inescapable differences between men and women, clearly evident in men’s drive to establish order in their world—building houses and town, establishing hierarchies, serving as policemen and soldiers, risking their lives to defend what they hold precious.  So too:  “Order is God the Father, the eternal Judge, ledger-keeper and dispenser of rewards and punishments” (p. 54). 

“Rule 3:  MAKE FRIENDS WITH PEOPLE WHO WANT THE BEST FOR YOU.”  After telling personal stories, showing why it’s important to terminate toxic friendships, Peterson insists endless loyalty to another person is never wise.  “Friendship is a reciprocal arrangement.  You are not morally obliged to support someone who is making the world a worse place.  Quite the opposite.  You should choose people who want things to be better, not worse.  It’s a good thing, not a selfish thing, to choose people who are good for you.  It’s appropriate and praiseworthy to associate with people whose lives would be improved if they saw your life improved” (p. 105).

“Rule 4:  COMPARE YOURSELF TO WHO YOU WERE YESTERDAY, NOT TO WHO SOMEONE ELSE IS TODAY.”  No matter what you do there’s almost always someone better at it!  Rather than compare yourself with others seek to daily develop your own unique self in your own unique setting.  Ask yourself:  “‘What could I do, that I would do, to make Life a little better?’”  “Aim high.  Set your sights on the betterment of Being.  Align yourself, in your soul, with Truth and the Highest Good” (p. 136).  And that Highest Good, Peterson says, in found in the Bible and especially in the Sermon on the Mount! 

“Rule 5:  DO NOT LET YOUR CHILDREN DO ANYTHING THAT MAKES YOU DISLIKE THEM.”  Rather than pamper and develop “a little God-Emperor of the Universe,” seek to shape him into an admirable adult.  Unfortunately, Peterson thinks, today’s parents want to be loved, fear their kids, and aspire to be their friends.  They’re simply following the poor advice doled out by the “adolescent ethos of the 1960s, a decade whose excesses led to a general denigration of adulthood, an unthinking disbelief in the existence of competent power, and the inability to distinguish between the chaos of immaturity and responsible freedom” (p. 148).  But children need guidance, correction, discipline—and only parents can do this well.  “It is an act of responsibility to discipline a child” (p. 153).  Echoes of the much-maligned James Dobson!

“Rule 6:  SET YOUR HOUSE IN PERFECT ORDER BEFORE YOU CRITICIZE THE WORLD.”  Start small.  Do what’s possible for you right now where you are—going to work, caring for your children, treating others rightly, taking responsibility for things you can personally control.  Stop fretting about—and scheming to remedy—the ills of the world.  “Rule 7:  PURSUE WHAT IS MEANINGFUL (NOT WHAT IS EXPEDIENT).”  Most folks do what’s pleasurable, choosing to focus on transient goods rather than permanent things.  But momentary sacrifices to gain long-term goals is the only way to live wisely.  Deferred gratification is the key to happiness.  “Rule 8:  TELL THE TRUTH— OR, AT LEAST, DON’T LIE.”  Talk straight to yourself and to others.  Your well-being, and the welfare of your world, depend upon it.  Lying is particularly the province of the ideologues so prominent in politics and media, and “oversimplification and falsification is particularly typical of ideologues.  They adopt a single axiom:  government is bad, immigration is bad, capitalism is bad, patriarchy is bad.  Then they filter and screen their experiences and insist ever more narrowly that everything can be explained by that axiom.  They believe, narcissistically, underneath all that bad theory, that the world could be put right, if only they held the controls” p. 258).    Truth-telling, as Alexander Solzhenitsyn found, is the only way to rests them.  

“Rule 9:  ASSUME THAT THE PERSON YOU ARE LISTENING TO MIGHT KNOW SOMETHING YOU DON’T.”  Have the humility to acknowledge your limits and forego trying to shape the world in accord with your personal notions.  Fresh, radical, novel, “creative” ideas—especially your own—are likely to be wrong!  Furthermore, know that thinking involves seriously listening to yourself!  “People think they think, but it’s not true.  It’s mostly self-criticism that passes for thinking.  True thinking is rare— just like true listening.  Thinking is listening to yourself. It’s difficult.” (p. 293).   “Rule 10:  BE PRECISE IN YOUR SPEECH.”  Words have meanings, so use them respectfully, thoughtfully.  There’s good reason for grammar, so learn to follow its prescriptions.  “Rule 11:  DO NOT BOTHER CHILDREN WHEN THEY ARE SKATEBOARDING.”  Kids (especially boys) need to take risks, to skirt with danger—it’s the best way for them to attain competence and maturity.  Let boys be boys—and resist every effort of the radical feminists to cram them into their skewed ideology.  (Peterson’s willingness to challenge feminist pieties is one reason he has such a large iTube male audience).  “Rule 12:  PET A CAT WHEN YOU ENCOUNTER ONE ON THE STREET.”  Find joy in the sheer goodness of creation.  Avoid the nihilistic impetus to despise and destroy the goodness in things.  

Summing up his central insights, Peterson says his studies led him to some “fundamental moral conclusions.  Aim up.  Pay attention.  Fix what you can fix.  Don’t be arrogant in your knowledge.  Strive for humility, because totalitarian pride manifests itself in intolerance, oppression, torture and death.  Become aware of your own insufficiency— your cowardice, malevolence, resentment and hatred.  Consider the murderousness of your own spirit before you dare accuse others, and before you attempt to repair the fabric of the world.  Maybe it’s not the world that’s at fault.  Maybe it’s you.  You’ve failed to make the mark.  You’ve missed the target.  You’ve fallen short of the glory of God.  You’ve sinned.  And all of that is your contribution to the insufficiency and evil of the world.  And, above all, don’t lie.  Don’t lie about anything, ever.  Lying leads to Hell. It was the great and the small lies of the Nazi and Communist states that produced the deaths of millions of people” (p. 242).  Enough said!  So let’s shape up and live responsibly!  

334 SCOTUS Justices

Justices of the United States Supreme Court have wielded extraordinary power throughout the nation’s history.  Unlike presidents and prominent legislators, however, they frequently remain rather unknown to the general public.  But for anyone interested, there are some fine treatises giving us insight into the lives and personalities of the jurists.  Sandra Day O’Connor, the first female justice (appointed by President Ronald Reagan) set down her memories of life on the Arizona in Lazy B:  Growing Up on a Cattle Ranch the American Southwest (New York:  Random House, c. 2002; Kindle Edition).  She prefaced her work with a statement by Wallace Stegner:  “There is something about living in big empty space, where people are few and distant, under a great sky that is alternately serene and furious, exposed to sun from four in the morning till nine at night, and to a wind that never seems to rest—there is something about exposure to that big country that not only tells an individual how small he is, but steadily tells him who he is.”  And O’Connor obviously learned who she was by understanding the big country around her.  

This “big empty space” certainly helped shape O’Connor, giving her a strong, nature-based frontier ethic.  Growing up unchurched, she once asked her father why they didn’t  “ever go to church on Sunday?”   He responded:  “‘It’s too far to go to town.  Besides, most of the local preachers aren’t very good.’  “Do you believe in God?” [she asked]  ‘Yes, I do. I know some people question whether God exists and whether all those Bible stories are true.  I don’t know about the stories, but when you watch the world around us . . . and see the laws of nature work, you have to believe that some power beyond us has created the universe and has established the way nature works.  . . . . It is an amazing, complex, but orderly universe.  And we are only specks in it. There is surely something—a God if you will—who created all of this. And we don’t have to go to church to appreciate it.  It is all around us. This is our church.” (p. 143).  So to the extent she had a moral compass, it came from the Natural Law and her parents’ frontier ethos.  

The Lazy B Ranch was located west of Lordsburg, New Mexico, in the “sparse, open high desert country south of the Gila River on the border of Arizona and New Mexico.”  It was land described by Kit Carson as “so desolate, desert, and God-forsaken that a wolf could not make a living on it” (p. 14).  It’s “high desert country—dry, windswept, clear, often cloudless” (p. 6).  “Water was scarce and hard to find.  Every drop counted.  We built catchment basins and dirt tanks to catch and store it.  We pumped it from underground.  We measured it and used it sparingly.  Life depended on it” (p. 7).  Her father said:  “‘Keep the grass healthy, keep adequate water reserves, take care of the land, and it will take care of us’” (p. 33).  Their cattle needed to graze on public lands (the “open range”), so the Lazy B Ranch controlled some 160,000 acres and measured “roughly 250 square miles, an area about 16 miles across and 15 miles long” (p. 19).  It enabled the Days to graze some 2,000 cattle.

The Lazy B “was the largest and most successful ranch in the region” due to the hard work of her  parents, Harry and Ada Mae Day.  They took up residence there in 1927, and they “thought there was no better life anyone could live than on the Lazy B.”  They lived there 50 years and had two children, including the first-born Sandra, who developed an especially strong bond with her father.  “As the first child, I was always the darling of my daddy’s eye.  . . . .  I loved the ranch and adored my father.  I loved [Mother] MO, too, but the bond between a little girl and her father is often something special.  How lucky I felt to be able to share as much of his life as I did!” (p. 96).  She also developed “a love for the land and for the way of life on the ranch that has stayed with me.  Spending hours each day at the dinner table discussing ranching, politics, or economics is a treat that many young people don’t experience” (p. 29). Sandra’s mother brought cultural refinement and a commitment to education in the family.  She taught Sandra to read at the age of four and provided a variety of magazines and books for her to devour.  “MO was a tidy package of good looks, competence, and charm.  She could fit in at a gathering of Arizona ranch wives or at an elegant party in Washington, D.C.  She was the only female role model we had,” and she “made a hard life look easy.  In a harsh environment where weather, the cowboys, and the animals were all unpredictable, she was unfailingly loving and kind.  She created an appealing and delightful life for her family all her days.  While some of the cowboys taught us that only the toughest survive, MO taught us that kindness and love can also produce survivors, and in a happy atmosphere” (p. 49).  

In addition to her family, O’Connor fondly remembered valuable ranch “hands,” a few of whom spent much of their lives as employees of the Lazy B.  “The cowboys did whatever  job was required.  They met the unexpected as though they’d known about it all along.  They never complained, and they made the best of everything along the way” (p. 124).  She devotes many pages to describing life on the ranch—round-ups, horses, routine tasks so essential for its operation.  Less happy memories include the growing federal government’s role in controlling the ranch, especially as environmentalists successfully pursued their agendas.  After her parents’ deaths, her brother Alan ran the ranch for a few years before selling it.  He and Sandra “worry about the future of the federal and state lands in the greater Southwest. We agree on one thing:  the land is better protected from destruction by off-road vehicles and people out for target shooting when it is occupied by responsible ranchers.” 

When it was time for her to go to school Sandra want to Redford School for Girls in El Paso, Texas, where she lived with her grandmother while attending classes.  She acquired a fine education and made life-long friends.  In due time she attended another El Paso school, Austin High School.  When she was 16 she enrolled at Stanford University, graduating with distinction and later attending its law school.  Graduating in 1952 she married a fellow law school classmate and soon began her legal career in Phoenix.  Thirty years later President Reagan appointed her to the Supreme Court, something O’Connor found almost incredible.  “It did not seem possible that a ranch girl would grow up to serve on our nation’s highest court” (p. 199).   Concluding her account, O’Connor said:  “The power of the memories of life on the Lazy B is strong.   It surges through my mind and my heart often.  . . . . We know that our characters were shaped by our experiences there,” where the “value system we learned was simple and unsophisticated and the product of necessity.  What counted was competence and the ability to do whatever was required to maintain the ranch operation in good working order—the livestock, the equipment, the buildings, wells, fences, and vehicles.  Verbal skills were less important than the ability to know and understand how things work in the physical world.  Personal qualities of honesty, dependability, competence, and good humor were valued most. These qualities were evident in most of the people who lived and worked at the Lazy B through the years” (p. 315).

And these were the qualities that sustained O’Connor throughout her years, entitling her to our respect for her years of public service on the Supreme Court of the United States.  

                                                * * * * * * * * * * * * * * * * * * * * * *

One of the finest autobiographies I’ve read is Clarence Thomas’s My Grandfather’s Son:  A Memoir (New York:  HarperCollins Publishers, c. 2007).  Looking back over his life, he says:  “All you can do is put one foot in front of the other and ‘play the hand that you’re dealt,’ as my grandfather so often said.  That’s what I did:  I did my best and hoped for the best, too often fearing that I was getting the worst.  In fact, though, I got everything I needed.  Much of it came from two people, my grandfather and grandmother, who gave me what I needed to endure and, eventually, to prosper” (p. x).  His grandparents were hardly famous or important to the world-at-large, but they meant everything to young Clarence.  

Reared for a few years by his mother (he almost never saw his father), Thomas was locked into  her poverty-stricken, dysfunctional world before her parents agreed to take care of Clarence and his brother.  Consequently:  “In every way that counts, I am my grandfather’s son.  I even called him ‘Daddy,’” and he was “determined to mold me in his image.  . . . .  He was the one hero in my life.  What I am is what he made me” (p. 2).  “Daddy” had a third-grade education and could barely read, but he had a strong work-ethic and determined to discipline his grandsons.  “‘The damn vacation is over,’ Daddy had told us on the morning we moved into his house.”  He declared that “while our mother had allowed us to come and go as we pleased, there would be ‘manners and behavior’ and ‘rules and regulations’ from now on” (p. 12).  That meant a great deal of hard work, which included helping Daddy deliver fuel oil when Clarence got out of school!  And at the age of 10 he was informed he was expected “to pull my load on the farm” (p. 23). 

“The family farm and our unheated oil truck became my most important classrooms, the schools in which Daddy passed on the wisdom he had acquired in the course of a long life as an ill-educated, modestly successful black man in the Deep South.  Despite the hardships he had faced, there was no bitterness or self-pity in his heart” (p. 26).   “He never praised us, just as he never hugged us” (p. 26).  In return, however, they “lived a life of luxury” compared to their early years, for they had a comfortable home with modern appliances,  plenty of food, and the security of knowing they were cared for.  As a child, Thomas often resented his grandfather’s severity.  But later in life he “came to appreciate what I had not understood as a child:  I had been raised by the greatest man I have ever known” (p. 28).

His grandfather had earlier joined the Roman Catholic Church, admiring her orderly rituals and disciplined clergy.  He also wanted a quality education for his grandsons and enrolled them in the Catholic grammar and high schools in Savannah, Georgia, where the nuns were “far more demanding” than the public school teachers.  Importantly, the nuns “taught us that God made all men equal, that blacks were inherently equal to whites, and that segregation was morally wrong” (p. 15).  Serving as an altar boy for mass, young Thomas contemplated becoming a priest and entered a Catholic seminary, where he studied hard and excelled in his course work.  Importantly, he profited from the discipline it afforded.  In time he decided to drop out of the seminary and go to college—a decision that precipitated an unexpected confrontation with his grandfather, who ordered him to leave home and survive on his own!  He left home in 1968 and easily slipped into an angry state, railing against various injustices in the country.  A friend of his introduced him to Marxism and Students for a Democratic Society, so when he went off to study at Holy Cross University in Massachusetts he was “an angry black man.”  Indeed:  “Racism had become the answer to all my questions, the trump card that won every argument” (p. 52).  He joined anti-war protests, chanting “Ho, Ho, Ho Chi Minh,” and endorsed all the then-hip radical causes.  But one morning, after risking his academic career attending a protest, he “stopped in front of the chapel and prayed for the first time in nearly two years.  I promised Almighty God that if He would purge my heart of anger, I would never hate again” (p. 60).  Soon thereafter he “began to suspect that Daddy had been right all along:  the only hope I had of changing the world was to change myself first” (p. 60).  

Thenceforth he studied ever more diligently, gaining entrance to the honors program at Holy Cross and then to Yale Law School.  He was also rethinking his worldview, doubting the efficacy of  the “affirmative action” policies that were bringing unqualified blacks into universities where they were bound to fail.  He began to critique the burgeoning welfare system’s impact  on African Americans.  Following graduation from Yale, he joined the staff of John Danforth, then serving as Missouri’s attorney general.  Thomas and his wife settled into Jefferson City, Missouri, finding acceptance and happiness in both his work and their social life.  Continuing to rethink his views on race, he found a helpful guide in Thomas Sowell, an erudite black economist.  When Danforth was elected to the United States Senate, Thomas soon followed him to Washington, D.C., and in 1980 he registered as a Republican and voted for Ronald Reagan!  “It was a giant step for a black man, but I believed it to be a logical one.  I saw no good coming from an ever-larger government that meddled with incompetence if not mendacity, in the lives of its citizens” (p. 130).  Now moving in Republican circles, he was appointed assistant secretary for civil rights in the Department of Education and later chairman of the Equal Employment Opportunity Commission.  He was turning more conservative, and as he took public stands at odds with the liberal agenda of journalists and civil rights leaders he suffered constant criticism and calumny.  “The only good things about these attacks was that they encouraged me to return to the faith that had sustained me in my youth” (p. 184).  He began praying and attending church, confessing that by “running away from God, I had thrown away the most important part of my grandparents’ legacy” (p. 184).  

When George W. Bush was elected President in 1988, he decided to nominate Thomas to the U.S. Court of Appeals for the District of Columbia.  Once confirmed by the Senate in 1989, he found the position much to his liking.  The next year, when Justice Marshall retired, President Bush nominated Thomas to replace him on the Supreme Court.  While making the obligatory visits to senators on Capitol Hill the barrage of slanderous attacks in the media rendered his “once-cheerful home . . . a joyless hermit’s cell” (p. 225).  Having earlier witnessed the devious and dishonest way senators Ted Kennedy and Joe Biden had treated Robert Bork, Thomas braced himself for the barrage of abuse to come.  But he never imagined that one of the women he had helped in his prior positions would become “my most traitorous adversary” (p. 230).  That woman, of course was Anita Hill.

Appearing before the Judiciary Committee, Thomas encountered an agenda crafted by its chairman, Joe Biden.  Privately, Biden had seemed cordial and supportive, but in public it became clear that his “smooth, sincere promises that he would treat me fairly were nothing but talk” (p. 236).  Capping his duplicity, Biden called Anita Hill to testify, and she made virulent allegations regarding his sexually-offensive behavior.  The media mob took her every word as gospel while disregarding Thomas’s explanations and defense.   Deeply wounded and frustrated by the process, Thomas finally erupted in a memorable verbal torrent:  “This is a circus.  It is a national disgrace.  And from my standpoint, as a black American, as far as I am concerned, it is a high-tech lynching for uppity blacks who in any way deign to think for themselves, to do for themselves, to have different ideas, and it is a message that, unless you kowtow to an old order, this is what will happen to you and you will be lynched, destroyed, caricatured by a committee of the U.S. Senate rather than hung from a tree” (p. 271).  Members of the committee were clearly stunned.  Public opinion instantly shifted.  And Clarence Thomas would become a justice of the United States Supreme Court, joining Antonin Scalia in rendering consistently originalist opinions.  

Throughout those difficult days Thomas relied on his deepening Christian faith.  He was also given invaluable strength by his faithful wife and the enduring support (and times of prayer with) Senator Danforth.  And he finally found how wonderfully his Daddy had lived out the wisdom he desperate needed in those trying days.  My Grandfather’s Son is an illuminating autobiography, because it reveals how one’s spiritual life and gratitude for family make life ultimately good. 

* * * * * * * * * * * * * * * * * * * * * * *

In Justice on Trial: The Kavanaugh Confirmation and the Future of the Supreme Court (Washington:  Regnery Publishing, Kindle Edition, c. 2019), Mollie Hemingway and Carrie Severeno provide a detailed account of one of the more disgraceful episodes in American history.  President Trump nominated Brett Kavanaugh to replace Justice Anthony Kennedy in 2018.  Justice Kennedy had recommended six of his former clerks, including Kavanaugh, whom he considered simply “brilliant.”  Kavanaugh had served on the D.C. Circuit Court for 12 years, written some 300 opinions, and was widely applauded for his judicial acumen.  (By contrast, Obama appointee Elena Kagan “had no judicial opinions to her name” but was easily confirmed by a Republican controlled Senate).  Kavanaugh didn’t quite fit the anti-establishment profile Trump wanted, but he seemed to be a safe, “moderate” choice who could easily survive the confirmation process.  Adding to his judicial accomplishments, he regularly attended a Catholic church and had a sterling reputation as a devoted husband and father.  Qualifications, however, meant nothing to powerful Democrats and their militant (pro-abortion) supporters.  As soon as it was known Trump had named his nominee, “a large crowd gathered outside the Supreme Court in a protest organized by the Center for American Progress (CAP), funded by George Soros and founded by John Podesta, a close aide to Barack Obama and the Clintons.  As they waited to find out who it was, they chanted “Hey, hey! Ho, ho!  The patriarchy has got to go!’” (#1131).   Partisans of the “resistance,”  Senate Democrats had used every possible parliamentary procedure to delay every Trump cabinet nomination, and they were even more determined to frustrate his judicial nominees.  

When the Judiciary Committee scheduled the Kavanaugh hearing, “Democrats considered staging a mass walkout or not showing up.  Fearing that such an action might backfire, however, they came up with a different plan:  disruption” (#1620).  The committee room was packed, while representatives of the “NAACP and NARAL wore shirts of various colors and lined the walkways, forming a rainbow of protesters” (#1625).  Highly disciplined and meticulously scripted, disrupters in the room “shrieked and were arrested, a pattern that would continue throughout the hearings” (#1637).   As protesters (flown in from all parts of the country and funded by Planned Parenthood) were arrested and removed, their seats were immediately filled by others, waiting their time to interrupt the procedures.  The Democrat Senators disrupted in their own way.  As soon as Chairman Charles Grassley opened the hearing, Senator Kamala Harris interrupted him, demanding more time to examine the 42,000 just-released documents dealing with Kavanaugh’s judicial records.  “It took nearly an hour and a half, with dozens of interruptions, for Grassley to get through his ten-minute opening statement” (#1651).  “A major source of the hearings’ drama was political ambition.  Ever since Joe Biden’s grandstanding during the [1987] Bork hearings, senators have been powerfully tempted to exploit a perch on the Senate Judiciary Committee for public attention” (#1871).  So  Senators Cory Booker and Kamala Harris and Amy Klobuchar tried to outdo each other in posturing for the public at Kavanaugh’s expense.  

Then Diane Feinstein, defying procedural rules, released copies of a letter alleging Kavanaugh had sexually assaulted a woman while they were in high school.  Kavanaugh couldn’t remember the woman, Christine Blasey Ford, since they’d attended different schools and moved in different social circles.  Importantly, those who’d known her insisted her “behavior in high school and college were dramatically at odds with her presentation in the media” (#2215).  On the other hand, eighty-seven women who had known Kavanaugh for many years held a press conference to make clear their support of him and validate his probity.  Senator John Cornyn of Texas put it plainly:  “‘The problem is, Dr. Ford can’t remember when it was, where it was, or how it came to be. There are some gaps there that need to be filled.’  Cornyn had simply stated the facts.  Those were enormous gaps in an accusation of sexual assault that was intended to keep one of the nation’s most distinguished judges off the Supreme Court.  But the media responded as if Cornyn were maliciously sowing doubt about an account that anyone of sound mind must regard as unimpeachable” (#2331).   With no concern for legal traditions, Senate Democrats and the media seized upon Ford’s words  as the capstone of their ferocious attacks on Kavanaugh.  “Normally, the burden of proof is on the accuser, but the media were not even paying lip service to that principle” (#2213).  

Hemingway and Severeno carefully document all the developments in this disgraceful episode, making it clear how maliciously Democrats and media sought to destroy a good man.  That Kavanaugh survived and hearings and was finally approved as a Supreme Court justice bears witness to his courage and the constant support of the president who nominated him. 

SCOTUS JUSTICES     

Justices of the United States Supreme Court have wielded extraordinary power throughout the nation’s history.  Unlike presidents and prominent legislators, however, they frequently remain rather unknown to the general public.  But for anyone interested, there are some fine treatises giving us insight into the lives and personalities of the jurists.  Sandra Day O’Connor, the first female justice (appointed by President Ronald Reagan) set down her memories of life on the Arizona in Lazy B:  Growing Up on a Cattle Ranch the American Southwest (New York:  Random House, c. 2002; Kindle Edition).  She prefaced her work with a statement by Wallace Stegner:  “There is something about living in big empty space, where people are few and distant, under a great sky that is alternately serene and furious, exposed to sun from four in the morning till nine at night, and to a wind that never seems to rest—there is something about exposure to that big country that not only tells an individual how small he is, but steadily tells him who he is.”  And O’Connor obviously learned who she was by understanding the big country around her.  

This “big empty space” certainly helped shape O’Connor, giving her a strong, nature-based frontier ethic.  Growing up unchurched, she once asked her father why they didn’t  “ever go to church on Sunday?”   He responded:  “‘It’s too far to go to town.  Besides, most of the local preachers aren’t very good.’  “Do you believe in God?” [she asked]  ‘Yes, I do. I know some people question whether God exists and whether all those Bible stories are true.  I don’t know about the stories, but when you watch the world around us . . . and see the laws of nature work, you have to believe that some power beyond us has created the universe and has established the way nature works.  . . . . It is an amazing, complex, but orderly universe.  And we are only specks in it. There is surely something—a God if you will—who created all of this. And we don’t have to go to church to appreciate it.  It is all around us. This is our church.” (p. 143).  So to the extent she had a moral compass, it came from the Natural Law and her parents’ frontier ethos.  

The Lazy B Ranch was located west of Lordsburg, New Mexico, in the “sparse, open high desert country south of the Gila River on the border of Arizona and New Mexico.”  It was land described by Kit Carson as “so desolate, desert, and God-forsaken that a wolf could not make a living on it” (p. 14).  It’s “high desert country—dry, windswept, clear, often cloudless” (p. 6).  “Water was scarce and hard to find.  Every drop counted.  We built catchment basins and dirt tanks to catch and store it.  We pumped it from underground.  We measured it and used it sparingly.  Life depended on it” (p. 7).  Her father said:  “‘Keep the grass healthy, keep adequate water reserves, take care of the land, and it will take care of us’” (p. 33).  Their cattle needed to graze on public lands (the “open range”), so the Lazy B Ranch controlled some 160,000 acres and measured “roughly 250 square miles, an area about 16 miles across and 15 miles long” (p. 19).  It enabled the Days to graze some 2,000 cattle.

The Lazy B “was the largest and most successful ranch in the region” due to the hard work of her  parents, Harry and Ada Mae Day.  They took up residence there in 1927, and they “thought there was no better life anyone could live than on the Lazy B.”  They lived there 50 years and had two children, including the first-born Sandra, who developed an especially strong bond with her father.  “As the first child, I was always the darling of my daddy’s eye.  . . . .  I loved the ranch and adored my father.  I loved [Mother] MO, too, but the bond between a little girl and her father is often something special.  How lucky I felt to be able to share as much of his life as I did!” (p. 96).  She also developed “a love for the land and for the way of life on the ranch that has stayed with me.  Spending hours each day at the dinner table discussing ranching, politics, or economics is a treat that many young people don’t experience” (p. 29). Sandra’s mother brought cultural refinement and a commitment to education in the family.  She taught Sandra to read at the age of four and provided a variety of magazines and books for her to devour.  “MO was a tidy package of good looks, competence, and charm.  She could fit in at a gathering of Arizona ranch wives or at an elegant party in Washington, D.C.  She was the only female role model we had,” and she “made a hard life look easy.  In a harsh environment where weather, the cowboys, and the animals were all unpredictable, she was unfailingly loving and kind.  She created an appealing and delightful life for her family all her days.  While some of the cowboys taught us that only the toughest survive, MO taught us that kindness and love can also produce survivors, and in a happy atmosphere” (p. 49).  

In addition to her family, O’Connor fondly remembered valuable ranch “hands,” a few of whom spent much of their lives as employees of the Lazy B.  “The cowboys did whatever  job was required.  They met the unexpected as though they’d known about it all along.  They never complained, and they made the best of everything along the way” (p. 124).  She devotes many pages to describing life on the ranch—round-ups, horses, routine tasks so essential for its operation.  Less happy memories include the growing federal government’s role in controlling the ranch, especially as environmentalists successfully pursued their agendas.  After her parents’ deaths, her brother Alan ran the ranch for a few years before selling it.  He and Sandra “worry about the future of the federal and state lands in the greater Southwest. We agree on one thing:  the land is better protected from destruction by off-road vehicles and people out for target shooting when it is occupied by responsible ranchers.” 

When it was time for her to go to school Sandra want to Redford School for Girls in El Paso, Texas, where she lived with her grandmother while attending classes.  She acquired a fine education and made life-long friends.  In due time she attended another El Paso school, Austin High School.  When she was 16 she enrolled at Stanford University, graduating with distinction and later attending its law school.  Graduating in 1952 she married a fellow law school classmate and soon began her legal career in Phoenix.  Thirty years later President Reagan appointed her to the Supreme Court, something O’Connor found almost incredible.  “It did not seem possible that a ranch girl would grow up to serve on our nation’s highest court” (p. 199).   Concluding her account, O’Connor said:  “The power of the memories of life on the Lazy B is strong.   It surges through my mind and my heart often.  . . . . We know that our characters were shaped by our experiences there,” where the “value system we learned was simple and unsophisticated and the product of necessity.  What counted was competence and the ability to do whatever was required to maintain the ranch operation in good working order—the livestock, the equipment, the buildings, wells, fences, and vehicles.  Verbal skills were less important than the ability to know and understand how things work in the physical world.  Personal qualities of honesty, dependability, competence, and good humor were valued most. These qualities were evident in most of the people who lived and worked at the Lazy B through the years” (p. 315).

And these were the qualities that sustained O’Connor throughout her years, entitling her to our respect for her years of public service on the Supreme Court of the United States.  

                                                * * * * * * * * * * * * * * * * * * * * * *

One of the finest autobiographies I’ve read is Clarence Thomas’s My Grandfather’s Son:  A Memoir (New York:  HarperCollins Publishers, c. 2007).  Looking back over his life, he says:  “All you can do is put one foot in front of the other and ‘play the hand that you’re dealt,’ as my grandfather so often said.  That’s what I did:  I did my best and hoped for the best, too often fearing that I was getting the worst.  In fact, though, I got everything I needed.  Much of it came from two people, my grandfather and grandmother, who gave me what I needed to endure and, eventually, to prosper” (p. x).  His grandparents were hardly famous or important to the world-at-large, but they meant everything to young Clarence.  

Reared for a few years by his mother (he almost never saw his father), Thomas was locked into  her poverty-stricken, dysfunctional world before her parents agreed to take care of Clarence and his brother.  Consequently:  “In every way that counts, I am my grandfather’s son.  I even called him ‘Daddy,’” and he was “determined to mold me in his image.  . . . .  He was the one hero in my life.  What I am is what he made me” (p. 2).  “Daddy” had a third-grade education and could barely read, but he had a strong work-ethic and determined to discipline his grandsons.  “‘The damn vacation is over,’ Daddy had told us on the morning we moved into his house.”  He declared that “while our mother had allowed us to come and go as we pleased, there would be ‘manners and behavior’ and ‘rules and regulations’ from now on” (p. 12).  That meant a great deal of hard work, which included helping Daddy deliver fuel oil when Clarence got out of school!  And at the age of 10 he was informed he was expected “to pull my load on the farm” (p. 23). 

“The family farm and our unheated oil truck became my most important classrooms, the schools in which Daddy passed on the wisdom he had acquired in the course of a long life as an ill-educated, modestly successful black man in the Deep South.  Despite the hardships he had faced, there was no bitterness or self-pity in his heart” (p. 26).   “He never praised us, just as he never hugged us” (p. 26).  In return, however, they “lived a life of luxury” compared to their early years, for they had a comfortable home with modern appliances,  plenty of food, and the security of knowing they were cared for.  As a child, Thomas often resented his grandfather’s severity.  But later in life he “came to appreciate what I had not understood as a child:  I had been raised by the greatest man I have ever known” (p. 28).

His grandfather had earlier joined the Roman Catholic Church, admiring her orderly rituals and disciplined clergy.  He also wanted a quality education for his grandsons and enrolled them in the Catholic grammar and high schools in Savannah, Georgia, where the nuns were “far more demanding” than the public school teachers.  Importantly, the nuns “taught us that God made all men equal, that blacks were inherently equal to whites, and that segregation was morally wrong” (p. 15).  Serving as an altar boy for mass, young Thomas contemplated becoming a priest and entered a Catholic seminary, where he studied hard and excelled in his course work.  Importantly, he profited from the discipline it afforded.  In time he decided to drop out of the seminary and go to college—a decision that precipitated an unexpected confrontation with his grandfather, who ordered him to leave home and survive on his own!  He left home in 1968 and easily slipped into an angry state, railing against various injustices in the country.  A friend of his introduced him to Marxism and Students for a Democratic Society, so when he went off to study at Holy Cross University in Massachusetts he was “an angry black man.”  Indeed:  “Racism had become the answer to all my questions, the trump card that won every argument” (p. 52).  He joined anti-war protests, chanting “Ho, Ho, Ho Chi Minh,” and endorsed all the then-hip radical causes.  But one morning, after risking his academic career attending a protest, he “stopped in front of the chapel and prayed for the first time in nearly two years.  I promised Almighty God that if He would purge my heart of anger, I would never hate again” (p. 60).  Soon thereafter he “began to suspect that Daddy had been right all along:  the only hope I had of changing the world was to change myself first” (p. 60).  

Thenceforth he studied ever more diligently, gaining entrance to the honors program at Holy Cross and then to Yale Law School.  He was also rethinking his worldview, doubting the efficacy of  the “affirmative action” policies that were bringing unqualified blacks into universities where they were bound to fail.  He began to critique the burgeoning welfare system’s impact  on African Americans.  Following graduation from Yale, he joined the staff of John Danforth, then serving as Missouri’s attorney general.  Thomas and his wife settled into Jefferson City, Missouri, finding acceptance and happiness in both his work and their social life.  Continuing to rethink his views on race, he found a helpful guide in Thomas Sowell, an erudite black economist.  When Danforth was elected to the United States Senate, Thomas soon followed him to Washington, D.C., and in 1980 he registered as a Republican and voted for Ronald Reagan!  “It was a giant step for a black man, but I believed it to be a logical one.  I saw no good coming from an ever-larger government that meddled with incompetence if not mendacity, in the lives of its citizens” (p. 130).  Now moving in Republican circles, he was appointed assistant secretary for civil rights in the Department of Education and later chairman of the Equal Employment Opportunity Commission.  He was turning more conservative, and as he took public stands at odds with the liberal agenda of journalists and civil rights leaders he suffered constant criticism and calumny.  “The only good things about these attacks was that they encouraged me to return to the faith that had sustained me in my youth” (p. 184).  He began praying and attending church, confessing that by “running away from God, I had thrown away the most important part of my grandparents’ legacy” (p. 184).  

When George W. Bush was elected President in 1988, he decided to nominate Thomas to the U.S. Court of Appeals for the District of Columbia.  Once confirmed by the Senate in 1989, he found the position much to his liking.  The next year, when Justice Marshall retired, President Bush nominated Thomas to replace him on the Supreme Court.  While making the obligatory visits to senators on Capitol Hill the barrage of slanderous attacks in the media rendered his “once-cheerful home . . . a joyless hermit’s cell” (p. 225).  Having earlier witnessed the devious and dishonest way senators Ted Kennedy and Joe Biden had treated Robert Bork, Thomas braced himself for the barrage of abuse to come.  But he never imagined that one of the women he had helped in his prior positions would become “my most traitorous adversary” (p. 230).  That woman, of course was Anita Hill.

Appearing before the Judiciary Committee, Thomas encountered an agenda crafted by its chairman, Joe Biden.  Privately, Biden had seemed cordial and supportive, but in public it became clear that his “smooth, sincere promises that he would treat me fairly were nothing but talk” (p. 236).  Capping his duplicity, Biden called Anita Hill to testify, and she made virulent allegations regarding his sexually-offensive behavior.  The media mob took her every word as gospel while disregarding Thomas’s explanations and defense.   Deeply wounded and frustrated by the process, Thomas finally erupted in a memorable verbal torrent:  “This is a circus.  It is a national disgrace.  And from my standpoint, as a black American, as far as I am concerned, it is a high-tech lynching for uppity blacks who in any way deign to think for themselves, to do for themselves, to have different ideas, and it is a message that, unless you kowtow to an old order, this is what will happen to you and you will be lynched, destroyed, caricatured by a committee of the U.S. Senate rather than hung from a tree” (p. 271).  Members of the committee were clearly stunned.  Public opinion instantly shifted.  And Clarence Thomas would become a justice of the United States Supreme Court, joining Antonin Scalia in rendering consistently originalist opinions.  

Throughout those difficult days Thomas relied on his deepening Christian faith.  He was also given invaluable strength by his faithful wife and the enduring support (and times of prayer with) Senator Danforth.  And he finally found how wonderfully his Daddy had lived out the wisdom he desperate needed in those trying days.  My Grandfather’s Son is an illuminating autobiography, because it reveals how one’s spiritual life and gratitude for family make life ultimately good. 

* * * * * * * * * * * * * * * * * * * * * * *

In Justice on Trial: The Kavanaugh Confirmation and the Future of the Supreme Court (Washington:  Regnery Publishing, Kindle Edition, c. 2019), Mollie Hemingway and Carrie Severeno provide a detailed account of one of the more disgraceful episodes in American history.  President Trump nominated Brett Kavanaugh to replace Justice Anthony Kennedy in 2018.  Justice Kennedy had recommended six of his former clerks, including Kavanaugh, whom he considered simply “brilliant.”  Kavanaugh had served on the D.C. Circuit Court for 12 years, written some 300 opinions, and was widely applauded for his judicial acumen.  (By contrast, Obama appointee Elena Kagan “had no judicial opinions to her name” but was easily confirmed by a Republican controlled Senate).  Kavanaugh didn’t quite fit the anti-establishment profile Trump wanted, but he seemed to be a safe, “moderate” choice who could easily survive the confirmation process.  Adding to his judicial accomplishments, he regularly attended a Catholic church and had a sterling reputation as a devoted husband and father.  Qualifications, however, meant nothing to powerful Democrats and their militant (pro-abortion) supporters.  As soon as it was known Trump had named his nominee, “a large crowd gathered outside the Supreme Court in a protest organized by the Center for American Progress (CAP), funded by George Soros and founded by John Podesta, a close aide to Barack Obama and the Clintons.  As they waited to find out who it was, they chanted “Hey, hey! Ho, ho!  The patriarchy has got to go!’” (#1131).   Partisans of the “resistance,”  Senate Democrats had used every possible parliamentary procedure to delay every Trump cabinet nomination, and they were even more determined to frustrate his judicial nominees.  

When the Judiciary Committee scheduled the Kavanaugh hearing, “Democrats considered staging a mass walkout or not showing up.  Fearing that such an action might backfire, however, they came up with a different plan:  disruption” (#1620).  The committee room was packed, while representatives of the “NAACP and NARAL wore shirts of various colors and lined the walkways, forming a rainbow of protesters” (#1625).  Highly disciplined and meticulously scripted, disrupters in the room “shrieked and were arrested, a pattern that would continue throughout the hearings” (#1637).   As protesters (flown in from all parts of the country and funded by Planned Parenthood) were arrested and removed, their seats were immediately filled by others, waiting their time to interrupt the procedures.  The Democrat Senators disrupted in their own way.  As soon as Chairman Charles Grassley opened the hearing, Senator Kamala Harris interrupted him, demanding more time to examine the 42,000 just-released documents dealing with Kavanaugh’s judicial records.  “It took nearly an hour and a half, with dozens of interruptions, for Grassley to get through his ten-minute opening statement” (#1651).  “A major source of the hearings’ drama was political ambition.  Ever since Joe Biden’s grandstanding during the [1987] Bork hearings, senators have been powerfully tempted to exploit a perch on the Senate Judiciary Committee for public attention” (#1871).  So  Senators Cory Booker and Kamala Harris and Amy Klobuchar tried to outdo each other in posturing for the public at Kavanaugh’s expense.  

Then Diane Feinstein, defying procedural rules, released copies of a letter alleging Kavanaugh had sexually assaulted a woman while they were in high school.  Kavanaugh couldn’t remember the woman, Christine Blasey Ford, since they’d attended different schools and moved in different social circles.  Importantly, those who’d known her insisted her “behavior in high school and college were dramatically at odds with her presentation in the media” (#2215).  On the other hand, eighty-seven women who had known Kavanaugh for many years held a press conference to make clear their support of him and validate his probity.  Senator John Cornyn of Texas put it plainly:  “‘The problem is, Dr. Ford can’t remember when it was, where it was, or how it came to be. There are some gaps there that need to be filled.’  Cornyn had simply stated the facts.  Those were enormous gaps in an accusation of sexual assault that was intended to keep one of the nation’s most distinguished judges off the Supreme Court.  But the media responded as if Cornyn were maliciously sowing doubt about an account that anyone of sound mind must regard as unimpeachable” (#2331).   With no concern for legal traditions, Senate Democrats and the media seized upon Ford’s words  as the capstone of their ferocious attacks on Kavanaugh.  “Normally, the burden of proof is on the accuser, but the media were not even paying lip service to that principle” (#2213).  

Hemingway and Severeno carefully document all the developments in this disgraceful episode, making it clear how maliciously Democrats and media sought to destroy a good man.  That Kavanaugh survived and hearings and was finally approved as a Supreme Court justice bears witness to his courage and the constant support of the president who nominated him. 

333 “Socialism Sucks”

It’s increasingly evident that many Americans now embrace socialism.  A 2016 “Harvard survey found that a third of eighteen- to twenty-nine-year-olds supported it,” and another survey “reported that millennials supported socialism over any other economic system.”  They apparently favor what David Horowitz calls “morally-sanctioned theft” and are either unaware of or deliberately deny the appalling reality of socialism’s genocidal history.  Add to this the amazing left-turn of the current Democrat Party, now espousing a Sanders-style socialistic agenda!  So it behooves us who treasure freedom to better understand what awaits us should we follow the pattern discernible in the “unfree world” examined by two Texas economists.  Having established themselves as bona fide academicians by publishing scholarly articles and books, Robert Lawson (a professor at Southern Methodist University) and Benjamin Powell (a professor at Texas Tech University), decided to take a light-hearted (anecdote-studded) but deeply serious (data-laden) tour of socialist utopias to experience first-hand their reality.  This resulted in Lawson’s Socialism Sucks: Two Economists Drink Their Way Through the Unfree World (Washington:  Regency Publishing, Kindle edition, c. 2019).   Lawson and Powell launched their travels in Sweden, a allegedly “socialist country” which “is not a socialist country.”  Certainly there are generous welfare and entitlement programs, but that does not make a country socialist, since our economists insist the abolition of private property and a state-owned means of production are necessary components of a truly socialist regime.  In time they also visited China and concluded that it too is not economically socialistic (although it is, for sure, politically dictatorial).  In fact, “China’s economic development since 1978 is one of the greatest successes of its kind in human history” (#939).  It is, they concluded, a “fake socialism.”  

So they decided to visit three countries that remain starkly socialist:  Venezuela, Cuba, and North Korea.  In Venezuela they encountered a “democratic socialism” which has almost overnight plunged from prosperity to poverty.  “At least until recently, it was the model that Western intellectuals admired and held up for emulation as a socialist paradise.  Now things are falling apart, but the apologists still insist the country’s problems have nothing to do with socialism” (#203).  Hugo Chávez “won the 1998 presidential election” promising to eliminate economic injustices and securing  much popular support.  Then (following the model of Stalin and Mao) he “confiscated more than ten million acres of private farmland” which led to a 60 percent collapse of food production and the skyrocketing of food prices.  Consequently:  “Venezuelans lost an average of twenty-four pounds in 2017.  Venezuela’s socialist policies are literally starving the country” (#338).  Of the 800,000 booming businesses in 1998, only 230,000 were afloat in 2016.  In fact, everywhere the authors looked in Venezuela they found suffering and sorrow—the dividends of little more than a decade of the “democratic Socialism” so praised by “useful idiots” such as Sean Penn, Oliver Stone, Michael Moore, and Bernie Sanders.  

From there Lawson and Powell went to Cuba to study a “subsistence socialism” that has persisted under the Castro brothers for 60 years.  Before the 1959 “revolution, Cuba had a thriving urban middle class, along with widespread rural poverty” (#455).  Fidel Castro promised to build a utopia featuring prosperity and equality.  Instead, he imposed poverty and dictatorial oppression.  Sitting in a Havana’s Hotel Tritón they saw its “decaying edifice” as “a crumbling tribute to Cuba’s central-planning problems.”  No longer subsidized by the Soviet Union, Cuban authorities eliminated funding for the hotel and it “was rotting, inside and out.  And nobody cared because nobody owned it.”  Almost immediately Lawson said:   “‘This place sucks.’  ‘Socialism sucks,’ said Bob as he drained his beer” (#475).  Thus they found a title for their book!  Researching by walking about the city, they became increasingly sad as they saw how poorly a state-run economy functions.  The food in restaurants was tasteless (ironically, “Cuban cuisine is excellent—just not when it’s served in Cuba”), the buildings were decaying, and the people (70 percent of whom work for the state and get twenty-five dollars a month) were pitifully poor.  

But Cubans are not as poor as North Koreans!  Our economists next flew to Seoul, South Korea, and found that the “Korean peninsula is a rare natural experiment where capitalism and socialism can be compared side by side.  The comparison is particularly informative because North and South Korea share a common history, language, culture, and, before they split, level of economic development.  . . . .  At the end of World War II, North Korea had about 80 percent of Korea’s industry, 90 percent of its electrical power, and 75 percent of its mines” (#761).  Following the war and the division of the peninsula, GDP per capita was basically the same.  Yet today South Korea is an economic powerhouse.  Seoul’s “economic output ranks it fourth in the world among metropolitan areas (behind Tokyo, New York, and Los Angeles)” (#780).  But thirty-five miles to the north there’s an entirely different country, North Korea.  Lawson and Powell were not allowed to cross the border, but they learned that the people earned little more than $1000 a year and in the 1990s “up to three million North Koreans died of starvation and related diseases” (#831).  Nothing better proves the fundamental goodness of free enterprise than the dramatic contrast between the two Koreas.

Concluding their journeys, our two freedom-loving economists visited “hungover socialisms” in Russia and Ukraine, witnessing the ravages resulting from Marx’s flawed theories.   Standing near a statue of him in Moscow, “Bob said, ‘I bet there’s never been a guy who has been so wrong about every major thing he wrote about and who still has as many followers as Marx’ (#1046).  Bob’s right.  Profits don’t represent exploitation, because the labor theory of value is wrong.  Instead, at least in a free market, profits represent created value.  Capitalism can’t be the cause of alienation because workers inevitably do better under capitalism than under socialism, and market prices provide a higher standard of living and more economic opportunity.  Finally, industries haven’t become more concentrated and wages haven’t been pushed down under capitalism.  Instead, capitalism has been the engine of prosperity, innovation, new industries, and rising wages, while socialist economies have stagnated or even regressed. ‘Yeah, there’s only one great Marx,’ I said. ‘Groucho.’  Groucho’s definition of politics is Marxism in a nutshell:  “Politics is the art of looking for trouble, finding it everywhere, diagnosing it incorrectly, and applying the wrong remedies”’” (#1053).  In Russia and Ukraine, those “wrong remedies” resulted in millions of deaths as Lenin and Stalin imposed their version of “scientific socialism,” including the savage liquidation of the Cossacks and Kulaks and others resisting the imposition of a communal paradise.  Lawson and Powell’s loathing for Lenin and Stalin oozes from every page.  So too they loath Walter Duranty, the New York Times reporter who lied on their behalf, garnering for himself a Pulitzer Prize, claiming that rumors of famine in Ukraine was “mostly bunk” and penning “columns with titles like ‘Soviet Is Winning Faith of Peasants,’ ‘Members Enriched in Soviet Commune,’ and ‘Abundance Found in North Caucasus’” (#1139). 

The bright exception to the general malaise in the former USSR is the tiny republic of Georgia (Stalin’s birthplace), where a “new capitalism” has virtually overnight (following the 2003 “Rose Revolution”) generated prosperity.  “I [Bob] love Georgia—the people, the food, the beer, the wine, and of course the economic reforms that have taken a Soviet backwater and given it new life” (#1244).  Reformers eliminated many superfluous government workers—reducing the Ministry of Agriculture from 4,374 to 600, Tbilisi City Hall employees from 2,500 to 800, the Ministry of Environmental Protection from 5,000 to 1,700.  In 2004 they sold government-owned factories, hospitals, and apartments—“everything was for sale except Georgia’s honor.”  In that year “Georgia ranked fifty-sixth on the economic freedom index.  In the 2017 edition of the index, Georgia ranked eighth in the entire world, ahead of the eleventh-ranked United States” (#1330).  The capital city, Tbilisi, “has better-paved streets than Dallas. The once dark city now gleams like Paris at night. Tourists come from all over Europe and the Middle East to enjoy Georgia’s famous food, wine, and other attractions, including . . . the redeveloped medieval section of town with its quaint shops and hip restaurants” (#1338). 

Returning to the United States, Lawson and Powell “infiltrated” a 2018 convention of American socialists.   “After traveling the unfree world and witnessing the economic stagnation, starvation, poverty, and political tyranny imposed by socialist regimes, Bob and I came to the Socialism Conference to answer our own question:  How can so many Americans, particularly millennials, view socialism so favorably?  We wanted to hear what these self-described young socialists had to say, and there were plenty of millennials to ask” (#1438).  But when asked they had only the haziest notion of what they supported!  If they actually knew that socialism calls for the abolition of private property and government controlling the means of production they talked little about it.  Instead, they relished the exhilaration of shouting slogans, such as “Free abortion on demand.  We can do it.  Yes, we can.”  Abortion rights and environmental activism seemed to be the real hot button issues at the conference, and this largely explains why so many young people were attending.  When speakers deigned to mention failing socialist states like, they inevitably said they weren’t “real” socialisms.  “When socialists, democratic and otherwise, held up Venezuela as a great socialist experiment in the 2000s, the message was, ‘See, we told you so; socialism works!’  But when the failure happened, the message changed to, ‘No, wait—that’s not real socialism!’” (#1532).   

But our two economists have seen “real socialism” in various parts of the world, and they believe no one who thinks honestly and reasons clearly could support a system which inevitably fails and causes horrendous human suffering.  

* * * * * * * * * * * * * * * * * * * * *

One of the finest essayists currently writing English is Theodore Dalrymple (the pen name of Anthony Daniels), a psychiatrist who has traveled widely and worked in medical facilities and prisons in some of the most impoverished realms of Africa and England.  Ever empathetic with the poorest of the poor, he writes to illuminate various social ills and enlist the reader’s concern.  Consequently, in The Wilder Shores of Marx: Journeys in a Vanishing World (Monday Books, Kindle Edition, c. 2012; first published in 1991), he took a tour of formerly-communist lands devastated by a singularly pernicious ideology, Marxism.  “Individually unimportant as the countries might be in world history, collectively they tell us much about one of the central political currents of the twentieth century” (#179).  Visiting some of “the peripheral countries of the communist world, then in the process of dissolution,” he determined “to pre-empt the nostalgia for what was an anti-human system in the likely event that the transition to something more normal would be difficult and unsatisfactory.  Apart from the massacres, deaths and famines for which communism was responsible, the worst thing about the system was the official lying:  that is to say the lying in which everyone was forced to take part, by repetition, assent or failure to contradict” (#151).  So he seeks to to simply tell the truth—to tell it as it was.  

Early in his life Dalrymple had studied and interacted with several young socialists, and he saw that the “fons et origo” of the “appeal to intellectuals” is “snobbery.  Left to themselves, people invariably display bad taste (a crime for which Lukacs, the Hungarian Marxist luminary who was also a murderer, thought they should be punished).  Therefore, they must not be left to themselves” (#3233).  Elite intellectuals must guide the masses, ruling (whether in economics or culture) by decrees!  “It didn’t take me long to conclude that communism was dismal, and that the words of Marx and Lenin betrayed an infinite contempt for men as they were, for their aspirations, their joys and sorrows, their inconsistencies, their innermost feelings, their achievements and failings.  Below the surface of their compassion for the poor seethed the molten lava of their hatred, which they had not enough self-knowledge to recognize.  I make no claim, therefore, to have travelled in a neutral frame of mind.  But neutrality is not a precondition of truth, which itself is not necessarily the mean between two extremes.  One does not expect neutrality of someone investigating Nazism, and would be appalled if he affected it;  why, then, expect it of someone investigating a different, but longer-lasting, evil?” (#198).  

Marx once wrote:  “Communism begins where atheism begins,” so when Dalrymple visited Albania and encountered an aggressively atheistic state he declared:   “Where religion is compulsory, I am an atheist; but where religion is forbidden, I am a believer.”  All public worship ceased in Albania in 1967, when churches and mosques were closed.  The regime was following Marx’s malice, implemented by Lenin, who declared:  “‘. . . any religious idea, any idea of god at all, any flirtation even with a god, is the most unutterable foulness . . .   It is the most dangerous foulness, the most shameful ‘infection’” (#226).  To gain access to the country Dalrymple joined a group of English tourists (most of them devout socialists), who were assigned a hotel in the nation’s capital, Tirana.  Walking about the city streets, he noted that “not even the firmest of Enver Hoxha’s partisans would maintain that Tirana is an exciting or vibrant city, but it is safe” (#392).  Safe, but dead!  As dead as the innumerable number of museums, inevitably devoted to the dictator Hoxha and the nation’s grandeur, the tour guides insisted the group visit.  Thereafter, “the very idea of a museum induced in me a faint sensation of nausea – still I cannot enter one without being overcome momentarily by a feeling of profound gloom” (#437).  He left one of these museums, “this cathedral of untruth, with a strange knot in my stomach.  The idea that Hoxha should have gone to his grave triumphant filled me with rage.  I felt I should have screamed ‘Lies, lies, lies!’ and trampled on the red carpet leading to his statue, just to let everyone know that I, at any rate, did not acquiesce in this elevation of mendacity to the status of religion” (#556).  

Following his trip to Albania Dalrymple joined a British delegation attending the World Festival of Youth and Students in Pyongyang, North Korea, where “thousands of young people” assembled “to dance, sing and denounce the United States.  The festivals, which last two weeks, are the Olympics of propaganda” (#914).  Other than himself the delegation consisted of English socialists, most of whom identified with various victimized groups and effusively aired their grievances.  They’d locked into an identity “that obviated the need for consideration of others.  Persecution, real or imagined, was sufficient warrant for the rightness of their behaviour.  The trouble was, of course, that the majority of the delegates considered themselves persecuted, whether as women, members of splinter communist parties, vegetarians, homosexuals, Irish by descent, proletarians, immigrants, or any combination of these.  Hence almost everyone acted more-persecuted-than-thou” (#939).  Needless to say, Dalrymple found his companions difficult to stomach and often disgusting!  But in Korea they suddenly became a “people of consequence,” rightly esteemed for their “manifest talents” (#957).

The delegates were allowed to see nothing but what the regime prescribed.  Thus they could see modern highways without automobiles.  They were marched through museums celebrating the Great Leader.  But beneath the veneer of grandeur Dalrymple “rapidly became convinced – absolutely and unshakeably convinced – that one day stories would emerge from North Korea that would stun the world, of cruelties equal to or surpassing those of Kolyma and the White Sea Canal in Stalin’s time” (#1108).  He was in fact face-to-face with one of the most inhumane nations ever established.  In due time the delegates joined a great throng assembled in a stadium seating 150,000 to witness a parade of the representatives from the world’s nations.  In one section of the stadium 20,000 Korean children with colored cards created a variety of portraits and slogans.  The children, Dalrymple learned, had not attended school for six months in order to daily practicing these routines!  Rather than being impressed Dalrymple was angered:  “Here was a perfect demonstration of Man as a means and not an end; of people as tiny cogs in an all-embracing machine” (#1308).  

When at last Kim Il Sung (“the Great Leader”) appeared “a kind of controlled pandemonium broke out instantaneously all around the stadium” (#1355).  Since only 15,000 of the attendees were foreigners, the stadium was basically packed with Koreans following orders, and Dalrymple “recalled a passage from Vaclav Havel:  ‘Each person somehow succumbs to a profane trivialisation of his or her inherent humanity . . .   In everyone there is some willingness to merge with the anonymous crowd and to flow comfortably along with it down the river of pseudo-life.  This is much more than a simple conflict between two identities.  It is something far worse: it is a challenge to the very notion of identity itself’” (#1318).  At that moment Dalrymple determined to stay singularly seated, “even if I were to be threatened with torture or death itself.  I was so appalled by the sight and sound of 200,000 men and women worshipping a fellow mortal, totally abdicating their humanity, that I do not think I am exaggerating when I say I should rather have died than assent to this monstrous evil by standing (my mother was a refugee from Nazi Germany)” (#1361).  Clearly he had “glimpsed the terror that underscores the tombstone orderliness of North Korea” (#1503).  And he could hardly wait to escape the prison of Kim Il Sung’s Democratic Republic of North Korea.

Subsequently Dalrymple visited Romania, Vietnam, and Cuba.  Inevitably he found the same physical drabness and societal decay characteristic of socialist nations.  Once “magnificent” cities, such as Bucharest or Havana, “the pearl of centuries of exploitation, is an inhabited ruin; the inhabitants are like a wandering tribe that has found the deserted metropolis of a but dead civilisation and decided to make it home” (#3270).  Shortages of virtually everything weighed down the people.  Totalitarians inevitably impose shortages, for  the “perpetual queuing for the bare necessities of life is the best guarantee against subversion” (#3414).  People were depressed.  Whenever he could talk privately with persons who knew they would not be reported to the authorities, he found deep dissatisfaction with anything socialistic.  He asked a Vietnamese man whether he thought Ho Chi Minh had deceived the people—and received an emphatic “Yes!”  The incessant state propaganda, the omnipresent quotations promoting either the Great Leader or his agenda, he came to see, was not designed to persuade but to humiliate.  “From this point of view, propaganda should not approximate to the truth as closely as possible:  on the contrary, it should do as much violence to it as possible.  For by endlessly asserting what is patently untrue, by making such untruth ubiquitous and unavoidable, and finally by insisting that everyone publicly acquiesce in it, the regime displays its power and reduces individuals to nullities” (#2041).  So too history must be destroyed.  “To put an end to the past: to begin again, the dream of adolescent revolutionaries everywhere” (#3990).  George “Orwell grasped intuitively but with astonishing precision the importance to a totalitarian regime of control over the past” (#2047).  So whether reporting the news or writing history, truth is irrelevant, for it “does not depend on correspondence to reality; it depends merely on who propounds it, and when” (#2093).  

In the book’s “Afterword,” Dalrymple scoffs at the utopian fantasies of socialists everywhere.  In fact:  “It was never a utopia, of course. The extraordinary deadness of communist countries, detectable even at their airports, is simply the deadness of communist prose transferred to life itself.  The schemes of communist dictators to reform the whole of humanity, to eradicate all vestiges of the past, to build a new world with no connection to the old, are not the whims of despots made mad by the exercise of arbitrary power, but the natural outcome of too credulous a belief in a philosophy which is simple, arrogant, vituperative and wrong.  When men reach power who believe that freedom is the recognition of necessity, is it any surprise that tyranny ensues?” (#4077). 

332 “Stripping the Altars”–The Anglican Reformation

Growing up in the Church of the Nazarene I learned we were Wesleyans—a theological position demonstrably different from both Catholicism and Calvinism.  In time I also learned that John Wesley was, throughout his life, a priest in the Church of England, so Nazarenes derive their heritage not from Luther and Calvin but from the church brought into being by King Henry VIII in the 1530s.  In graduate school I studied ancient and medieval history, but my knowledge of the English Reformation was largely derived from textbooks—and they generally cast a positive light on the English Reformation and its established Protestant church.  However, my understanding of that era was significantly challenged and changed by reading Eamon Duffy’s deeply-informative reassessment of Reformation historiography:   The Stripping of the Altars:  Traditional Religion in England 1400-1580 (New Haven:  Yale University Press, c. 1992).  Born in Ireland and a “cradle Catholic,” Duffy is a professor of history at Cambridge University, and he describes (drawing almost exclusively from primary sources) and illustrates (providing extensive photographs) the rich and vibrant religious life in late Medieval England before the Reformation—or what is more accurately labeled the “Anglican Schism.”  The bulk of the book is devoted to describing the laity’s religious life in the late Medieval period, whereas the final third of the book is devoted to the changes wrought in the church by Henry VIII and his children.  It is, as Jack Scarisbrick said:  “A mighty and momentous book . . . which re-orders one’s thinking about much of England’s religious past.”  

“It is the contention of the first part of the book,” Duffy writes, “that late medieval Catholicism exerted an enormously strong, diverse, and vigorous hold over the imagination and the loyalty of the people up to the very moment of Reformation.  Traditional religion had about it no particular marks of exhaustion or decay, and indeed in the whole host of ways, from the multiplication of vernacular religious books to adaptations within the national and regional cult of the saints, was showing itself well able to meet new needs and new conditions” (p. 4).  Documenting the various realms of religious activity—liturgical practices, mass attendance, seasonal feasts, pilgrimages, educational materials, devotional materials and practices, corporal and spiritual acts of mercy, etc.—Duffy shows how surprisingly literate and spiritually satisfied were these Medieval English believers.  

Consider, for example, the many “prayers of late medieval English men and women” which survive “in huge numbers, jotted in the margins or flyleaves of books, collected into professionally commissioned or home-made prayer-rolls, devotional manuals, and commonplace books, all gathered into the primers or Books of Hours (Horae) [scriptural prayer books], which by the eve of the Reformation were being produced in multiple editions in thousands, in formats ranging from the sumptuous to the skimpy, and varying in price from pounds to a few pence” (p. 209).  Remarkably, most all of these primers were written in Latin, indicating how widely it was used and understood by large numbers of laymen.  Handwritten entries in these primers indicated “a minimal competence” in both English and Latin and show “a wide spectrum of lay people using and supplementing the Latin devotions of the primers with familiarity and freedom” (p. 225).  

This flourishing spirituality ended abruptly in the 1530s when the “violent disruption” of Henry’s Reformation (the “Henrician religious revolution” as Duffy terms it) quickly and effectively demolished “traditional religion” in England.  Though Henry VIII stoutly denounced Luther’s reformation in its first decade, retaining an allegiance to Catholic liturgy and (to a degree) Catholic doctrine, his determination to divorce his first wife (Catherine) and marry his mistress (Anne Boleyn) led him to create a new church, the “Church of England,” with himself as head.  From that position he appointed utterly amoral men, most notably Thomas Cromwell, to carry out his edicts, including the rapid dismantling of hundreds of monasteries that were a vital part of the Catholic world.  Monastic lands were thence given to powerful nobles, supporters of the king, who thenceforth staunchly supported the revolution and its dividends.  Churches, too, were despoiled, losing great quantities of gold and silver reliquaries, jewels and tapestries—anything of monetary value.  Revealingly, when a devout man entered the despoiled shrine of Our Lady of Worcester, he lamented:  “‘Lady, art thou stripped now?  I have seen the day that as clean men hath been stripped at a pair gallows as were they that stripped thee’” (p. 403).  Within a handful of years, Henry’s “stripping of the altars” eliminating 1000 years of English piety and worship.  

With Cromwell’s “Injunctions” in 1536 and 1538, the radical dimensions of the Henrician revolution became clear.  Popular devotional practices, including processions, pilgrimages, lighting candles before saints’ statues, praying for the dead, reciting the rosary, were outlawed.  Even the shrine of St Thomas Becket in Canterbury was pillaged and his bones scattered.  Pilloried as “a maintainer of the enormities of the Bishop of Rome, and a rebel against the King,” he was declared a persona non grata and his name was “to be erased from all liturgical books and his Office, antiphons, and collects to be said no more” (p. 412).  But then, in 1539, a scant five years after his divorce, the king paused the process and promulgated the Act of Six Articles, which “marked a decisive turning-point for the progress of radical Protestantism” (p. 424) as he tried to reverse some of Cromwell’s endeavors.  Indeed, Cromwell himself would soon fall from favor and lose his head to the busy executioner.  Many traditional ceremonies and devotional practices were restored to favor, and the “unauthorized reading of the scriptures”—especially Tyndall’s New Testament— was forbidden since they threatened to undermine royal authority.  

In the 1540s Thomas Cranmer replaced Thomas Cromwell as Henry’s chief overseer of the reformation.  Skillfully trimming to the wind, Cranmer managed to stay in the king’s good graces while continuing to implement certain aspects of his own radical Protestant agenda.  Whereas Cromwell had employed violence in the extreme, Cranmer (a gifted scholar) relied on education and ecclesiastical pressure, setting forth new primers for devotions and prescribing revised liturgies for the Church of England.  When Henry VIII died in 1547 his nine-year-old son, Edward VI, succeeded him.  Edward would be clay in the hands of a “Council” (powerful nobles who had been enriched by the dissolution of church properties), which supported Cranmer as he moved quickly to advance the reformers’ iconoclastic agenda—destroying, for example, images in stained glass church windows as well as statues in the walls.  Even images and pictures in private homes were outlawed.  He composed and issued a Book of Homilies and demanded they be read every Sunday in every church.  In 1549 Cranmer issued a prayer-book which sought “to transform lay experience of the Mass, and in the process eliminated almost everything that had till then been central to lay Eucharistic piety” (p. 464).  In 1553 he issued his epochal Book of Common Prayer and make clear his intent “to break once and for all with the Catholic past, and to leave nothing in the official worship of the Church of England which could provide a toehold for traditional ways of thinking about the sacred” (p. 473).  

Young Edward died in 1553 and his half-sister Mary succeeded him.  As the daughter of Henry’s first wife, Catherine, she was quite like her mother—a kindly, devout Catholic.  And she sought to bring England back to the Catholic fold, a move which was widely welcomed throughout the countryside.  (Inasmuch as Duffy has written a treatise on her which I’ll review a bit later, I’ll not deal with the “Marian restoration” here.)  Following Mary’s death in 1558, her half-sister Elizabeth (Anne Boleyn’s daughter) succeeded her as Queen of the realm.  Since she was considered illegitimate by the Roman Catholic Church and thus ineligible to rule, Elizabeth naturally re-imposed an Henrician/Edwardian Protestantism upon England, issuing an Act of Uniformity in 1559 that abolished the Mass and required identical rites in all parishes.  She moved decisively to control all aspects of religious life.  

Throughout these tumultuous decades, Duffy says:  “The picture that emerges from them is unmistakably that of a slow and reluctant conformity imposed from above, with little or no evidence of popular enthusiasm for or commitment to the process of reform” (p 573).  Many rebellions and widespread resistance showed the people’s attachment to the traditional religion, but the ruthlessness with which Henry, Edward, and Elizabeth responded finally established the new church throughout England.  By the 1570s it was clear that the Church of England had become not only the established church but the accepted authority now shaping religious life.  “But for most of the first Elizabethan adult generation,” Duffy concludes,” Reformation was a stripping away of familiar and beloved observances, the destruction of a vast and resonant world of symbols which, despite the denials of the proponents of the new Gospel, they both understood and controlled.  The people of Tudor England were, by and large, no spartans, no saints, but by the same token they were no reformers.  They knew themselves to be mercenary, worldly, weak, and they looked to religion, the old or the new, to pardon these vices, not to reform them” (p. 591).  

                                             * * * * * * * * * * * * * * * * * * * * * * *

More than a decade after publishing The Stripping of the Altars, wherein he noted that there was no reliable study of Mary Tudor, Eamon Duffy sought to rectify the deficit by writing Fires of Faith:  Catholic England under Mary Tudor (New Haven:  Yale University Press, c. 2013).  Rather recently “a good deal of scholarly work” has led historians to move away from the “Bloody Mary” epithet and judge her more positively.  In 1553 the deeply Catholic queen inherited a church deeply wounded by the “reforming” endeavors of her father and half-brother.  Their regimes “had bulldozed away centuries of devotional elaboration, and had stripped bare the cathedrals and parish churches of England.  The most devastating impact had probably been in music, since the heavy emphasis of reformed protestantism on the intelligibility of the written or spoken word in worship left no place for Latin word-setting and elaborate polyphony.  . . . .  But, after music, it was architecture and its attendant arts—paintings, statuary, stained glass—that suffered most.  Virtually all the altars had been pulled down, their consecrated table-slabs or mensal often deliberately broken up, or profaned by use” in various construction projects (pp. 3-4).  Desperate for money, Edward’s government had “carried through the largest government confiscation of local property in English history” (p. 4).  

Queen Mary and Cardinal Reginald Pole moved to undo all this and make England Catholic again.  Taking issue with many historians, Duffy insists their endeavors were (briefly) effective examples of the Counter-Reformation, restoring the ancient Catholic faith in many contested places in Europe.  Cardinal Pole, especially, had a vision and commitment that, had Mary reigned longer, might well have accomplished their goals.  Having played a prominent role in the initial sessions of the Council of Trent, Cardinal Pole returned to his native land following Mary’s accession to the throne.  And he was, Duffy says, “in charge” of the movement to make England Catholic again.  Addressing Parliament in 1554, Pole made a “remarkable speech” that endeavored to “reconcile England to the Holy See” (p. 43).  He condemned the overturning of traditional religion and the parallel erosion of civic justice, lamenting:  “‘Neither was any man so sure of his goodes and possessions, but he stood continually in abject danger and hazard of his life too’, and ‘the best sorte, and the most innocente’ had the most to fear” (p. 44).  To Duffy, “Pole’s long-pondered analysis of the English reformation, indeed of the whole sweep of English religious history, provided a rationale for theological renewal that was stark, clear and uncompromising, and that endorsed the conservative instincts of the majority of the population, while shaking itself free of the intellectual and moral compromises of Henry’s church” (p. 46).  Following Pole’s intellectual guidance, the clergy under Mary largely returned to the ancient Catholic traditions.  

They were aided by a number of pro-Catholic literary works that clearly emphasized “the real presence of Christ in the eucharist and the doctrine of the sacrifice of the Mass; the spiritual primacy of the pope; the antiquity, unity and holiness of the visible catholic church, embodied in European Christendom generally, and specifically in the restored church of England; the sold authority of the church to interpret scripture; the value of penance and good works for salvation; the freedom of the human will” (p. 62).  Conversely, they decried much of the Reformation, including:  “the novelty, contradictions and confusions of protestant teaching; the lust, licentiousness and avarice of its founding fathers, from Luther to Henry VIII; the arrogance and ignorance of rank-and-file protestant believers; the singularity and lack of charity in their withdrawal from the parish and its ceremonial round; and the wedge that protestantism drove between its followers and the rest of society” (p. 62).  

But the Marian counter-reformation entailed force as well as persuasion and led to the execution (mainly by burning) of 284 Protestants.  Given the prominence of this phenomenon in the public mind, Duffy devotes a significant, deeply-detailed section of his book to it.  These were the men and women celebrated in John Foxe’s famous and thoroughly polemical Actes and Monuments (popularly known as The Book of Martyrs) and are the main reason for labeling the queen “Bloody Mary.”  In fact, her father, step-brother, and half-sister executed dissidents and “heretics” in equal numbers, for religious persecution was widely practiced in that era and was endorsed by eminent Protestants such as Thomas Cranmer and esteemed Catholics, including Cardinal Pole.  In fairness to Pole, he energetically tried to “convert rather than punish heretics.”  But when necessary, he thought capital punishment acceptable.  One of the most prominent Protestants, the Duke of Northumberland, who’d helped Edward VI pursue his radical agenda, renounced on the scaffold his earlier views and “attributed the ruin of England and his own corruption to the heresy into which the country had been led” for the past 16 years “‘by seditious preachers and teachers of new doctrine.  He called on those present ‘to remember the ould learning’ and return to the faith and unity of the catholic church” (p. 88).  

Most of the “martyrs,” however, died courageously, staunchly defending their reformation views.  Bishops Ridley and Latimer were, in John Foxe’s view, the finest exemplars of their faith, with Latimer famously saying:  “‘Be of good comfort master Ridley, and play the man:  we shall this day light such a candle by Gods grace in England, as (I trust) shall never be put out’” (p. 155).  Ever the diligent researcher, Duffy concludes this speech was a “pure invention, added by Foxe in the 1570” edition of his book.  But it certainly entered the hagiography of Reformation lore and, to a degree at least, certifies the courage and commitment of the dying faithful.  Less resolute than Ridley and Latimer, Archbishop Thomas Cranmer first renounced his reformed beliefs but then recovered them shortly before his execution, holding his hand in the fire to signify his repudiation of his earlier recantation.  

Within five years the Marian counter-reformation, by mixing persuasion and force, had largely succeeded, Duffy says, and was widely embraced throughout the realm, for Duffy contends that “the spirit of the counter-reformation was in fact alive and well in Marian England” (p. 190).  Had Mary lived England may very well have returned to the Catholic fold.  Yet though England returned to the Reformation world, Duffy thinks the work of Queen Mary and Cardinal Pole laid out the agenda for a counter-reformation that famously succeeded in many European lands.  Their Catholic “reform programme, embodied in the published acts of Pole’s synod, would help shape one of Trent’s most momentous innovation—the seminary.  The revived populism that was Pole’s legacy was the inspiration for what can fairly be described as the heroic stand made by those most unexpected heroes, the bishops and dignitaries of the English church.  Marian catholicism inspired the generation of ardent activists who would provide Elizabethan catholicism with its core convictions, its best writers, its most characteristic institutions and its martyrs.  It set adrift in mainland Europe a diaspora of talented academics and administrators whose interest and convictions merged seamlessly into those wide movements for reform that we call the counter-reformation, and who would themselves contribute to its creative ferment.  The Latin term ‘Inventio’ is a very rich one:  it carries the meanings to devise or create, as well as to find or discover.  In  both senses, the Marian church ‘invented’ the counter-reformation” (p. 207).  

                                      * * * * * * * * * * * * * * * * * * * * * * *

Historians, whenever possible, endeavor to thoroughly research “primary sources” when probing the past.  Providing much valuable first-hand information on the English Reformation is Nicolas Sander’s The Rise and Growth of the Anglican Schism (Rockford, IL:  Tan Books and Publishers, c. 1988), first published in Latin in 1588.  The book’s English editor says it is “the earliest and most trustworthy account which we possess of the great changes in Church and State that were wrought in the reign of Henry VIII,” and it became the primary source for Roman Catholic historians writing about the period.  An English priest, born to a distinguished family, and educated at Oxford during King Edward VI’s reign, Sander emerged under Queen Mary as an expert in canon law and enjoyed the respect and support of the Catholics trying to reclaim England for the ancient faith.  When Elizabeth came to power he sought refuge on the Continent and spent most of his remaining days in exile, working to reestablish the Catholic faith in his native land.  

He began his treatise by noting that:  “The Britons are said to have been first converted to the faith of Christ by Joseph of Arimathia, then confirmed therein by Eleutherus, the Roman Pontiff” in the second century.  Thus for precisely 1000 years “none other than the Roman Catholic faith prevailed in England” before King Henry VIII established his own Church with himself as a Protestant Pope.  Marrying his deceased brother Arthur’s wife Catherine in 1509, Henry wed a saintly woman considerably older than himself who bore him several children, including Mary, but no boys.  Unlike his godly wife, Henry disdained chastity and was soon “was giving the reins to his evil desires, and living in sin, sometimes with two, sometimes three of the queen’s maids” (p. 8).  After two decades, when Catherine failed to provide him a male heir he determined to divorce her and marry Anne Boleyn.  

Many pages of the book are devoted to describing the assorted maneuvers Henry launched to get papal approval to divorce Catherine, and his associates realized that he was prepared to “renounce the faith together with his wife, rather than live without Anne Boleyn” (p. 50).  He offered bribes to distinguished professors in various universities to support his cause but found most of them upholding the validity of his first marriage—though in Germany Luther’s associate, Philip Melanchthon simply urged Henry to stay married to Catherine and “treat Anne Boleyn as a concubine” (p. 84).  Then Thomas Boleyn suggested the king to get the Pope to make his chaplain, Thomas Cranmer, Archbishop of Canterbury, for “he will do whatever may be asked or even desired, for any subject’” (p. 87).  When eminent counselors, including Bishop John Fisher and the Lord Chancellor Thomas More, refused to accept his break with the Catholic Church they would be imprisoned and executed.  The swordsman beheading More, Sander said, “struck off the head of justice, of truth,, and of goodness” (p. 126).  

Though far more polemical than Eamon Duffy’s historical works, Sander anticipated (by 400 years) his conclusions:  the Anglican Schism not only birthed a new Protestant denomination but in the process destroyed a vibrant religious society, leading to a land despoiled of its cultural legacy and charitable economic structures—described in elaborate, statistical detail by a celebrated modern historian, W.G. Hoskins, in The Age of Plunder:  The England of Henry VIII, 1500-1547.

331 Heather Mac Donald

During the past several decades no journalist, said George Will, “has produced a body of work matching that of Heather Mac Donald.”  With degrees in literature from Yale and Cambridge universities, plus a law degree from Stanford, she brings unique credentials and scholarly depth to her essays (generally dealing with poverty and education) published in New York’s City Journal.  She also has a rare quality in today’s journalists—courage!  She seeks to uncover and disclose truths in America the ruling elite find unpalatable.  Thus, when she published The War on Cops:  How the New Attack on Law and Order Makes Everyone Less Safe (New York:  Encounter Books, c. 2016; Kindle), she became a regular target for leftist anger.  Reading her treatise in the light of riots and destruction in the summer of 2020, moreover, reveals how presciently she read the signs of the times, for she looked at crime in the streets as more than a simple criminal matter.  Murders and mayhem certainly do much harm and take thousands of lives, but “it is not, in itself, the greatest danger in today’s war on cops.  The greatest danger lies, rather, in the delegitimization of law and order itself” (#120).

For 20 years, following 1994, city mayors and police would generally follow New York City Mayor Rudolph Guiliani’s prescriptions and “crime would fall 50 percent nationwide, revitalizing cities across the country” (#81).   Cops actively engaged in “Broken  Windows” policing—stopping criminals engaged in misdemeanors before they moved on to felonies.  But by 2016 things had changed and crime was “shooting up in cities across the United States.  Homicides in the country’s 50 largest cities rose nearly 17 percent in 2015, the greatest surge in fatal violence in a quarter-century” (#57).  Under President Barack Obama—who campaigned promising “change” and “repeatedly charged that the criminal-justice system treats blacks differently from whites” (#93)—one of the most dramatic changes was in crime.  “Fueling the rise in crime in places like Baltimore and Milwaukee is a multi-pronged attack on law enforcement.  Since late summer 2014, a protest movement known as Black Lives Matter (a fraudulent, thuggish organization in Mac Donald’s judgment) has convulsed the nation.  Triggered by a series of highly publicized deaths of black males at the hands of the police, the Black Lives Matter movement holds that police officers are the greatest threat facing young black men today.  That belief has spawned riots, ‘die-ins,’ and the assassination of police officers.  The movement’s targets include Broken Windows policing and the practice of stopping and questioning suspicious individuals, both of which are said to harass blacks” (#89).

Sensitive to media-fueled criticism, inner-city police understandably did less policing.  Arrests plummeted.  And as darkness follows dusk “a bloodbath ensued, and its victims were virtually all black. When the cops back off, blacks pay the greatest price.  That truth would have come as no surprise to the legions of inner-city residents who fervently support the police and whose voices are almost never heard in the media” (#106).  The virulent anti-cop movement gained impetus from the August 2014 police shooting of Michael Brown in Ferguson, Missouri.  A white police officer, Darren Wilson, shot an 18-year-old black man—a “gentle giant” who supposedly had his hands raised saying “Hands Up, Don’t Shoot” and was shot in cold blood.  Soon thereafter, rioters burned and looted Ferguson businesses.  When a grand jury exonerated the policeman, more riots erupted, and “Black Lives Matter protests grew ever more virulent as a second myth took hold:  that the American criminal-justice system is rigged against blacks” (#137).

Promoting this myth—and while while looters were ravaging Ferguson—President Obama “betrayed the nation” by condemning the grand jury’s failure to indict Darren Wilson.  “Obama had one job and one job only in his address that day:  to defend the workings of the criminal-justice system and the rule of law.  Instead, he turned his talk into a primer on police racism and criminal-justice bias.  In so doing, he perverted his role as the leader of all Americans and as the country’s most visible symbol of the primacy of the law” (#154).  The president “left no doubt that he believed the narrative of the mainstream media and race activists about Ferguson.  That narrative held that the shooting of Brown was a symbol of nationwide police misbehavior and that the August riots were an ‘understandable’ reaction to widespread societal injustice” (#178).  He and his Attorney General Eric Holder toured the country reciting this incendiary litany.  This narrative has absolutely no factual basis, but that deterred neither the president nor the press.  

Soon after Obama spoke, the New York Times pontificated on the Ferguson riots:  “A more perfect example of what the late Daniel Patrick Moynihan called ‘defining deviancy down’ would be hard to find.” Revealingly:  “The Times could not bring itself to say one word of condemnation against the savages who self-indulgently destroyed the livelihoods of struggling entrepreneurs and their employees in Ferguson, Missouri” (#260).  Blaming the grand jury for failing to indict the policeman, the Times proceeded to assert “that ‘the killing of young black men by police is a common feature of African-American life and a source of dread for black parents from coast to coast.’  A ‘common feature’?” Mac Donald asks.  In fact:  “This is pure hysteria” promoted by “the media frenzy that follows every such police killing, rare as they are, compared with the silence that greets the daily homicides committed by blacks against other blacks” (#305).  In fact, only a handful of unarmed blacks are annually killed by police—about half the number of whites!  “Blacks made up 60.5 percent of all murder arrests in Missouri in 2012 and 58 percent of all robbery arrests, though they are less than 12 percent of the state’s population.  Such vast disparities are found in every city and state in the country” (#485).   Unfortunately for this nation’s well being, “America’s elites have talked feverishly about police racism in order to avoid talking about black crime” (#532).

In time the Justice Department issued an official report on the Ferguson killing, “eviscerating virtually every aspect of the pro-Brown, anti-Wilson narrative,” and demolishing “the incendiary story that had fueled the riots in Ferguson, Missouri— that a teenaged “gentle giant” was gunned down by a trigger-happy cop who feared black people— and made it clear why the department would not be bringing civil rights charges against Officer Darren Wilson” (#378).  The report also explained that Brown’s body was left one the site for four hours because the police wanted to carefully examine the evidence and were hindered by protesters chanting “Kill the police.”  (This became a theme song for Black Lives Matter, chanting while marching in protests:  “What do we want?  Dead cops.”)  But the report was largely ignored by our politicians and journalists, who were determined to push the anti-police narrative, a “lie” that flooded much of “the country and grew into a kind of mass hysteria.  That lie holds that the police pose a mortal threat to black Americans—indeed, that the police are the greatest threat facing black Americans today.  Several subsidiary untruths buttress that central myth:  that the criminal-justice system is biased against blacks; that there is no such thing as a black underclass; and that crime rates are comparable between blacks and whites, so that disproportionate police action in minority neighborhoods cannot be explained without reference to racism” (#628).

The riots in Ferguson were followed by riots in Baltimore and other cities.  The pattern was set.  And as a result, Mac Donald believes, our legal system has begun to “fray.”  Police officers—illustrating the “Ferguson effect”—are less willing to confront lawbreakers lest they be accused of “racial profiling.”  The twenty-year decline in crime has been reversed as violent crimes have surged.  “There are signs that the legal order itself is breaking down in urban areas.  ‘There’s a total lack of respect out there for the police,’ says a female sergeant in New York.  ‘The perps feel more empowered to carry guns because they know that we are running scared.’  The lawful use of police power is being met by hostility and violence, which is often ignored by the press” (#1033).  When then FBI Director Jim Comey admitted the evidence substantiated this, President Obama charged him with “shoddy, biased analysis.  ‘We do have to stick with the facts,’” Obama said, but:  ‘What we can’t do is cherry-pick data or use anecdotal evidence to drive policy or to feed political agendas.’  The idea that Obama knows more about crime patterns and policing than the FBI director is ludicrous; the one with a “political agenda” is Obama, who has spent the last two years disseminating the dangerous lie that the criminal-justice system is racially biased” (1092).

In the book’s final section, Mac Donald turns to analyzing some of the fundamental realities fomenting crime.  Unsurprisingly:  “A straight line can be drawn between family breakdown and youth violence.  In Chicago’s poor black neighborhoods, criminal activity among the young has reached epidemic proportions.  It’s a problem that no one, including the Chicago Police Department, seems able to solve.  About 80 percent of black children in Chicago are born to single mothers.  They grow up in a world where marriage is virtually unheard of and where no one expects a man to stick around and help raise a child” (#1896).  For four years Barack Obama worked as a “community organizer” in South Side Chicago, promoting Saul Alinsky’s agenda of “change” and creating “mass organizations to seize power and give it to the people.”  As president, Obama routinely mouthed “Alinskyite bromides about school spending, preschool programs, visiting nurses, global warming, sexism, racial division, and income inequality” (#2177).  Throughout his years as an organizer, Obama ignored “the disappearance of the black two-parent family,”  illustrating a “myopia” that “continues today, guaranteeing that the response to Chicago’s current youth violence will prove as useless as Obama’s activities were a generation ago” (#1911).  Various governmental initiatives have sought to deal with Chicago’s children, spending billions of dollars without demonstrable effect.  If these programs could have compensated “for the absence of fathers,” Mac Donald thinks, “the black violence problem would have ended years ago” (#2057).  Yet:  “The official silence about illegitimacy and its relation to youth violence remains as carefully preserved in today’s Chicago as it was during Obama’s organizing time there” (#2144).  

Though published four years ago, The War on Cops could easily have been published in 2020.  Urban details have changed—Minneapolis instead of Ferguson, LA instead of Baltimore—but the issues remain much the same.  

                                        * * * * * * * * * * * * * * * * * * * * * * * * * * 

For an explanation of the anti-cop rioting in American cities, an examination of American universities provides plenteous clues.  In The Diversity Delusion:  How Race and Gender Pandering Corrupt the Universities and Undermine Our Culture (New York:  St Martins Press, c. 2018).  Heather Mac Donald begins by noting that English majors in our universities no longer study Chaucer, Spenser, Shakespeare and Milton because they might offend “students of color.”  The dismantling of the traditional canon gained currency, in a dramatic fashion, when Jesse Jackson led Stanford Students chanting “Hey, he, ho, ho, Western Civ has got to go.”  And it is largely gone!   Zealously seeking victim-status, students now demand “safe spaces”where they will suffer no racial or sexual micro-aggressions.  They reveal the changing face of higher education, wherein “human beings are defined by their skin color, sex, and sexual preference; that discrimination based on those characteristics has been the driving force in Western civilization; and that America remains a profoundly bigoted place, where heterosexual white males continue to deny opportunity to everyone else” (p. 2).  UCLA English majors no longer study classic writers, but are required to take courses in “Gender, Race, Ethnicity, Disability, and Sexuality Studies; Imperial, Transnational, and Postcolonial Studies; genre studies, interdisciplinary studies, and critical theory; or creative writing.  In other words, the UCLA faculty was now officially indifferent as to whether an English major had ever read a word of Chaucer, Milton, or Shakespeare, but was determined to expose students, according to the course catalog, to ‘alternative rubrics of gender, sexuality, race, and class’” (p. 211).  

A primary plank in this endeavor is “affirmative action,” eminently evident in California.  Though a 1996 initiative supposedly made it illegal, the state’s elites found clever ways to circumvent it under the umbrella of “diversity,” an ideology which routinely trumps the law.  Admitting blacks and Hispanics to the state’s elite universities, despite their poor qualifications, demonstrates how administrators reveal a bigotry of low expectations as pernicious as that of Southerners before the civil rights movement.  They “relied on wildly unequal double standards to achieve its smattering of ‘underrepresented minorities,’ especially at Berkeley and UCLA, the most competitive campuses.  The median SAT score of blacks and Hispanics in Berkeley’s liberal arts programs was 250 points lower (on a 1600-point scale) than that of whites and Asians.  This test-score gap was hard to miss in the classroom.  Renowned Berkeley philosophy professor John Searle, who judges affirmative action ‘a disaster,’ recounted that ‘they admitted people who could barely read’” (p. 38).  In 2002 UC Berkeley admitted 374 applicants “with SATs under 1000—almost all of them “students of color”—while rejecting 3,218 applicants with scores above 1400” (p. 45).  Such admitted students, as one would imagine, rarely survived the rigors of the university and routinely dropped out.  But the elites in the system cared not for graduation rates—only “diversity” in admissions counts! 

Equally harmful is the “micro-aggression farce” making university life fearful.  Casual comments in class discussions easily lead to accusations of racism or sexism or whatever “ism” you fancy.  Even demanding that students write grammatical English may elicit protests.  One Teaching Assistant said:  “‘Asking for better grammar is inflammatory in the school.  You have to give an A or you’re a racist’” (p. 66).  A UCLA law professor arranged a softball game for his students, who decided to get T-shirts with whimsical lettering.  Minority students, however, discerned a covert “white privilege” racial message and claimed to feel “triggered” by the shirts as well as traumatized by some “‘racist/classist/sexist comments made inside and outside of the classroom’” (p. 72).  Rather than defend the eminently defensible professor, administrators equivocated and appeased the protestors, making life miserable for a highly esteemed scholar.  And it is not only UCLA!  Mac Donald provides persuasive examples from a variety of places to show how micro-aggressions harm university education.  

Turning from race to gender, Mac Donald shows the great harm being done to universities by radical feminist ideology.  For example, contrary to the “rape-culture” atmosphere feminists lament, actual interviews revealed that when asked if they’d been raped “very few women” assented.  In one notorious incident at Columbia University, the “victim” took “six months to decide that she had been raped” (p. 145).  Few campus “rapes” are reported to the local police, “because the accuser and her counselors know that most cases wouldn’t have a chance in court” (p. 146).  What’s actually harming women, unfortunately, is the “hook-up” culture spawned by feminists themselves.  “While there are thankfully few actual rape victims on college campuses, there are thousands of girls feeling taken advantage of by partners who walk away from casual sex with no apparent sense of thwarted attachment” (p. 145).  Yet the “rape culture” has migrated from the university to the workplace, styling itself as the “Me Too” movement, egregiously evident in the Justice Brett Kavanaugh hearings.  

After looking at the devastation demonstrably evident on university campuses, Mac Donald concludes by recommending alternative forms of education, such as the phenomenally successful “Great Courses.”  She pleads for a return to traditional liberal arts studies and responsible campus behavior.  Given all the evidence she presents in her essays, however, the university (or at least the elite universities) is almost ruined beyond redemption.  

                                           * * * * * * * * * * * * * * * * * * * * * * * *

Twenty years ago Heather Mac Donald collected a series of essays in The Burden of Bad Ideas:  How Modern Intellectuals Misshape Our Society (Chicago:  Ivan R. Dee, c. 2000).  Therein she documented the harm done to the recipients of social engineering.   “These essays record,” Mac Donald said, “my travels through institutions that have been perverted by today’s elite intellectual orthodoxy, from an inner city high school that teaches graffiti-writing for academic credit . . . to the Smithsonian Institution, now in thrall to a crude academic multiculturalism; from New York’s Dantean foster care system to Ivy League law schools that produce ‘scholarship’ urging blacks to view shoplifting, and pilfering from an employer, as political expression” (p. xi).  

In “The Billions of Dollars That Made Things Worse” Mac Donald explored the impact of philanthropic foundations such as Carnegie and Ford which long ago abandoned their founders’ aspirations (e.g. Carnegie libraries) and now see themselves as agents of social change, funding radical “community activists” around the country, seeking to transform “a deeply flawed American society” (p. 4).  “When,” for example, “McGeorge Bundy, former White House national security advisor, became Ford’s president in 1966, the foundation’s activism switched into high gear.  Bundy reallocated Ford’s resources from education to minority rights” and “created a host of new advocacy groups, such as the Mexican-American Legal Defense and Educational Fund” and “the Native American Rights Fund, that still wreak havoc on public policy today” (p. 9).  These foundations have routinely provided the funds to establish social justice centers on university campuses devoted to race, class, and gender.  They also have subsidized public interest litigation, enabling legions of lawyers to push for bilingual education, voter rights, racial quotas, sexual equality, prisoners’ rights, etc., all designed to  “establish in court rights that democratically elected legislatures have rejected” (p. 20).   No one should be surprised that the Ford Foundation recently gave $100 million to Black Lives Matter, giving it ample funds whereby to destabilize our republic.  

Paralleling the changes in powerful foundations have come similar changes in powerful media, preeminently evident in the New York Times.  Whereas the paper Adolph Ochs bought in 1896 was devoted to sound money, low taxes, and “‘no more government than is absolutely necessary to protect society, maintain individual and vested rights, and assure the free exercise of a sound conscience’” (p. 39), a century later it championed precisely the opposite positions.  Charting the ways poverty has been portrayed in the Times, Mac Donald shows how appeals for individual charity early in the 20th century shifted to demands for an ever-expanding welfare state.  With the passing decades, “elite opinion came to see the cause of poverty not in individual character and behavior but in vast, impersonal social and economic forces that supposedly determined individual fate” (p. 26).  No longer were individuals (including the poor) held accountable to moral standards, which were discarded in favor of a psychoanalytic model.  Distinctions between the “undeserving” and “deserving” poor disappeared from the Time’s pages.  Bad luck rather than bad character explained the plight of the city’s burgeoning welfare recipients. 

The varied titles of the essays indicate the scope of Mac Donald’s authorial lens, and she successfully pillories many of the conventional liberal ideas that so shape public policy not only in New York but throughout the country.  Refuting the “bad ideas” of the intelligentsia are the realities of a world wherein three things seem clear.  “First was the depth of the dysfunction that I often saw—the self-destruction wrought by drugs and alcohol and promiscuity, the damage inflicted on children by a world from which the traditional family had largely disappeared (though throughout the most troubled neighborhoods I found individuals of extraordinary moral strength fighting for order).  Second was the extent to which government programs shaped life in the ghetto, influencing the choices that individuals made and distorting the forms the social interaction took.  Finally, I was continually amazed by the trenchancy with which those I interviewed could judge their situations and the policies that had gone into making them.  If you want to know how well social policies are working, I learned, ask the poor—when their advocates weren’t around” (pp. vii-viii).

330 Reclaiming Common Sense

Dr. Ben Carson, the distinguished Secretary of the Housing and Urban Development department, recently sought to calmly assess the COVID-19 pandemic and the equally virulent panic paralyzing the nation.  Concluding his remarks he said we really need some “common sense” to deal with the crisis.  In the light of his Christian faith, he probably shares the view of G.K. Chesterton, who (in one of his Father Brown stories, “The Oracle of the Dog”) said:  “the first effect of not believing in God is that you lose your common sense.”  The need for such common sense is urged in Reclaiming Common Sense:  Finding Truth in a Post-Truth World (New York:  Encounter Books, c. 2019), by Robert Curry, who seeks to follow the example of George Orwell, of whom Lionel Trilling said:  “The medium of his thought is common sense, and his commitment to intellect is fortified by an old-fashioned faith that the truth can be got at, that we can, if we actually want to, see the object as it really is.”  Curry wants to introduce Americans to a way of thinking that was quite common a century ago, for it was “the coin of the realm in American thought.”  

Unfortunately, such common sense realism has been largely supplanted by ideologies (e.g. Romanticism or Progressivism or Freudianism or Postmodernism or Transgenderism) of various sorts.   For example, powerful elites in America deny the self-evident differences between men and women.  “Today, academic and cultural elites as well as government officials insist that ‘gender identity’ is more real than biology” (p. 23).  Thus they claim to discern 63 or more “genders,” and we’re asked to embrace the dogma that “marriage” can describe unions of alternative sorts.  As Bruce Fleming says:  “‘The dogma of the intellectual upper classes today is a bedrock belief in what I call “linguistic realism”  . . . .  If I say I am a woman, I am a woman, whatever others think.  If I say I feel myself to be oppressed, I am.  If I say that I was the victim of what we call sexual assault, I am—even if a court later decides there was no assault and hence no victim’” (p. 95).  To have asked a farmer in Kansas in 1890 about such “genders” or “assaults” would have elicited from him, most probably, sheer speechlessness!  One could not even imagine such silliness.  This farmer’s reaction would be simple common sense!  

The Kansas farmer would not likely have attended school very long, but such schooling would have been rooted in a common sense philosophy then widely embraced by the schools.   Common sense was, the historian Arthur Hermon said, “virtually the official creed of the American Republic.”  This faculty, he says, “‘belongs to everyone, rich or poor, educated or uneducated; indeed, we exercise it every day in hundreds of ways.’”  It’s not infallible, of course, but many things are simply self-evident—“‘the existence of the real world and basic moral truths’” that “‘are no sooner understood than they are believed’” because they “‘carry the light of truth itself’” (p. 27).  In leading universities, such as Princeton, this “common sense realism” held sway for much of the 19th century.  It was deeply shaped by an important 18th century Scottish philosopher, Thomas Reid, a Professor of Moral Philosophy at the University of Glasgow, who said:  “‘If there are certain principles, as I think there are, which the constitution of our nature leads us to believe, and which we are under a necessity to take for granted in the common concerns of life, without being able to give a reason for them; these are what we call the principles of common sense; and what is manifestly contrary to them, is what we call absurd’”  (p. 29).

Curry seeks, in this treatise, to simply explain and defend the thought of Thomas Reid.  So doing he rejects many modern philosophers, starting with Rene Descartes, who doubted everything other than their own subjective selves.  Descartes, the “father of modern philosophy,” famously declared “Cogito ergo sum—I think, therefore I am.”  He began with with himself and erected a philosophical system, setting forth an approach largely followed by hundreds of other less astute thinkers.  But Reid and his epigones thought we are equally if indeed not more certain that the world and other people really exist.   “There is no need to prove that the world and other people exist, just as there is no need to prove that tables can’t sing arias or that carafes cannot at the same time be women.  Stated, put into language, they are self-evident truths, which cannot and need not be proved.”  And these are the truths we must assert.

                                    * * * * * * * * * * * * * * * * * * * * * *

The United States was founded, Robert Curry believes, by thoughtful men who took their bearings from the Scottish Common Sense tradition (“one of the most remarkable developments in the history of the world”), and he defends this thesis in Common Sense Nation: Unlocking the Forgotten Power of the American Idea (New York:   Encounter Books, c. 2015; Kindle Edition).  Bearing witness to this position he cites Alexander Hamilton, who said:  “The sacred rights of mankind are . . . written, as with a sunbeam, in the whole volume of human nature, by the hand of Divinity itself, and can never be erased or obscured by mortal power.”  His frequent adversary, Thomas Jefferson, put the same truth more famously:  “We hold these Truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  Trusting the perspicacity of such Founders, Curry dedicates his book “to the proposition that we need to understand the language of the Founders if we want to understand the ideas of the Founders.  It will also tell the story of the systematic effort to bury the ideas of the Founders ” (p. xvii).

The Founders’ ideas were grounded in those self-evident truths they considered foundational.  They thereby embraced the Scottish common sense philosophy shaped by eminent thinkers such as Thomas Reid and Adam Smith.  It came to America via emigrating scholars (preeminently clergymen such as John Witherspoon) and young Americans such as Benjamin Rush who studied in Scotland—major figures in the “American Enlightenment,” which was something quite different from the French Enlightenment.   Consequently, historian Allen Guelzo, says:  “‘Before the Civil War, every major [American] collegiate intellectual was a disciple of Scottish common sense realism’” (p. 7).  The Founders certainly cited Montesquieu and John Locke, but it was the Scots who showed them ways “to fashion government along unprecedented lines—and to find a hitherto undreamed of way to realize Locke’s revolutionary claim that the supreme political power in every commonwealth is the people.  When it came time to lay the foundation for the new nation and its government, the Founders went to work thoroughly grounded in the philosophical arguments the Scots advanced.  It was those arguments that showed the Founders a way forward. It enabled them to go beyond the idea of a monarchy with its power somewhat limited by a Bill of Rights, and to make the American experiment in government by, for, and of the people” (p. 14).

They did so, in part, because the Founders were largely educated by Scots such as William Small, “by far the most brilliant member of the faculty at William and Mary,” who taught Jefferson.  At Princeton Madison was quite influenced by President John Witherspoon, who had studied with Adam Smith and Thomas Reid in Scotland.  Witherspoon himself would sign the Declaration of Independence, and his influence was prodigious, for his “students by one count included, among many others, five delegates to the Constitutional Convention, twenty-eight U.S. senators, forty-nine U.S. representatives, twelve governors, three Supreme Court Justices, eight U.S. district judges, three attorneys general, and many members of state constitutional conventions and state ratifying conventions.  Is it any wonder that the ideas and arguments of Reid and Smith and their Scottish colleagues are everywhere in the writings of the Founders?  Witherspoon’s course in moral philosophy, which he dictated year after year in largely unchanging form and which his students copied down faithfully, is almost certainly the most influential single college course in America’s history” (p. 19).  Hamilton was tutored at King’s College (now Columbia), by Robert Harpur, who had also studied at Glasgow, absorbing the Scottish Common Sense perspective.  “The ideas of the Scottish Enlightenment were studied and hotly debated just about everywhere in colonial America.”  Throughout the land, in all the colonial colleges, says Douglass Adair, “‘the young men who rode off to war in 1776 had been trained in the texts of Scottish social science’” (pp. 16-18).  

Importantly, Common Sense thinkers believed we have an innate “moral sense” providing important ethical precepts.  There are some things we can’t not know!  Recently advocating this perspective, Aleksandr Solzhenitsyn said: “We are born with a sense of justice in our souls; we can’t and don’t want to live without it!”  Similarly, in 1787, writing to his nephew, Thomas Jefferson said:  “Man was destined for society . . . He was endowed with a sense of right and wrong merely relative to this.  This sense is as much a part of his nature as the sense of hearing, seeing, feeling; it is the true foundation of morality . . . The moral sense, or conscience, is as much a part of a man as his leg or arm.  It is given to all human beings in a stronger or weaker degree . . . It may be strengthened by exercise, as may any particular limb of the body. “  Twenty-eight years later he said virtually the same thing in a letter to John Adams.  These words clearly reflect the views of Francis Hutcheson, the leading spokesman for Scottish moral sense philosophy. 

Textbook treatments of the Declaration of Independence routinely credit John Locke for the views set forth by Jefferson.  Curry, however, wants us to see it in the light of Scottish thinkers who constantly critiqued Locke.  Consider the famous “self-evident truths” Jefferson cited.  In an 1825 letter to Henry Lee,  he said that while writing the Declaration he sought:  “Not to find out new principles, or new arguments, never before thought of . . . but to place before mankind the common sense of the subject . . . it was intended to be an expression of the American mind.”  This is not a Lockean notion!  Rather, it was Thomas Reid who had “made self-evident truths the foundation of his philosophy, the philosophy of common sense realism.  Reidian common sense is the human faculty by means of which we can grasp self-evident truths. It is a power like Hutcheson’s moral sense or the sense of sight or of hearing.  Therefore,  common sense is the power that makes human understanding possible” (p. 55).  Reid at times equated “self-evident truths” with “first principles,” which are “‘propositions which are no sooner understood than they are believed . . . [having] the light of truth in itself.”  Similarly, Alexander Hamilton said, in Federalist 31, that “there are certain primary truths, or first principles, upon which all subsequent reasonings must depend.  These contain an internal evidence which, antecedent to all reflection or combination, commands the assent of the mind.”  Saying so, Hamilton could easily have been quoting Reid!

After analyzing the Declaration of Independence Curry turns to the Constitution, and its defense in the Federalist Papers, which obviously leads to citing their primary authors, James Madison and Alexander Hamilton.  Unlike the utopian romantics, such as Rousseau, who helped shape the French Revolution, American revolutionaries took a realistic approach to human nature and politics.  In important ways their views resembled Scottish Presbyterianism; indeed King George III called the Americans’ revolt “a Presbyterian Rebellion.”  The church in Scotland, with its uniquely representative form of government, illustrated the popular sovereignty invoked by the Founders.  At the Constitutional Convention, Madison’s “Virginia Plan” closely resembled the Presbyterian system.  And so did Madison’s understanding of human nature.  Thus:  “Madison was fighting for a radical re-conception of the relationship of mankind and the state” based upon natural rights, given by God and not the state, nor by any “contract” established by earlier generations.  “‘The rights were there all along.’  That is to say, our rights are inherent, part of our nature as human beings, unalienable.  In order to understand the Founders, we need to recognize their intent:  to design America’s government guided by this new understanding of the nature of our rights, and, insofar as possible, to design government so as to protect and preserve those rights” (p. 85).  

The Founders’ common sense philosophy, however, was abandoned by the Progressives who gained power at the beginning of the 20th century.  Emblematic of this change is Woodrow Wilson, who embraced an evolutionary worldview that justified replacing the written Constitution with constantly changing edicts and laws designed to meet current demands.   He scoffed at any notions of “self-evident” truths or “inalienable rights” as fantasies of earlier times.  Wilson embraced the philosophy of G. W. F. Hegel which was in vogue when he went to graduate school at Johns Hopkins, where virtually all the professors had secured their Ph.D.s in Germany.  Hegel celebrated the powerful state rather than personal freedom.  “For Hegel, the movement of the state through time was the ‘march of God on earth” (p. 164).  Successive progressive presidents, including Teddy Roosevelt, Franklin D. Roosevelt and Lyndon B. Johnson, followed Wilson and rejected “the Constitutional safeguards of individual liberty in favor of the government’s ability to bring about social change, favors an ever expanding and activist role for government in society, such as government control of health care, government intervention in the economy and so on” (p. 150).

* * * * * * * * * * * * * * * * * * * * * * *

In my many years teaching introductory philosophy classes I never discussed Thomas Reid.  And I’d never read any of his books.  Recently prompted by Robert Curry’s discussion of common sense philosophy I secured and read Reid’s An Inquiry into the Human Mind, on the Principles of Common Sense (Kindle Edition).  He begins with a simple declaration:  “Wise men now agree, or ought to agree, in this, that there is but one way to the knowledge of nature’s works—the way of observation and experiment” (#65).  Throughout history man has reasoned from observable events to their explanations, employing “the same method by which Newton discovered the law of gravitation and the properties of light.  His regulæ philosophandi are maxims of common sense, and are practiced every day in common life; and he who philosophizes by other rules, either concerning the material system or concerning the mind, mistakes his aim.  Conjectures and theories are the creatures of men, and will always be found very unlike the creatures of God.  If we would know the works of God, we must consult themselves with attention and humility, without daring to add anything of ours to what they declare.”   

Descartes, uttering his famous Cogito ergo sum, “resolved not to believe his own existence till he should be able to give a good reason for it.”   If, in fact, he could actually doubt his own existence, Reid thought, “his case would have been deplorable,” for anyone who “disbelieves his own existence” is as deranged as one who “believes he is made of glass.”  Nevertheless Descartes sought to move from certain internal truths to then “prove the existence of a material world:  and with very bad success.”  He (along with John Locke and William Berkeley and David Hume in different ways) invoked “philosophy to furnish them with reasons for the belief of those things which all mankind have believed, without being able to give any reason for it” (#174).  Without intending to, they all espoused positions that would inevitably lead to an “abyss of skepticism” as was demonstrably evident in the works of David Hume.   To refute the skeptics Reid devoted successive chapters to describing and analyzing the five physical senses—smelling; tasting; hearing; touching; seeing.  When we smell the scent of a rose, we simply testify to the undeniability of an existing reality.  We can do no more than observe that something smells.  Likewise we have memories of scents in the past.  “Sensation and memory, therefore, are simple, original, and perfectly distinct operations of the mind, and both of them are original principles of belief.  . . . . Sensation implies the present existence of its object; memory its past existence” (#362).  From our sensations we can infer that we have a mind capable of knowing the world around us.  We know the things we sense—not simply the ideas in our minds concerning these things.  But the “wisdom of philosophy” sought to demonstrate the primacy of sensations or ideas in the mind rather than assume the reality of the material world.  Implausibly, to Reid, “wisdom” philosophers such as Descartes and Locke “maintained, that colour, sound, and heat, are not any thing in bodies, but sensations of the mind.”  Denying color resides in things, Reid contends, is “nothing else but an abuse of words,” capriciously changing the “meaning of a common word” (#1342).  He sided with ordinary folks who simply assume that colors and sounds reside in things independent of the mind. 

If Descartes and Locke are “wise men,” Reid declared,“let me be deluded with the vulgar.”  To him, “Common sense and reason have both one Author, —that Almighty Author, in all whose other works we observe a consistency, uniformity, and beauty, which charm and delight the understanding:  there must therefore be some order and consistency in the human faculties, as well as in other parts of his workmanship.”  Obviously the “belief of a material world is older, and of more authority, than any principles of philosophy.”  So too is our awareness of “first principles” given us by the Author of our being, our human nature.  “Such principles are parts of our constitution, no less than the power of thinking: reason can neither make nor destroy them; nor can it do any thing without them . . .  A mathematician cannot prove the truth of his axioms, nor can he prove any thing, unless he takes them for granted.  We cannot prove the existence of our minds, nor even of our thoughts and sensations.  A historian, or a witness, can prove nothing, unless it is taken for granted that the memory and senses may be trusted.  A natural philosopher can prove nothing, unless it is taken for granted that the course of nature is steady and uniform.  How or when I got such first principles, upon which I build all my reasoning, I know not; for I had them before I can remember: but I am sure they are parts of my constitution, and that I cannot throw them off” (#1060).

Concluding his essay, Reid explains, with compelling clarity:  “when I feel the pain of the gout in my toe, I have not only a notion of pain, but a belief of its existence, and a belief of some disorder in my toe which occasions it; and this belief is not produced by comparing ideas, and perceiving their agreements and disagreements; it is included in the very nature of the sensation.  When I perceive a tree before me, my faculty of seeing gives me not only a notion or simple apprehension of the tree, but a belief of its existence, and of its figure, distance, and magnitude; and this judgment or belief is not got by comparing ideas, it is included in the very nature of the perception.”  Such “original and natural judgments are therefore a part of that furniture which nature hath given to the human understanding.  They are the inspiration of the Almighty, no less than our notions or simple apprehensions.  They serve to direct us in the common affairs of life, where our reasoning faculty would leave us in the dark.  They are a part of our constitution, and all the discoveries of our reason are grounded upon them.  They make up what is called the common sense of mankind, and what is manifestly contrary to any of those first principles, is what we call absurd.  The strength of them is good sense, which is often found in those who are not acute in reasoning.  A remarkable deviation from them, arising from a disorder in the constitution, is what we call lunacy; as when a man believes that he is made of glass. When a man suffers himself to be reasoned out of the principles of common sense by metaphysical arguments, we may call this metaphysical lunacy; which differs from the other species of the distemper in this, that it is not continued, but intermittent: it is apt to seize the patient in solitary and speculative moments; but when he enters into society, common sense recovers her authority.  A clear explication and enumeration of the principles of common sense, is one of the chief desiderata in logic” (#3841).  

* * * * * * * * * * * * * * * * * * * * * * * * * * 

Paul A. Boer, Sr. has edited a helpful book:  Selections from the Scottish Philosophy of Common Sense, (Veritatis Splendor Publications, c. 2002; Kindle Ed) and included writings by Thomas Reid, Adam Ferguson, James Beattie, and Duncan Stewart, who were all alarmed by the skepticism of David Hume.  Reading them reminds one of the perduring wisdom of the philosophical realism variously espoused by Aristotle and Aquinas and of the fact that “the Philosophy of Common Sense was the dominant philosophy in the American Universities,” and left its imprint on this nation, both theologically (in the Princeton school) and politically (in the thought of the Founders). 

329 Humanitarian Woes

During the past two centuries, Man has replaced God in various quarters (including many “modernist” churches).  Consequently humanitarianism—the abstract love for mankind—has increasingly replaced charity as the ultimate mark of righteousness.  The course was set in the 19th century when the highly influential bible critic David Strauss called for “the carrying forward of the Religion of Christ to the Religion of Humanity.”    Yet, as Feodor Dostoevsky insightfully noted in The Idiot:  “In abstract love of humanity one almost always only loves himself.”  Thus it is not surprising that folks who have given their lives in purely humanitarian endeavors end up disillusioned if not deeply jaundiced.  With that in mind one can learn much from Travesty in Haiti:  A true account of Christian missions, orphanages, fraud, food aid, and drug trafficking (Smashwords, c. 2012; Kindle) by Timothy T. Schwartz.  In 1995 Schwartz went to Haiti to finish his Ph.D. in cultural anthropology by doing “field work” in a clearly needy place.  He had no religious bent, but he did hope to make a difference by studying and understanding the country.  And he “was enthusiastic.  My enthusiasm and belief that I could make a contribution kept me returning despite the hardships, the violence, the coups, and the embargoes.  But ten years later I was a different person.  Perhaps I was simply burned out” (p. 228).  He illustrates rather nicely the downside of humanitarianism all over the world—without a transcendent perspective trying to help hurting people drives one to despair.  So while reading his books one must always remember Schwartz sees his world through rather jaundiced glasses!   His is animus doubtlessly distorts his presentation, but his woeful data still deserve consideration.

Schwartz says he “was supposed to do what is called participant observation, meaning that I was to live in the community, take part in the lives of the people there, live as they live, interfering as little as possible so that I could learn about their culture and how impoverished Haitians deal with problems of daily survival.”  Thereafter, he hoped “to join the ranks of foreign aid experts who work for charitable organizations such as CARE International, experts who design and carry out farm, commerce, and health projects meant to help the poor in their struggle to overcome hunger and disease” (p. 6).  For a year he lived in a rather remote fishing “hamlet” wherein his naive presuppositions and aspirations quickly vanished—in part because he “made the mistake that so many blan [whites] make in Haiti:  I started giving” (p. 19).  Regardless of their status, his neighbors incessantly begged, asking him to part with virtually everything he owned.  And when they didn’t beg they stole.  In time Schwartz simply left his personal property locked away with a missionary family some distance from the hamlet.  

After successfully completing his Ph.D., Schwartz found employment with a number of NGOs (Non-Government Organizations) that had begun arriving in Haiti in the 1950s.  This included CARE (the Cooperative for Assistance and Relief Everywhere), the most prestigious of them all.  He mainly conducted surveys to document educational, economic, and medical conditions in the country.  But as he looked at the data and  roamed about seeking to better understand Haiti, he found, to his dismay, that many of the humanitarian “aid” programs harmed the very people they were designed to help.  This was due primarily to their lack of accountability for the distribution of massive amounts of money collected from sincere donors who want to do something to “help” the needy.  Food aid, dumped in great quantities, inevitably harmed Haiti’s farmers and encouraged widespread theft and graft.   Easily stolen from the distribution sites and sold in the markets, aid parcels brought a tidy profit for the thieves.  A country that was exporting food in 1950 had become impoverished as “food aid” from rich countries overwhelmed it.

Technological assistance, often in the form of machinery (generators, tractors, etc.) sent to impoverished rural areas, did little good simply because Haitians could not effectively use it.  And, since it was designed for advanced economies, it was fundamentally unsuited for the country.  Illustrative of the problem, Schwartz points to five wind generators looming high on a hill near Baie-de-Sol, a provincial capital.  They were put there in the 1990s by the German embassy at the cost of several million dollars.  Each generator could produce 50,000 kilowatts of electricity, but within six months they were all destroyed by vandals tearing out their electrical wiring.  Copper brings cash in the market!  No one on the site knew anything about them, much less had the ability to get them working.  To Schwartz, “it is the typical story regarding development all over Haiti:  ‘It is broken, can’t be fixed, and nobody knows anything else about it.’  And that was the whole point.  To me the wind generators epitomized foreign aid.  Their guts ripped out, never having functioned for longer than a blan sat watching and caring for them, they are a summary statement of international development efforts in Haiti” (p. 66). 

The “missions” in the book’s subtitle refer not to the churches established to preach the Gospel.  Though emphatically not a Christian, Schwartz has no criticism for these evangelistic endeavors, frequently led by native preachers.  What he finds appalling are the many humanitarian ventures, almost always focused on helping Haitian children, under Christian auspices.  Virtually all of these are “orphanages” featuring impoverished children that collect enormous sums from sincere supporters in the United States.  In fact, many of the “orphans” have at least one parent.   And they are mainly in the facility to receive a quality education unavailable elsewhere.  Still more:  most “orphans,” have several sponsors in America sending monthly checks to support them, enabling entrepreneurs to nicely profit thereby.  In all the establishments Schwartz investigated, operators were spending “only a fraction of the money they raised for the children and pocketing the rest.  Orphanages in the area were a business” (p. 134).  After visiting “every single orphanage” one province, he concluded:  “They all look like scams to me” (p. 148).

Sadly enough this indictment ultimately held for the mission he had most trusted, run by an American family that portrayed itself as altruistic Christians devoted to the Haitians.  Visitors were inevitably impressed by their piety and charitable work.  To Schwartz, initially, they were bona fide good folks.  He “respected them, admired their honesty, their good works, the closeness of their family.  I had gone to their church services, stood with them holding an open bible in my hand as the Reverend read the words” (p. 215).  But then he learned the truth.  The Reverend was sleeping with the servants!  And the funds they raised supported a lavish lifestyle.  In the end:  “It was like CARE, a perversion of American charitable ideals, with its false claims to be aiding the “poorest of the poor” when what it was really doing was throwing exquisite banquets at plush hotels while carrying out U.S. political policy in the interest of international venture capitalists and agro-industrialists”  (p. 216). 

Summing up his book Schwartz says:  “This is the inside story of those projects and the impact on the people they were meant to help.  It is largely a story of fraud, greed, corruption, apathy, and political agendas that permeate the industry of foreign aid.  It is a story of failed agricultural, health, and credit projects; violent struggles for control over aid money; corrupt orphanage owners, pastors, and missionaries; the nepotistic manipulation of research funds; economically counterproductive food relief programs that undermine the Haitian agricultural economy; and the disastrous effects of economic engineering by foreign governments and international aid organizations such as the World Bank and USAID and the multinational corporate charities that have sprung up in their service, specifically, CARE International, Catholic Relief Services, World Vision, and the dozens of other massive charities that have programs spread across the globe, moving in response not only to disasters and need, but political agendas and economic opportunity.  It is also the story of the political disillusionment and desperation that has led many Haitians to use whatever means possible to better their living standards, most recently drug trafficking; and how in the service of international narcotraffickers and money launderers, Haiti has become a failed State” (p. 2).

                                   * * * * * * * * * * * * * * * * * * * * * * * * *

In The Great Haiti Humanitarian Aid Swindle (n.p., Kindle, 2017), Timothy T.  Schwartz extends the expose he began in Travesty in Haiti.  He dedicates the book “to the millions of people who have donated money to help impoverished Haitians,” to the “tens of thousands of rescue workers and sincere aid employees who have gone to Haiti to help,” and “to the impoverished Haitians who are meant to benefit from aid, but the many, if not most, who do not benefit.”  Importantly, all these folks “deserve explanations for the wasted aid and they deserve explanations for the exaggerations, misrepresentations and outright lies about the Haitian people that came both after the 2010 earthquake and for decades before it” (frontpiece).  

Lest the reader suspect differently, Schwartz fully endorses charities of all sorts.  Loving others and giving them aid is a fully admirable thing.  But too many “charities,” however well-intended, ultimately do much harm and become sophisticated forms of stealing.  “Millions of people are engaged in ripping off the neediest people on the planet.  They participate in duplicity, exaggeration, and outright lying.   . . . they publish images of what they claim are enslaved children and raped women.  They invent or exaggerate statistics.  They seek out the most horrid stories of abuse.  They insinuate themselves into the stories or the statistics as saviors who are rescuing those in dire need.  And then, of course, they ask us for money” (p. 3).  

Yet they often do little to actually help needy people!  “Instead they spend the bulk of the money, not on the needs of the desperately poor or wretched and distressed, but on themselves.  They use the money to pay for their own homes, to pay school tuitions for their own privileged children, to pay their pension plans and vacations” (p. 3).  Even worse, Schwartz thinks, they are aided and abetted by a compliant media which get us to “believe the stories, the radically inflated numbers, and the twisted statistics.”  Anyone reading Schwartz quickly realizes how “fake news” oozes from to “those bastions of supposedly credible news, such as The New York Times, London’s The Guardian, wire services such as the Associated Press and United Press International, Agence France-Presse and prime time news shows such as CBS’s 60 Minutes and CNN’s Anderson Cooper 360°” (p. 3).  

The focus of this book is the 2010 earthquake which devastated Haiti and ignited an incredible humanitarian response.  Schwartz was living in the Dominican Republic when the earthquake occurred and drove immediately to Port-au-Prince, hoping to help as an interpreter as well as observe humanitarian endeavors.  Though highly-trained and well-equipped, the rescue teams generally arrived too late to really help and then failed to enter the areas most devastated by the quake.  They failed, it seems, because they feared the violent, knife-wielding rioters featured by the media!  Indeed, he says:  “Anyone who read the headlines would have been afraid.  The disgrace was the press; those professionals we count on to tell us what’s happening.  They were fomenting the fear” (p. 31).  But the fears were utterly groundless.  In fact, even the most devastated areas in Port-au-Prince were much safer than usual.  Most (90 percent or more) of the folks actually rescued were saved within eight hours by friends and neighbors and even looters, digging through the rubble with their hands and simple tools in the hours immediately following the quake.  “In the years since the earthquake, dozens of people have told me how looters dug them out.  I have never met anyone saved by an official rescuer” (p. 27).  A total of 67 search and rescue teams managed to rescue only 137 people—a number widely celebrated by the press.  “And it cost a fortune.  The total cost was 243 million U.S. dollars, about 1.84 million dollars for each of the 137 to 147 rescues that were, fairly or unfairly, attributed to international rescue teams” (p. 65).  Meanwhile, hundreds of seriously-injured Haitians were desperately needing medical attention.  Rescue teams (flush with skilled paramedics) drove by hotels and shelters housing hundreds of injured Haitians in order to dig through rubble vainly seeking survivors.  “If, instead of devoting their time to the rescue efforts . . . the 1,918 paramedics and doctors assigned to the rescue squads had been treating just ten people per day per paramedic, they would have treated 134,260 people in the first week” (p. 67).  But dramatic rescues make better TV!  And raise more money!

Money, as well as rescue teams, began almost immediately flowing into the country.  Corporations and individuals sent $3.1 billion and foreign governments would give another $10 billion.  The Red Cross made an “emergency flash appeal” for $10 million, but when funds began arriving the amount was raised to $100 million and ultimately reached $1.2 billion.   Save the Children first asked for $9.8 million and quickly raised $20 million.  By the year’s end the amount was $87 million.   “World Vision asked for $3.8 million.  But they then kept asking for more, and more, and more, until they had collected a total of $191 million.  UNICEF originally called for $120 million.  When they brought in $229 million in six months—almost double what they requested—they decided they needed another $127 million.  . . . .   The NGOs and UN agencies were as a rule insatiable.  In all post-earthquake Haiti, only Doctors Without Borders told donors they had enough money, and that was after bringing in a whopping $138 million” (p. 9).   Then the  “squandering and waste began almost immediately” (p. 5).  “The stories go on and on.  . . . Food for the Poor was building permanent houses in Haiti before the earthquake for $2,000 per home.  After the earthquake, the U.S. government partnered with Food for the Poor to build 750 of what were essentially the same houses, but at a cost of $38,000 per house, 19 times the pre-earthquake costs.   Red Cross CEO Gail McGovern said that $100 million of the $500 million given to the Red Cross would go to ‘provide tens of thousands of people with permanent homes.’  Five years later NPR would report that the charity had built six permanent homes” (p. 7). 

Ever the conscientious scholar, Schwartz meticulously documents his assertions with extensive notes and appendices, though this was made difficult by the charities’ failures to disclose their finances.   Of the 196 organizations the Disaster Accountability Project examined, only six provided up-to-date accounting.  “Only one provided what DAP considered ‘complete and factual information.’  The majority—128—did not have factual situation reports available on their websites, relying instead upon anecdotal descriptions of activities or emotional appeals.  Many claimed to provide details of their activities on their blogs, but the blogs were almost entirely ‘appeals to emotion, pictures of children, and purely anecdotal accounts about touching moments during a particular delivery of relief’” (p. 8).  They told anecdotes because they had little data demonstrating how they helped respond to the “disaster.”  Numbers were inflated as well as difficult to discern.  Take Cassandra Nelson, who worked for Mercy Corps.   She flew into Haiti and said:  “‘it is like opening a window on unprecedented levels of ruin . . . by far the worst devastation that I’ve ever seen.’”  Flying home 16 days later, she declared:  “‘Literally everything is destroyed.’”  On the contrary, Schwartz, who actually knows the country quite well, says there were remarkably few scenes such as Nelson described.  Yet, “wherever you were in the world, you could have turned on the television, logged on to the internet, or opened a newspaper and found pictures that made you think that Port-au-Prince was like that.  But if you were actually in Port-au-Prince at the time, to see those scenes you would have had to search them out” (p. 83).  In short, things were not nearly so bad in Haiti following the earthquake as we were led to believe!  Many buildings collapsed, but over 90 percent of them did not!  Journalists lamented the lack of electricity and running water but never checked to learn that such was the daily reality Haitians faced long before the earthquake!  They also aired astronomical figures for the lives lost, untruths given them by humanitarian aid agencies who knew they would increase their revenues thereby. 

Schwartz was employed by USAID (the United Nations International Childrens Emergency Fund) to document how many people returned to their homes following the earthquake.  This necessarily involved ascertaining how many people actually died.  When he presented his report, however, a USAID official unleashed a tirade against him since he didn’t accept the official Haitian government’s number of 316,000—the number cited by most NGOs soliciting donations around the world.  “Where the figures were coming from nobody knew” (p. 92).  In fact, Schwartz believes, only around 60,000 people died.  But neither the government nor the press nor the NGOs were interested in the truth.  They just wanted inflated figures.  “The disturbing thing about all this, and what really suggests that regarding the number of people killed there was indeed a type of Great Haiti Humanitarian Aid Swindle complete with falsified data at the highest levels of the government and cover-ups at the highest level of the press, is that the press knew from the beginning that the government was inflating the figures.  And by corollary, U.S. government bureaucrats knew” (p. 94).  It took Schwartz five years to fully figure it out, but he concluded that “the executives at humanitarian agencies, such as Steve McAndrew of the Red Cross or Sophie Perez of CARE International” demanded high numbers.  “The more people dead, the more the good-hearted people of the world would be inclined to give donations.  It’s a no-brainer.  For the press it was obvious too.  The bigger the tragedy, the more horrific the scenes and the more harrowing the tales, the more people would buy newspapers, log onto their internet sites or turn on their televisions and watch the news” (p. 111).

Beyond exaggerating death statistics, child protection workers and orphanage owners cleverly massaged the images of homeless Haitian children.  “With UNICEF and Save the Children leading, orphanages fanning the flames, and the press publishing almost anything anyone said—no matter how scant the facts—the scramble to save Haiti’s children took on apocalyptic dimensions.  They told us that there were over 1 million lost, separated or abandoned children, conjuring up images of little children aimlessly wandering through the ruins of Port-au-Prince.  As time went on the experts added images of sexual predators and slave hunters prowling the rubble in search of the children.  They told us that people were selling children for $50.  It came to be known around the world as the ‘Haiti Orphan Crisis.’  Almost none of it was true.  As will be seen, the number of orphaned, lost or separated children was inflated by factors that ran into the hundreds and perhaps thousands.  No network of slave hunters or perverts was ever verified.  Nor was there ever a confirmed case of someone selling a child” (p. 129).  It was all a scam!  Certainly less than 1,000 children were separated from their parents—and the number was probably around 100.  Yet UNICEF celebrated its work of reuniting families and collected some $100 million from donors.  Precisely how many families were reunited?  Twenty! 

“Whatever their intentions, it was a massive swindle. The world’s largest child protection agencies, UNICEF, Save the Children, World Vision, Compassion International and others, together with the orphanages and the world’s three largest news services, Agence France-Press, Reuters and the Associated Press, used untruths and exaggerations about children to precipitate a media hysteria that sustained an avalanche of donations from concerned citizens in almost every country on earth.  The success of that swindle is not only in the money they brought in.  Nor is the success of the swindle limited to the fact that more than 90 percent of the money went to internal expenses, including pension plans, salaries, school tuitions for the children of UNICEF staff and the staff of those organizations to which UNICEF distributed money” (p. 170).  Sadly enough:  “The most outstanding mark of swindle is that when it was all over, after having never apologized or even publicly acknowledged the duplicity, UNICEF officials were still looking into cameras, gushing with heartfelt sincerity, and asking for more money to help the Haitian children.  And they were getting it” (p.172).

For his efforts to rightly report such facts, Schwartz was roundly assailed.  He was called “a spiteful piece of garbage,” a “criminal,” a “liar,” a “despicable vampire” responsible for Haitian woes!  He was, for sure, a threat to highly-paid employees of humanitarian agencies.  “USAID-Washington would go on to blacklist me,” though he’s one of the best informed Haitian scholars (p. 96).   (The fact that Schwartz has self-published these works may very well indicate how he violates the modern humanitarian credo!)  

328 Post-WWII America

 Few of us, having lived through the last half of the 20th century, would discount the massive cultural changes that have transpired during our lifetimes.  But understanding these phenomena, digging into the real causes of the transformation, proves rather daunting.  Given the nature of historiography, no one has the capacity to fully describe, much less to fully understand the past.  Every thoughtful historical monograph, as Alfred North Whitehead said, in his Adventures in Ideas, is “a sort of searchlight elucidating some of the facts, and retreating the remainder into an omitted background.”  A highly-readable, descriptive narrative of important developments during one decade (from the mid‘60s to the mid-‘70s) is provided by Amity Shlaes in Great Society:   A New History (New York:  Harper, Kindle ed., c. 2019).  The “great society” was a phrase appropriated by Lyndon Baines Johnson to represent his aspirations as president, and it became one of the most ambitious social engineering endeavors in American history.  

Shlaes begins with a telling vignette of Michael Harrington, the author of The Other America:  Poverty in the United States—a 1962 treatise widely discussed in the final year of the John F. Kennedy administration.  Semi-humorously, Martin Luther King quipped:  “You know, we didn’t know we were poor until we read your book’” (p. 73).  Harrington was a self-identified socialist who had been briefly involved in the formation of Students for a Democratic Society.  When Lyndon B. Johnson became president in 1963 he and many in his administration (most especially Sarge Shriver, JFK’s brother-in-law and LBJ’s poverty czar) were quite taken by Harrington’s ideas.  Given an office in the White House, Harrington noted:  “‘the abolition of poverty would require a basic change in how resources are allocated.’”  Shriver mentioned this to LBJ, an aspiring Franklin D. Roosevelt, who “told him that if serious economic redistribution was necessary to realize the long-delayed completion of the New Deal, then redistribution might be worth it” (p. 3).  

Whether or not LBJ’s endeavors would bring about the “great society”—great because it is good—Amity Shlaes seeks to show.  So she begins with JFK’s “New Frontier,” brought into being by the election of 1960.  The nation was then prospering, amply illustrating The Affluent Society described by Harvard economist John Kenneth Gailbraith.  Businesses such as GE and GM were fiscally sound and most working men made good money.  The president himself was notably pro-business, sending “his progressive advisor” Galbraith off to India as an ambassador rather than embracing his socialist ideals.  But he also made clear overtures to labor unions, issuing an executive order enabling federal employees to unionize.  However, when he gave a speech indicating his admiration for Britain’s National Health Service the stock market plunged and he quickly retreated into the security of the status quo.  JFK was no FDR, seeking to engineer societal change.  There were, to be sure, pockets of poverty, but by-and-large the Ozzie and Harriet world of the ‘50s gave much impetus to considerable optimism for the coming years.  

Cynically discounting such optimism, however, a group of students met in 1962 near Port Huron, Michigan, in a camp developed by Walter Reuther, the president of the United Auto Workers and named “Four Freedoms”—the items listed by FDR in his last inaugural address.  Styling themselves the “New Left” and led by the likes of Tom Hayden, they felt “it was like God was sending us a message.”  Many of the youngsters imagined they were attending something of a “participatory democracy,” but in fact their input was unimportant, for the real message had been carefully crafted months before by Hayden, Harrington, and operatives funded by the UAW.  Harrington and Hayden were “Catholic activists” and were also “drinking buddies” (p. 77).  One of Reuther’s union officials considered the students were “our kind of youngsters,” and his brother Victor provided ample funding for the group’s endeavors by helping distribute the “Port Huron Statement,” substituting the word “statement” was for “manifesto” in order to distance it from the Communist Manifesto!  Much of the “Statement” had been earlier incubated in Reuther’s UAW “propaganda mills” which constantly decried income inequality and the fact that “‘the wealthiest 1 percent of Americans own more than 80 percent of all personal shares of stock’” (p. 78).  Indeed, Walter Reuther was determined to distribute the wealth by nudging the nation toward a “social democracy.”  And for that he needed “an American president to lead his redistribution revolution” (p. 62). 

FDR, of course, had earlier moved “the country toward socialism while sustaining democracy” (p. 63).   So Walter Reuther needed another FDR.  But he knew JFK’s New Frontier would not update the New Deal.   When John Kennedy was killed, however, Lyndon Baines Johnson proved more amenable to the Reuther agenda.  Indeed, one of the first persons LBJ called was Walter Reuther.  “‘I’ll need your friendship more than I ever did in my life,’ Johnson said.  Reuther promised ‘every possible help I can offer’” (p. 87).  Within a few months Johnson declared “unconditional war on poverty” in his State of the Union address, and a new national tilt toward “social democracy” was underway.  This was evident in a 1964 speech at the University of Michigan, wherein LBJ set forth “a vision as fantastic as the vision of Port Huron, as transformative as that of Reuther” (p. 97).  Poverty must end, civil rights must be insured, and a “Great Society” must be brought into being. 

Thenceforth came a cascade of legislation and federal programs, launched without concern for financial accountability, justified simply as what “ought” to be done by compassionate Americans! The list is almost interminable—medicare; medicaid; civil rights injunctions; minimum wage edicts.  LBJ was on a roll and his triumph in the election of 1964 apparently illustrated the people’s support for his programs;  the “Great Society” was an effective expansion of the New Deal.  But implementing the agenda proved far more difficult than passing legislation!  Take, for example, a rather simple prescription, the minimum wage.  Designed to reduce unemployment, it in fact increased it!  “Black and white youth unemployment had run about the same until the middle of the 1950s, 8 to 11 percent.  But when Congress raised the federal minimum wage by a third in 1956, unemployment rose far higher among black teenagers than among whites, to 25 percent” (p. 183).  The War on Poverty flooded communities with money that counterproductively encouraged irresponsibility, enabling men avoid work.  When you could get $200 a month from welfare, why work hard to earn the same amount!     

Equally vain were the Great Society’s housing programs.  Facing depressed sections in the nation’s great cities, progressives pressed for federally-funded housing projects.  After all, Walter Reuther had declared:  “The choice before the people of every major urban center is simple and clear.  It is build or burn.”  Government housing for the needy had long been a progressive ideal, and their projects revealed an architectural aesthetic.  Consider what was erected in Washington, D.C. to house the newly-created Department of Housing and Urban Development.  It was was, architecturally, a monument to “Brutalism,” a movement celebrating massive, concrete, featureless, geometric structures.  But to most Americans it signified a “brutalist” bureaucratic obsession.  No matter what experts said, “brutalist” had to mean what it sounded and looked like, possessing brute power” (p. 230).  To deal effectively with city slums, old neighborhoods were razed and replaced with soaring, sterile concrete structures—“projects” designed improve living conditions for the impoverished.  Yet with a rapidity impossible to imagine these “projects” in St. Louis, Chicago, and elsewhere became cages of squalor and crime.  They would be, in a rather short time, simply demolished. 

But unlike the brutalist housing projects, Great Society programs persisted.  President Nixon tinkered a bit with some of them but dared not seek to reverse them.  Indeed, he pursued policies, such as wage and price controls in 1971, that were flagrantly socialistic!  Ronald Reagan, both as Governor of California and President of the United States, spoke frequently and passionately against some of them, but Democrats successfully obstructed most all of his proposals.  Half-a-century later, Shlaes says, with trillions of dollars expended, one can only look back at the Great Society and lament its many failures.

* * * * * * * * * * * * * * * * * * * * *

In The Age of Entitlement:  America Since the Sixties (New York:. Simon & Schuster, c. 2020; Kindle Edition), Christopher Caldwell provides a helpful lens with which to understand current developments in America.  He begins by noting how deeply the ‘60s shaped subsequent decades.  Indeed:  “For two generations, ‘the sixties’ has given order to every aspect of the national life of the United States—its partisan politics, its public etiquette, its official morality.  This is a book about the crises out of which the 1960s order arose, the means by which it was maintained, and the contradictions at its heart that, by the time of the presidential election of 2016, had led a working majority of Americans to view it not as a gift but as an oppression” (p. 3).  This was because many of the “reforms” pushed through in that decade “came with costs that proved staggeringly high—in money, freedom, rights, and social stability” (p. 6).

Caldwell’s disillusionment provides a stark contrast to the ‘60s utopian optimism.  Following the traumatic assassination of John F. Kennedy, the welfare state rapidly expanded—Medicare, Medicaid, Civil Rights and Voting Rights acts—and was expected to fulfill the aspirations of the “best and the brightest” who engineered it.  Most importantly, Caldwell argues:  “Civil rights ideology, especially when it hardened into a body of legislation, became, most unexpectedly, the model for an entire new system of constantly churning political reform” (p. 5).  Here the law of unexpected consequences held true, for the “changes of the 1960s, with civil rights at their core, were not just a major new element in the Constitution.  They were a rival constitution, with which the original one was frequently incompatible,” and we are in the midst of a titanic struggle which will determine which will prevail:  “the de jure constitution of 1788, with all the traditional forms of jurisprudential legitimacy and centuries of American culture behind it; or the de facto constitution of 1964, which lacks this traditional kind of legitimacy but commands the near-unanimous endorsement of judicial elites and civic educators and the passionate allegiance of those who received it as a liberation.  The increasing necessity that citizens choose between these two orders, and the poisonous conflict into which it ultimately drove the country, is what this book describes” (p. 6).

In particular, the march toward desegregation, launched by the Supreme Court’s Brown vs. Topeka Board of Education ruling in 1954, inevitably eroded the constitutionally guaranteed freedom of association.  Equality, rather than freedom, became imperative!  Inevitably, the “sanctity of private property” was softened whenever racial discrimination called for correction.  Though some legislators, debating the civil rights laws, feared unanticipated consequences (e.g. mandated school busing, lowering school admission standards, hiring quotas, etc.), they were dismissed as devotees of an antiquated social system.  Nevertheless, many of their fears materialized, and lawmakers “who opposed the legislation proved wiser about its consequences than those who sponsored it” (p. 22).  Rather quickly civil rights leaders and federal bureaucrats moved from eliminating segregation to calling for widespread social and economic changes.  Then, only two months after LBJ signed the Voting Rights Act, the Watts neighborhood in Los Angeles exploded in a deadly race riot, revealing that more than “civil rights” was at stake.  

In fact, more than delivering justice to the black population was envisioned by the progressives now governing the nation.  “Not just excluded and exploited Southern blacks but all aggrieved minorities now sought to press their claims under this new model of progressive governance.  The civil rights model of executive orders, litigation, and court-ordered redress eventually became the basis for resolving every question pitting a newly emergent idea of fairness against old traditions:  the persistence of different roles for men and women, the moral standing of homosexuality, the welcome that is due to immigrants, the consideration befitting wheelchair-bound people.  Civil rights gradually turned into a license for government to do what the Constitution would not previously have permitted. It moved beyond the context of Jim Crow laws almost immediately, winning what its apostles saw as liberation after liberation” (p. 34).  So “women’s liberation” hitched its wagon to the civil rights movement, demanding “equality” for the sexes.  Consequently, while in 1960 married and unmarried women shared similar attitudes regarding most everything today they differ in most all things!  Feminists vigorously promoted contraception, abortion, and full equality in the marketplace.  But they also unleashed “irresistible demands for further sexual freedoms.  Just as Americans were getting comfortable with the things feminism had meant to Betty Friedan and her followers (liberation from household drudgery and loneliness, a fair shake in the workplace, equal dignity elsewhere), feminism began showing signs of what it would blossom into half a century later (gender studies, queer theory, a questioning of all rules about sex)” (p. 56).  Such “freedoms” deeply changed the culture.  

Another culture-changer was the war in Vietnam, beginning with “an act of presidential deceit,” the Tonkin resolution.  But within four years the war had proved so unpopular that everyone running for president in 1968 promised to extricate the country from what seemed to be a quagmire.  Militarily the war might have been won, but politically it was lost—particularly among the younger elites.  Thus a Harvard anti-war student said:  “On the one hand we were angry about the war, about racism, about the countless vicious acts we saw around us.  But on the other hand, we viewed America as one great wasteland, a big, monstrous, mechanized, air-conditioned desert, a place without roots or feeling.  We saw the main problem, really, as:  THE PEOPLE—the ways they thought and acted towards each other.  We imagined a great American desert, populated by millions of similar, crass, beer-drinking grains of sand, living in a waste of identical suburban no-places. What did this imagined ‘great pig-sty of TV watchers’ correspond to in real life?  As ‘middle-class’ students we learned that this was the working class—the ‘racist, insensitive people.’  Things already going on at the time of the Vietnam War inclined privileged people to look on ‘average’ Americans as the country’s problem” (p. 78).  

The counterculture evident in this student’s lament asserted itself and would spread its tentacles throughout every crack in America.  An alienated elite would ultimately dominate virtually all important institutions (schools, media, churches) and demand societal transformation funded by the taxpayer.  Endless funding of proliferating anti-poverty, anti-racist, anti-sexist bureaucracies continued, and not even Ronald Reagan could arrest it.  “Having promised for years that he would undo affirmative action ‘with the stroke of a pen,’ lop the payments that LBJ’s Great Society lavished on ‘welfare queens,’ and abolish Jimmy Carter’s Department of Education, he discovered, once he became president, that to do any of those things would have struck at the very foundations of desegregation. So he didn’t” (p. 110).  Reagan tacitly complied with the “second constitution created by the civil rights movement which led, by the end of the century, to increasingly strident racial politics.  

This was manifestly evident in the metastasizing power of  “affirmative action” and “political correctness”—important planks in the nation’s new constitution, largely shaped by judicial decrees.  It is now clear that by passing the 1964 civil rights laws Americans “had inadvertently voted themselves a second constitution without explicitly repealing the one they had” (p. 172).  In fact:  “Affirmative action was deduced judicially from the curtailments on freedom of association that the Civil Rights Act itself had put in place.  Political correctness rested on a right to collective dignity extended by sympathetic judges who saw that, without such a right, forcing the races together would more likely occasion humiliation than emancipation.  As long as Americans were frightened of speaking against civil rights legislation or, later, of being assailed as racists, sexists, homophobes, or xenophobes, their political representatives could resist nothing that presented itself in the name of ‘civil rights.’ This meant that conflict, when it eventually came, would be constitutional conflict, with all the gravity that the adjective ‘constitutional’ implies” (p. 172).

One of the ultimately disastrous consequences of this shift surfaced in 1992 when President George H.W. Bush, following race riots in Los Angeles, signed a Housing and Community Development Act.  “It inaugurated the process we have seen at many junctures in this book:  the sudden irruption of civil rights law and diversity promotion into an area from which it had been mostly absent, in this case mortgage finance” (p. 178).  This act opened the gates to “the financial crisis that, in the following century, would nearly destroy the world economy under the presidency of Bush’s even more reckless son” (p. 179).  Sandwiched between the two presidents Bush, Bill Clinton manipulated the mortgage finance system, denouncing “the dearth of private housing credit in poor, black, urban neighborhoods” fomented by racist white bankers, and demanding low mortgage rates for blacks buying homes.  In Caldwell’s judgment:  “Sometime between the passage of Lyndon Johnson’s civil rights laws and the long Bush-Clinton march through the country’s financial institutions, the victims’ perspective had won. Now any inequality was an injustice, and one did not need a clear account of what had caused it to demand redress from the system” (p. 180).

Another realm dramatically unexpectedly changed was the institution of marriage.  Other than a few gay activists, no one imagined it possible that same-sex marriage would ever be legally imposed on the nation by a Supreme Court mandate (Obergefell v. Hodges) in 1916.  But homosexuals adroitly fused their “liberation” agenda with the “radical feminist cause of delegitimizing” traditional, heterosexual marriage “and the traditional idea of masculinity” it implied (p. 216).  Gay activists wanted “not just tolerance but a conferral of dignity.  . . . .  Civil rights was always this way:  dignity was an integral and non-negotiable part of what was demanded, and a government interested in civil rights must secure it, no matter what the cost in rights to those who would deny it” (p. 217).  “As Rosa Luxemburg had written of the Russian Revolution, ‘The real dialectic of revolution stands the parliamentary cliché on its head:  The road leads not through majorities to revolutionary tactics, but through revolutionary tactics to majorities’” (p. 225).

Justice Antonin Scalia saw this clearly, dissenting from Obergefell, declaring it to be undemocratic.  “‘A system of government that makes the People subordinate to a committee of nine unelected lawyers,’ Scalia wrote, ‘does not deserve to be called a democracy.’  He called the decision an upper-class ‘putsch,’ noting that every single member of the Supreme Court had gone to either Harvard Law School or Yale Law School, and concluded:  ‘The strikingly unrepresentative character of the body voting on today’s social upheaval would be irrelevant if they were functioning as judges, answering the legal question whether the American people had ever ratified a constitutional provision that was understood to proscribe the traditional definition of marriage.  But of course the Justices in today’s majority are not voting on that basis; they say they are not’” (p. 229).  Writing for the majority, Justice Kennedy had “explicitly repudiated certain conceptions of democracy that had until recently been sacrosanct.  ‘It is of no moment whether advocates of same-sex marriage now enjoy or lack momentum in the democratic process,’ he wrote.  Unless someone was expecting the Court to apologize for Brown v. Board of Education, this thwarting of majority rule in the name of civil rights was what the Supreme Court was for” (p. 229).  Kennedy, of course, was enforcing the “second constitution”—the living constitution of Al Gore, not the original constitution of Antonin Scalia.  

With amazing rapidity the practical ramifications of Obergefell became evident.  Bakers were brought to trial for refusing to bake cakes for gay weddings.  Transgender students insisted they should use restrooms of their choice or compete as athletes in accord with their self-definition.  “A terrible irony of civil rights, obvious from the very outset but never, ever spoken of, was making itself manifest . . . .  The civil rights approach to politics meant using lawsuits, shaming, and street power to overrule democratic politics.  It encouraged—no, it required—groups of similarly situated people to organize against the wider society to defend their interests.  Now it became clear that the members of any group that felt itself despised and degraded could defend its interests this way” (p. 232).