293 What’s Happened to the University?

In What’s Happened to the University:  A Sociological Exploration of Its Infantilisation (New York:  Rutledge, c. 2017), Frank Furedi appraises developments during the past 50 years in institutions of higher learning.  He began his academic life as a student in 1965 and is now Emeritus Professor of Sociology at the University of Kent in the UK.  In his student days, universities were open to new ideas and touted the virtues of free speech and challenging ideas.  Subsequently, however, they became “far less hospitable to the ideals of freedom, tolerance and debate than in the world outside the university gate.  Reflecting on this reversal of roles has come about is the principal objective of this book” (p. vi).   Furedi’s distressed that students now seek to ban books that threaten their vulnerable psyches and protest speakers who  might offend a variety of sexual and ethnic groups.  The free speech mantras of the ‘60s have turned into speech codes; the former devotees of free speech have frequently become, as powerful professors, enforcers of censorship.  “Safe spaces,” “trigger warnings,” “microagressions” and “chill out rooms” (replete with play dough and “comfort” animals to relieve anxieties) indicate how many universities have in fact become infantilized.   Thus:  “Harvard Medical School and Yale Law school both have resident therapy dogs in their libraries” (p. 27).  

In some ways this culminates a project educators launched in the 1980s, making “self-esteem” their summum bonum.  Feelings, above all, must be massaged and potential hurts (e.g. poor grades or athletic defeats) eliminated.  Protecting children became a parental obligation easily transferred to the schools.  Parents now accompany and hover over children entering the university.  Administrators serve in loco parentis, not as they did a century ago, by regulating campus behavior, but by protecting students’ feelings, especially if they self-identify as members of certain “vulnerable groups.”  Wellness clinics, counseling services, ethnic and same-sex study centers all cater to psychological or emotional rather than intellectual needs.   Treating students as “biologically mature children, rather than young men and women, marks an important departure from the practices of the recent past” (p. 7).  As one might anticipate, “the more resources that universities have invested in the institutionalization of therapeutic of therapeutic practices, the more they have incited students to report symptoms of psychological distress” (p. 46).  

An incident at Yale University in 2015 illustrates this.  A university committee issued guidelines regarding appropriate Halloween costumes.  One faculty member, Erika Christakis, posted an email suggesting “that ‘if you don’t like a costume someone is wearing, look away, or tell them you are offended’ and concluded that ‘free speech and the ability to tolerate offense are the hallmarks of a free and open society’” (p. 17).  Students then denounced Christakis and her husband (a psychology professor who defended her) for racial insensitivity.  Yale’s President, Peter Salvoes, promptly met with tearful undergraduates and shared their felt distress.  Though not dismissed from their positions, Erika and Nicholas Christakis soon left Yale, casualties of the raging intolerance now widespread in academia.  Another incident further illustrates campus conditions.  “Caroline Heldman, a professor in Occidental University’s politics department, recalled that some of her students began experiencing PTSD-related episodes in her classes:  ‘there were a few instances where students would break down crying and I’d have to suspend the class for the day so someone could get immediate mental health care.’  Her antidote to this problem was to introduce a trigger warning on her course” (p. 42).  

  What really matters these days is one’s racial or sexual identity.   “Universities are singularly accommodating to the objectives of cultural crusaders” (p. 65).  To identify as an African-American or Native American or gay man or lesbian woman grants one status and authority quite apart from whatever one may think or say.  In addition, it’s especially important to stress the “victim” status of one’s group, even if the only obvious victims were ancestors who lived decades if not centuries ago.  Doing so enables one to invoke “social justice” and demand preferential treatment of some sort.  “Social justice” increasingly means protesting historic policies and personalities.  So students at the University of Missouri demanded a statue of Thomas Jefferson be removed from campus because he owned slaves.  Schools must be renamed if they memorialize anyone tainted with racist or sexist traits.   Selected cultures must be sacrosanct, making intolerable any “appropriation” of their dress, music, or food.  So many campus cafeterias dare not feature Mexican or Asian food lest students remonstrate!   And, importantly, only women can speak for women, only blacks for blacks, only Indians for Indians!  Authority comes purely from one’s ancestry, not from any scholarly expertise.  Consequently:  “The reverential and self-righteous tone of cultural crusaders echoes the voice of traditional religious moralists” (p. 64).  

To provide “safe space” for culture groups leads to self-segregated dormitories, and there are now dorms reserved for blacks and other minorities at elite schools such as UC Berkeley and MIT!  These “safe spaces” for students protect them from  psychic and emotional hurts, shoring up their fragile self-esteem.  No debates are allowed, lest someone be judged wrong!  On many campuses, the notion that “criticism is violence” has gained traction, so teachers are warned to avoid even evaluating their students!  “It is an article of faith on campuses that speakers who espouse allegedly racist, misogynist or homophobic views should not be allowed to speak” (p. 103).  Challenging speakers, such as Heather Mac Donald and David Horowitz, are shouted down or prevented from appearing on campuses, for they might distress the feelings of some groups.  Advocates of safe spaces insist that “tolerance, affirmation and respect” therein provide a good environment for learning, though no empirical studies demonstrate such.  In fact, from Socrates onward it’s been assumed that learning advances when one is forced to examine his beliefs and test his presuppositions with a commitment to embracing even uncomfortable truths.  

Conjoined with “safe spaces” are the efforts to censor free speech which have accelerated since 1980.  Certain words simply cannot be uttered!  Though profanity (as traditionally understood) flourishes in dormitories and classrooms, legions of taboo words are now forbidden.  Thus one may no longer refer to his “wife”—though “partner” is allowed.  In elite universities one may proudly be a “Native American” but never an “Indian.”   “Censorship, which was once perceived as an instrument of authoritarian attack on liberty, is today often represented as an exercise in sensitive behavior management” (p. 102).  Even threatening ideas must be policed, with professors issuing “trigger warnings” that exempt sensitive students from exposure to them!  Classic texts, ranging from Sophocles’ Oedipus the King to Mark Twain’s Huckleberry Finn to J. D. Salinger’s Catcher in the Rye are now suspect!  Feminists especially object to reading classic texts they brand misogynist.   

Thus “microagressions,” even though unintentional and even unconscious, cannot be tolerated.  Lurking behind hurtful words there must be gravely immoral thoughts!   “You can’t think that” is now an acceptable policy on some campuses.  According to one influential theorist:  ‘“Many racial microaggressions are so subtle that neither target nor perpetrator may entirely understand what is happening.’”  But this may well make them “more harmful to people of color than hate crimes or the overt and deliberate acts of White Supremacists” (p. 119).  Generally speaking, only the ones who suffer from these verbal assaults really understand their evil.  An offense is in they eyes of the beholder!  Students on many campuses are now demanding the right to anonymously inform on their professors’ microaggressions and “Bias Response Teams” have been formed to enforce proper discipline on them.  To prevent hurt feelings, for example, UCLA now “prohibits people from asking Asian-Americans the question ‘Where are you from or where you born?’” lest they feel non-American.   Nor can you say “America is a land of opportunity” lest someone feel that such is not true for him (p. 109).  Correcting a student’s grammar may lead to complaints of “white privilege” and racial bias.  

The culmination of these developments, Furedi says, is “the quest for a new etiquette.”  Traditional ways, including chivalrous conduct, have generally dissolved.  To replace them we find what Jurgen Habermas “‘described as the juridification of everyday life’” (p. 125).  Yet exactly what kinds of behavior may now be condemned or approved and enacted into law remains undecided.  Administrative decrees, more psychological than philosophical in justification, seek to regulate activities but lack deeply moral (and especially religious) justification, so they quickly change and often defy common sense.   “The rhetoric of campus guidelines tends to avoid the language or right and wrong or good and evil, appealing instead to the therapeutic language of feelings” (p. 128).  To make sure feelings are protected, universities employ numbers of sensitivity experts and trainers and workshop “facilitators” to raise “awareness” and enforce speech codes and punish microagressions.  Millions of dollars are yearly expended to deal with “sexual harassment” complaints.  Students must be properly acculturated to the modern ethos, so Cambridge University now promotes “events ‘to celebrate Lesbian, Gay, Bisexual and Transgender (LGBT) History Month, Black and Ethnic Minority (BME) History Month, International Women’s Day (IWD), International day of Persons with Disabilities (IPDP) and Holocaust Memorial Day (HMD)” (p. 135).  No victim groups may be ignored lest someone’s self-esteem decay!  

None of us really knows where this will all end.  But Umberto Eco was certainly prescient when he said that “‘even though all visible trees of 1968 are gone, it profoundly changed the way of all of us, at least in Europe, behave and relate to one another.  He added that ‘relations between bosses and workers, students and teachers, even children and parents, have opened up,’ and that therefore,’they’ll never be the same again’” (p. 134).  If Furedi’s right, universities have wasted their patrimony and may never regain their rightful place in modern culture.  

* * * * * * * * * * * * * * * * * * *

During spring break in 2006, the captains of Duke University’s lacrosse team hired two strippers, including twenty-seven-year-old Crystal Magnum, to perform at an off-campus party.  Such events were not particularly notable, since 20 or so had occurred at the university that year.  But Magnum subsequently claimed to have been raped, provoking a sensational series of events carefully recorded by Stuart Taylor Jr. and KC Johnson in Until Proven Innocent:  Political Correctness and the Shameful Injustices of the Duke Lacrosse Rape Case (New York:  St. Martin’s Press,  c. 2007).   Added to the incident itself, illustrating the sexual tone of today’s universities, it’s a disturbing story laced with racial tensions, political aspirations, faculty prejudices, administrative cowardice, and media malpractice.  Though dense with details, the book fully engages the reader, alerting him to some the troublesome aspects of 21st century culture.

Though best known for its basketball prowess, Duke’s lacrosse team was a perennial powerhouse, routinely competing for national championships.  Accordingly, the team featured many fine athletes, often graduates of elite prep schools where lacrosse was emphasized.  These athletes were, moreover, generally outstanding students, bound for the graduate and professional schools which train the doctors and lawyers their parents envisioned.  The stripper, on the other hand, had a checkered background, marked by a failed marriage, illegitimate children, prostitution, and mental problems.  But she was poor and black, born and reared in Durham.  And the lacrosse players, with one exception, were white, the sires of wealthy families.  The two strippers’ performance lasted all of four minutes, in part because Magnum was apparently too drunk to stand, much less dance.  She and her colleague, Kim Roberts, departed the house, though Magnum  passed out on the back stoop.  She said nothing to Kim about being raped, nor did she say anything to a security guard who subsequently called 911 and tried to help her, nor to the police who responded.  Taken to the hospital, she was examined by doctors and nurses, who found no signs of rape.  Finally, however, a feminist nurse who considered herself an advocate for rape victims filed her own report, and it became the basis for later rape accusations.  Throughout the process Magnum’s story continually changed, so it was not clear exactly what had transpired with the lacrosse team. 

When police received information regarding the incident, a Durham detective well-known for his antipathy to Duke students took charge of the investigation.  He had no interest in interviewing Kim Roberts, who best knew what actually happened.  When another policeman interviewed her, six days after the alleged rape, Roberts declared the sexual assault story “a crock,” and her handwritten statement “contradicted Magnum on all important points” (p. 57).  The lead detective also refused to consider any data regarding Magnum’s career as a prostitute, though in time one of her associates testified to “taking her to jobs in three hotels with three different men” on the nights preceding the lacrosse party.   The detective also failed to interview the doctor who had actually performed the pelvic exam when Magnum was admitted to the hospital.  When shown pictures of all the lacrosse players, she could not identify any of them with certainty, and one of those she fingered was nowhere near the party that night.  

Taking an even greater interest in her case was District Attorney Michael Nifong, who envisioned it leveraging his political career in the coming election.  He needed the support of the black community in Durham as well as the liberal professors at Duke, so he quickly discerned how supporting Crystal Magnum’s rape accusations would ultimately enable him to win the upcoming election.  In a series of inflammatory press releases, Nifong branded the Duke athletes “rapists” fully deserving the vigorous prosecution he would pursue.  Many of his statements were demonstrably false, but newspapers and media outlets across the nation soon picked up on the case, almost unanimously assuming the guilt of the players accused.  Virtually everywhere there was a simple objective:  “Lynch the privileged white boys.  And due process be damned” (p. 121).  Writers for the New York Times and TV personalities such as Nancy Grace and Joe Scarborough cheered the mob of outraged folks determined to punish the “rapists.”  Few journalists cared to find the truth!  (Amazingly, the most balanced publication dealing with the case was Duke’s student newspaper!)   Inevitably, Jesse Jackson showed up, trumpeting his support for an abused black woman, and the local NAACP applauded Nifong’s every move!  

So too the Duke administrators (most especially President Richard Brodhead) and professors (especially from the African-American and women’s studies programs) began to loudly denounce the lacrosse players, apparently committed to the notion that any woman claiming to have been raped must be telling the truth.  Here was an illustration of the “morality tale” of “virtuous black women brutalized by white men” (p. 66).  The Duke faculty launched hysterical attacks on the lacrosse team.  (Many professors simply resented the fact that many thought of Duke in terms of its athletes, while they wanted the institution to bask in an aura of academic excellence.)  Many had a deep commitment to the feminism on display in the yearly “Take Back the Night” rallies.  And virtually all of them wanted to publicly bear witness to their racial sensitivities and liberal proclivities.   To some teachers, the players should be punished for rape  “‘whether it happened or not’” since it would help compensate “‘for things that happened in the past’” (p. 170).   Even as evidence proving the athletes’ innocence steadily mounted, Duke’s professors “served as enthusiastic cheerleaders for Nifong,” and “for many months not one of the more than five hundred members of the Duke arts and sciences faculty—the professors who teach Duke undergraduates—publicly criticized the district attorney or defended the lacrosse players’ rights to fair treatment” (p. 105).  The more radical the professor (e.g. Houston A Baker, a past president of the Modern Languages Association) the more the mainstream media loved to interview him!  Long before the trial, these professors simply assumed the men were guilty—and, of course, an illustration of how America is a racist, sexist society!  Only one lonely professor, a chemist, dared stand up and defend his friend, the lacrosse team’s coach!  In the judgment of Thomas Sowell:  “‘The haste and vehemence with which scores of Duke professors publicly took sides against the students in this case is but one sign of the depth of moral dry rot in even our prestigious institutions’” (p. 117).  

Fortunately for the lacrosse athletes, several had parents with the means and connections to assemble a strong legal defense team.  These lawyers early saw the flaws in Nifong’s accusations and found solid evidence (especially DNA) upholding the innocence of their clients.  All of the players cooperated with the police, submitting to lie detector exams and volunteering the blood samples requested for DNA tests, which proved to be the “biggest defense bombshell, since the State Bureau of Investigation reported that “‘no DNA material from any young man tested was present on the body of this complaining witness’” (p. 162).  Then the athletes’ attorneys demonstrated “the staggeringly conclusive evidence of innocence, and of probable Nifong misconduct” (p. 302).   Violating an operating rule for prosecutors, Nifong had refused to even look at evidence collected by defense attorneys, something “unheard of” in legal circles, pushing his case through a grand jury and bringing it to trial.  But in time the evidence would, in fact, become public and the athletes were vindicated.  

Cracks in the prosecution’s case began with blogs such as Liestoppers dissecting the mainstream media’s presentations.  Articles in the New York Times were shown to be filled with egregious errors, deliberately omitting crucial evidence countering Nifong’s claims.  Then a few TV programs—most notably Sean Hannity’s—questioned the assumed guilt of the lacrosse athletes.   Students on the Duke campus—many resenting the malicious role their professors played in the process—increasingly sided with the team and believed that Crystal Magnum had lied.  Ultimately CBS’s 60 Minutes, after a lengthy investigation, declared “the rape claim was a fraud and Nifong was guilty of outrageous misconduct” (p. 282).   When Nifong faced the defense attorneys in a preliminary hearing, his case quickly unravelled.  It became clear that he and one of his expert witnesses had conspired to hide evidence, and he dropped the rape charge.  He “had engaged in grossly unethical—perhaps criminal—misconduct, and the case against the lacrosse players was a travesty” (p. 317).  He lost face, soon resigned his office, and would finally be disbarred.  His effort to punish the innocent “may well have been the most egregious abuse of prosecutorial power ever to unfold in plain view” (p. 356).  In sum, Nifong was guilty of “demonizing innocent suspects in the media as rapists, racists, and hooligans; whipping up racial hatred against them to win an election; rigging the lineup to implicate them in a crime that never occurred; lying to the public, to the defense, to the court, and the State Bar; hiding DNA test results that conclusively proved innocence; seeking (unsuccessfully) to bully and threaten defense lawyers into letting their clients be railroaded” (p. 356). 

But even more shameful than the district attorney was the Duke faculty and administration!  Even when the evidence proved the lacrosse athletes innocent, activist professors remained belligerent and unrepentant!  Eight-seven professors published a letter repudiating any efforts to make them retract or apologize for their slanders.  Instead, they attacked the bloggers, students, and journalists who defended the athletes.   So too the NAACP, The New York Times, and other powerful organizations refused to retract their slanders or seek to do justice to the maligned men.  Even if they did no wrong, it seems, they represent what’s wrong in this nation’s racist/sexist/classist society!   Anyone concerned with the justice in America needs to know what happened at Duke—and is still happening in other sectors of the USA.

292 “Lo, the Poor Indian”

Reflecting a pervasive Enlightenment perspective—and presaging Jean-Jacques Rousseau’s Romantic admiration for America’s “Noble Savage”—Alexander Pope, in his Essay on Man, declaimed: 

Lo, the poor Indian!  whose untutored mind

Sees God in clouds, or hears him in the wind;

His soul proud Science never taught to stray

Far as the solar walk or milky way; 

Yet simple nature to his hope has given,

Behind the cloud-topped hill, an humbler heav’n.   

Neither Pope nor Rousseau knew much about the New World’s indigenous inhabitants, but that didn’t dissuade them from making authoritative pronouncements, and similar ignorance has infected much that’s been written or portrayed about Indians ever since.  Thus today many folks imagine they understand them  as a result of watching a TV special on the Dakota Access Pipeline or listening to alleged “Native American” spokesmen leading protests in various locales.    

Illustrating this ignorance is the widespread circulation (especially in environmentalist circles) of an alleged statement made by Chief Seattle in 1851.  The quotation declared:  “Every part of this earth is sacred to my people.  Every shining pine needle, every sandy shore, every mist in the dark woods, every clearing and humming insect is holy in the memory and experience of my people.”  Seattle’s words, duplicated in many books and displayed on schoolroom posters, effectively persuaded many Americans that the First Americans were the First Environmentalists, carefully husbanding the natural world, walking softly on Mother Earth.  In fact, the speech was written in 1972 by a Texas scriptwriter working on a film produced by the Southern Baptist Radio and Television Commission!  It fit the mood of the moment, whether or not it had any historical veracity, and became part of the nation’s folklore!  Certainly it helped establish one of the many misleading stereotypes that in the long run serve to harm Indian people.  

Endeavoring to better root us in reality, Naomi Schaeffer Riley recently toured the United States and Canada gathering material for her insightful The New Trail of Tears:  How Washington Is Destroying American Indians (New York:  Encounter Books, c. 2016).   To understand anything we need first to describe it and then think clearly to explain it.  So Riley proffers careful descriptions accompanied by reasoned analysis.  Her descriptions remind us of similar accounts through the centuries—tribal peoples beset by a multitude of problems (including the highest poverty and lowest life expectancy rate of any racial group, shocking suicide numbers, alcohol and drug abuse, rape, sexual abuse and widespread gang activity).  Her analysis, however invites us to think hard about the glaring failures of latest in a long list of “saviors”—the federal government.  Rapacious frontiersmen and ruthless armies harmed Indians in the past, but today the primary culprit responsible for their predicament is the government, the pretentiously  benevolent Welfare State.  

“As you’ll see in this book,” Riley says, “the problems American Indians face today—lack of economic opportunity, lack of education, and lack of equal protection under law—and the solutions to these problems require a different approach from the misguided paternalism of the past 150 years.  It’s not the history of forced assimilation, war, and murder that have left American Indians in a deplorable state; it’s the federal government’s policies today” (#149).  More troubling:  Indians provide us a “microcosm of everything that has gone wrong with liberalism,” caused by “decades of politicians and bureaucrats showering a victimized people with money and sensitivity instead of what they truly need—the autonomy, the education, and the legal protections to improve their own situations” (#149).  

Consider this:  the federal bureaucracies charged with responsibility for the nation’s one million reservation Indians, the Bureau of Indian Affairs (BIA) and the Bureau of Indian Education (BIE), employ 9,000 employees—roughly one bureaucrat for every 100 Indians.  The feds’ funding “for education, economic development, tribal courts, road maintenance, agriculture, and social services—was almost $3 billion in 2015.  Consequently:  “Tribal leaders only demand more money from Washington to fix their problems.  And the senators and congressmen who represent them are only too glad to oblige return for the votes of the populations” (#2910).  Yet extraordinary unemployment rates, coupled with tribal ownership of land and reliable welfare payments, leave virtually all reservations poverty-stricken.  Lacking private property rights, reservation Indians (whose lands are tribally owned but held “in trust” by the federal government) almost inevitably suffer what economists call “the tragedy of the commons.”  Theoretically, everyone owns the land, but no one owns any actual parcel and takes no responsibility for any of it.  But everyone gets annuities (and in many areas, per capita dividends from tribal casinos) that provide subsistence without needing to work—and therein lies much that’s wrong with the reservations.  

Still more:  endless federal regulations dictate how reservation lands may be used—and make it virtually impossible to use it productively!  Entrepreneurs and venturesome economic projects inevitably run afoul of a nanny state determined to insure that Indians will always be the “Indians” suitable to bureaucrats who often operate in accord with sentimental myths rather than observable realities.  Thus, for example, Michelle Obama could tell a gathering of Indian youngsters that “‘on issues like conservation and climate change, we are finally beginning to embrace the wisdom of your ancestors’” (#567).  Had she simply driven through most any reservation she could have seen how little ancestral wisdom regarding the “sacred land” may be found in Indian country!  Here the results of the Obamas’ antipathy to developing natural resources can be demonstrated.  Reservations sit on enormous coal, uranium, oil and gas reserves, but ’”86% of Indian lands with energy or mineral potential remain undeveloped because of Federal control of reservations that keeps Indians from fully capitalizing on their natural resources if they desire’” (#450).   Even a superficial assessment of Indian affairs should persuade one that the money expended on behalf of the Indians hardly helps (and probably harms) them.

The greatest natural resource, of course, is people, and children must be well educated in order to develop their potential.  The Bureau of Indian Education (BIE), a notably inefficient bureaucracy,  expends about $850 million providing for its “42,000 students (most children on reservations don’t attend BIE schools), which amounts to about $20,000 per pupil, compared with a national average of $12,400” (#87).   Only half of the students in high school graduate, and those who do frequently have less-than-adequate skills.  Providing details, Riley sets forth a sobering assessment of the schools under federal jurisdiction.  In one school on the Crow reservation in Montana, for example, $27,304 per pupil was expended—compared with $10,625 in non-Indian state schools.  Yet the graduation rate was 39%!  There, and everywhere you look, Indian schools are “among the worst in the nation” (#1562).   In stark contrast, the Saint Labre Catholic schools in southeastern Montana serve 800 Crow and Cheyenne children.  These Catholic schools take no federal monies and do nicely, enjoying a dropout rate of only one percent!  And large numbers of their graduates go on to study in college.  (There are some bright lights in Indian country, but they’re rare.)  

Aware of the educational failures of reservation schools, distraught parents and students usually blame the lack of discipline and qualified teachers, as well as nepotism-infected tribal administrations, though they also point to the breakdown of the family as the primary culprit.  Some youngsters who graduate high school then attend one of the 32 federally-funded tribal colleges, where they often study tribal traditions or arts and crafts.  Rarely do they graduate and attend a university, nor do they learn much they can use apart from the reservation.  Sadly:  “Every school on the [Pine Ridge, Sioux] reservation is scrambling for teachers.  But the tribal school—Oglala Lakota College—doesn’t even offer a degree in secondary education” (#2258).  Rather than training youngsters to effectively help their people, most colleges cater to personal proclivities, often traditional arts and crafts.  Thus  “‘The Tribal Institute of American Indian Arts in New Mexico,’ according to the Atlantic, ‘spends $504,000 for every degree it confers . . . more than Harvard or MIT’” (#1578).  

The Indians doing the best these days are the ones whose descendants lost their lands in the 19th century—or individuals who leave the reservation and find their way in the broader culture.  Descendants of the Five Civilized Tribes in eastern Oklahoma are certainly prospering nicely when compared with the reservation-rooted Sioux and Navajo.  So too the Lumbees (in the Lumberton North Carolina area), lacking language, chiefs and tribal land, blended into the area’s population.  Fully assimilated, they supported a decent school system and also embraced the “passionate Baptist faith that, to a person, they today profess’” (#1292).  In one Lumbee’s opinion, their success resulted from the “tribe’s independence from the federal government.  ‘Indians had to pay for everything themselves here.  They had pride in the people who built it’” (#1300).  They could also own and develop, buy and sell land.  

The Lumbees weren’t wealthy, but they were doing okay.  Then politicians in Washington D.C. decided to help them!  Overwhelmed with liberal guilt following WWII, the feds decided to allow landless tribes to “reconstitute” themselves.  In 1975 President Nixon signed the Indian Self-Determination and Education assistance Act, opening the coffers for grants to law enforcement, education, and environmental programs.  Increasingly, Indians could qualify for generous welfare programs.  As a result, increasing numbers of younger Lumbees ceased working and now waste their days doing drugs.  Today’s schoolchildren are notably less well-educated than their grandparents!  Whereas churches used to help the needy, the government now hands out money and enables them to idly self-destruct.  An older Lumbee, Ronald Hammonds, a successful cattle farmer, laments:  “‘Women are encouraged to have babies.  It’s economic development.  You get a check.  We’ve got more illegitimate kids than ever, and it’s getting worse.’  He calls the local housing project a ‘breeding ground’ and says that the children are mostly being raised by their grandmothers.  ‘They’ve got no responsibility.  They’re looking for the government as the solution to all our problems’” (#1419).   The only answer to the many problems the Lumbees now face, Hammonds thinks, is to get the government out of their lives.  

And that’s basically the solution Riley recommends:  eliminate the dependency engendered by the reservations!  That would, of course, mean much anguish in Indian communities—and in the non-Indian liberals who empathize with them.  But it may be the only “tough love” way to free the most impoverished peoples in America.  Indicating how little things have actually changed in 150 years, read carefully the final paragraph in Our Wild Indians:  Thirty-three years; Personal Experience Among the Red Men of the Great West (Hartford, CN:  A. D. Worthington and Company, c. 1883).  Colonel Richard Irving Dodge, who knew the Indians as well as any 19th century writer and described them with a relentless honesty, harbored no romantic or humanitarian illusions regarding either them or their cultures.  “The only hope for the Indian,” he wrote, “is in the interest and compassion of a few men, who, like the handful of “Abolitionists” of thirty years ago, have pluck and strength to fight, against any odds, the apparently ever losing battle.  These in turn must rely upon the great, brave, honest human heart of the American people.  To that I and they must appeal to the press; to the pulpit; to every voter in the land; to every lover of humanity.  Arouse to this grand work.  No slave now treads the soil of this noble land.  Force your representatives to release the Indian from an official bondage ore remorseless, more hideous than slavery itself.  Deliver him from this pretended friends and lift him into fellowship with the citizens of our loved and glorious country” (#9377).  

* * * * * * * * * * * * * * * * * * * * * * * 

Inasmuch as the main focus of my graduate study at the University of Oklahoma was Western American History—writing my master’s thesis and doctoral dissertation on Cherokee history—I for many years often taught a class entitled “The First Americans.”  One of the books I either required or recommended was Dee Brown’s Bury My Heart at Wounded Knee:  An Indian History of the American West, though I warned students to take it as more of a pro-Indian polemic than balanced history.  Despite its bias, it presented the post-Civil War Indian wars in a very readable way and alerted readers to the mistreatment of tribal peoples.  Were I still teaching today, however, I’d have better source that covers the same terrain—and basically comes to the same conclusions—with more effort to understand both white and Indian perspectives. to see both good and evil in each group of people.  

It’s Peter Cozzens’ The Earth is Weeping:  The Epic Story of the Indian Wars for the American West (New York:  Alfred A. Knopf, c. 2016). Fortunately, says Cozzens, there are primary sources unavailable to Dee Brown and he can “tell the story equally through the words of Indian and white participants and, through a deeper understanding of all parties to the conflict, better address the many myths, misconceptions, and falsehoods surrounding the Indian Wars” (#351).  He provides in-depth descriptions and interesting details regarding Indian warriors’ training and skills as well as those of the U.S. Army recruits who opposed them.  Still more:  he effectively shows how Indians themselves (through intra-tribal rivalries and conflicts as well as inter-tribal animosities) contributed to their defeat.  In many ways the book simply fills in the details contained in a succinct statement made by Lieutenant Colonel George Crook, who fought many a battle with them:  “I do not wonder, and you will not either, that when Indians see their wives and children starving and their last source of supplies cut off, they go to war.  And then we are sent out there to kill them.  It is an outrage.  All tribes tell the same story.  They are surrounded on all sides,the game is destroyed or driven away, they are left to starve, and there remains but one thing for them to do—fight while they can.  Our treatment of the Indian is an outrage’” (#318).  

After setting the stage with a discussion of United States developments and policies, as well as Indians’ tribal traits and migrations onto the Great Plains, Cozzens turns to Red Cloud’s War in 1866.  Determined to halt the movement of miners into Montana’s gold camps, Red Cloud (leading Oglala and Miniconjou Sioux warriors) prevailed, defeating an army detachment at the Fetterman “massacre” and subsequently signed the second treaty of Fort Laramie in 1868, closing the Bozeman trail and securing for the Lakotas the “Great Sioux Reservation” (today’s South Dakota west of the Missouri River), to be maintained for their “absolute and undisturbed use and occupation.”  Red Cloud’s “victory” was a rare Indian triumph—and it hardly arrested the westward movement of Americans pioneers.   

The Lakota and Northern Cheyenne further enjoyed two brief victories in 1876—the battles at the Rosebud and the Little Big Horn in southeastern Montana.  At the Rosebud, General Crook was repulsed by warriors following Crazy Horse.  Days later, Colonel George Armstrong Custer led his Seventh Cavalry into the Little Bighorn region, where he encountered one of the largest encampments of Sioux and Cheyenne (7,000 Indians; 1800 warriors) ever assembled.  He’d bragged that his Seventh Cavalry could “whip all the Indians in the Northwest,” but at the Little Big Horn he proved himself a poor prophet.  Following Crazy Horse, Sitting Bull and Gall, the Indian warriors slew 258 troopers, losing only 31 of their own.  Following the battle, Sitting Bull said:  ‘“I feel sorry that too many were killed on each side.  But when Indians must fight, they must’” (#5249).  Custer’s last stand, however, was the northern tribes’ last stand, for the army thereafter sent column after column (frequently in winter, burning their lodges and food supplies) after the hostiles and effectively broke their will within a few years.   With the surrender of Crazy Horse, the last renegade Lakotas came to terms with the United States and accepted their lot as reservation Indians.  After taking refuge in Canada for a few years, Sitting Bull too surrendered early in 1881.  “‘Nothing but nakedness and starvation has driven this man to submission,’ concluded a sympathetic army officer, ‘and not on his  own account but for the sake of his children, of whom he is very fond’” (#6070).  

On the Southern Plains, at the same time, the Cheyennes and Arapahoes were defeated (in part by Colonel Custer’s massacre of Black Kettle’s peaceful village of Southern Cheyennes on the Washita River) and confined to a reservation in the western part of Indian Territory, to be joined soon thereafter by the Kiowa and Comanche (finally defeated in the Red River Wars in the 1870s).  Adding to the relentless might of the military, the Indians further faced the loss of the buffalo—the enormous herds that supplied their every need in 1865 were simply gone by 1875.  Buffalo hunters, killing the animals for their hides, nearly wiped out the species!  Hide hunters, Phil Sheridan said, did “more to settle the Indian Problem in two years than the army had done in thirty.  For the Sake of lasting peace, let them kill and skin until the buffalo are exterminated’” (#3100).  And without the buffalo, the Indians either starved or begged for rations from army forts.  

In the Far West, the Modocs were defeated in northern California.  The Nez Perces, led by Chief Joseph, were forced from their Washington homeland and conducted an epochal struggle, coursing through 1700 miles in Idaho and Montana before surrendering near the Canadian border.  The Utes of the Rocky Mountains were defeated and relocated in reservations in Utah and southern Colorado.  In the Southwest, the Apaches under Cochise and Victorio waged some resourceful guerrilla wars, but with the defeat of Geronimo’s small band in 1886 that region was pacified.  At the end, some 5,000 troops were involved in corralling eighteen warriors led by Geronimo and Naiche!  Though there is a certain aura around Geronimo, those who knew him best generally disliked him.  One Apache leader said:  ‘“I have known Geronimo all my life put to this death and have never known anything good about him.’”  The daughter of  Naiche  “agreed.  ‘Geronimo was not a great man at all.  I never heard any good of him’” (#7348).  Significantly, the troops  who most effectively hunted down the Apache bands were other Apaches, equally skilled in tracking and surviving in harsh environs.  General George Crook, one of the officers engaged in Indian wars for three decades, said:  “‘In warfare with the Indians it has been my policy—and the only effective one—to use them against each other’” (#7566).  

The post-Civil War conflicts in the American West was consummated in a massacre at Wounded Knee South Dakota in 1891.  Hundreds of despairing Lakotas had been captivated by a new religious movement, the Ghost Dance.  A Paiute medicine man, in Nevada, Wovoka, meshed native and Christian traditions and urged followers to dance incessantly to usher in a wonderful world devoid of white men and their oppression.  Though most Indians disdained the movement, fervent practitioners worried officials in the Indian Bureau, whose agents insisted the army suppress it.  In a convoluted chapter of the ferment, Sitting Bull was arrested and killed by Indian policemen.  Then a 65 year old Miniconjou chief named Big Foot decided to lead his band to safety on the Oglalas’ Pine Ridge Reservation.  Confusion and misunderstanding led to a violent confrontation along Wounded Knee Creek, and at least 150 Sioux (mainly women, children, and old men) died.  

Thirty years of Indian wars had ended.  And Peter Couzzens provides the most readable, accurate account of them I’ve read.  

# # # 

291 The War on Humans

   When, during the last presidential debate, Hillary Clinton defended all forms of abortion (the deliberate taking of an unborn, innocent human being’s life at any time in a woman’s pregnancy), she graphically illustrated her party’s position in this nation’s decades-long cultural war.  Though the defenders of life have won some important battles, pro-abortion forces still occupy commanding positions on the battlefield.  That truth is powerfully illustrated in Ann McElhinney and Phelim McAleer’s investigative treatise—Gosnell:  The Untold Story of America’s Most Prolific Serial Killer (Washington:  Regency Publishing, c. 2017).  Four things powerfully struck me while reading the book:  1) the sheer barbarity of the late-term abortions performed by Kermit Gosnell, M.D., in his Philadelphia Women’s Medical Society clinic wherein an estimated “40 percent of the babies aborted . . . were over the gestational age limit for legal abortion in Pennsylvania” (#2318); 2) the utter indifference and dereliction of state officials required to inspect and regulate abortion clinics; 3) the lock-step commitment of the nation’s media to ignore, obscure, or at least minimize Gosnell’s crimes; and, 4) the irony of some abortions (late-term) qualifying as murder whereas others (the million or so done yearly in Planned Parenthood facilities) have absolute legal protection.   As Kirstin Powers said, ‘“whether Gosnell was killing the infants one second after they left the womb instead of partially inside or completely inside the womb—as in routine late-term abortion—is merely a matter of geography.  That one is  murder and the other is a legal procedure is morally irreconcilable’” (#157).  

McElhinney and McAleer are Irish journalists who were drawn to the story by its intrinsic merit rather than because of any pro-life convictions.  Indeed, Anne McElhinney had “never trusted or liked pro-life activists” (#127).  Then, as she began covering Gosnell’s trial, she realized that “pro-abortion advocates tend to avoid any actual talk of how an abortion is done and what exactly it is that is being aborted.”  But now she knows!  And she also now knows that “what is aborted is a person, with little hands and and a face that from the earliest times has expression.  The humanity in all the pictures is unmistakable, the pictures of the babies that were shown as evidence in the Gosnell trial—first, second, and third trimester babies, in all their innocence and perfection” (#140).  While researching and writing she “wept at my computer.  I have said the Our Father sitting at my desk.  I am no holy roller—I hadn’t prayed in years—but at times” she could do nothing else.  Even more profoundly, she sensed “the presence of evil,” the sheer lack of conscience, pervading the pro-abortion establishment.

The Gosnell case began with a drug investigation launched by a Philadelphia undercover narcotics investigator, Jim Wood, who was getting drug peddlers to reveal the sources of illegal prescriptions for drugs like OxyContin.  A tangled web of informants led Wood to Dr. Kermit Gosnell, who turned out to be “one of the biggest suppliers in the entire state of Pennsylvania,” operating out of his Women’s Medical Society clinic (#302).  Therein investigators discovered far more than a drug emporium!   They found “a filthy, flea-infested, excrement-covered” abortion clinic almost impossible to describe.  Urine and blood discolored the floors; trash, cat excrement and hair littered the facility.  They found “semi-conscious women moaning in the waiting room.  The clinic’s two surgical procedure rooms were filthy and unsanitary,” featuring rusty equipment and non-sterile instruments (#452).  Unqualified, unlicensed staff members had administered sedatives and cared for the patients—one worker had an eighth-grade education and a phlebotomy certificate!  Another liked being paid in cash (and given free Xanax, Oxy-Contin, Percocet, etc.) because it enabled her to continue drawing fraudulent disability benefits from the Veterans Administration.  “The basement was filled with bags of fetal remains that reached the ceiling” (#527).   In a cupboard there were jars filled with little baby feet—apparently something of a fetish for Gosnell.  Dead babies were found in various containers, stored in refrigerators and freezers.  “Investigators found the remains of forty-seven babies in all” (#632).  It was truly a house of horrors!

Evidence collected from the clinic and Gosnell’s house, as well as testimony from his staff and patients, was presented to a grand jury, which spent a year combing through it.  “The final report, published on January 14, 2011, is a complete page-turner, a chronicle of how America’s biggest serial killer got away with murder for more than thirty years.  In its gruesome 261 pages, the grand jury named and shamed—and in some cases recommended charging—the doctor, his wife, and most of this staff, along with officials in numerous state government agencies, all the way up to the governor” (#798).  Indeed, it became clear that multiple complaints had been filed against the clinic for 30 years and  “the incompetents in Harrisburg, Pennsylvania’s state capital, knew or should have known that, even by their own lax rules, Gosnell should not have been carrying out abortions—but they didn’t care” (#1201).  Their dereliction was facilitated by the 1995 election of a “pro-abortion Catholic Republican,” Tom Ridge, whose policies proved “catastrophic for the many women and the hundreds of live babies who were injured and killed in Gosnell’s clinic” (#1473).  To one Philadelphia-area reporter, Ridge was “‘Gosnell’s chief enabler’” (#1480).  

Given the laws in Pennsylvania, to be charged with murder it was necessary to prove that some of the babies had been born alive and subsequently killed by Gosnell.  The case before the grand also had to be made before an openly pro-abortion judge who “was keen to draw attention away from the abortion establishment closing ranks, protecting one of their own and protecting abortion, regardless of the harm done on the way” (#1685).  Even the “partial-birth” procedure, whereby the baby’s head remained in the birth canal while the torso and legs were outside the mother, could not be labeled “murder” since the law allowed it.  Gosnell’s staff, however, testified to seeing many babies born alive and then killed (snipping their necks with scissors) by the doctor.  Importantly, for the trial, they had also taken some pictures of the slain babies that would provide vital evidence for the prosecution.   Ultimately he would be “charged with seven counts of first-degree murder and two counts of infanticide, and conspiracy to commit murder.  But from the evidence, it’s fair to assume that he murdered hundreds—perhaps thousands—over the course of his career” (#2684).  

Refusing a plea deal that would have led to his incarceration but spared his wife, Gosnell stood trial confident that he would be found innocent of all charges.  “His desire to appear as the smartest guy in the room overpowered all reason and good sense” (#2952), and he even fantasized serving as his own defense attorney!   His attorney, considered by many the best in Philadelphia, portrayed him “as a hardworking, selfless man—a pillar of the community with a virtually unblemished record who ran afoul of an overzealous prosecutor” (#3308).  Given a jury cleansed of pro-life persons and a pro-abortion judge who was a drinking companion with the defense attorney, Gosnell thought he could escape punishment by appealing to the pro-abortion ethos prevalent in progressive circles.  Nevertheless, as the evidence was presented and the expert testimony given, showing graphically what takes place in “late term” abortions as well as the killing of born-alive infants, the jury concluded Gosnell was in fact guilty as charged and he would be sentenced to life in prison.    

What the jury saw, however, went largely unreported by the nation’s media.  “If it hadn’t been for a committed group of bloggers, new media journalists, pro-life activists, and Twitter users, the Kermit Gosnell trial very likely would not have made national news” (#3983).  If a journalist mentioned the case it was usually to stress how virtually all other abortions were different from those performed in the Philadelphia clinic.  But then Kirstin Powers wrote a piece for USA Today, harshly condemning the press for neglecting the trial.  She’d found that none of the major TV networks mentioned the case during its first three months.  Nor did President Obama, who had “worked against the Born-alive Infants Protection Act” while he was in the Illinois Senate, make any comments or face any questions dealing with his position on Gosnell.  Only Fox News covered “the story from the beginning of the trial” (#4189).   The book’s reception further illustrates the media’s pro-abortion bias.  Within days of its release, it sold out on Amazon and Barnes and Noble, outselling all but three non-fiction titles.  But the New York Times refused to put it on its best seller list, and no mainstream media reviewed it.  

Ann McElhinney and Phelim McAleer have written a fully-documented, compelling treatise.  They obviously read everything relevant to the case, sat day-after-day witnessing the trial procedures, and later interviewed Gosnell in prison.  Though the surfeit of details—minutely describing the dead babies found in the clinic, investigating the police and prosecutors responsible for bringing the case to trial—may put off readers wanting a short synopsis, Gosnell:  The Untold Story of America’s Most Prolific Serial Killer merits the attention of everyone committed to the sacred “right to life” guaranteed by the Constitution as well as proclaimed in the Scriptures.  

* * * * * * * * * * * * * * * * * * * * * * *  

In the name of Nature, human nature is being denied and degraded in many venues.  Under the guidance of secular humanism, anti-human forces have been unleashed and radical “trans-human” proposals entertained.  As an astute Mortimer Adler long ago predicted (in The Difference in Man and the Difference It Makes), once a clear distinction between human beings and the rest of creation is drawn, no reason remains for granting man any special standing (i.e. “human exceptionalism”).  The Great Chain of Being has dissolved, leaving nothing but randomly scattered and essentially equal beings.  For several decades Wesley J. Smith has researched and written about precisely this development, and in The War on Humans (Seattle:  Discovery Institute Press, c. 2014), he challenges some of the growing anti-human (misanthropic) currents in contemporary culture—most notably within an environmentalism “that is becoming increasingly nihilistic, anti-modern, and anti-human” (#107).   This is clear when one confronts the philosophical aspects of the Deep Ecology Movement which serves for many as a “neo-Earth religion” that considers  human beings as no more than technologically sophisticated, consumerist parasites destroying Mother Earth.  Consequently, reducing the human population and giving other species unrestricted opportunities to thrive and multiply becomes the goal.   

This “green misanthropy” denies any moral difference between flora and fauna and human beings, whose numbers need reducing in order to enable other species to flourish.  To Paul Watson, head of the Sea Shepherd Conservation Society, humans are the “AIDS of the Earth.”  Only radical surgery, reducing man’s presence and activity on earth, can save the planet.  Similarly, Eric R. Pianka, a biology professor at the University of Texas who was named the Distinguished Texas Scientist of 2006, suggested it would be good if an ebola pandemic would kill 90% of the human population.  To Pianka:  “Humans are no better than bacteria, in fact, we are just like them when it comes to using up resources. . .  We are Homo the sap, not sapiens (stupid, not smart)’” (#318).  Needless to say, such activists enthusiastically promote abortion, euthanasia, eugenics and genetic engineering.  

Fortuitously, today’s green misanthropists have found in the hysteria regarding global warming (or climate change) a useful tool with which to promote their agenda.  Smith claims no expertise in dealing with climate change claims, but he does clearly discern the anti-human tone permeating the discussion.  He also notes that atmospheric carbon levels have steadily increased while there’s been no significant increase in world temperatures detected in 20 years.  Fearsome predictions abound—as in former Vice President Al Gore’s feverish warnings— but minor changes have actually occurred.  Polar bears still flourish, ice still forms in the arctic, snow still falls, crops still grow, hurricanes and earthquakes continue as usual—life on earth continues much as before.  

Yet numbers of school children fear they will not live into adulthood!  A U.S. senator introduced legislation to punish anyone daring to question the reality of climate change!    NASA’s James Hansen urged “the jailing of oil executives for committing ‘crimes against nature’ for being global warming ‘deniers’” (#811).  An editor at the Los Angeles Times says the paper will no longer print letters to the editor that doubt global warming!  Something has happened.  Vast numbers of folks have succumbed to green propaganda.  “Illustrating just how wacky global warming Malthusianism can become, the Mother Nature Network published an article lauding Genghis Khan—the killer of millions of people—for wonderfully cooling the planet during his years of conquest” (#690).    The author claimed that the Mongol invasions eliminated enough people (ca. 40 million) to keep 700 tons of carbon from fouling the atmosphere!  So the planet cooled for a century and Khan’s genocide should be praised!  To Smith:  “Only when the new Earth religion reigns can a vicious barbarian like Khan be canonized a saint” (#697).  

In 1974, Christopher Stone, a University of Southern California law professor published an article, “Should Trees Have Standing?—Toward Legal Rights for Natural Objects,” arguing that trees, as well as humans, should enjoy legal standing.  Subsequently, courts have increasingly granted environmentalists’ claims that legislation (most notably the Endangered Species Act) should be broadly construed so as to guarantee the preservation of all sorts of creatures and environments.  So now we face a new Earth religion that insists all of Nature has inalienable rights, including the right to exist—i.e. to be respected, to procreate, to have access to water.  Laws in nations such as Ecuador, Switzerland (with its “plant dignity” agenda), and New Zealand (declaring the Whanganui River to be  a person), now protect such rights. “In the 1970s,” Smith says, summarizing his presentation, “the values of Deep Ecology were anathema to most.  Ten years ago, granting ‘rights’ to nature would have been laughed off a a pipe dream.”  Yet, as we have witnessed in the rapid acceptance such innovations as same-sex marriage, “in contemporary society very radical ideas often gain quick acceptance by a ruling elite growing ever more antithetical to human exceptionalism” (#1305).  “The triumph of anti-humanism within environmental advocacy threatens a green theocratic tyranny.  Like eugenics, the misanthropic agendas discussed in this book are all profoundly Utopian endeavors, meaning that the perceived all-important ends will come eventually to justify coercive means.   Indeed, the convergence of human loathing, concentrated Malthusianism, and renewed advocacy for radical wealth redistribution—all of which are now respected views within the environmental movement, and each of which is dangerous in its own right—threatens calamity.

“Don’t say you weren’t warned” (#1380).  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

In A Rat Is a Pig Is a Dog Is a Boy:  the Human Cost of the Animal Rights Movement (New York:  Encounter Books, c. 2010), Wesley J. Smith carefully distinguishes between “animal welfare” (treating animals rightly) and “animal rights” (treating animals as man’s equal) and seeks to show the dangers posed by the latter.  He writes to alert readers to the dangers posed by animal rights’ radicals who plant bombs, destroy property, burn buildings, condemn all medical research involving animals, urge rigorous forms of vegetarianism, and even justify murder.  He also writes to inform us of the philosophical implications evident in statements such as Richard Dawkins’ declaration that we are not only like apes—“we are apes,” though differently evolved in minor ways!  

Foundational to the animal rights movement is Peter Singer’s 1975 book, Animal Liberation, wherein the first chapter is titled “All Animals Are Equal.”  Fifteen years later he could boast to having launched “a worldwide movement” that would continue to shape  men’s minds, extending the rights of “personhood” to whales and dolphins, dogs and cats, cattle and sheep.  He’d launched a movement!  Millions now embrace his assumptions and promote his objectives—“rescuing” various animals, halting animal research, throwing paint on fur coats, etc.  “Meanwhile, tens of millions of human beings would be stripped of legal personhood, including newborn human infants, people with advanced Alzheimer’s disease, or other severe cognitive disabilities—since Singer claims they are not self-conscious or rational—along with animals that do not exhibit sufficient cognitive capacity to earn the highest value, such as fish and birds.”  Working out the implications of his position, Singer concluded:  “‘Since neither a newborn infant nor a fish is a person the wrongness of killing such beings is not as great as the wrongness of killing a person’” (p. 28).   Contending newborn infants are not yet persons, he notoriously justifies infanticide until the baby attains “personhood” as he defines it.  

Though Singer speaks more plainly, many other distinguished academics and activists share his opposition to “speciesism,” the notion that humans are intrinsically superior to other forms of creation.  Thus they suggest that “Animals Are People Too,” positing “a moral equality between humans and animals,” making it “immoral for humans to make any instrumental use of animals” (p. 35).  All creatures capable of feeling pain are declared full-fledged members of the moral community.  Thus the People for the Ethical Treatment of Animals (PETA) once orchestrated a campaign “called ‘Holocaust on your  Plate,’ which compared eating meat to the genocide perpetrated by the Nazis against Jews” (p. 36).  

Ever alert to the opportunity of pushing their agenda through the judicial system, animal-rights activists work relentlessly to establish the “personhood” of animals in the courts.  Steven M. Wise, a law professor who heads the Center for the Expansion of Fundamental Rights, contends “that all animals capable of exercising what he calls ‘practical autonomy’ are entitled to ‘personhood and basic liberty rights,’ based on mere ‘consciousness’ and ‘sentience’” (p. 62).  Cass Sunstein, one of the regulations “czars” appointed by President Obama, thinks animals should be granted legal standing, and Harvard Law School’s Professor Lawrence Tribe (one of Obama’s instructors) “has spoken in support of enabling animals to bring lawsuits” (p. 67).  To this point, the main success enjoyed in the courts has been in cases restricting or halting medical research using monkeys or on behalf of “endangered species” such as the spotted owl in the Pacific Northwest.  But there is a powerful movement pushing our legal system to grant full equality to all creatures, great and small.

In addition to the courts, animal-rights advocates are working to proselytize children, primarily through the public schools.  Given their childish affection for bunnies and puppies, children easily respond to emotional appeals on behalf of mistreated animals.  PETA comics portray hunters and fishermen as evil people in publications such as “Your Daddy KILLS Animals!”  Young readers are then warned:  “‘Until your daddy learns that it’s not ‘fun’ to kill, keep your doggies and kitties away from him.  He’s so hooked on killing defenseless animals that they could be next!’” (p. 104).  PETA provides teachers with free curriculum materials and guest speakers espousing vegetarianism as well as condemning all forms of animal mistreatment.  High school students are promised legal assistance should they refuse to dissect frogs or dead animals in biology classes.  One organization, Farm Sanctuary, provides schools with materials promoting the “rescue” of animals imprisoned in “factory farms.” 

Smith makes his case by citing an impressive number of sources and presenting arresting illustrations, alerting us to the problems posed by the animal-rights movement.  He also rightly emphasizes “the importance of being human,” rightly caring for animals without elevating them to a sacred status. 

290 Some Polish Perspectives–Leyutko & Kolakowski

 Though Poland as a nation has frequently suffered occupation and exploitation, Polish artists (Chopin) and thinkers (Pope John Paul II) have blessed the world with their works.   Ryszard Legutko’s recent The Demon in Democracy:  Totalitarian Temptations in Free Societies (New York:  Encounter Books, c. 2016) adds another name to the list of writers dealing with the nature of the modern world.  When communists controlled his country, Legato edited an underground philosophy journal espousing the principles of Solidarity—the movement that liberated the nation 30 years ago.  “Solidarity,” he says, stood up in defense of human dignity (in its original and not the corrupted sense), access to culture, respect for the truth in science and for nobility in art, and a proper role given to Christian heritage and Christian religion.  It seemed that suddenly those great ideas at the root of Western civilization—which this civilization had slowly begun to forget—were again brought to life and ignited as a fire in the minds of the members of a trade union” (#819).  Sadly, following the collapse of the Iron Curtain these values quickly evaporated within the  liberal-democracy that replaced communism in Poland and now dominates in much of Europe.  

As a practicing philosopher Legutko has, for many years, pondered developments within this “liberal democracy” and has concluded it contains some of the same flaws that made communism so pernicious.   Leftism, even under a “democratic” banner, is still collectivist and authoritarian.  He makes clear that the “liberal-democracy” he critiques is not the classic system espoused by Thomas Jefferson or Winston Churchill but the modern system evident in both the social democracies supporting the European Parliament and America’s Democrat Party.  After watching with amazement how easily former communists became champions of liberal-democracy, Legutko argues they both “proved to be all-unifying entities compelling their followers how to think, what to do, how to evaluate events, what to dream, and what language to use.  They both had their orthodoxies and their models of an ideal citizen” (#152).  The European Union (EU) increasingly dictates to rather than represents the people of the continent’s nations.  “Even a preliminary contact with the EU institutions allows one to feel a stifling atmosphere typical of a political monopoly, to see the destruction of language turning into a new form of Newspeak, to observe the creation of a surreality, mostly ideological, that obfuscates the real world, to witness an uncompromising hostility against all dissidents, and to perceive many other things only too familiar to anyone who remembers the world governed by the Communist Party” (#174).  

What the two systems share, most deeply, is a commitment to change the world through technology—to “modernize” everything, to bring into being both a new human being and a perfect world.   The past provides neither things worth preserving nor guidance for the future inasmuch as it followed superstitious, medieval, old-fashioned notions.  Neither cultural traditions nor churches nor traditional families nor written constitutions matter, for what’s imperative is the construction of a totally new, modern world.  Rather than accepting the givenness of things as created, both communists and liberal-democrats endeavor to transform them; rather than dealing with reality they propose to construct it.  “In both systems a cult of technology translates itself into acceptance of social engineering as a proper approach to reforming society,  changing human behavior, and solving existing social problems” #233).   “In one system [the U.S.S.R.] this meant reversing the current of Siberia’s rivers, in the other [the U.S.], a formation of alternative family models; invariably, however, it was the constant improvement of nature, which turns out to be barely a substrate to be molded into a desired form” (#240).  

Legutko devotes a chapter to the shared communistic and liberal-democratic perspective on history—what astute thinkers such as C.S. Lewis condemned as “historicism.”  This is the notion derived from Hegel that there is a predetermined, irresistible evolutionary force shaping human events.  To swim with its progressive current is embrace and champion all things modern.  To be on the “right side of history” is to be altogether wise and righteous.  To oppose, to react against this course of events demonstrates stupidity and misanthropy.  For communists, forcefully establishing an egalitarian socialism is the goal; for liberal-democrats, the same end must be attained through peaceful, electoral means.  Both deeply believe they are the change-agents entrusted with perfecting both human nature and the world in general.  “A comparison between the liberal-democratic concept of history and that of communism shows a commonality of argument as well was of images of the historical process” (#412), generally drawn from Marxist sources:  1)  the triumphant march of freedom, vanquishing tyrannies of various sorts (monarchies; churches); 2)  the liberation of various victim (class; race; gender) groups; and, 3) the ultimate, thoroughly scientific enlightenment of homo sapiens.  

To make his case persuasive, Legutko suggests we imagine the differences between an old man and a youngster.  By virtue of his experience, the old man fears change, knowing it often stems from immaturity and ignorance.  The old man knows much about what has happened, including the tragedies and misfortunes resulting from well-intended, imprudent decisions.  But the youngster thinks he and his companions rightly envision a better world and need only to act quickly to achieve it.  “The old man is balanced in his reactions and assessments, looking for the appropriate courses of action in the world which, according to him, was founded on human error, ignorance, poor recognition of reality, and premature ventures; the youngster has an excitable nature, moving from desperation to euphoria, eagerly identifying numerous enemies whose destruction he volubly advocates, and equally happy to engage in collaborative activities with others because—he believes—the world is full of rational people.  The old man says that, given the weaknesses of the human race, institutions and communities (families, schools, churches) should be protected because over the centuries they have proven themselves to be tools to tame human’s evil inclinations; the young man will argue that such institutions and communities need to be radically exposed to light, aired out, and transformed  because they are fossils of past injustices.  The old man is a loner who believes that only such an attitude as his can protect the integrity of the mind; the youngster eagerly joins the herd, enjoying the uproar, mobilization, and direct action” (#536).  

Obviously the modern mind is that of a youngster, full of technical information and lofty aspirations, optimistically envisioning “the promise of a great transformation” that has enraptured so many intellectuals since the Renaissance.  Such intellectuals envision themselves as leaders on the “cutting edge of history,”  and they endlessly engage in “a favorite occupation of the youngster:  to criticize what is in the name of what will be, but what a large part of humanity, less perceptive and less intelligent than himself, fails to see” (#565).   A century ago, the U.S.S.R. served as a lodestar for “youngsters” such as John Reed, who sought therein the realization of their dreams.   More recently, the “youngsters” took to the barricades in Paris in 1968 or marched in America’s streets in support of Ho Chi Min.  The ‘60s revolutionaries chanted “a medley of anarchist slogans, a Marxist rhetoric class struggle and the overthrowing of capitalism, and a liberal language of rights, emancipation, and discrimination.  Capitalism and the state were the main targets, but universities, schools, family, law, and social mores were attacked with equal vehemence” (#1580).  One need only study carefully the rhetoric and policies of Bernie Sanders and Elizabeth Warren to note the empowerment of those ‘60s revolutionaries.  

Indicating one adolescent aspect of the modern mind is the importance of entertainment—a point persuasively made three decades ago by Neil Postman in Amusing Ourselves to Death.  In earlier, more religious times, entertainment was understood to be a non-consequential activity designed to provide a brief break from the serious work assigned us.  But, Legutko, says:  “In today’s world entertainment is not just a pastime or a style, but a substance that permeates everything:  schools and universities, upbringing of children, intellectual life, art, morality and religion” (#753).  Modern entertainment resembles the divertissement so acutely diagnosed by Pascal at the beginning of the modern era:  it’s an activity “that separates us from the seriousness of existence and fills this existence with false content” (#753).  We don’t escape reality for a few hours—we immerse ourselves in an imaginary world.  “By escaping the questions of ultimate meaning of our own lives, or of human life in general, our minds slowly get used to that fictitious reality, which we take for the real one, and are lured by its attractions” (#760).  

Rivaling historicism in its importance for both communists and liberal-democrats is utopianism, generally flying the multicolored flag of social justice.  “Utopia is thus not a political fantasy but a bold project bolder than others because it aims at a solution to all the basic problems of collective life that humanity has faced since it began to organize itself politically.  Utopia is—I beg the reader’s pardon for such a vile-sounding phrase—the final solution” (#931).  Beginning in the Renaissance, various utopians proposed political solutions to man’s ancient ills and aspirations, insisting “man can achieve greatness and be equal to God, because he has unlimited creative potential” (#931).  The republic envisioned by America’s Founders was not utopian, but the egalitarian liberal-democracy promoted by 20th century progressives—from Richard Ely and Woodrow Wilson to John Rawls and Barach Obama—certainly is.    

Counterintuitively, the “classical liberalism” that began with Adam Smith and Thomas Jefferson celebrating individualism slowly became “a doctrine in which the primary agents were no longer individuals, but groups and the institutions of the democratic state.  Instead of individuals striving for the enrichment of social capital with new ideas and aspirations, there emerged people voicing demands called rights and acting within the scope of organized groups.”  Special interest groups working within a relentlessly-expanding state, orchestrated legislative enactments and judicial decisions, “demanding legal acceptance of their position and acquired privileges.  In the final outcome the state in liberal democracy ceased to be an institution pursuing the common good, but became a hostage of groups that treated it solely as an instrument of change securing their interests” (#1205).  Ironically, today’s liberals (most notably homosexuals and feminists) are hardly liberal, in as much as they strive to regulate virtually every aspect of life, including “language, gestures, and thoughts” (#1284).  They’re just Leftists intent on imposing their agenda.  

The political system shaped by both communists’ and liberal-democrats’ historicist-utopianism becomes all-intrusive, ever intent on removing all vestiges of property or class distinctions.  Leftist ideologies of the ’60s now dominate the liberal-democratic academic and media complex.  And the Christian churches, sidelined by pernicious church-state separation decrees, have largely accommodated themselves to the deeply anti-Christian ways of modernity.  Consequently, many churches have tailored their teachings to fit “the requirements of the liberal-democratic state and, consequently, to revise their doctrines substantially, sometimes beyond recognition” (#2885).  Having successfully marched through our cultural institutions, triumphant liberals have “managed to silence and marginalize nearly all alternatives and all nonliteral view of political order” (#1536).  

Reading Legutgo’s provocative and deeply-informative analysis of these realms both clarifies and challenges our understanding of the our world.   I share my good friend John Wright’s strong endorsement of this work.  It is, as John O’Sullivan says in his Introduction, a “culturally rich, philosophically sophisticated, and brilliantly argued book” that deserves our attention if we’re concerned about our civilization. 

* * * * * * * * * * * * * * * * * * * * * *

Fortunately for the general reader, first-rate philosophers often write accessible essays, addressing both current issues and perennial truths.  Thus Leszek Kolakowski, a Polish thinker rightly renowned for his magisterial, three-volume Main Currents of Marxism, published a score of short essays in Modernity on Endless Trial (Chicago:  The University of Chicago Press, c. 1990) that offer serious readers valuable insights into some main intellectual currents of the 20th century.  Whenever an erstwhile Marxist casts a favorable glance as Christianity it makes sense for believers to consider his reasons.   

One set of essays focus “On Modernity, Barbarity, and Intellectuals.”  Strangely enough, a corps of intellectuals has orchestrated the barbarism that has emerged during the last three centuries—an  era labeled “modernity.”  Since Kolakowski cannot see how “postmodern” differs from “modern,” he discerns the loss of religion (and loss of taboos) as the primary current in modern (and postmodern) times, leading to “the sad spectacle of a godless world.  It appears as if we suddenly woke up to perceive things which the humble, and not necessarily highly educated, priests have been seeing—and warning us about—for three centuries and which they have repeatedly denounced in their Sunday Sermons.  They kept telling their flocks that a world that has forgotten God has forgotten the very distinction between good  and evil and has made human life meaningless, sunk into nihilism” (pp. 7-8).  A series of influential, secularizing skeptics prepared the way for the destructiveness of “Nietzsche’s noisy philosophical hammer” crafted to re-order the world (p. 8).  The “intellectuals” responsible for this process were not the scholars—scientists or historians—who “attempt to remain true to the material found or discovered” (p. 36) apart from themselves.  A barbarizing “intellectual” is someone who wishes not “simply to transmit truth, but to create it.  He is not a guardian of the word, but a word manufacturer” (p. 36).  Invariably, such intellectuals are seductive, spinning wondrous tales of utopian vistas.  

To Nihilists such as Nietzsche, truth is illusory.  Consequently, various cultures’ “truths” are equally “true” even if they are obviously contradictory!  Such cultural relativism—declaring all cultures are equal, praising the Aztecs as well as the Benedictines—easily embraces an admiration for various forms of what was once judged barbarism.  The sophisticated, scholarly “tolerance” so mandatory in elite universities and journals ends by granting “to others their right to be barbarians” (p. 22).  What we are witnessing is the Enlightenment devouring itself!  In Kolakowski’s judgment:  “In its final form the Enlightenment turns against itself:  humanism becomes a moral nihilism, doubt leads to epistemological nihilism, and the affirmation of the person undergoes a metamorphosis that transforms it into a totalitarian idea.  The removal of the barriers erected by Christianity to protect itself against the Enlightenment, which was the fruit of its own development, brought the collapse of the barriers that protected the Enlightenment against its own degeneration, either into a deification of man and nature or into despair” (p. 30).  

Another set of essays deal with “the Dilemmas of the Christian Legacy,” for modernity’s secularizing process has significantly, if indirectly, shaped much of the Christian world “through a universalization of the sacred,” sanctifying worldly developments as “crystallizations of divine energy.” (p. 68).  The “Christianity” rooted in process theology—as propounded by Teilhard de Chardin for example— envisions universal salvation and unending evolutionary progress.  “In the hope of saving itself, it seems to be assuming the colors of its environment, but the result is that it loses its identity, which depends on just that distinction between the sacred and the profane, and on the conflict that can and often must exist between them” (p. 69).  Kolakowski detects and dislikes what he finds in these circles—“the love of the amorphous, the desire for homogeneity, the illusion that there are no limits to the perfectibility of which human society is capable, immanentist eschatologies, and the instrumental attitude toward life” (p. 69).  Losing their sense of the sacred, this-worldly philosophies and religions fail to provide any basis for culture.  Indeed:  “With the disappearance of the sacred, which imposed limits to the perfection that could be attained by the profane, arises one of the most dangerous illusions of our civilization—the illusion that there are no limits to the changes that human life can undergo, that society is ‘in principle’ an endlessly flexible thing and that to deny this flexibility and this perfectibility is to deny man’s total autonomy and thus to deny man himself” (p. 72).  A rejection of the sacred invites the denial of sin and evil.  

Though not overtly Christian, Kolakowski himself rejected the atheistic Marxism of his early years and found Christianity the best hope for the world and become a cheerleader for, if not a devotee of the Faith.  “There are reasons why we need Christianity,” he argues, “but not just any kind of Christianity.  We do not need a Christianity that makes political revolution, that rushes to cooperate with so-called sexual liberation, that approves our concupiscence or praises our violence.  There are enough forces in the world to do all these things without the aid of Christianity.  We need a Christianity that will help us move beyond the immediate pressures of life, that gives us insight into the basic limits of the human condition and the capacity to accept them, a Christianity that teaches us the simple truth that there is not only a tomorrow but a day after tomorrow a well, and that the difference between success and failure is rarely distinguishable” (p. 85).   

Given his critique of modernity, Kolakowski has little patience with the modernist (or liberal) Christianity that focuses on “social justice,” peace, and ephemeral earthly progress—the this-worldly political agenda so routinely proclaimed in some quarters.  “Christianity is about moral evil, malum culpae, and moral evil inheres only in individuals, because only the individual is responsible” (p. 93).  To even speak of “a ‘morally evil’ or ‘morally good’ social system makes no sense in the world of Christian belief” (p. 93).  The vacuous “demythologization” project of modernists such as Rudolph Bultmann elicits Kolakowski’s erudite disdain, for it was merely a fitful gasp of the irrational skepticism launched centuries ago by William of Occam and the nominalists, then subtly advanced by David Hume and the 18th century empiricists.  In truth, “there is no way for Christianity to ‘demythologize’ itself and save anything of its meaning.  It is either-or:  demythologized Christianity is not Christianity” (p. 105).  

Demythologized Christianity contradicts itself.  In this respect it’s simply another utopian political ideology.  Having early advocated the Marxist version of utopia, Kolakowski easily detects the many currents of such blissful imagining—popularly expressed in John Lennon’s popular song “Imagine.”  Consider the fantasies of folks who envision a world wherein fraternity is realized, where equality prevails in every realm.  They “keep promising us that they are going to educate the human race to fraternity, whereupon the unfortunate passions that tear societies asunder—greed, aggressiveness, lust for power—will vanish” (p. 139).  Inevitably they establish dictatorships designed to enforce the mirage of equality.  Allegedly admirable goals—caring for the impoverished and weak—require the abolition of private property and a state controlled economy, the abolition of the free market.  However noble the intentions, “the abolition of the market means a gulag society” (p. 167).  

In the name of compassion, giving preferential treatment to various disadvantaged groups, societies easily “retreat into infantilism” (p. 173).  Citizens become dependent, childlike welfare recipients.  The State assumes more and more responsibility to care for everyone’s needs, and we “expect from the State ever more solutions not only to social questions but also to private problems and difficulties; it increasingly appears to us that if we are not perfectly happy, it is the State’s fault, as though it were the duty of the all-powerful State to make us happy” (p. 173).  The State, of course, cannot possibly do this.  Yet this blatantly utopian longing drove some of the most powerful mass movements of the 20th century, most of them Marxist to some degree.  Marx, of course, didn’t envision the gulags that would result from the implementation of his socialistic ideas!  But Lenin and Trotsky were, in fact, faithful to his precepts, installing a “dictatorship of the proletariat” that could not but violently pursue its agenda.  Reducing ethics to “fables” and doing whatever necessary to advance his cause, Lenin simply implemented his Marxist principles. 

289 Notable Conversions–Andrew Claven & Sally Read

 

When I review books I hope some readers find bits of valuable information and perhaps pick up a copy if it interests them.  But some books I not only read and relish but wish everyone could enjoy the enlightenment and beauty they afford.  Such is Andrew Klavan’s The Great Good Thing:  A Secular Jew Comes to Faith in Christ (Nashville:  Nelson Books, c. 2016), wherein a gifted writer speaks persuasively, reaffirming the perennial allure of the the Incarnate Savior, our Lord Jesus Christ.  Klavan is well-known in the literary world, considered by Stephen King a “most original American novelist of crime and suspense.”  But rather than keeping us in suspense Klavan, in The Great Good Thing, tells us about his conversion, culminating with his Christian baptism at the age of 49.  “No one could have been more surprised than I was,” he says.  “I never thought I was the type.  I had been born and raised a Jew and lived most of my life as an agnostic.  I believed in the fullest freedom of thought into the widest reaches of fact and philosophy  I believed in science and analysis and reasonable explanations.  I had no time for magical thinking of any kind.  I couldn’t bear solemn piety.  I despise even the ordinary varieties of willful blindness to the tragic shambles of life on earth.”  In short, for half-a-century he’d been a hard-boiled realist—“a worlding by nature” (p. xiii).  

Flourishing as a writer, Klavan “was one of the men of the coasts and cities, at home among the snarks and cynics of these postmodern times” (p. xvi).  Yet here he was, confessing “that Jesus Christ was Lord” and accepting “the uniquely salvific truth of his life and preaching, death and resurrection—this it seemed to me even in the moment, was to renounce my natural place in the age, to turn against my upbringing and my kind.  It felt, so help me, as I were flinging myself off the deck of a holiday cruise ship, falling away from its lighted ballrooms and casinos, from the parties and the music and sparkling wine of Fashionable Ideas, to go plunging down and down and did I mention down into a wave-tossed theological solitude” (p. xv).  In a sense it made no sense!  But in a deeper sense, it was a coming together of the central themes of his novels wherein his “heroes were always desperately on the run desperately trying to get at a truth that baffled their assumptions and philosophies” (p. xvi).  They wanted to make sense of the world but couldn’t find the key.  

Slowly, through much reading and writing and personal experience, he discovered  the key—the answer to Pilate’s question, “What is truth?”— could be found only in the message proclaimed by The Gospel According to St. John!  Jesus Is the Truth!  Klavan’s spiritual journey, rather like C.S. Lewis’s, took place over a number of years wherein he moved from agnosticism to belief in God.  He’d begun praying and found his life improved by the discipline.  He’d “become like a character in one of my own stories, desperately trying to unknit the fabric of fact and perception, to separate the warp of psychology from the weft of objective truth, before time ran out” (p. xix).  He fully understood the risks entailed—a successful Jewish writer, safely ensconced in an up-scale Santa Barbara suburb, daring to declare himself a Christian.   What would that mean?  “‘Oh, God,’ I prayed fervently more than once, ‘whatever happens, don’t let me become a Christian novelist!’” (p. xx).  “Would I descend into that smiley-faced religious idiocy that mistakes the good health and prosperity of the moment for the supernatural favor of God?” (p. xx).  And in becoming a Christian he determined not to forsake his Jewish ancestry and culture.  Could it happen?  

Well, it did.  He found Christ—or, in that paradoxical mystery of redemption, Christ found him!  Consequently, he found himself “rejoicing.  I was convinced and fully convinced:  my mind was God’s, my soul was Christ’s, my faith was true.  How had that happened and why?  Given the spiritual distance I’d traveled, given the depths of my doubts, given the darkness of my most uncertain places, and given, most of all, the elation and wonder I felt at the journey’s end, it seems to be a story worth telling” (p. xxv).  

It’s a story worth telling—and for us it’s a story worth reading!  Klavan recounts  his early years in Great Neck, New York, “a wealthy town, a well-tailored suburban refuge from the swarming city,” where he was immersed in an upper-middle-class, secularized Jewish community, the son of a successful New York morning drive radio personality.   But as a child he was inwardly unhappy and spent much time daydreaming, constructing elaborate fantasies featuring himself as the invariably tough-guy hero.  Much of his school-time was devoted to fantasizing rather than studying.  He seemed to be a good student, “but it was all fraud.  I could read well and write well and talk glibly and even figure out math problems in my head.  So I could bluff my way through subjects I knew nothing about, and neither my teachers nor my parents, nor even my friends, were aware that I was hardly doing any schoolwork at all” (p. 28).   In fact he learned nothing—“no historical facts, no mathematical formulas, no passages from the books we were supposed to have read” (p. 28).  

Nor did he learn much about Judaism.  His family’s Jewishness was purely cultural, extending to only a few traditions.  “God was not a living presence in my home,” (p. 45), and his required attendance (“suffocating torture”) at Hebrew school in the local synagogue left no impression on him.  “My father used to say, ‘You can’t flunk out of being Jewish.’  But man, I tried” (p. 48).  Forced to submit to his Bar Mitzvah, he ad-libed his way through it and was startled by the “fortune in gifts” he received.   But inwardly he felt only “rage and shame” at participating in a ritual mouthing words he disbelieved.  He knew he was a hypocrite and hated himself for it.  “With great pomp and sacred ceremony, they had made me declare what I did not believe was true—and then they had paid me for the lie with these trinkets!  I felt that I had sold my soul” (p. 55).  

Though his family’s Judaism hardly affected young Klavan, a brief exposure to a thoroughly Christian family did!  A woman, Mina, who worked for his family and became virtually a family member, gave him “a substantial portion of what mothering I had” (p. 61).  With neither husband nor children of her own, “she just took care of people, that’s all” (p. 62).  Though he didn’t really understand it, Mina “was a true Christian.  Religious, I mean, even devout” (p. 64).  She never mentioned “Jesus to me, but he was alive and real to her” (p. 64).  Allowed to go to her gaily decorated, music-filled house one Christmas, Klavan felt himself in a “wonderland” surrounded by cheerful, caring people.  Many years later, preparing for his baptism, he marveled that “Jesus had first entered my consciousness” at that first Christmas at Mina’s house” (p. 68).  Thenceforth, even in his most agnostic, secular stages, he retained a deep fondness for Christmas, even celebrating the season with a sincerity lacking in some Christian circles!  But:  “It was Christmas we loved, the bright tradition, not Christ, never Christ” (p. 74).  

Nor did he love schooling of any kind!  Early on he’d determined to become a writer, and he thought only personal experience could teach him what he wanted to know.  So he entered a program designed to enable youngsters to finish high school early and launched out on his own, at the age of 17, to enter “the world of Experience behind the walls” (p. 119).  He worked at various jobs, traveled hither and yon, and certainly experienced many things.  In the midst of his wandering, for reasons he cannot recall, he applied for admission to the University of California at Berkeley and was accepted.  But he was late on the scene.  “The radical years were over.  The riots and mayhem I’d been hoping to see had passed like a storm” (p. 123).  Though nominally a student, he mainly drank and slept and tried to teach himself how to write.  

When he did go to class, he “went through all the usual dazzle-dazzle shenanigans:  bluff and fakery.  I read none of the books.  I conned and wrote my way to passing grades” (p. 124).  But for some reason he always bought the books required for the courses and kept them.  Back then, when postmodernism was just beginning its onslaught, some university professors still assigned the “classics” and encouraged students to engage in the “Great Conversation, an interchange carried on across the centuries by the major thinkers and artists of the Western canon.  The idea was that by studying this conversation you could move closer to the Truth and so find a fuller wisdom about reality and what made for the Good life” (p. 134).  So Klavan’s growing library contained the works of the masters and one day, lying listlessly in bed, he picked up a William Faulkner novel.  Suddenly he discovered what literature was all about!  And he began to read, on his own, the classic literary works of Western Civilization.  “Without knowing it, I had joined the Great Conversation” (p. 138).  

Not only did he discover the classics at Berkeley—he found a wife, Ellen, the daughter of the chairman of the Berkeley English department.  Ellen’s parents embraced him, and he managed to graduate from the university as well as marry her and make a lasting marriage of invaluable worth.  Though he’d been “a fool in many ways,” by marrying her “somehow—and not for the last time in my life—I had managed to stumble into the great good thing” (p. 158).  The young couple then moved to New York and managed to survive, working at various jobs while he tried to become the writer of his dreams.  Amidst his manifest lack of success as a writer he began struggling with mental and emotional issues that led him to enter a five-year stint of psycho-analysis, wherein a gifted therapist greatly helped him to get mentally healthy.  In the midst of his depression (both mental and financial), however, he decided to write a suspense novel—using a pseudonym, since such was not the genre of “serious” writers.  He and his brother quickly wrote—and sold—the book, which then won the Edgar Award for best paperback mystery.  Better yet, they also got a movie deal.  He not only made money but discovered “that telling such stories was my gift” (p. 170).  In time he would become a highly-acclaimed and financially successful suspense novelist.  

Among the many events that opened his mind to God was the birth of his first child.  “Sex, birth, marriage, these bodies, this life, they were all just representations of the power that had created them, the power now surging through my wife in this flood of matter the power that had made us one:  the power of love.  Love, I saw now, was an exterior spiritual force that swept through our bodies in the symbolic forms of eros, then bound us materially, skin and bone, in the symbolic moment of birth.”  Watching the baby emerge from the womb, Klavan experienced a truly mystical moment.  “I became not one flesh with my wife but one being beyond flesh with the love I felt for her.  My spirit washed into that love and became part of it, a splash in a rushing river.  In that river of love, I went raging down the plane of Ellen’s body until the love I was and the love that carried me melded with the love I felt for the new baby we had made together and I became part of the love as well,” and he saw he “was about to flow out into the infinite.  I saw that, beyond the painted scenery of mere existence, it was all love, love unbounded, mushrooming, vast, alive, and everlasting.  The love I felt, the love I was, was about to cascade into the very origin of itself, the origin of our three lives and of all creation” (p. 191).  

In time he realized:  “You cannot know the truth about the world until you know God loves you, because that is the truth about the world” (p. 236).  Tasting the reality of love, he sought Love!  He began slipping into churches and even attending services—and then met an engaging Episcopal rector.  He appreciated the music as well as the messages and began to “realize there was a spiritual side to life, a side I had been neglecting in my postmodern mind-set” (p. 195).  Intuitively, he knew morality itself requires a transcendent foundation.  “An ultimate Moral Good cannot just be an idea.  It must be, in effect, a personality with consciousness and free will” (p. 205).  “In the chain of reasoning that took me finally to Christ, accepting this one axiom—that some actions are morally better than others—is the only truly nonlogical leap of faith I ever made.  Hardly a leap really.  Barely even a step, I know it’s so.  And those who declare they do not are, like Hamlet, only pretending” (p. 206).  

Coming to faith in Christ proved momentous:  “My personality was so transformed I hardly recognized myself” (p. 211).  Filled with joy, Klavern flourished as a writer and father.   His written works reveal his mind’s journey, refuting the postmodernism firmly entranced in the nation’s intelligentsia and working through the anti-semitism obvious in the Western Christian Culture he’d come to love.   He saw the truth revealed in the words of one of his own characters in True Crime:  “‘You want to believe in God,’ the pastor says, ‘you’re gonna have to believe in a God of the sad world’” (p. 225).  Sin has shattered our world, and it’s full of evil—including the Holocaust.  But the Savior has saved us from sin!  “In this new mental freedom, I came to see that the dilemma I had been wrestling with—my love of a culture that had done so much evil and yet produced such lasting beauty—was only my personal portion of the greater human paradox.  We are never free of the things that happen.  Even evil weaves itself into the fabric of history, never to be undone.  Yet at the same time—at the very same time—each of us gets a new soul with which to start the world again.”  Jesus “offered a spiritual path out of the history created by Original Sin and into the newborn self remade in his image.  It is the impossible solution to the impossible problem of evil.  All reason says it can’t be so.  But it’s the truth that sets us free” (p. 229).  

This book is so good that (as is evident in the long quotations) it’s tempting just to duplicate the entire text!  So let me just share Eric Metaxis’ encomium:  “Andrew Klavan’s superb new book deserves to become a classic of its kind.  Klavan’s immense talents as a writer are on full view in what must certainly rank as his most important book to date.  Tole lege [take up and read—the words Augustine heard prior to his conversion].”  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Inasmuch as all heaven rejoices when a single sinner repents, all converts are equally treasured.  But inasmuch as every person is unique, each conversion story adds depth and texture to the ways of God with man. Thus one of the most recent conversion accounts, Night’s Bright Darkness:  A Modern Conversion Story (San Francisco:  Ignatius Press, c. 2016), by Sally Read, a contemporary English poet, merits our attention.  

Reared in a militantly atheist home—her father a vociferous Marxist journalist—Sally Read represents much about today’s secularism.  “At ten I could tell you that religion was the opiate of the masses; it was dinned into me never to kneel before anyone or anything.  My father taught me that Christians, in particular, were tambourine-bashing intellectual weaklings.  As a young woman I could quote Christopher Hitchens and enough of the Bible to scoff at.  My father would happily scoff with me” (#83).  There is neither God nor soul.  Matter alone exists, she thought.  Yet as she began working as a nurse in a psychiatric nurse she met patients whose sufferings and dyings gave her pause.  And amidst the intemperate drinking and casual sex that punctuated her work-week routine, she occasionally felt strangely drawn to old churches in the neighborhood.  

Then her father’s death at the age of fifty-six distressed her.  “I felt as if a god had died.  The creator of my world and my protector had gone” (#231).  Feeling abandoned and inwardly empty, she felt as if she were in hell and wondered what, indeed, life is all about.  She lost weight and hair.  She “even considered, in a desperate and vague way, invoking God.”  Perhaps some kind of faith would make life “liveable.  But it seemed entirely unfeasible to believe in any God; I thought I could never lower myself to that degree of self-delusion” (#246).  Looking back, she now considers that desolate phase of her life a blessing, for God was mysteriously working therein to bring her to Himself.  “His absence was so painfully loud it seems, now, to prove his existence,” for He “reaches us wherever we are, even if we are so far from knowing him that we mistake him completely.  His infinity always contains our finitude” (#253).  

In the six years following her father’s death, Sally Read became a published poet, married an Italian and gave birth to a baby girl, Florenzia.  She had to “battle” for both a wedding and a child in a world which welcomed neither, but in her heart she just knew such things are right.  Then ultimate, metaphysical questions began to haunt her, and while pregnant she wondered, “What was it I was creating?” (#313).  Having successfully published two volumes of poetry, she envisioned writing a more journalistic work designed to help women nourish their emotional and reproductive health.  Personally, she “had suffered debilitating physical and psychological effects while taking oral contraceptives” and wanted to explore that issue as well as “abortion and its effects on women” (#364).  So she interviewed various women while researching the book and in Rome encountered some devout American Catholics whose husbands were studying at pontifical universities.  Though they lived by standards utterly unlike hers, she was strangely warmed by their zest for life and constant awareness of God in their daily activities.  

Consequently, she began visiting churches and got acquainted with a godly guide, Father Gregory, a Ukrainian studying for the priesthood, whose gentle counsel and literary references help nudge her to belief in God.  He encouraged her to pray, though she had no idea how.  But she picked up T.S. Eliot’s Four Quartets and sensed almost immediately an inner peace, an acceptance of a Reality around her that unleashed a torrent of tears.  Telling Father Gregory of that moment, he sent her a poem by St. John of the Cross which further inspired her.  Entering a nearby church she felt:  “The strange calm that had come upon me that night the week before had settled into a new longing to know what to do.”  Looking up, she saw an icon of Christ’s face in a window and said:  “‘If you’re there, you have to help me’” (#650).  And He did!  “I felt almost physically lifted up.  My eyespots crying instantly, my face relaxed.  It was like being in the grip of panicked amnesia, when suddenly someone familiar walked into the room and gave myself back to me—a self restored to me more fully than before.  It was a presence entirely fixed me as I was on it, and it both descended toward me and pulled me up.  I knew it was him.  This was the hinge of my life; this compassion and love and humility so great it buckled me as it came to meet me.  Later, I would read in Simone Weil’s writings what seemed a very similar experience—how, as she prayerfully recited George Herbert’s poem ‘Love,’ ‘Christ himself came down and took possession of me’” (#658).  

Thenceforth she freely prayed, reciting the Our Father, knowing she was safe in His arms and feeling “as if the Birth, the Crucifixion, the Resurrection were plunged into my being in one gorgeous blow—this is how it is to all of a sudden know the meaning of reality:  the heart kick-started to sense its intrinsic architecture of logic, love, and reason” (#658).  She intimately sensed the Presence of the Living Lord.  “There was a feeling of being known in every cell.  My aloneness was taken away from me; and though it has often since returned, I know that loneliness is the illusion and Christ beside me the reality.  This was my earliest prayer:  being attuned to Christ’s presence, which by grace I perceived in those early days as strongly was my daughter’s breathing or the sound of the blackbird singing at night in the garden.  Prayer became essential,” and she sensed “being touched—if so pale a word can describe the sensation of being broken and healed—touched that he had come to me when I had rejected him and spoken against him and published lies about him in my books” (#680). 

At that point, though Catholics had certainly guided her, she had no interest in Catholicism, with all its dogmas and rules.  But she began reading the Gospels and found the Jesus revealed therein quite unlike the “gentle Jesus, meek and mild” proclaimed in liberal churches.  She came to see that the Truth was something given to us, not something we fashion, something best established in the Church of the Apostles.  And the Truth she encountered led her, step by step into the Catholic Church.  Night’s Bright Darkness reveals a poet determined to discern and beautifully describe Reality entering into its fullness.  

# # #

288 Debunking Utopia

 In Dostoyevski’s The Brothers Karamazov, Father Zossima tried to counsel a distraught woman by encouraging her to embrace an “active love” by helping others.  She  unfortunately failed to follow his advice, settling into an abstract “love for humanity.”  Father Zossima called her’s a “love in dreams” and noted that “love in action is a harsh and dreadful thing compared with love in dreams,”  for when daydreaming, or imagining how we can help “humanity,” we slip into a non-existent future that helps no one.  Consequently we label “utopian” those societies designed for beings quite unlike our species.  Nima Sanandaji, in Debunking Utopia:  Exposing the Myth of Nordic Socialism (Washington, D.C.:  WND Books, c. 2016), reminds us, with ample facts, that socialism forever fails simply because it cannot succeed in the real (as opposed to the imaginary) world.  He makes three important points:  1) post-WWII Scandinavia’s economic success results from the region’s cultural roots rather than socio-political structures; 2) trying to duplicate Nordic economic structures elsewhere cannot but to fail; and 3) “democratic socialism” is now collapsing even in Scandinavia due to its intrinsic flaws—indeed, “the days when the Nordic countries could actually be socialist are long gone” (#2333).  

Sanandaji’s family emigrated from Iran to Sweden in 1989, and he personally enjoyed all the benefits of that nation’s generous (socialistic) welfare state, ultimately finishing a Ph.D. in economics, publishing 20 books and scores of scholarly papers.  After years of carefully examining the “democratic socialism” of Nordic countries (Sweden, Norway, Finland, Iceland, and Denmark), he understands why it’s seriously flawed and now imploding.  Indeed, though American socialists (e.g. Bernie Sanders) and  progressives (e.g. President Obama) naively laud them, today “only one of the five Nordic countries has a social democratic government” (#107).  In various ways Scandinavians seem to be moving away from a failed model.  

Without question the Nordics enjoy many good things—longevity, education, health care, women’s rights, generous vacations, etc.  But the good life there evident, Sanandaji insists, results mainly from Nordic culture rather than socialist structures.  They all “have homogenous populations with nongovernmental social institutions that are uniquely adapted to the modern world.  High levels of trust, a strong work ethic, civic participation, social cohesion, individual responsibility, and family values are long-standing features of Nordic society that predate the welfare state.  These deeper social institutions explain much of the success of the Nordics” (#337).  Imagining the United States—or African or South American countries—could duplicate the Nordic model without the Nordic culture is simply wishful (and extraordinarily frivolous) thinking.  Still more:  it’s important to acknowledge that many of the world’s finest places, enjoying the highest level of well-being, are places like Switzerland and Australia which differ markedly from the Nordics.  

For instance, using one of the main criteria for national well-being—longevity—we find Japan, Switzerland, Singapore, and Australia at least as good as the Nordics.  “Instead of politics, the common feature seems to be that these are countries where people eat healthily and exercise” (#423).   Rather than thinking welfare states make everyone healthy through universal health care, we should understand the life-style ingredients that truly matter.  Then consider the celebrated economic “equality” praised by the likes of  Bernie Sanders.   Sanandaji’s brother Tino (also an economist) notes:  “‘American scholars who write about the success of the Scandinavian welfare states in the postwar period tend to be remarkably uninterested in Scandinavia’s history prior to that period.  Scandinavia was likely the most egalitarian part of Europe even before the modern era’” (#521).  

In part this grew out of the region’s agrarian roots.  For centuries hard-working farmers had survived in an unusually difficult environment.  Necessarily they forged a culture “with great emphasis on individual responsibility and hard work” (#630).  They also secured property rights and embraced a market system that enabled them to thrive as independent yeomen committed to the “norms of cooperation, punctuality, honesty, and hard work that largely underpin Nordic success” (#659).  These norms were then brought to the United States by Scandinavian immigrants in the 19th century, and we find transplanted Swedish-American and Norwegian-American communities distinguished by conscientious, law-abiding, hard-working people.  Consequently they thrived and easily entered the mainstream of their new nation.  Today the eleven million Americans who identify themselves as Nordic are doing even better than their kinsmen still living in Scandinavia and “have less than half the average American poverty rate” (#830).   Culture, not economics, explains the difference!   

Rather than helping improve Scandinavia, Sanandaji says, socialism has actually harmed the region!  As an article in “The Economist explains:  ‘In the period from 1870 to 1970 the Nordic countries were among the world’s fastest-growing countries, thanks to a series of pro-business reforms such as the establishment of banks and the privatization of forests.  But in the 1970s and 1980s the undisciplined growth of government caused the reforms to run into the sands.’  Around 1968 the Left radicalized around the world . . . .  The social democrats in Sweden and other Nordic countries grew bold, and decided to go after the goose that lay the golden eggs:  entrepreneurship” (#1104).  Implementing “democratic socialism” they targeted and taxed the “rich”—the businessmen, the wealth-creators, the very folks responsible for their nations’ prosperity.  Though Scandinavian countries enjoyed remarkable prosperity immediately following WWII, by becoming welfare states they struggled for the next half-a-century to preserve it.  “Third Way socialist policies are often upheld as the normal state Swedish policies.  In reality, one can better understand them as a failed social experiment, which resulted in stagnating growth and which with time have been abandoned” (#1127).  

Rather than celebrating the glories of socialism, the Nordics have learned a sad lesson and recently turned toward a more free market economy.  They grew “rich during periods with free-market policies and low taxes, and they have stagnated under socialist policies.  Job growth follows the same logic” (#1250).  Small government and low taxes spell prosperity; intrusive government and high taxes make for slow (or no) growth.  Recognizing this—and retreating from state-run monopolies—educational and health care facilities have been “opened up in Sweden as voucher systems, allowing for-profit schools, hospitals, and elderly care contorts operate with tax funding” (#1458).  Such moves “drove up wages, evident by the fact that these individuals gained 5 percentage points’ higher wages than similar employees whose workplaces had not been privatized” (#1485). 

Sanandaji has written this book to warn Americans who look favorably on “democratic socialism” in a nation “only very marginally more economically free than Denmark” (#2348).  Noting that Franklin D. Roosevelt was the “architect of the American welfare state,” he then reminds us that FDR also warned:   “‘The lessons of history, confirmed by the evidence immediately before me, show conclusively that continued dependence upon relief induces a spiritual and moral disintegration fundamentally destructive to the national fibre.  To dole out relief in this way is to administer a narcotic, a subtle destroyer of the human spirit.  It is inimical to the dictates of sound policy.  It is in violation of the traditions of America’” (#1512).  Recently a German scholar, Friedrich Heinemann, has validated FDR’s concern regarding “moral disintegration” in welfare states.  Data from the World Value Survey indicate “a self-destructive mechanism exists” in them which dissolves norms.   This is sadly evident, Sanandaji says, in Nordic lands, whose celebrated “work ethic” has dissipated.   Many young Scandinavians work less diligently than their parents, fail to form solid families, and falsely claim to be “sick” to avoid work when it suits them.  

Debunking Utopia should give us Americans additional pause as we endeavor to shape our nation’s immigration policies as well as its economy, for Sanandaji dolefully describes the mounting problems Sweden faces as a result of opening the border to refugees from Muslim nations.  “Sweden is in a ditch because many politicians , intellectuals and journalists—on both the left and the right—have claimed that refugee immigration is a boon to the country’s economy and that large-scale immigration is the only way of  sustaining the welfare state.  . . . .  But of course, serious research has never shown that refugee immigrants boost the Swedish economy.  The truth is quite the opposite” (#2216).    In truth, poverty and crime and educational problems have accelerated as waves of immigrants have washed over the country.  

In fact, Americans should look seriously at their own traditions and seek to revive them rather than fantasizing about any form of “democratic socialism.”  As an old country song declares:  “there ain’t no living in a perfect world,” and “the true lesson from the Nordics is this:  culture, at least as much as politics, matters.  If the goal is to create a better society, we should strive to create a society that fosters a better culture.  This can be done by setting up a system wherein people are urged to take responsibility for themselves and their families, trust their neighbors and work together.  The Nordic countries did evolve such a culture—during a period when the state was small, when self-reliance was favored.  For a time these societies prospered while combining strong norms with a limited welfare state, which was focused on providing services such as education rather than generous handouts.  Then came the temptation to increase the size of the welfare state.  Slowly a culture of welfare dependency grew, eroding the good norms” (#2395).  Only by resurrecting those good norms—and abandoning the failed welfare state—can Scandinavians or Americans truly prosper.  

                     * * * * * * * * * * * * * * * * * * * * * * * * * * 

“Two decades ago, New Zealand went through a dramatic transformation, from a basket case welfare state saddled with crushing public debt, rampant inflation, and a closed and moribund economy, to what is today widely regarded as one of the freest and most prosperous countries in the world.  This is the story of how that happened.”  So Bill Frezza sums up his New Zealand’s Far-Reaching Reforms:  A Case Study on How to Save Democracy from Itself (Guatemala, Guatemala:  The Antigua Forum, c. 2015).  Following WWII—and before such reforms—as more and more money was printed to fuel more and more welfare programs, New Zealand graphically illustrated H. L. Mencken’s quip:  “Every election is a sort of advance auction sale of stolen goods.”  Indeed, Roger Muldoon won election as prime minister in 1975 by running on a platform described by critics as a “denial of economic reality accompanied by bribery of the voters.”  

But all this changed through reforms orchestrated by two politicians (Roger Douglas and Ruth Richardson) representing opposing parties, who made “common cause in a fight against their own party leaders” to save their nation from corrosive inflation and ultimate bankruptcy in the late 1980s and early 1990s.  They both demanded “honest and open accounting” wherewith to tell the truth regarding the nation’s condition.  “If private businesses kept their books the way most governments keep their books, our jails would be full of CEOs” (#361)  But the reformers determined:  “The country was going to be run more like a successful business than a public piggy bank” (#554).  Richardson especially focused on reforming the educational system, turning it to charter schools answerable to the parents.  Douglas worked to reduce corporate and personal income taxes, eliminate inheritance taxes, and establish a consumption tax.  Since they eliminated government sponsored enterprises, such as Fannie Mae and Freddie Mack in the U.S., New Zealand didn’t suffer the housing bubble that burst and devastated America in 2008.  

Consequently, “New Zealand enacted its most lasting reforms when advocates for efficient government, free markets, free trade, sound money, and prudent fiscal policy came together” and legislative acts were passed that “forever changed the way New Zealand’s government did business” (#248).  Freeza explains the important acts and shows how they changed the country.  Government agencies were privatized and compulsory union membership eliminated.  “All civil service employees were moved from job-for-life union contracts and a seniority based advancement regime to individual employment contracts and a merit based regime” (#331).  Opposition to such changes was inevitably intense.  “As a poster child for the bitter medicine being administered, Ruth became the most hated politician in New Zealand.  Effigies were burned in the streets, protesters poured a pot of blue paint on her (she saved the ruined dress for a charity auction), and police had to protect her on her jog to work every morning” (#549). 

But the reforms worked and made a lasting difference.  The 2014 Index of Economic Freedom ranks “New Zealand fifth in the world, behind Hong Kong, Singapore, Australia, and Switzerland with ratings of 90 percent or higher for rule of law, freedom from corruption, business freedom, and labor freedom” (#623).  The nation’s GDP increased fourfold while the national debt shrunk from $25 billion in 1993 to $15 billion in 2007.  Trade, especially with China, has flourished.  In retrospect, Freeza says (wondering if the New Zealand story can be duplicated):  “The list of success factors required for a democracy to flourish economically is not long:  honesty, integrity, transparency, accountability, efficiency, thrift, prudence, flexibility, freedom, leadership, and courage.  Does anyone care to stand up and deny that these are virtues not just of good government but of a good life?  Although universally acclaimed by economists, philosophers, and theologians, why are these virtues so hard to find in governments and politicians?” (#668).  Unfortunately, politicians such as Roger Douglas and Ruth Richardson, willing to risk losing elections and incurring criticism, rarely appear.  But without them majoritarian democracies will, it seems, sadly enough, generally follow a destructive path.

* * * * * * * * * * * * * * * * * * * * * * * *

In The Problem with Socialism (Washington, D.C.:  Regency Publishing, c. 2016), Thomas DiLorenzo notes that a recent poll showed “43 percent of Americans between the ages of eighteen and twenty-nine had a ‘favorable’ opinion of socialism” and preferred it to capitalism (#85).  Another poll indicated 69 percent of voters under 30 would support a socialist for President—as Bernie Sanders’ near victory in the Democrat Party primaries certainly illustrated.  Misled by a multitude of educators, these young folks fail to realize G.K. Chesterton’s insight:  “the weakness of all Utopias is this, that they take the greatest difficulty of man [i.e. original sin] and assume it to be overcome, and then give an elaborate account [i.e. redistribution of wealth] of the overcoming of the smaller ones.  They first assume that no man will want more than his share, and then are very ingenious in explaining whether his share will be delivered by motorcar or balloon” (Heretics).   Or, as Lady Margaret Thatcher famously quipped:  the ultimate problem with socialists is they finally run out of other people’s money to spend.

Though socialism in the 19th century meant the “government ownership of the means of production,” in the 20th century it morphed into redistributive measures designed to eliminate all sorts of inequalities through progressive taxes and regulatory edicts.  Inevitably socialists want government to control as many industries (e.g. health care), confiscate as much land (e.g. national forests), and destroy capitalism “with onerous taxes, regulations, the welfare state, inflation, or whatever they thought could get the job done” (#127).  Also inevitably, nations embracing socialism impoverish themselves.  Africa bears witness to the fact that 40 years of socialistic experiments made them “poorer than they had been as colonies” (#183).  Indeed, one of DiLorenzo’s chapters is entitled:  “socialism is always and everywhere an economic disaster.”  A glance at American history shows how socialistic endeavors in colonial Jamestown utterly failed.  But, Matthew Andrews says:  “‘As soon as the settlers were thrown upon their own resources, and each freeman had acquired the right of owning property, the colonists quickly developed what became the distinguishing characteristic of Americans—an aptitude for all kinds of craftsmanship coupled with an innate genius for experimentation and invention’” (#255).  Socialism, whether of the dictatorial or majority-rule democratic variety, is all about planning.  It’s preeminently “the forceful substitution of governmental plans for individual plans” (#588).  Planned economies always look wonderful to the planners.  But the plans inevitably founder when implemented because they run counter to human nature.

Socialists further violate human nature by seeking to dictate economic equality (e.g. “free” education, health care, housing, food, etc.) which “is not just a revolt against reality; it is nothing less than a recipe for the destruction of normal human society,” as became brutally evident in Russia and China (#374).  By eliminating capitalism’s division of labor and freeing each person to cultivate his own talent as well as his own garden, socialism (Leon Trotsky believed) would enable the perfection of our species so that the “human average will rise to the level of an Aristotle, a Goethe, a Marx.”   That such never happens—indeed could never happen—effectively refutes such utopianism.  “How remarkable it is that to this day, self-proclaimed socialists in academe claim to occupy the moral high ground.  The ideology that is associated with the worst crimes, the greatest mass slaughters the most totalitarian regimes ever, is allegedly more compassionate that the free market capitalism that has lifted more people from poverty created more wealth, provided more opportunities for human development, and supported human freedom more than any other economic system in the history of the world” (#697).  

The intrinsic deficiencies of socialism are also on display in those “islands of socialism in a sea of capitalism—government-run enterprises like the U.S. Postal Service, state and local government public works departments, police, firefighters, garbage collection, schools, electric, natural gas, and water utilities, transportation services, financial institutions like Fannie Mae, and dozens more” (#500).  Though there may very well be practical reasons for their existence, they are “vastly more inefficient, and offer products or services of far worse quality than private businesses” (#508).   Economists generally hold “that the per-unit cost of a government service will be on average twice as high as a comparable service offered in the competitive private sector” (#508).  That privately owned and operated firms like UPS and FedEx prosper, while the USPS needs abiding subsidies, surprises no economist.  Nor does it surprise anyone that USPS employees “earn 35 percent higher wages and 69 percent greater benefits than private industry employees” (#558).  

This problem ultimately led to recent changes in Scandinavia, where free-market reforms are currently reversing decades of “democratic socialism.”  The Swedish Economic Association recently reported “that the Swedish economy had failed to create any new jobs on net from 1950 to 2005.”  Consequently, Sweden is actually “poorer than Mississippi, the lowest-income state in the United States” (#880).  Within a half-century, the nation slipped “from fourth to twentieth place in international income comparisons.”  It has simply proved “impossible to maintain a thriving economy with a regime of high taxes, a wasteful welfare state that pays people not to work and massive government spending and borrowing” (#855).  Of Denmark’s 5.5 million people, 1.5 million “live full-time on taxpayer-funded welfare handouts” (#890).  One Swedish economist, Per Byland, says giving out “benefits” and thereby “‘taking away the individual’s responsibility for his or her own life, a new kind of individual is created—the immature, irresponsible, and dependent.’”  Thus the celebrated, carefully planned Swedish “welfare state”  has unintentionally created multitudes of “psychological and moral children’” (#872).  

Sadly enough, DiLorenzo concludes, socialism ultimately harms the very folks its designed to help—the poor.  It’s a “false philanthropy.”   And it should be resisted wherever possible. 

287 The Kingdom of Speech; Undeniable; Evolution 2.0

 For five decades Tom Wolfe has remained a fixture atop the nation’s literary world—helping establish the “new journalism,” publishing essays and novels, credibly claiming to discern the pulse and diagnose the condition of America.  His most recent work, The Kingdom of Speech (New York:  Little, Brown and Company, c. 2016), finds him entering (with his customary wit) the somewhat arcane worlds of biological evolution and linguistics, finding therein much to question and pillory while educating us in the process.  He was prompted to research the subject when he read of a recent scholarly conference where “eight heavyweight Evolutionists—linguists, biologists, anthropologists, and computer scientists” had given up trying to answer “the question of where speech—language—comes from and how it works” (#18).   It’s “as mysterious as ever” they declared!  Amazingly, one of the eight luminaries was Noam Chomsky, for 50 years the brightest star in the linguistics’ firmament!  Now for academics such as Chomsky this is no small admission, for:  “Speech is not one of man’s several unique attributes—speech is the attribute of all attributes” (#36).  When the regnant Neo-Darwinian theory of evolution fails to explain language it fails to explain virtually all that matters!   

To put everything in historical context, Wolfe guides us through some fascinating developments in evolutionary theory, including deft portraits of Alfred Wallace and Charles Darwin (who maneuvered to co-opt Wallace as the singular architect of the theory of biological evolution of species through natural selection).  While styling himself an empirical scientist, Darwin subtly propounded a cosmogony that closely resembles the creation stories of many American Indians.  In fact, Darwin’s story, with its “four or five cells floating in a warm pool somewhere” developing into a world teeming with remarkable creatures was, rightly understood, a “dead ringer” for that of the Navajos!  “All cosmologies, whether the Apaches’ or Charles Darwin’s faced the same problem.  They were histories, or, better said, stories of things that had occurred in a primordial past, long before there existed anyone capable of recording them.  The Apaches’ scorpion and Darwin’s cells in that warm pool were somewhere were by definition educated guesses” (#281).  They were all “sincere, but sheer, literature” (#293).  

  While telling his story, however, Darwin recognized that speech “set humans far apart from any animal ancestors.”  Other traits he might passably explain, but he utterly failed to show how “human speech evolved from animals” (#205).  “Proving that speech evolved from sounds uttered by lower animals became Darwin’s obsession.  After all, his was a Theory of Everything” (#215).  Critiquing this theory was England’s most prestigious linguist, Max Muller, who insisted there is radical difference in kind between man and beast—and that difference is language.  “Language was the crux of it all.  If language sealed off man from animal, then the Theory of Evolution applied only to animal studies and reached no higher than the hairy apes.  Muller was eminent and arrogant—and made fun of him” (#860).  And then, just when Darwin mustered up the nerve to publish The Descent of Man, and Selection in Relation to Sex, declaring apes and monkeys evolved into human beings, the pesky Alfred Wallace (who had been busily writing trenchant biological treatises) wrote an article, “The Limits of Natural Selection as Applied to Man,” pointing out certain uniquely human traits, including language, impossible to explain through natural selection.  “No said Wallace, ‘the agency of some other power’ was required.  He calls it ‘a superior intelligence,’ ‘a controlling intelligence.’  Only such a power, ‘a new power of definite character,’ can account for ‘ever-advancing’ man” (#694).  But this Darwin could not allow!  All must be the result of purely material, natural processes!  “He had no evidence,” Wolfe says, but he told a good “just so” story that captured much of the public mind.  Yet his followers, for 70 years, gave up trying to explain the origin of language and turned to simpler evolutionary matters, upholding the Darwinian standard and insisting, with Theodosius Dobzhansky:  “Nothing in Biology Makes Sense Except in the Light of Evolution.”  But not even Dobzhansky ventured to suggest precisely how speech evolved!  

Then came Noam Chomsky, who (as a graduate student at the University of Pennsylvania) set forth a revolutionary theory of linguistics, a “radially new theory of language.  Language was not something you learned.  You were born with a built-in ‘language organ’” (#1000).  Along with your heart and liver, you’re given it—a biological “language acquisition device” (routinely referred to as the LAD in the “science” of linguistics).  Chomsky summed it all up in his 1957 Syntactic Structures and thereby became “the biggest name in the 150-year history of linguistics” (#1012).  But what, precisely was this LAD?  Was it a free-standing organ or an organ within the brain?  Like all else in the evolutionary scheme, it had to be something material.  But where could it be found?  Take it by faith, Chomsky said—in time empirical scientists would find it!  

After 50 years of absolute preeminence in the field of linguistics, however, Chomsky suddenly faced an antagonist!  Daniel L. Everett, having spent 30 years living with a small tribe in the Amazon jungle—the Piraha, arguably the most primitive tribe on earth—dared to challenge the Master!  He declared Chomsky’s theory falsified by the Indians he studied.  They “had preserved a civilization virtually unchanged for thousands . . . many thousands of years” (#1313), and no “language organ” or “universal grammar” could explain how they spoke.  When Everett presented his findings to the public a decade ago—declaring they provided “the final nail in the coffin for Noam Chomsky’s hugely influential theory of universal grammar” (#1393)—a “raging debate” ensued.  In fact, it was total war, with Chomsky and his epigones determined to destroy Everett!  They questioned his integrity, discounted his credentials, and schemed to ostracize him from the academic community.  

Fighting back, in 2008 Everett published Don’t Sleep, There Are Snakes, summarizing his 30 years among the Piraha.  Amazingly, for a linguistics treatise, it became something of a best-seller!  “National Public Radio read great swaths of the book aloud over their national network and named it one of the best books of the year” (#1637).  Dismissing Chomsky’s celebrated theory, Everett argued:  “Language had not evolved from . . . anything.  It was an artifact” (#1631).  He followed this up with Language:  The Cultural Tool, insisting “that speech, language is not something that had evolved in Homo sapiens, the way the breed’s unique small-motor-skilled hands and . . . or its next-to-hairless body.  Speech is man-made.  It is an artifact . . . and it explains man’s power over all other creatures in a way Evolution all by itself can’t begin to” (#1675).  Soon he found some distinguished defenders, including Michael Tomasello—co-director of the Max Planck Institute for Evolutionary Anthropology.  In an article entitled “Universal Grammar Is Dead,” Tomasello opined:  “‘The idea of a biologically evolved, universal grammar with linguistic content is a myth’” (#1663).  Then Vyvyan Evans published The Language Myth and simply dismissed the innate “language instinct” notion.  Still others soon joined the growing condemnation of the Chomsky thesis!  

Chomsky of course responded, defending himself—but subtly retracting some of his earlier hypotheses.  Then, in a long, convoluted article, we find him confessing:  “‘The evolution of the faculty of language largely remains an enigma’” (#1734).  An enigma, no less!  Fifty years of feigning The Answer!  (It seems Chomsky knows less than Aristotle, who concluded that humans have a “rational soul” enabling them to function in uniquely human ways.)  And to Tom Wolfe, this at least became crystal clear:  “There is a cardinal distinction between man and animal, a sheerly dividing line as abrupt and immovable as a cliff:  namely, speech” (#1890).  He thinks:  “Soon speech will be recognized as the Fourth Kingdom of Earth.”  In addition to the mineral, vegetable, and animal worlds, there is “regnum loquax, the kingdom of speech, inhabited solely by Homo Loquax” (#1938).  How interesting to find Wolfe affirming what an earlier (and deeply Christian) literary giant, Walker Percy, identified (in Lost in the Cosmos) as the “delta factor”—the symbolic language unique to our species.  There’s an immaterial dimension to language, rendering it impossible to reduce to (or explain by) mere matter.  

* * * * * * * * * * * * * * * * * * * * * * * * * 

While practicing their craft, scientists cannot but ask philosophical questions.  The empirical details of their discipline may very well prove interesting to certain scholars, but the deeper philosophical implications of their findings constantly press for examination and explanation.  Thus, in Undeniable:  How Biology Confirms Our Intuition that Life Is Designed (New York:  HarperOne, c. 2016), Douglas Axe, a highly-credentialed biologist (degrees from Cal Tech and Cambridge University; research articles published in peer reviewed journals) notes:  “The biggest question on everyone’s mind has never been the question of survival but rather the question of origin—our origin in particular.  How did we get here?” (#195).  We cannot but wonder:  “What is the source from which everything else came?  Or, to bring it closer to home:  To what or to whom do we owe our existence?  This has to be the starting point for people who take life seriously—scientists and nonscientists alike.  We cannot rest without the answer, because absolutely everything of importance is riding on it” #275).  

Axe mixes many enlightening personal anecdotes—struggling to survive within an antagonistic academic establishment while entertaining serious questions concerning the dogmas espoused therein—with an expertise honed in laboratories (most notably Cambridge University) and through interactions with both eminent biologists and cutting-edge publications.  But he urges us to rely not upon prestigious authorities.  We should trust our common sense, believing what we see and intuitively know rather than what we’re told to see and believe.  He shares St. Paul’s probing conviction that “the wrath of God is revealed from heaven against all ungodliness and unrighteousness of men, who suppress the truth in unrighteousness, because what may be known of God is manifest in them, for God has shown it to them.  For since the creation of the world His invisible attributes are clearly seen, being understood by the things that are made, even His eternal power and Godhead, so that they are without excuse, because, although they knew God, they did not glorify Him as God, nor were thankful, but became futile in their thoughts, and their foolish hearts were darkened.  Professing to be wise, they became fools” (Ro 1:18-22).  

At an early age children (even if reared in atheist homes) prove St. Paul’s point, sensing there’s an ultimate God-like source responsible for a world that seems to function in accord with certain regularities and principles.  This Axe labels the universal design intuition that recognizes an intelligent dimension to all that is.  Thus children “innately know themselves to be the handiwork of a ‘God-like designer,’” only to suffer schools wherein they’re generally “indoctrinated with the message that they are instead cosmic accidents—the transient by-products of natural selection” (#843).  To refute that materialistic dogma, philosophical rather than scientific, Axe to presents in-depth scientific information pointing to intelligent design as the answer to our deepest questions.  He’s particularly adept at showing how the latest findings in molecular biology (in particular the tiny and incredibly complex proteins he examines in the laboratory) and cosmology make purely naturalistic explanations truly improbable.  Fortunately, for the general reader, Axe explains things in simple, intelligible ways while demonstrating his mastery of the materials.  And he insists:  “What is needed isn’t a simplified version of a technical argument but a demonstration that the basic argument in its purest form really is simple, not technical” (#898).  We don’t need a Ph.D. in science to understand the common sense science basic to the question of origins.  

Axe’s argument actually takes us back to Aristotelian metaphysical tradition (though he doesn’t overtly align himself with it), for the world we observe contains real beings (what he calls “busy wholes) innately orientated to discernible ends.  There’s more to Reality than mindless matter—there’s information, reason, a Logos giving shape to that matter.  “When we see working things that came about only by bringing many parts together in the right way, we find it impossible not to ascribe these inventions to purposeful action, and this pits our intuition against the evolutionary account” (#1264).  Consider such amazingly complex creatures as spiders, salmon, and whales, each of which “is strikingly compelling and complete, utterly committed to being what it is” (#1117).  The utter inescapability of the material, formal, efficient, and final causes necessary to understand and explain them cannot be denied!  Thus life doesn’t just happen as a result of atoms randomly bouncing through space.  And to imagine life originated in a primordial pond of inorganic compounds violates both the empirical evidence of science and the laws of thought.  To anyone with eyes to see, “life is a mystery and masterpiece—an overflowing abundance of perfect compositions” that cannot be explained in accord with Darwin’s natural selection (#1129).  

Having presented his case, Axe says:  “The truth we’re arrived at is important enough that we have a responsibility to stand up for it.  Think of this is a movement, not a battle.  When a good movement prevails, everyone wins” (#2835).  He further believes that Darwinian devotees are now on the defense, retreating on many fronts.  They know Darwin himself understood his theory’s vulnerability, admitting:  “If it could be demonstrated that any complex organ existed, which could not possibly have been formed by numerous successive, slight modifications, my theory would absolutely break down.”  But though he credibly described the survival of the species he simply failed—as have his successors—to explain their arrival!  A century of intensive research leaves unanswered the truly fundamental questions:  how did organic life (e.g. the first cell containing proteins providing genetic instruction for making proteins) arise from inorganic materials?  why are humans uniquely conscious and marked by distinctively non-utilitarian traits such as altruism?  

Unlike many advocates of Intelligent Design, who insist they are not making an argument for the existence and power of God, Axe forthrightly moves from his scientific data and philosophical arguments to “naming God as the knower who made us.  I see no other way to make sense of everything we’ve encountered on our journey” (#3096).  The material world can only be—and be understood—because of an immaterial world, the spiritual and supernatural world.  “In the end,” he says, “it seems the children are right again.  The inside world is every bit as real as the outside one.  Consciousness and free will are not illusions but foundational aspects of reality, categorically distinct from the stuff of the outside world.  Following the children, if we allow ourselves to see the outside world as an expression of God’s creative thought, everything begins to make sense” (#3190).  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

An electrical engineer by training and profession and fully immersed in cutting-edge computer developments, Perry Marshall’s consuming interest in evolutionary theory has prompted him to publish Evolution 2.0:  Breaking the Deadlock Between Darwin and Design (Dallas:  Benbella Books, Inc., c. 2015) in hopes of bringing about a rapprochement between folks primarily committed to Scripture and those strongly rooted in evolutionary Science.  Neither a “God of the Gaps” young earth creationism nor a mindless matter-in-motion Old School Darwinism will suffice, given the best available evidence.  Reared in young-earth creationist milieu but educated as a scientist, Marshall long struggled with seemingly unanswerable questions.  But:  “One day I had a huge epiphany:  I suddenly saw the striking similarity between DNA and computer software.  This started a 10-year journey that led me down a long and winding road of research, controversy, and personal distress before I discovered a radical re-invention of evolution” (#333).   This combination of a divinely-engendered creation and evolutionary process he calls Evolution 2.0 and urges it upon his readers.  

Codes provide patterns for both computers and biological organisms.  Though we struggle to understand mysterious powers such as gravity and thermodynamics, we fully understand how to create computer codes.  And we absolutely know that codes do not write themselves!  Codes exist because minds devise them!  Without intelligent coders the there could be no codes!  He supports both creationists (believing God encodes creation) and Darwinists (believing much of the evolutionary account).   Independently wealthy, Marshall even offers a multi-million dollar prize to the “first person who can discover a process by which nonliving things can create code” (#416).  But:    “Nobody knows how you get life from nonlife.  Nobody knows how you get codes from chemicals.  So nobody gets to assume life just happens because you have some warm soup and a few billion years; we want to understand how and why” (p. 178).  He stresses, in bold print:  “Codes are not matter and they’re not energy.  Codes don’t come from matter, nor do they come from energy.  Codes are information, and information is in a category by itself” (p. 187).  

Marshall begins this very personal book by confessing that his childhood cosmological beliefs were seriously challenged by the data he faced as a scientist.  He sincerely wanted to discover the truth and resolved to find it, whatever the cost.  “At the core of my being, I knew I could not live apart from integrity; I could no somehow make myself believe something that was demonstrably untrue” (p. 6).  Trusting his engineering training, he resolved let it guide him, fully aware it might lock him into atheism.   The electrical engineering he’d mastered is highly mathematical—everything works precisely.  And as he delved into biological science he found tiny living organisms working just as precisely, following sophisticated instructions.  He also found that evolutionary theory nicely explained much that is evident in living creatures, confirming Darwin’s insights.   But one of the Darwinists’ core beliefs—that random mutations fully explain the information necessary for living beings—he found untenable.  There must be some intelligent source for the information markedly present in all that lives.  

It soon dawned on him that computer codes and biological DNA are remarkably alike.  It’s information that enables them both to work in such wonderful ways.  Prior to any evolving organisms there must be information, precisely coded in the DNA, that enables them to function.  “The pattern in DNA is a complex, intricate, multilayered language.  And incredibly efficient and capable language at that” (p. 165).  It’s not “natural selection” but “natural genetic engineering” that best explains the living world.  Marshall carefully discusses important things such as “transposition,” “horizontal gene transfer,” “epigenetics,” “symbiogenesis,” and “genome duplication,” to illustrate the wonderful ways cells function.  “One cell can hold a gigabyte of data; plant and animal tissues have a billion cells per cubic centimeter” (p. 102).  A simple cell has information equivalent “to about a hundred million pages of the Encyclopedia Britannica” (p. 106).  Indeed:  “As amazing as Google is, a single cell possesses more intelligence than a multibillion-dollar search engine” (p. 235).  And within each cell there are the true building blocks of all organisms—tiny, information-laden proteins that enable to cell to thrive.  “No human programmer has ever written software that does anything even close to this.  Could randomness do it?  No chance” (p. 142)

Only God could have written the software evident in our world.  That many of the the greatest scientists of the past—Newton, Kepler, Kelvin, Mendel, Planck—believed in such a God should encourage us to follow their example.  For himself, Marshall says:  “after years of research, expense, scrutiny, and debates, my conclusion is:  Not only is Evolution 2.0 the most powerful argument for a Designer that I’ve ever seen (!), but people of faith were on the cutting edge of science for 900 or the last 1,000 years.  The rift between faith and science might heal if everyone could see how evolution actually works” (p. 270).    Marshall has read widely and provides helpful bibliographic materials.    Though not a trained philosopher, he clearly understands sophisticated arguments and logic, and his scientific preparation enables him to both grasp and explain current data.  While not as conclusively persuasive as he might like, he does provide a valuable treatise on this absolutely crucial issue.

286 Clinton Cash; Armageddon; Stealing America

 Peter Schweizer is an investigative journalist with a muckraker’s penchant for pursuing the darker dimensions of American politics, looking for scoundrels whose behavior needs exposing.  So in Architects of Ruin he detailed governmental corruption underlying the 2008 financial collapse; in Makers and Takers he highlighted the many faults of the welfare state; and in Throw Them All Out he brought to light the many suspicious stock trades enriching members of Congress.  Just recently, in Clinton Cash:  The Untold Story of How and Why Foreign Governments and Businesses Helped make Bill and Hillary Rich (New York:   Harper Collins, c. 2015), he documents the extraordinary number of questionable ties linking the Clintons and their foundation to wealthy foreign governments and businessmen.  Most all of his critical findings present only circumstantial evidence.  Demonstrable quid pro quo transactions are by their very nature are enshrouded in secrecy and rarely leave overt proof.  But Schweizer’s evidence leads the reader to suspect the Clintons of massive corruption and malfeasance in office.  Legally, there’s a Latin phrase—res ipsa loquitur (the thing speaks for itself)—that fully applies to Schweizer’s evidence.  When first published, the book was attacked and dismissed by the Clinton-supporting mainline media.  Thus ABC’s George Stephanopoulos (without disclosing the fact that he personally had contributed $75,000 to the Clinton Foundation) glibly assured viewers that nowhere did Schweizer establish any “direct  action” taken by Hillary “on behalf of the donors.”  Thus, he declared, there were no quid pro quo deals.  However, subsequent Congressional and FBI investigations make Schweizer’s case increasingly credible.  Res ipsa loquitur!  

Admittedly there has always been considerable dishonesty in American politics.  But the Clintons have been unusually close to wealthy foreigners, raking in millions of dollars for speeches and garnering  contributions for the Clinton Foundation.  Indeed, “the scope and extent of these payments are without precedent in American politics.  As a result, the Clintons have become exceedingly wealthy” (#167 in Kindle).  Indeed:  “No one has even come close in recent years to enriching themselves on the scale of the Clintons while they or a spouse continued to serve in public office” (#201).  “Dead broke” in 2001, Hillary claimed, they quickly prospered (accumulating $136 million within a decade) by circumventing the law which prohibits foreign interests from contributing to political campaigns.  Lavish speaking fees and gifts to the Clinton Foundation (which employed friends and covered lush “expense” accounts for the inner circle) were the “legal” (in fact the only discernable) ways whereby the Clintons became inordinately wealthy.  “The issues seemingly connected to these large transfers are arresting in their sweep and seriousness:  the Russian government’s acquisition of American uranium assets; access to vital US nuclear technology; matters related to the Middle East policy; the approval of controversial energy projects; the overseas allocation of billions in taxpayer funds; and US human rights policy, to name a few” (#236).  

Symptomatic of things to come was President Bill Clinton’s pardon (just before leaving the White House in 2001) of billionaire fugitive Marc Rich, who was living abroad to avoid facing a variety of charges.  One of the world’s richest men, he’d been indicted for illegal trading practices and tax evasion.   His “business ties included a ‘who’s who’ of unsavory despots, including Fidel Castro, Muammar Qaddafi, and the Ayatollah Khomeini.”  Rich “owed $48 million in back taxes that he unlawfully tried to avoid and faced the possibility of 325 years in prison,” earning him a place on the FBI’s Most Wanted List.  A federal prosecutor, Morris Weinberg, said:  “The evidence was absolutely overwhelming that Marc Rich, in fact, committed the largest tax fraud in the history of the United States.”  Rather than risk imprisonment, Rich fled the country in 1991.  Fortunately, he had a charming former wife, Denise, who ingratiated herself with the Clintons, making lavish contributions—moving $1.5 million into their coffers.  President Clinton then pardoned Marc Rich soon after Denise donated “$100,000 to Hillary’s 2000 Senate campaign, $450,000 to the Clinton Library, and $ 1 million to the Democrat Party” (#270).  These transactions were helped along by Rich’s lawyer (and former White House counsel) Jack Quinn, who pled her case with Bill and Hillary.  Informed of the pardon, Mayor Rudolph Giuliani, the U.S. attorney who spearheaded the Rich investigations, refused to believe it.  Surely it was “impossible” that a president would pardon him.  But Clinton did.  Ever mindful of the letter of the law, he evaded clear quid pro quo connections, but what rational person could deny it!  Res ipsa loquitor!  Rich’s was merely one of Clinton’s many last-minute pardons—crooks, con men, relatives, ex-girlfriends, former cabinet members and congressmen.   

Such suspicious Clintonian behavior persisted—indeed escalated—during the following years as Bill and Hillary established the Clinton Foundation and erected the Clinton Library, soliciting funds from various donors and negotiating huge fees (often amounting to $500,000 or more) for speaking engagements around the world—especially in developing nations such as Uzbekistan and Kazakhstan where despots flush with cash sought to multiply it.  Skeptical journalists such as Christopher Hitchens wondered:  “why didn’t these third world oligarchs ‘just donate the money directly [to charities in their own countries] rather than distributing it through the offices of an outfit run by a seasoned ex-presidential influence-peddler?’” (#300).  Their activities caused the Obama team to voice significant concern regarding Hillary’s financial ties when she was appointed Secretary of State, so she promised to fully disclose their financial activities and secure State Department approval before accepting gifts to the foundation from foreign sources.  But she quickly broke these promises:   “Huge donations then flowed into the Clinton Foundation while Bill received enormous speaking fees underwritten by the very businessmen who benefited from these apparent interventions” (#395).  Interestingly enough, while ex-presidents’ speaking fees gradually decline once they’re out of office, Bill Clinton’s dramatically escalated when his wife became Secretary of State.  

One of the businessmen most frequently involved in the Clintons’ financial endeavors was a Canadian mining tycoon, Frank Giustra, who first connected with them  in the 1990s and frequently provided a luxurious private jet for Bill to fly around the world (or to campaign for Hillary) after he left the White House.  It was Giustra who arranged a meeting between Bill and the dictator of Kazakhstan that led to an involved uranium deal, helped along by then Senator Hillary Clinton’s activities in the Congress.  This “deal stunned longtime mining observers,” and soon thereafter “Giustra gave the Clinton Foundation $31.3 million” (#593).   Yet another uranium deal involved a Canadian company and Russian investors who sought to gain control of “up to half of US uranium output by 2015” (#751).  This move was set in motion by Vladimir Putin, who personally discussed various issues, including trade agreements, with Secretary of State Clinton in 2010.  Monies then flowed into the Clinton Foundation, thanks to significant gifts from folks invested in the uranium industry.  “Because uranium is a strategic industry, the Russian purchase of a Canadian company holding massive US assets required US government approval.  Playing a central role in whether approval was granted was none other than Hillary Clinton” (#821).  Though a number of congressmen protested the deal, it was duly authorized by the Committee on Foreign Investment in the United States—a select committee that included the Secretary of State and other Obama Cabinet members.   Coincidentally, Salida Capital, one of the Canadian companies involved in the transaction and possibly a “wholly owned subsidiary of the Russian state nuclear agency,” would give “more than $2.6 million to the Clintons between 2010 and 2012” (#875).  Ultimately, “Pravda hailed the move with an over-the-top headline:  ‘RUSSIAN NUCLEAR ENERGY CONQUERS THE WORLD’” (#969).  

Since most of the millions flowing through the Clintons’ hands goes to (or through) the Clinton Foundation, Schweitzer devotes many pages to probing its activities as well as providing fascinating portraits of its denizens.  Though its “window-display causes” portray the foundation as admirably charitable, helping victims of AIDS, poverty, obesity, etc., it’s more probably both “a storehouse of private profit and promotion” (#1326) and a generous employer for a numbers of Clinton associates, advisers and political operatives.  (A recent review of the foundation’s 2014 IRS report reveals that of the $91 million expended only $5 million actually went to needy causes; the rest was devoted to employees, fundraising, internal expenses.)  In fact, the foundation has virtually no infrastructure and does very little to actually help those in need.  Rather, it seeks broker deals between “government, business, and NGOs” (#1349).  That some good is ultimately done cannot be denied, but it’s not actually done by the foundation itself.  “While there are plenty of photos of Bill, Hillary, or Chelsea holding sick children in Africa, the foundation that bears their name actually does very little hands-on humanitarian work” (#1356).  When Hillary became Secretary of State, she utilized a special government employee (SGE) rule that allowed her to appoint aides, including Huma Abedin, to her department while simultaneously garnering a salary from the Clinton Foundation.  “Abedin played a central role in everything Hillary did” (#1589), and according to the New York Times “‘the lines were blurred between Ms. Abedin’s work in the high echelons of one of the government’s most sensitive executive departments and her role as a Clinton family insider’” (#1595).  

The Clintons’ approach to “charitable” work was manifest following the devastating 2010 earthquake in Haiti which killed some 230,000 people and left 1.5 million more homeless.  Days after the earthquake, Secretary of State Hillary Clinton flew to the island, as did husband Bill.  “With a cluster of cameras around him, Bill teared up as he described what he saw” (#2497).  “The Clintons’ close friend and confidante, Cheryl Mills, who was Hillary’s chief of staff and counselor at the State Department [recently granted immunity for telling the FBI what she knew about the thousands of Hillary’s deleted emails] was assigned responsibility for how the taxpayer money, directed through USAID, would be spent” (#2497).  A special committee, with Bill as cochair, was appointed to distribute these funds, and he made speeches describing how Haiti would marvelously recover under his guidance.  But little construction actually took place!  For example, in “December 2010 Bill and Hillary approved a ‘new settlements program’ that called for fifteen thousand homes to be built in and around Port-au-Prince.  But by June 2013, more than two and a half years later, the GAO audit revealed that only nine hundred houses had been built” (#2712).  

Rather than rebuilding the nation’s infrastructure, the money was spent on “worthless projects,” and “in several cases Clinton friends, allies, and even family members have benefited from the reconstruction circumstances” (#2521).  Consider the story of Denis O’Brien, an Irish billionaire who studiously curried the Clintons’ favor (often making available his Gulfstream 550) while successfully promoting his mobile phone company, Digicel.  The firm profited enormously from its Haitian programs and O’Brien himself collected $300 million in dividends in 2012.  O’Brien invited Bill to speak three times in three years in Ireland, and almost simultaneously his company was granted profitable positions in Haiti.  Then there’s Hillary’s brother, Tony Rodham, who had absolutely no background in the mining industry but became a member of the board of advisors for a mining company that secured a “gold mining exploitation permit”—a “sweetheart deal” that outraged the Haitian senate.  Meanwhile, Bill’s brother Roger collected $100,000 for promising builders he’d arrange a sweet deal with the Clinton Foundation.  “In sum, little of the money that has poured into Haiti since the 2010 earthquake has ended up helping Haitians.  And how that money was spent was largely up to Hillary and Bill” (#2770).  

In conclusion:  “The Clintons themselves have a history of questionable financial transactions” (#2806).  They neither follow the same rules nor receive the same treatment as most Americans, yet they have famously flourished within modern American politics.  That they have succeeded, despite the record of questionable activities detailed in Clinton Cash, should give us pause!  

* * * * * * * * * * * * * * * * * * * * * *

Few political insiders know Bill and Hillary Clinton better than Dick Morris, the architect of Bill’s “triangulation” strategy which enabled him to coast to re-election in 1996.  Morris’s Armageddon:  How Trump Can Beat Hillary (West Palm Beach, FL:  Humanix Books, c. 2016), co-written with his wife, Eileen McGann, offers a unique perspective on this year’s election.  Given Morris’s checkered history, his pronouncements must always be considered with significant reservations!  Much of his life he’s worked as a “hired gun” and shown little ethical concern when involved in the rough and tumble world of partisan politics.  But inasmuch as he was one of Bill Clinton’s most trusted consultants in the 1990s he certainly provides information worth pondering as we consider Hillary’s presidential aspirations.  Morris also discusses Donald Trump’s prospects, but it’s his knowledge of the Clintons that most interests me.  

As the book’s title indicates, Morris writes as an alarmist:  “The ultimate battle to save America lies straight ahead of us:  It’s an American Armageddon, the final crusade to defeat Hillary Clinton” (#138).  Her election, he says, listing a litany of fears, will consign us to “four long years of another bizarre Clinton administration, featuring the Clintons’ signature style of endless drama, interminable scandals, constant lies, blatant cronyism and corruption, incessant conflicts of interest, nepotism, pathological secrecy, hatred of the press, his and her enemies lists, misuse of government power, inherent paranoia, macho stubbornness, arrogant contempt for the rule of law, nutty gurus, and thirst for war.  Those will be the disastrous and unavoidable hallmarks of a Hillary regime” #246).  With a cast of characters including Bill and Chelsea Clinton, Sidney Blumenthal, David Brock, Terry McAuliffe, et al.—“unqualified and greedy cronies and  her money-grubbing family members” roaming Hillary’s White house—the nation will suffer gravely.  When we think of the Clinton scandals, we usually focus on Bill’s sexual escapades, but Morris declares “that almost every single scandal in the Bill Clinton White House was caused by Hillary:  Travelgate, Whitewater, Filegate, her amazing windfall in the commodities futures market, the Health Care Task Force’s illegal secrecy, the household furniture and gifts taken from the White House to Chappaqua, Vince Foster’s suicide, Webb Hubbell’s disgrace—all Hillary scandals” (#412).  

In his first chapter Morris lists “A Dozen Reasons Hillary Clinton Should Not Be President.”  These include:  1) her dismal failure to respond well to the terrorist attack in Benghazi; 2) her compulsive, life-long lying about almost everything; 3) her penchant for hawkish, pro-war pronouncements; 4) her ties with the Muslim Brotherhood, as evident in her close ties to Huma Abadin, whose parents (and she herself) fervently supported the organization; 5) her easily documented record of flip-flops on a variety of issues (e.g. gay marriage, free trade) during the course of her life; 6) her manifest corruption—a “way of life” most evident in her multifaceted financial deals, e-mails, and Clinton Foundation; 7) her obsessive concern for secrecy; 8) her queen-like ignorance regarding ordinary Americans; 9) her economic vacuity; 10) her reliance on disreputable “gurus” such as Sidney Blumenthal; 11) her stubbornness; and 12) her notorious nepotism.  

Clearly Dick Morris dislikes and distrusts Hillary Clinton.  How seriously you take his warnings naturally depends upon how much you trust him.  But when placed in proper context, and compared with other accounts corroborating his data, he’s persuasive.  

* * * * * * * * * * * * * * * * * 

With the election of 2016 approaching, Dinesh D’Souza published two clearly polemical treatises designed to warn America about Hillary Clinton and the Democratic Party:  Stealing America:  What My Experience with Criminal Gangs Taught Me About Obama, Hillary, and the Democratic Party (New York:  Broadside Books, c. 1016), and Hillary’s America:  The Secret History of the Democratic Party (Washington, D.C.:  Regnery Publishing, c. 2016).   For many years D’Souza has espoused conservative principles, shaped in part by his unique story as an immigrant (from India) feeling deeply blessed to thrive in his adopted country.  For me his treatises serve to elicit thought, not to chart a course!  

In 2012 D’Souza gave a friend running for a state office in New York $10,000 and persuaded two others to donate the same amount, for which he reimbursed them.  He knew he was skirting the campaign finance limit but didn’t think he was breaking the law.  Soon thereafter, however, he was pursued by the Justice Department and (unlike virtually all other violators) found himself paying half-a-million dollars in legal fees and serving eight months of nights in a confinement center in San Diego.  That he’d just produced an anti-Obama film (2016) was, he believed, anything but coincidental!  Commenting on his case, Harvard law professor Alan Dershowitz said:  “‘What you did is very commonly done in politics, and on a much bigger scale.  Have no doubt about it, they are targeting you for your views’” (p. 14).  In confinement D’Souza “understood, for the first time, the psychology of crookedness.  Suddenly I had an epiphany:  this system of larceny, corruption, and terror that I encountered firsthand in the confinement center is exactly the same system that has been adopted and perfected by modern progressivism and the Democratic Party” (p. 26).  He came to see the party of Obama and the Clintons not simply as “a defective movement of ideas, but as a crime syndicate” (p. 26). 

Pursuing this thesis—however preposterous it might seem—makes for interesting reading.  In particular, one learns much about the criminal underclass populating America’s prisons and its utter cynicism regarding the political system.  The murderers and thieves with whom D’Souza lived noted that most politicians enter “office with nothing and leave as multimillionaires.  So how did this happen?  It just happened?” (p. 47).  If nothing else they understood crime—and they knew criminality undergirds this process!  D’Souza soon grasped the truth of St. Augustine’s famous observation in The City of God:  “What are kingdoms but gangs of criminals on a large scale?  What are criminal gangs but petty kingdoms?”  Translating that truth into contemporary America, D’Souza concludes that “the ideological convictions of Obama, Hillary, and the progressives largely spring out of those base motives and that irrepressible will to power.  The progressives have unleashed a massive scheme for looting the national treasury and transferring wealth and power to themselves, and their ideology of fairness and equality is to a large degree of justification—a sales pitch—to facilitate that larceny.  Previously I didn’t see this very clearly; now I do” (p. 50).  

The same basic message characterizes D’Souza’s Hillary’s America, the book basic to the widely-viewed documentary bearing the same title.  He clearly believes Hillary is a threat to the republic, but more basically he argues the Democratic Party has (since its inception under Andrew Jackson) supported a variety of evil endeavors running from stealing Indians’ lands to enslaving Africans to endorsing Jim Crow laws and racist eugenics.  To D’Souza the “progressive narrative” of American history is “a lie” and the Democratic Party must be held accountable for its desultory past.  Hillary Clinton is merely the current representative of a movement that has “brutalized, segregated, exploited, and murdered the most vulnerable members of our society.”  As such Hillary and the Democrats must, he insists, be defeated!  

# # #

285 Deadly Notions

Given our rationality, ideas inevitably have consequences and deeply shape human history.  In The Death of Humanity:  and the case for life (Washington:  Regnery Faith, c. 2016), California State University historian Richard Weikart helps explain the “culture of death” so pervasive throughout the past century—during which both belief in the dignity of man and the actual lives of millions of men demonstrably perished.  Consider the case of the serial killer and cannibal Jeffrey Dahmer:  following his arrest in 1991, he said that he believed “‘the theory of evolution is truth, that we all just came from the slime, and when we died . . . that was it, there was nothing—so the whole theory cheapens life.’  With this vision, he saw no reason not to kill and eat other men.  As he confessed, ‘If a person doesn’t think there is a God to be accountable to, then what’s the point in trying to modify your behavior to keep it in acceptable ranges?’” (#224).  Similarly, Eric Harris, one of the killers in Columbine High School in 1999, confessed (in his journal) to loving Thomas Hobbes and Friedrich Nietzsche; furthermore, he wore a T-shirt declaring “Natural Selection” when he launched his killing spree.  

Having survived Auschwitz, the great Austrian psychologist Victor Frankl analyzed the intellectual currents he held responsible for the Holocaust:  “If we present a man with a concept of man which is not true, we may well corrupt him.  When we present man as an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity, and environment, we feed the nihilism to which modern man is, in any case, prone.  I became acquainted with the last stage of that corruption in my second concentration camp, Auschwitz.  The gas chambers of Auschwitz were the ultimate consequences of the theory that man is nothing but the product of heredity and environment—as the Nazi liked to say, of ‘Blood and Soil.’  I am absolutely convinced that the gas chambers . . . were ultimately prepared not in some Ministry or other in Berlin, but rather at the desks and in the lecture halls of nihilistic scientists and philosophers” (The Doctor and the Soul,  xxvii).  

In many ways, Weikart’s work is an extended commentary on the anthropological ideas Frankl  held responsible for genocide, holding man to be “an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity, and environment.”  During the past 200 years a multitude of thinkers have embraced varieties of philosophical materialism and rejected the traditional Christian “sanctity-of-life” ethic.  Some of them, beginning with Julien Offray de La Mettrie in 1747, imagined humans in terms of Man the Machine.  As a machine running within a mechanistic universe utterly devoid of meaning or purpose, it follows that a man is as irresponsible for his behavior as is the moon circling the earth.  Picking up on this notion, Ludwig Feuerbach famously said “Man is what he eats” (Der Mensch ist, was er isst!) and Karl Marx drank deeply from this fountain of atheism as he began his revolutionary career.  In our day, Francis Crick, celebrated for his DNA discoveries, “has probably done as much as anyone to promote the idea that humans can be reduced to their material basis” (#807) and speaks for many eminent academics.  Since we’re nothing but genes and molecules, i.e. matter-in-motion:  “‘No newborn infant should be declared human until it has passed certain tests regarding its genetic endowment and that if it fails these tests it forfeits the right to life’” (#826).   

Other philosophical materialists reduce man to a highly-evolved animal, following Darwin’s dictum that he was “created from animals” and has no soul.  That view, as adumbrated by People for the Ethical Treatment of Animal’s Ingrid Newkirk, holds:  “A rat is a pig is a dog is a boy.  They are all mammals.”  PETA enthusiasts, of course, elevate animals to human status, demanding they be treated tenderly.  But others lower humans to animals, treating them as disposable if worthless.  Following Darwin, human life must be devalued, since all animals share a common ancestry and natural selection requires the denial of any purpose to life.  Necessarily there can be no moral standards—might-makes-right as the fittest survive and individuals struggle for supremacy in life-and-death competition.  

Weikart traces the powerful trajectory of Darwinism (a theme he earlier documented in From Darwin to Hitler:  Evolutionary Ethics, Eugenics, and Racism in Germany and Hitler’s Ethic:  The Nazi Pursuit of Evolutionary Progress), culminating in some of the pronouncements of Peter Singer—Princeton University’s Professor of Philosophy.  Studying Singer—famed for his promotion of “animal liberation,” infanticide, bestiality, and “unsanctifying human life”—one realizes how much that’s wrong with our world can be traced to Charles Darwin, whose moral relativism justified any behavior which increased an individual’s survival potential (“even killing one’s own offspring”) if advantageous.  “Singer admits that Darwinism informs his own position that humans are not special or uniquely valuable.  He claims that Darwin ‘undermined the foundations of the entire Western way of thinking on the place of our species in the universe’” (#1054).  Without those Christian foundations, there is no reason to condemn the might-makes-right victors in the struggle for existence.  

Differing somewhat from the Darwin (who stressed environmental factors and understood little of what we label genetics), there are biological determinists such as Harvard University’s Stephen Pinker.  In The Blank Slate:  The Modern Denial of Human Nature, he justifies infanticide inasmuch as one’s “genes made me do it.”  Pinker labels just-born humans “neonates,” whose killing he calls “neonaticide” rather than murder.  Since the neonate lacks “morally significant traits” and is not demonstrably a full person, he has no more right to life than a mouse.  Hard-wired by our genes, we have no free-will and simply follow what’s prescribed for us.  Criminals are thus not to be held responsible for their crimes—a position increasingly held by criminologists and judges who attack this nation’s incarceration policies.  

Biological determinism was strongly asserted, more than a century ago, by Charles Darwin’s cousin, Francis Galton, who eagerly embraced the theory of natural selection and applied it to eugenics—a “‘new religion’ that would humanely improve humans biologically” (#1792).   In Galton we encounter “Social Darwinism” in its purist form.  Enthusiasts for this endeavor promoted a blatantly-racist agenda in Europe and America, passing laws and influencing a variety of academic disciplines.  Later tarnished by its association with the Nazis, eugenics following WWII, but it has recently revived under the rubric of “sociobiology” and “evolutionary psychology.”  Sociobiology’s architect, Harvard University’s E. O. Wilson, restricted reality to “chance and necessity” and insisted “that everything about humans—behavior, morality, and even religion—is ultimately explicable as the result of completely material processes” (#1955).  Given this assumption, virtually any behavior may be “good” as long as it contributes to the evolutionary process.  If one finds animal species engaging in suicide or infanticide or incest, numbers of evolutionary devotees declare such behavior may very well be appropriate for humans as well.  

Environmental determinists glibly declare “my upbringing made me do it”—as did Clarence Darrow (the famed defense attorney at the Scopes monkey trial) when he defended Nathan Leopold and Richard Loeb in a celebrated Chicago case a century ago.  The two young men, both brilliant and wealthy, had murdered a 14 year old boy simply to carry out the “perfect” crime.  Darrow (working out the implications of the Darwinism he had early defended in the Scopes trial) declared they were simply acting out what had been programmed into them and were thus guiltless of any crime!  Wikhart traces the genealogy of this position across 200 years, running from Helvetius through Robert Owen and his socialist supporters to Marx and his 20th century revolutionaries such as Stalin and Mao.  Prominent American psychologists, led by John B. Watson and B.F. Skinner, declared that one can prescribe any kind of behavior by applying the right stimulus.  Thus criminals ought not be held responsible for their acts—society shapes them and they do what they cannot but do.  

Yet another powerful component in the “culture of death” is the “love of pleasure” most sharply evident in the works and influence of the Marquis de Sade, who embraced any kind of behavior (including sadism) that “feels good.”  He and other Enlightenment thinkers recovered and promoted Epicurus and Lucretius—ancient writers clearly at odds with the Christian tradition.  Subsequently, Jeremy Bentham and John Stuart Mill constructed an ethical system—Utilitarianism—reducing all moral questions to a pleasure/pain calculus.  There are no “natural” rights, only more or less pleasurable experiences.  Maximizing pleasure becomes the sole “good,” whether one considers an individual or a society.  Subsequently, Sigmund Freud set forth his highly-influential psychology, reducing most every question to its sexual implications and satisfactions.  His case for sexual liberation had enormous influence, particularly as the counterculture of the ‘60s worked out its hedonistic ethos.  

Some of the 20th century’s most toxic deadly notions flow from existential and postmodern philosophers.  One of the chief sources for both movements is Friedrich Nietzsche, who declared, in Also sprach Zarathurstra:  “Die at the right time; thus teaches Zarathustra . . . .  Far too many [people] live and hang much too long on their branches.  May a storm come to shake all these rotten and worm-eaten ones from the tree.”  When Clarence Darrow mounted his defense of Leopold and Loeb he invoked Nietzsche as well as Darwin to explain away their responsibility for murder.  Nietzsche certainly shared Darwin’s view of human origins, writing:  “‘You have made your way from worm to man, and much of you is still worm’” (#3307).  When one carefully studies the careers of Mussolini and Hitler it becomes evident that many of the most murderous regimes were influenced by the atheistic existentialism of Nietzsche, including his contempt for the less fit in life’s struggle.  We have a picture of Hitler looking at a bust of Nietzsche in 1938.  A caption for the picture proudly claims “Nietzsche was a forerunner of Nazism” (#3300), and Hitler certainly wanted to move “beyond good and evil” in his will-to-power ambitions.  Traditional ethical notions, such as opposing suicide and infanticide, were to be discarded in an endeavor to purify and elevate the race.  

Having looked at the many thinkers responsible for our culture of death, Weikart assesses the fact that suicide, euthanasia, infanticide, and abortion have become increasingly acceptable in much of our world.  Thus we find two medical ethicists, in 2012, proposing we re-conceptualize infanticide as “after-birth abortion” to insure its social acceptability.   “Death-With-Dignity” initiatives have succeeded in Washington, Oregon, and California and promise to succeed elsewhere as secularism replaces Christianity as the nation’s moral foundation.  Many secularists (including the famous “situation ethicist” Joseph Fletcher) insist that mere human beings are not fully “persons” and have no right to life.  Persons, Fletcher asserted, “must have certain qualities, such as the ability to make moral decisions, self-awareness, self-consciousness, and self-determination” (#4078).  Similarly, Peter Singer says, neither an unborn “fetus” nor a newly-born baby can be considered a “person.”  Nor do severely handicapped individuals or terminally ill comatose patients qualify as “persons.”  

In his “Conclusion,” Weikart says:  “Humans on display in zoos.  Comparing farm animals in captivity to Holocaust victims.  ‘After-birth abortion.’  Physicians killing patients, sometimes even when they are not sick or in pain.  Accusing  fetuses of assaulting their mothers, just because they are living peaceably in utero.  Promoting ‘love drugs’ to make us more moral.  Granting computer programs moral status.  These are just a few examples that powerfully illustrate how sick our society is.  As many intellectuals have abandoned the Judeo-Christian sanctity-of-life ethic in favor of secular philosophies, we have descended into a quagmire of inhumanity.  Some today view humans as nothing more than sophisticated  machines or just another type of animal.   For them, human s are nothing special—just another random arrangement of particles I an impersonal cosmos” (#4936).  

The Death of Humanity deserves careful study and reflection.  J. Budziszewski, one of today’s finest Christian philosophers, says:  “So often I have heard the question, ‘How did we ever become so muddled in this twenty-first century?  What happened?’  This is a question for a historian, who can weave a single coherent story about a great many sources of confusion.  Richard Weikart is that historian, and I will be recommending his sane and lucid book often.”  As will I—and am so doing with this review!   

* * * * * * * * * * * * * * * * * * * * * * 

In Architects of the Culture of Death (San Francisco:  Ignatius Press, c. 2004), Donald De Marco and Benjamin Wiker provide brief vignettes of 23 thinkers, grouped together in seven sections, who bear responsibility for the dehumanizing “culture of death” facilitating the killing of innocent persons.   De Marco is a philosopher; Wiker is a biologist; both are committed Catholics who write to promote the “Personalism” associated with Pope John Paul II and deeply embedded in two millennia of Christian thought.  “It is precisely because of the infinite value of each human person, as revealed especially in the great drama of Jesus Christ, that truly Christian culture must be a Culture of Life, a culture that sees the protection of persons and their moral, intellectual, and spiritual development as the defining goals of society.  Whatever contradicts these goals can have no place in the Culture of Life” (p. 14).  

Clearly at odds with the Culture of Life is Friedrich Nietzsche, one of the “will worshippers” who celebrated a “Will to War, a Will to Power, a Will to Overpower’” (p. 41).   His heroes were “Supermen” like Julius Caesar who imposed their will on others, using whatever (frequently violent) means necessary.  In 1940 an American historian, Crane Brinton, diagnosed the impact of his literary works:  “Point for point he preached . . . most of the cardinal articles of the professed Nazi creed—a transvaluation of all values, the sanctity of the will to power, the right and duty of the strong to dominate, the sole right of great states to exist, a renewing, a rebirth, of German and hence European society. . . .  The unrelieved tension, the feverish aspiration, the driving madness, the great noise Nietzsche made for himself, the Nazi elite is making for an uncomfortably large part of the world’” (p. 52).   

Though not connected with them in any formal way, Nietzsche certainly shared much with eminent eugenicists of his era, who all embraced Charles Darwin’s notions of evolution through “natural selection” and the “survival of the fittest.”  Though Darwin himself evaded the implications of his theory for human beings for much of his life, it became clear in 1871, with the publication of his Descent of Man, that he was a eugenicist.  And he was also “a racist and a moral relativist” (p. 76).  Thus his cousin, Francis Galton, enthusiastically worked out the social implications of Darwinism by promoting eugenic measures designed to improve the race.  Just as we can breed better dogs we can breed better babies.  Inferior members of the species are best left to die off or forced to embrace celibacy.  Private correspondence between cousins Galton and Darwin proves how totally the latter endorsed the work of the former, so the two share responsibility for what we term “Social Darwinism.”  Embracing some of the deadlier aspects of this movement, the German zoologist Ernst Haeckel championed a rather ruthless form of evolutionary philosophy he called Monism, “drawing out the full implications of Darwinism” (p. 107).  He fervently espoused “eugenics and racial extermination” as well as “abortion, infanticide, and euthanasia as well” (p. 107).  Haeckel’s books were widely read at the turn of the 20th century and demonstrably influenced many of the policies crafted by Adolf Hitler.   

“Secular utopianists,” preeminently Karl Marx, prepared the way for mass-murderers such as Stalin and Mao.  Though his devotees religiously absolve Marx from any responsibility for the behavior of Communist regimes—asserting all efforts to implement his teachings strayed from the founder’s intent—there is clearly a deadly dimension to all efforts to establish a perfectly egalitarian world.  In fact, “Marx could not be more limpid in his call for violence.  He advocated hanging capitalists from the nearest lampposts” (p. 125).  Aligned with Marx (sharing both his atheism and communism), the French Existentialist Jean-Paul Sartre’s “philosophy leads logically and directly to despair and suicide. . . .  His world of atheism is a kingdom of nothingness plunged into intellectual darkness, convulsed with spiritual hate and peopled by inhabitants who curse God and destroy each other in their vain attempt to seize his vacant throne” (p. 175).  (There is thus some warrant for Paul Johnson to suggest, in his biography of Darwin, that Pol Pot, the genocidal Cambodian Communist, derived some of his murderous ideas from both Sartre, who introduced him to Darwin, and from Darwin himself!)

While the “pleasure seekers” might not seem to promote the culture of death, at least indirectly they do!  Thus Helen Gurley Brown, who made Cosmopolitan magazine a stellar success (especially on college campuses), singularly promoted “feel-good sex.”  Her “Sex and the Single Girl, [was} a ‘shameless, unblushing, runaway, unmitigated’ manual advising and instructing women on how to seduce men and enjoy their inalienable right to have as much sex as humanly possible” (p. 237).  Her message helped shape the enormously successful television show, “Sex in the City,” mainstreaming her ideas.   Inevitably she approved adultery, contraception, and abortion—anything pleasuring yourself was fine.  

So too “sex planners” added their notions to the anti-life brew.  Margaret Mead, named “Mother of the World” by Time magazine in 1969, was certainly one of the most influential anthropologists of the 20th century and reached a broad women’s audience through her regular columns (1961-1978) for Redbook magazine, helping “bring the twentieth-century sexual revolution to its culmination”  (p. 250).  As a young woman she published Coming of Age in Samoa (1928) and instantly became an academic superstar.  Though her misleading portrayal of the sexually libertine Samoans was “autobiography disguised as anthropology,” the book would be required reading in hundreds of university classes and help undermine the Christian tradition’s commitment to chastity and opposition to abortion.  Joining Mead as a spurious “scholar” was Alfred Kinsey, who sought to justify his own covert homosexuality and pedophilia with allegedly statistical studies on the Sexual Behavior of the American male and female.  The Kinsey Reports lent an aura of respectability to deviant behaviors simply by falsely stating large numbers of Americans actually practiced them.  

Finally, there are the “death peddlers”—Derek Humphry, who in Final Exit championed suicide; Jack Kevorkian, the pathologist who bragged about his “mercy-killing” activities and “personifies the Culture of Death” as vividly as anyone; and Peter Singer, the Princeton philosopher who seeks to discard the “traditional Western ethic” which for 2,000 years has promoted the “sanctity of life.”  Taking Darwinism to its ultimate conclusions” (p. 363), Singer denies significant differences between humans and other animals.  He also believes a “person” is a human being with certain capacities and thus not all humans are “persons” worthy of being.  His books—and his international prestige as being one of the preeminent ethicists in the world—bear witness to the triumph, in many sectors, of a noxious ideology.  

284 Hillary’s History

In a court of law, eyewitness testimony is highly privileged, considered “first-hand” evidence most worthy of consideration.  So too historians relentlessly seek out “primary” sources—eyewitness accounts showing “how it actually was.”  Eyewitnesses may, of course, render skewed accounts—shaped by personal biases or faulty memories or delimited vision.  They may very well be a bit inarticulate and disjointed in telling their stories.  So juries and historians take such things into account and try to put everything in its proper context.  But in the final analysis eyewitness testimony and primary sources provide us our surest route to historical truth.

One recent eye-witness account meriting attention is Gary J. Byrne’s Crisis of Character:  A White House Secret Service Officer Discloses His Firsthand Experience with Hillary, Bill, and How They Operate (New York:  Center Street, c. 2016).  After serving in the Air Force, Byrne realized his vocational aspirations and became “an elite White House Secret Service officer, a member of its Uniformed Division,” entrusted with guarding the President, his family and staff.  He began his assignment when George H.W. Bush (affectionately referred to as “Papa Bush”) was still President.  “I assumed every president would follow Papa Bush’s example,” Byrne says.  “The work ethic, love of country, work environment, and respect for the people serving would be constant, and politics would never matter” (p. 36). 

But his high regard for the Bush family turned to anguish as he watched the Clintons occupy the White House and witnessed first hand—among other things—the Monica Lewinsky affair.  In addition:  he  “saw a lot more.  I saw Hillary, too.  I witnessed her obscenity-laced tirades, her shifting of blame” (p. ix) and other traits disqualifying her from most any high office, much less the presidency.  He and his fellow officers “were measured by the highest of ethical requirements” while “[t]hose at the very pinnacles of power held themselves to the very lowest standards—or to none whatsoever” (p. x).  “The Clintons are crass.  Papa Bush is class” (p. 277).  To Bryne, Hillary  “simply lacks the integrity and temperament to serve in the office.  From the bottom of my soul I know this to be true.  So I must speak out” (p. xi).   

Byrne’s critical comments are confirmed and underscored by other agents, who provided Ron Kessler the information recorded In The President’s Secret Service:  Behind the Scenes with Agents in the Line of Fire and the Presidents They Protect—an historical narrative of the agency.  In the chapter devoted to the Clintons, Kessler says that Bill was charming, if utterly undisciplined, but “Hillary Clinton could make Richard Nixon look benign.  Everyone on the residence staff recalled what happened when Christopher B. Emery, a White House usher, committed the sin of returning Barbara Bush’s call after she had left the White house.  Emery had helped Barbara learn to use her laptop.  Now she was having computer trouble.  Twice Emery helped her out. For that Hillary Clinton fired him” (p. 146).  He would then be unemployed for a year, thanks to the vindictive First Lady!  One agent said:  “‘When she’s in front of the lights, she turns it on, and when the lights are off and she’s away from the lights, she’s a totally different person.’” Off stage she was “‘very angry and sarcastic and is very hard on her staff.   She yells at them and complains.’”  Though publically she pretended to adore the agents assigned to protect her, she “‘did not speak to us.  We spent years with her.  She never said thank you’” (p. 147).  That other agents share Byrne’s disdain for Hillary lends his account considerable credibility!  

Agent Byrne first encountered the Clintons in 1992 when he worked at some of the candidate’s  campaign rallies.  Chatting with a sheriff from Arkansas, he mentioned the many rumors then revolving around the Clintons.  The sheriff “gave me a thousand-yard stare.  ‘Let me tell you something, Gary.  Everything—everything they say about them is true.  The Clintons are ruthless.  And [the media-led public] don’t even know the half of it’” (p. 39).  The next six years amply proved to Byrne the truth of that sheriff’s assertion.  The polite, orderly White House deteriorated into “helter-skelter” chaos as the Clinton crew failed to “focus, pace themselves, or even delegate.  Staff wore jeans and T-shirts and faced each problem with grand ideological bull sessions” (p. 50).  Hillary Clinton’s “doting, barely post-adolescent staffers resembled enabling, weak-willed parents.  She threw massive tantrums” (p. 56) which only intensified as the years passed.  Her friendly, empathetic public facade belayed the private fury evident in “antics [that] made my job interesting.  She’d explode in my face without reservation or decorum, then confide in some visiting VIP, ‘This is one of my favorite officers, Gary Byrne’” (p. 60).  

Byrne provides important details regarding various scandals and insights into personalities in the Clinton White House, but he is best known for his testimony regarding the Monica Lewinsky affair that figured largely in the impeachment of the president.  She was what the secret service called a “straphanger” or “loiter”—a young volunteer intern with political connections, wondering about the White House seeking access to powerful persons.  Lewinsky clearly stalked President Clinton, doing everything possible to frustrate the agents who tried to shield him from her advances.  But rather quickly it became an open secret that she and Clinton were having an affair—one many such trysts the president engaged in while living in the White House, including sessions with Eleanor Mondale, the daughter of the former vice president.  Still more, a fellow agent told Byrne:  “‘You have no idea what it’s like on the road’” (p. 107), where women regularly traipsed in and out of Clinton’s quarters.  He “had difficulty managing where he saw his many mistresses, whether it was at the White House or on the road.  It baffled the Uniformed Division as to how he could manage all these women without any of them realizing there were so many others.  We wondered how he got any work done and joked that he would have been better at running a brothel in a red-light district than the white House” (p. 127).  

After encountering Lewinsky, President Clinton put her on the White House payroll and gave her his top-secret phone number so they could have intimate talks.  To Byrne:  “paying a mistress with taxpayer funds and giving her security clearance?  These were new lows” (p. 111).  Ultimately the semen-stained blue dress would prove the president guilty of perjury and lead to his impeachment.  Then when special prosecutor Ken Starr, investigating Clinton’s affair with Paula Jones, learned of the Lewinsky affair, he brought the weight of the Justice Department to bear on Byrne, seeking information helpful to his investigation.  So very much against his will he was subpoenaed and forced to tell what he had observed in the White House.  Testifying via videotape before a grand jury, he would soon be seen by the nation on C-SPAN—though he had been promised his testimony would remain sealed.  As a Secret Service agent he had vowed to protect the president—committed to never revealing “information that might jeopardize [his] safety and security”—so he refused to discuss certain things.  But as a citizen he had to reveal certain details relevant to the Starr inquiry.  Consequently, he became one of the most important under-oath witnesses regarding the Clintons’ behavior in the White House.

Now safely removed from that crisis-ridden epoch, Byrne can look back and assess it.  While testifying, he remembered that Arkansas sheriff’s words regarding the Clintons’ ruthlessness, and he confesses to fearing them and what might happen to him and his family because of his testimony.  Still more, he’s outraged:  “I was compelled to tell the truth, but why the hell was neither the president nor Mrs. Clinton ever really compelled to tell the damn truth?” (p. 165).  Bill Clinton misbehaved and lied and easily moved on virtually unscathed while many “little people” had their lives ruined by his behavior and his wife’s machinations.  “This is the man I was protecting?  That’s what I tolerated?  I had tried and tried to prevent harm to this president, but he failed us all!” (p. 177).  

Two decades later, Byrne says:  “Our collective amnesia about the Clinton White House is dangerous because it could happen again—maybe with a different Clinton dealing the cards, but with the same stacked deck” (p. 273).  So he has written this book to dissuade us from electing Hillary, particularly in light of her careless handling of classified materials and suspicious work with the Clinton Foundation.  He “was there with the Clintons.  I could not keep silent then, and I can’t keep silent now” (p. 274).  

* * * * * * * * * * * * * * * * * * * * *

In the early ‘60s David Schippers led the Justice Department’s Organized Crime and Racketeering Unit, successfully prosecuting mobsters such as Sam Giancana.   A lifelong Democrat who twice voted for Bill Clinton, he was renowned for his skills as a prosecutor and trial attorney.  More importantly:  he was known as a man of integrity.  As The House of Representatives began the inquiries which led to the impeachment of President Clinton, Shippers was brought to Washington to lead an oversight investigation of the Justice Department and ultimately became Chief Counsel of the House Managers entrusted with pursuing evidence for the president’s impeachment.  In Sellout:  The Inside Story of President Clinton’s Impeachment,Shippers provided an “insider’s account” of what happened nearly 20 years ago.  

In the light of evidence he probably knew better than anyone else, Shippers believed Clinton should have been removed from his office for his “high crimes and misdemeanors.”  Though the president claimed to be “proud of what we did” during the impeachment process—declaring he “saved the Constitution”—Schippers thought him demonstrably guilty of  “some of the most outrageous conduct ever engaged in by a president of the United States” (p. 3).  He quickly learned to detect and deeply abhor the Clintons’ guiding modus operandi:  do anything to avoid the truth.  White House spin-masters manipulated the media (portraying the president as a victim) and glossed over his incessant lies which were obvious to skilled lawyers who saw through his legalistic obfuscations.  To Shippers, Clinton’s real “high crimes and misdemeanors” were perjury and obstruction of justice.  But he and his media accomplices successfully reduced the whole inquiry to nothing more than questions of lamentable sex with Monica Lewinsky.  “The White House never ceased to astound and dismay me in the extent to which it demonstrated its utter contempt for the Judicial Branch, the Legislative Branch, and the American people” (p. 171).  

As much as anyone, then, David Schippers understands the Clintons’ duplicitous behavior.  So when he commends a recent book by Dolly Kyle we may assume he validates much of her account in Hillary:  The Other Woman:  A Political Memoir (Washington, D.C.:  WND Books, c. 1916).  Schippers says the book “is as timely as tomorrow’s newspaper” inasmuch as it contains “Ms. Kyle’s firsthand knowledge obtained over many years” (#56).  Acutely aware of the investigations he conducted 20 years ago, he affirms the truth of Kyle’s memoir since she’s known the Clintons for half-a-century and occupies “a unique position to reveal the truth about Billy and Hillary that no one else can tell” (#178).  She wrote this book because “Hillary Rodham Clinton is running for president.  She is morally and ethically bankrupt” (#144).  From Kyle’s perspective:  “The average person cannot comprehend that two politicians could have managed to get where they are with so many crimes in their wake, and so little reporting about it” (#1028).  The Clintons are, to be candid:  “lying, cheating, manipulative, scratching, clawing, ruthlessly aggressive, insatiably ambitious politicians . . . and nothing about them has changed in the past forty-plus years, except that they have deluded more and more people” (#1034).  

Dolly Kyle met Bill Clinton in 1959 in Hot Springs, Arkansas, when she was eleven years old.  They both graduated from Hot Springs High School in 1964, and she provides many details and insights into the community and families that help us better understand “Billy” Clinton.  She was immediately attracted to him and “a liaison . . . evolved from puppy love to dating to friendship to a passionate love affair” (#209) that lasted, off-and-on, for 40 years.  Their affair was pretty much an “open secret” in Arkansas, though it attracted little media attention.  “I’m not proud (and have repented) of having that decades-long affair with Billy Clinton, but it is a fact” (#448).  They became lawyers, married other persons, had children, and repeatedly interacted with each other.  Sadly enough, for too many years she simply thought of him as a lovable rascal, indulging his appetites with a series of willing women.  “I didn’t realize until many years later, that Billy was a serial sexual predator and a rapist” (#1886).  Nor did she then understand Hillary’s role in suppressing any evidence of his philandering.  

When Hillary Rodham moved to Arkansas and married Bill Clinton, she necessarily had contact with her husband’s Arkansas friends, including Dolly Kyle.  Though Dolly retains a lingering affection for “Billy” (despite his wayward ways he’s “a charming rogue who was sexually addicted”), she clearly dislikes Hillary.  In her opinion, Bill moved in with Hillary while they were students at Yale in order to share her wealth and the two have simply used each other to advance their respective careers ever since.    During their early years, it was “generally Hillary’s job to make the money and provide the financial base from which she and Billy could maneuver their way to the White House” (#2424).  “Even their decision to have a child was a calculated political maneuver to make them appear to be a normal couple” (#1533).  

Meeting Hillary for first time in Little Rock in 1974, Kyle was shocked at the “dowdy-looking woman who appeared [at a distance] to be middle-aged” (#590), wearing thick glasses, shapeless dress and sandals; she clearly cared little for style or personal appearance.  When Bill introduced them, Dolly “smiled and extended my right hand in friendship,” but Hillary “responded only with a glare at me.  Finally, seeing my hand still extended, she managed a grudging nod.  She did not condescend to shake my hand” (#608).  Obviously there would be little love lost between these two women in Bill Clinton’s world!  But their encounters were minimized as Bill usually attended events (such as high school reunions) without Hillary and could easily engage in various liaisons to his liking.  At the 30th reunion there occurred “the infamous scene between the two of us that was immortalized under oath in the impeachment investigation” (#935).  

Ultimately, when he was president, Bill Clinton’s sexual affairs came under increased judicial scrutiny, and Kyle (under oath in a disposition in the Paula Jones v. Clinton lawsuit) disclosed the nature of their relationship.  She had earlier discovered first-hand the malice and vindictiveness with which Hillary pursued any woman who might endanger her aspirations.  In fact, when an English journalist was about to disclose her affair with Bill when he was running for president in 1992, her own brother had warned her, speaking for Billy:  “If you cooperate with the media, we will destroy you’” (#3291) if she confirmed the truth about her relationship with him.  So in time she concluded:  “While proclaiming himself to be the champion of women’s rights, Billy Clinton has continually betrayed the woman he married, the girl he fathered, and the untold numbers of women he used for his sexual gratification.  Meanwhile, proclaiming herself to be the champion of women’s rights, Hillary Clinton has been behind the threats and intimidation of the women her own husband abused and molested” (#1579).  

In addition to providing details regarding Billy’s sexual misconduct, Kyle shares what she knows about the Clintons’ multifaceted adventures in Arkansas and the White House.  She discusses important  personalities such as Webb Hubbell and Vince Foster (one of Bill’s childhood friends and a partner with Hillary at the Rose Law Firm).  She cynically notes that Hillary was first hired and later became a partner of the Rose Law Firm at precisely the same moments her husband became attorney-general and then governor of Arkansas!  Vince Foster “knew the facts about Hillary’s double-billing practices that had enabled her to receive questionable foreign money with strings attached” as well as the “FBI files that had been taken illegally for illegal purposes and would later be found with Hillary’s fingerprints on them” (#3525).  He knew all the details regarding the Clintons’ financial adventures.  In time, Kyle thinks, he committed suicide simply because he could not handle all the stress he experienced as a result of his work with the Clintons, dying under the weight of being betrayed by his friends.  

Dolly Kyle also conveys—as she documents the evils done by the Clintons—a deep sense of betrayal.  She feels personally betrayed, but in a larger sense she’s persuaded they have betrayed an enormous number of others and this nation itself.  While distressingly disorganized and subject to criticism because of her personal animosities, Hillary, the Other Woman, certainly gives us first-hand insights into the character (or lack of it) of two of the most prominent politicians of our era.  

* * * * * * * * * * * * * * * * * * * * * 

            Perhaps the best-known victim of the terrorists’ attacks on September 11, 2001, was Barbara  Olson, the wife of the nation’s Solicitor-General, Ted Olson.  Like her husband, she was a lawyer, and had served as both a prosecutor for the Department of Justice and as counsel to a House congressional committee that investigated some of the Clintons’ scandals.  She died aboard the hijacked airplane that smashed into the Pentagon two days before her long-awaited book—ironically titled Final Days—was to be published.  She concluded that book with a solemn reminder and a warning regarding the deeply radical views of Bill and Hillary Clinton which she had earlier catalogued in Hell to Pay: The Unfolding Story of  Hillary Rodham Clinton (Washington: Regnery Publishing, Inc., c. 1999).

          Olson’s eyes opened while investigating allegations regarding missing FBI files and the firing of White House Travel Office employees in order to give the jobs to some of the Clintons’ Arkansas friends.  .  Immersing herself in the witnesses’ evidence, Olson came “to know Hillary as she is—a woman who can sway millions, yet deceive herself; a woman who has persuaded herself and many others that she is ‘spiritual,’ but who has gone to the brink of criminality to amass wealth and power” (p. 2). Olson had “never experienced a cooler or more hardened operator,” a more singularly calculating public figure, whose  “ambition is to make the world accept the ideas she embraced in the sanctuaries of liberation theology, radical feminism, and the hard left” (p. 3).  Machiavellian to the core, Hillary proved herself to be “a master manipulator of the press, the public, her staff, and-likely-even the president” (p. 3).

          Intellectually gifted, Hillary attended Wellesley College in the late ‘60s.  Awash in the currents of the counterculture, she gradually embraced its radical agenda, participating in antiwar marches, defending a Black Panther murderer, and enlisting fellow students to change the world.  She was selected to speak at her commencement following an address by Massachusetts’ Republican Senator Edward Brooke.  Rather than give her prepared speech, however, Hillary “‘gave an extemporaneous critique of Brooke’s remarks’” (p. 41), rudely reproving him. “We’re not interested in social reconstruction,” she shouted; “it’s human reconstruction” (p. 42). Nothing less than the Marxist “new man”—would satisfy her.  

          That youthful obsession, Olson argues, persisted.  Hillary found Western Civilization bankrupt, needing more than reform.  Only “remolding,” only radical new structures, can bring about the “social justice” she pursues.  Such can come only “from the top—by planners, reformers, experts, and the intelligentsia.  Reconstruction of society by those smart enough and altruistic enough to make our decisions for us.  People like Bill and Hillary Clinton.  Hillary, throughout her intellectual life, has been taken by this idea, which is the totalitarian temptation that throughout history has led to the guillotine, the gulag, and the terror and reeducation camps of the Red Guard”  (p. 311).  Overstated?  Well, Olson knew Hillary well!