153 Deconstructing The Da Vinci Code




          In an era when large numbers of people take seriously the propaganda promoted by filmmakers such as Oliver Stone and Michael Moore, I guess it’s inevitable that spurious works of fiction, such as Dan Brown’s The Da Vinci Code (New York:  Doubleday, c. 2003) could be pawned off as a source for historical information about Christianity.  For over a year the book has remained at the top of best-seller lists, selling some 7 million copies in a year.  Rave reviews in the New York Times and Library Journal provided a cover of credibility for it, and many critics applauded its “impeccable” research and historical accuracy.  It’s been translated into dozens of languages, and it’s increasingly evident that its popularity resides in its appeal to people’s religious hungers as well as their thirst for an entertaining mystery. 

          As a story, The Da Vinci Code includes murder, mystery, romance, and action–necessary ingredients for a best-selling novel.  The book begins with the murder of a curator at the Louvre in Paris, Jacques Sauniere, who leaves clues regarding his killer for his granddaughter (Sophie Neveu, a police cryptologist) and a friend (Robert Langdon, a Harvard professor of religious symbology) to pursue.  In time they discover revelations in Leonardo’s “The Last Supper” and hidden “truths” in the legend of the Holy Grail and traditions preserved by the Priory of Sion.  They learn that Jesus married Mary Magdalene, and their offspring have preserved and transmitted the great truths that will infuse the new wisdom (Sophie means wisdom; Neveu means new) proclaimed by the Mother Earth paganism Brown promotes. 

          Were it merely a mystery story, it would not deserve careful scrutiny.  Indeed, many works of fiction begin with a disclaimer, indicating that all characters and incidents are sheer fictions.  But the first page of The Da Vinci Code asserts:  “Fact:  The Priory of Sion–a European secret society founded in 1099–is a real organization.  In 1975 Paris’s Bibliotheque Nationale discovered parchments known as Les Dossiers Secrets, identifying numerous members of the Priory of Sion, including Sir Isaac Newton, Botticelli, Victor Hugo, and Leonardo da Vinci.  The Vatican prelature known as Opus Dei is a deeply devout Catholic sect that has been the topic of recent controversy due to reports of brainwashing, coercion, and a dangerous practice known as ‘corporal mortification.’  Opus Dei has just completed construction of a $47 million National Headquarters at 243 Lexington Avenue in New York City.  All descriptions of artwork, architecture, documents, and secret rituals in this novel are accurate.” 

Dan Brown obviously invites readers to take his novel as a depository of historical truth.  He reinforced this in several interviews, such as the one he gave on NBC’s Today Show, where he asserted that “absolutely all” of the book’s historical data are true.  “Obviously,” he said, “Robert Langdon is fictional, but all of the art, architecture, secret rituals, secret societies–all of that is historical fact.”  On ABC’s 20/20 Brown explained his breakthrough to a new understanding about Christianity and acknowledged his sense of mission to share it with the world.  He’s a propagandist for a new faith–one that replaces “patriarchal Christianity” with an ancient “matriarchal paganism.”  Enamored with “The Age of Aquarius,” he speaks for  ’60s generation, which has promoted anti-traditional views of sex and marriage, education, ethics, religion, and Reality.   

          Inasmuch as it is a work of propaganda, one should preface any reading of Brown’s work with a warning:  Reader Beware!  The book is riddled with inaccuracies, fraudulent claims, subtle misrepresentations, and blatant lies.  We should heed Paxton Hood’s ancient warning:  “Be as careful of the books you read as of the company you keep, for your habits and character will be as much influenced by the former as the latter.”  So beware:  The Da Vinci Code is a propaganda piece, written by a man seeking to destroy Christianity and replace it with a religion more attuned to the feminist fantasies and postmodern prejudices he favors.  It’s a popularization of esoteric notions found in books such as Elaine Pagel’s The Gnostic Gospels, Lynn Picknettt and Clive Prince’s The Templar Revelation, and Margaret StarBird’s The Goddess in the Gospels:  Reclaiming the Sacred Feminine.  Brown longs for the reestablishment of a pagan cult devoted to Mother Earth, a sexually libertine and morally permissive autonomous individualism. 

          Taking seriously the claims set forth in The Da Vinci Code, several critiques have been published by first-rate Christian scholars.  Of those I’ve read, perhaps the most thoroughly-researched and blow-by-blow factual refutation is The Da Vinci Hoax:  Exposing the Errors in The Da Vinci Code, by two Catholic scholars, Carl E. Olson and Sandra Miesel (San Francisco:  Ignatius Press, c. 2004).  Brown virtually equates Christianity with Roman Catholicism and errs egregiously in many of his denigrations of that body.  For example, he often refers to the Early Church as “the Vatican,” long before that administrative center even existed!  As committed Catholics, Olson and Miesel are particularly adept at providing accurate and appropriate responses to his “central concerns, which are ideological” (p. 33). 

          Brown clearly promotes the revival of Gnosticism, a perennial ideology that promises an individualistic, generally antinomian autonomy in discerning religious truth and following one’s inner light.  Gnostics, ancient and modern, often envision God as androgynous–an amorphous blend of masculine and feminine traits, who is frequently addressed as “Mother.”  Modern Gnostics, like Elaine Pagels, celebrate some “hidden gospels,” such as the Gospel of Thomas, which they insist was embraced by significant sectors of the Early Church.  They further argue (with virtually no documentary evidence) that the Early Church was fully egalitarian, led by female as well as male bishops, until patriarchal “orthodoxy” imposed its fetters upon all claiming the name Christian.  In accord with Pagels and her cadre of disciples, Dan Brown denounces the Christian Church for suppressing the “sacred feminine” and hopes for its recovery. 

          This is most evident, Brown says, in the Catholic Church’s “smear campaign” against Mary Magdalene, who, according to his novel, is the Holy Grail.  According to Brown, Mary Magdalene was Jesus’ first and most important apostle.  They married and had children.  Following Jesus’ death, she went to France, and her descendents clandestinely transmitted the message of the real Jesus.  But as Olson and Miesel make clear, all Brown’s “facts” about Mary Magdalene are sheer fabrications, largely dreamed up by a small group of feminists chatting with each other at the Harvard Divinity School.  Throughout Church history, Mary Magdalene has in fact enjoyed high standing as a loyal disciple of Jesus, but Brown’s portrayal of her derives from a few references in apocryphal works and spurious speculations that have emerged only in recent centuries. 

          Brown misrepresents Jesus as well as Mary.  He asserts, for instance, that Christ was never considered divine until the Council of Nicaea (325 A.D.) declared Him such by a “relatively close vote.”  In fact, New Testament documents amply indicate a confidence that Jesus, the Incarnate Christ, was God’s Son.  The earliest Christian records we have, subsequent to the NT, shared the view of Ignatius of Antioch (ca. 50-117 A.D.), who wrote:  “There is one Physician who is possessed both of flesh and spirit; both made and not made; God existing in flesh; true life in death; both of Mary and of God; first possible and then impossible, even Jesus Christ our Lord” (Letter to the Smyrnaeans, 10).  The bishops at the Council of Nicaea, by an overwhelming majority (only two of more than 200 bishops dissented, which is hardly the  “relatively close” vote Brown claims) simply affirmed the deeply embedded faith of the Church.  Still more:  just as Brown misleads readers regarding the Council of Nicaea, so he maligns Constantine, the emperor who called for it.  According to the novel’s “historian,” Teabing, Constantine was a lifelong pagan who manipulated the Church to attain his own ends.  In the process he made Sunday the Christian holy day, established the NT canon to exclude rival “gospels,” and imposed the new notion that Jesus was fully divine.   Few of Brown’s assertions regarding Christ have historical merit, though many naïve readers apparently take them as true. 

          Olson and Miesel carefully investigate one of the book’s main themes, the secret messages of the Priory of Sion, obviously based upon a 1982 book, Holy Blood, Holy Grail, “co-authored by Michael Baignent, Richard Leigh, and Henry Lincoln.  So fundamental is this book to The Da Vinci Code that Dan Brown borrowed two of the author’s names for his character Leigh Teabing” (p. 223).  Two of the authors are Masonic historians, and they promote the story of Mary Magdalene, whose alleged descendents became part of France’s Merovingian dynasty and then the Knights Templar, whose secretive operations have continued over the centuries.  Much of this material depends upon Les Dossiers Secrets, a collection of documents in the Bibliotheque Nationale purporting to establish the existence of the Priory of Sion.  In fact, Olson and Miesel show, the Priory of Sion is “a modern hoax conjured up by a Frenchman named Pierre Plantard and his associates” (p. 234).  Plantard wrote some books and appeared on major TV networks as an alleged Templar expert.   Holy Blood, Holy Grail relies extensively upon his works.  In time, however, Plantard was exposed and forced to admit that his “history” was a bundle of lies.  Dan Brown, of course, knows this.  But he perpetuates the lies because they serve his cause.

          Olson and Miesel carefully, persuasively document their refutations of The Da Vinci Code.  Footnotes, an extensive bibliography, and an index make this a most useful critique of Dan Brown’s hoax.

          A more engaging and philosophically astute critique of Brown is provided by James L. Garlow and Peter Jones in Cracking Da Vinci’s Code (Colorado Springs:  Victor, c. 2004).  Garlow earned a Ph.D. in historical theology from Drew University, and Jones has a Th.M. from Harvard Divinity School and a Ph.D. from Princeton Theological Seminary.  They are fully qualified to dissect the historical and theological claims set forth in Brown’s novel.  (By way of full disclosure, I know the authors and was part of a team that helped them prepare to write the book).  They understand that The Da Vinci Code, on a deeply spiritual level, is a cunning attack upon Christianity.  It’s designed to destroy the very foundations–Scripture, Tradition, Christ’s historical Incarnation and Resurrection–that have supported the Faith for two millennia.  Responding to fiction with some fiction of their own, Garlow and Jones skillfully involve the reader by beginning each chapter with an episode involving a modern university student struggling with some of Brown’s allegations, taught as fact in her women’s studies classes. 

          The authors particularly address “the sacred feminine,” embedded in the worship of Mother Nature, one of The Da Vinci Code‘s central themes, in a chapter entitled “God’s Second Best Idea.”  Brown’s novel is, in fact, deeply sexual in its message, for it “is ultimately–when pressed to its not-so-logical conclusion–an appeal for free sex, separate from the parameters established by God” (p. 35).  The novel’s popularity, one suspects, relates to its rationalization of sex under the guise of  “spirituality.”  Indeed, one of Brown’s main criticisms of the Christian Church involves her historic opposition to sexual sins.  Shamelessly misrepresenting the Church, he says she equates sex with “original sin” and thereby renders all sexual behavior shameful.  On the contrary, Garlow and Jones argue that sex is “God’s Second Best Idea” and defend the view that the very best sex is monogamous and heterosexual, gloriously in accord with the ways of creation. 

          They further argue that it is the very pagan religions celebrated by The Da Vinci Code, not biblical Christianity, that have devalued women.  The matriarchal pagan cultures Brown celebrates never existed.  They’re sheer figments of feminist fantasies.  And pagan religions, for all their goddesses and priestesses, were marked by temple prostitution, sex-selection infanticide, foot binding in China, and suttee (burning widows in India).  The alleged authorities cited by Brown when, for example, he makes wild assertions concerning the number of witches burned by the Inquisition, have been totally disproved by careful research.  Brown parrots the radical feminist claim that the Catholic Church killed five million female witches–a monstrous “gendercide.”  In fact, perhaps 50,000 witches (one-fourth of them male) were executed in 300 years.  Christianity, Garlow and Jones insist, has done more for women’s rights and dignity than any other religion, and both Scripture and Church history reveal how women have flourished in Christian cultures. 

          Years ago Peter Jones attended a graduate seminar at Harvard that included Elaine Pagels, and he understands her real agenda:  to reconfigure Christianity in accord with Gnostic thought.  What Harvard professors were saying 30 years ago now informs a novel read by millions!  They, as well as Pagels and Dan Brown, consider the Bible a purely human construct, not the inspired Word of God.  Thus Cracking Da Vinci’s Code contains some careful apologetics in defense of the traditional canon.  Jones and Garlow point out the remarkable similarities between the ancient heretic, Marcion, and modern thinkers like Robert Langdon in Brown’s novel.  Marcion discarded the Old Testament as well as “legalistic” sections of the New Testament and promoted a lawless spirituality that permitted the sexual license he personally relished.  In response, Tertullian denounced him as “the Pontic mouse who nibbled away the Gospels . . . abolished marriage . . . and tore God almighty to bits with [his] blasphemies” (Against Marcion). 

          Whether or not Marcion was a Gnostic we’ll never know for sure, but he certainly shared many Gnostic views.  Peter Jones has written a fine monograph on Gnosticism, The Gnostic Empire Strikes Back, and this book’s chapter comparing the Gnostic and New Testament Gospels is quite illuminating.  Jones remembers how Elaine Pagels at Harvard immersed herself in the Nag Hammadi Gnostic texts, whose most common message “is the rejection of the Genesis creation account” (p. 166).  She then published The Gnostic Gospels and vaulted into an academic super-star status as a professor at Princeton University.  She considers Gnostic Christianity a viable alternative to orthodoxy, and she portrays the Gnostics as victims of a power play by the patriarchal bigots who established the Catholic Church and insisted on doctrinal conformity. 

Though once an evangelical, Pagels has recently “found a spiritual home in the Church of the Heavenly Rest in New York, led by a ‘woman priest,’ where she was able to reject the notion that being Christian was ‘synonymous with accepting a set of beliefs’ such as the Apostles’ Creed.  Pagels is also interested in the blending of Christianity and Buddhism” (p. 169).  Sharing her stance, influential feminists have celebrated the “sacred feminine” and find solace in occult texts, women’s diaries and communal experiences.  They–and Dan Brown–celebrate the ecstatic pagan mysticism featured in various Goddess cults.  Many radical feminists hunger for “‘the Neolithic, pagan, matriarchal perception of the sacred universe itself'” (p. 203).   Ancient goddesses such as Isis, Asherah, and Cybele illustrate the perennial allure of the “Great Mother.” 

“The religious worldview of The Da Vinci Code celebrates the soft, inclusive womb of the Goddess, from which everything emerges and to which it all returns” (p. 224).  The Goddess cults have recently proliferated in America, making incursions into allegedly Christian circles.  One of Hillary Clinton’s advisors in the 1990s, Jean Houston, “believes our society needs to be rebuilt through the myth of the goddess Isis and her consort Osiris” (pp. 204-205).  The Pilgrim Press, the publishing arm of the United Church of Christ, published (in 1999) a book by a “theologian/pagan priestess, Wendy Hunter Roberts, Celebrating Her:  Feminist Ritualizing Comes of Age, which says:  “‘Deep within the womb of the earth lies a memory of sacredness nearly buried under the weight of patriarchy.  …  More and more women–especially those with Christian backgrounds–are being drawn to this empowering, goddess-centered worship'” (p. 208).  Mary Daly, longtime professor of theology at Boston College and one of the founders of “Christian feminism,” has lately abandoned Christianity, but as early as 1973 she revealed her true faith by declaring, in Beyond God the Father:   “‘The antichrist and the Second Coming are synonymous.  This Second Coming is not the return of Christ but a new arrival of female presence. …  The Second Coming, then, means that the prophetic dimension in the symbol of the Great Goddess . . . is the key to salvation from servitude'” (p. 209). 

The choice we must make is simple and profound:  either pagan monism or biblical theism.  “Is God just Nature or is He the Creator of Nature?  Your answer to that question changes everything you think and do” (p. 230).  To monists, everything is ultimately the same thing.  To theists, as C.S. Lewis so wisely declared, “God is a particular Thing.”  There is an otherness to God, the Creator of heaven and earth.  He cannot be reduced to a “cosmic womb” forever spawning small segments of itself. 

Cracking Da Vinci’s Code is a most engaging and analytically successful of the critique. 

Ben Witherington III, an incredibly prolific professor of New Testament at Asbury Theological Seminary, has published The Gospel Code:  Novel Claims About Jesus, Mary Magdalene and Da Vinci (Downers Grove:  InterVarsity Press, c. 2004).  He acknowledges:  “We are facing a serious revolution regarding some of the long-held truths about Jesus, early Christianity and the Bible” (p. 11).  He also demonstrates the affinity between Brown’s novel and two previously published works–Holy Blood, Holy Grail (1982), and Margaret Starbird’s The Woman with the Alabaster Jar:  Mary Magdalen and the Holy Grail (1993)–reminding readers “we’ve been down this road before–twice! (p. 17). 

He then selects some errors in Brown’s novel–aspersions on the N.T. canon, claims regarding Constantine’s role in the Early Church, celebrations of Mary Magdalene, denials of Jesus’ deity–and provides scholarly refutations.  For Brown to suggest that Constantine played a role in establishing the New Testament Canon ignores the fact that the New Testament’s four Gospels “were recognized as sacred and authoritative tradition by A.D. 130” (p. 23), fully two centuries before the emperor ruled!   Brown’s allegation that the Council of Nicaea “proclaimed” the divinity of Jesus “is patently false” (p. 22).  To allege, as does The Da Vinci Code, that the earliest Christian records are contained in the Dead Sea Scrolls and Nag Hammadi documents “is so false it’s what the British would call a howler,” says Witherington (p. 24). 

What Brown “fails to grasp,” Witherington notes, is “that early Christianity, like early Judaism, is not primarily about symbols and metaphors but is deeply rooted in history, including events like the exodus, the reign of King David and the life, death and resurrection of Jesus” (p. 25).  Careful reading of ancient history reveals that the four Gospels are written in the historical and biographical style of that era.  Comparing the Gospel of Thomas (a favorite source for contemporary Gnostics) with the biblical Gospels reveals a world of difference!  It’s the difference between a collage of speculative notions and an integrated, factual position.

Though Witherignton’s treatise has value, it appears as if he simply plugged in some previously written essays dealing with the topics, since it seems curiously detached from The Da Vinci Code itself.  If one’s interested in Witherington’s position on issues such as The Jesus Seminar (a highly publicized Gnostic enterprise) this book is quite good.  But it’s not really a meaningful discussion of Brown’s novel! 

152 Anti-Americanism






      Jean-Francois Revel’s Anti-Americanism (San Francisco:  Encounter Books, c. 2003) provides an experienced French journalist’s explanation of a pervasive attitude that has characterized Europe’s intelligentsia since WWII.  Early in his life, in the ’50s and ’60s, Revel viewed the U.S. “through the filter of the European press” and considered it “the land of McCarthyism and the execution of the Rosenbergs (who were innocent, we believed), of racism and the Korean War and a stranglehold on Europe itself–the ‘American occupation of France,’ as Simone de Beauvior and the Communists used to say.  And then Vietnam became the principal reason to hate America” (p. 3).  In time, having actually spent time the U.S., researching and writing his enormously successful Without Marx or Jesus, Revel came to see that most everything he’d learned about the country was false, largely the product of the “Great Lie” fomented by Communist propaganda.  Without Marx or Jesus elicited much criticism from Leftists, who rightly discerned that the “book was less about America and anti-Americanism than about the epic twentieth-century struggle between socialism and liberal democracy” (p. 12).  Should the American way prevail, Europe’s socialist agenda would fail, so the “Blame America First” instinct became deeply ingrained in the European mind.  So journalists, who should tell the truth, generally “use their forums narcissistically to trumpet their own preconceived ideas instead of serving facts . . . betraying their public” (p. 53). 

     America, of course, has famously succeeded in virtually every way during the 20th century.  Sadly enough, it was Europeans who “invented the great criminal ideologies of the twentieth century, forcing the United States to intervene on our continent twice with her armies.  America largely owes her unique superpower status today to Europe’s mistakes” (p. 16).   Europe decayed, primarily, as a result of the “closed economies” imposed between WWI and WWII–various versions of socialism which, wherever implemented, manifestly failed “to deliver the economic goods, even minimally.”  Thereby Europe imploded, and “This weakening entailed the corresponding and virtually automatic rise of the United States” (p. 43). 

     Consequently, European intellectuals have vented a jaundiced view of all things American.  Revealingly, their main concern, following 9/11, was for endangered Muslims in America, not for Americans slaughtered by terrorists.  When U.S. troops attacked Afghanistan, Europeans denounced the action as imperialistic.  Pacifists everywhere carried “banners that said:  “NO TO TERRORISM.  NO TO WAR.’  Which is about as intelligent,” Revel notes, “as:  ‘NO TO ILLNESS.  NO TO MEDICINE'” (p. 59).  Islamic jihadists, in accord with an Osama bin Laden training manual, uphold “the ideals of assassination, bombs and destruction, to the diplomacy of the rifle and submachine gun.  The principal mission of our military organization is to overthrow the Godless regimes and replace them all with an Islamic regime'” (p. 70).  But to Europe’s craven intellectuals, such terrorists should be appeased rather than resisted. 

     Unlike the intellectuals, Revel insists, Ronald Reagan got it right.  The “Evil Empire” he opposed in 1983 collapsed within a decade.  He wisely invaded and liberated Grenada, a tiny island nation overwhelmed by Cubans intent on imposing Communism.  He boldly said, in 1987:  “Mr. Gorbachev, tear down this wall,” much to the dismay of many in his own State Department, as well as assorted Europeans.    The policy of detante, so celebrated by the likes of Jimmy Carter and eminent European leaders, politely condemned millions of oppressed people to Soviet control.  Reagan’s SDI proposals were roundly ridiculed by his critics.  But “Adam Michnik, Poland’s most influential editorialist and press magnate, recalls that the Strategic Defense Initiative–the ‘Star Wars’ so decried by Western leftists–was the decisive factor in persuading the Soviets that they could never win the Cold War . . . .  The SDI was a key trigger of perestroika and the cascade of events that followed” (p. 126).  President Reagan–and now President George W. Bush–may be “simplistic,” but they rightly know there’s a difference between good and evil and act accordingly. 

     It’s clear to Revel that nothing America does could please her critics.  They’re rooted in socialism’s anti-capitalistic obsession.  “Even during the Cold War,” Revel says, “although it was the U.S.S.R. that annexed Eastern Europe, made statellites out of several Africal countries and invaded Afghanistan, and although it was the People’s Republic of China that marched into Tibet, attacked South Korea and subjugated three Indochinese countries, it remained dogma among Europeans–from Sweden to Sicily, from Athens to Paris–that the only power that could be fingered as ‘imperialistic’ was America” (159).  Seeing what they want to see rather than reality, anti-Americans will forever detest the U.S.

* * * * * * * * * * * * * * * * * *

Paul Hollander escaped his native Hungary as the Soviets occupied it following WWII.  In the U.S. he became a professor of sociology at the University of Massachusetts and published several thoroughly researched monographs, including Anti-Americanism:  Irrational & Rational (New Brunswick:  Transaction Publishers, c. 1995, first published by Oxford University Press in 1992).  In a new introduction, he endorses the words of Eugene Genovese, a noted historian, who said:  “unable to offer a coherent alternative to capitalism as a social system, and with no socialist countries left to identify with, many left-wingers now wallow in a mindless hostility to Western Civilization and to their own identity as Americans” (p. xlvii).  Perhaps they are wallowing, but they’re hardly listless!  They have established an unusually powerful and militant “adversary culture” that dominates much of the media and academe.  Rather than honestly address the failures of Marxism as an ideology, many university professors, Hollander adds, have “immersed themselves with renewed vigor in other matters such as multiculturalism, postmodernism, critical legal theory, revisions of American history, the many branches of feminism, and so forth” (p. l). 

     These professors, such as Noam Chomsky and Howard Zinn, have become constant complaint specialists, routinely denouncing the failures and injustices of America.  They piously champion the plight of various victims, and (though highly privileged themselves) they claim to represent the underclass, the folks oppressed by the “system.”  They praise terrorists at home (the Weathermen in the ’70s) and abroad (Islamic jihadists today) so long as they attack the U.S.   Frustrated utopians, they find no goodness in the less-than-perfect world at hand.  As journalist Studs Terkel confessed, he had little interest in the “facts” about America, but wrote in accord with “‘a vision of what still could be'” (p. 49).   Accordingly, when Angela Davis (once the Communist Party’s Vice Presidential candidate, who now enjoys a prestigious appointment at the University of California, San Francisco) spoke to students at Dartmouth College in 1988 and condemned everything American, the students gave her a standing ovation.  It’s a bit like the heirs of a sizeable estate cheering when their deceased father is pilloried with a collage of harsh allegations. 

     Anti-Americanism, Hollander emphasizes, clearly pervades three public sectors:  1) the churches; 2) the universities; 3) the media.  Mainline churches, he shows, in a fascinating, meticulously documented chapter, have become especially anti-American.  During the past half-century, the clergy “have become the predictable voices of social criticism in American society,” and, importantly, their statements differ little from their “secular counterparts” (p. 81).  Though the “peace and justice” activists who fill the churches’ bureaucracies and pulpits and college classrooms pose as biblical prophets, they’re little more than foot soldiers in the Leftist army.  Having lost their faith in a supernatural religion, they address social and political issues in the name of an absent God.  Thus they fund gangs of thugs and terrorists, such as Yasser Arafat’s PLO, and “resource centers” that are outgrowths of the Weather Underground (p. 117).  They coddle Cuba and North Korea while condemning Chile and South Korea. 

     For me, Hollander’s discussion of the churches proves embarrassing, for he details some sobering truths about some “evangelicals” who influenced me for 20 years.  I long trusted Jim Wallis and the journal Sojourners Magazine to help me assess the social and political world.  Thus I read recommended books, written by radicals like Noam Chomsky and Richard Barnett and the brothers Berrigan.  Naively I absorbed much of their anti-American, pacifist critique.  By 1972 I’d embraced their critique of the Vietnam War–and then was utterly uninformed about the massive loss of life in Southeast Asia that followed America’s retreat.  Amazingly, Hollander shows, Jim Wallis condemned the Vietnamese who fled to the U.S. following the war.  They were, he averred, too addicted to American “consumerism” to appreciate the new society the North Vietnamese would establish!  That thousands of boat people perished rather than bow to Communist dictatorship simply didn’t meet the Sojourner criteria for “peace and justice”! 

Following President Jimmy Carter’s lead, I welcomed the fall of Samoza and the victory of the Sandinistas in Nicaragua.  But then, in the mid-’80s, I began to suspect that Jim Wallis and like-minded informants had misled me!  It dawned on me that they always gave favorable treatment to Marxist movements, whereas anti-Communists–like the Contras in Nicaragua or Pinochet in Chile–were routinely assailed.  Nicaraguan developments, as the Sandinistas slipped into the Soviet orbit, were favorably massaged by Wallis and Sojourners Magazine.  In an issue devoted to a tour of Nicaragua, Hollander observes, “Every stereotyped misconception of Marxist one-party dictatorships reappeared as the authors retraced, figuratively speaking, the path traversed by their ideological forebears who had visited, in the same spirit, the Soviet Union, China, North Vietnam, and Cuba.  The new ‘sojourners’ were similarly impressed by the various accomplishments (real or claimed) of the new regime and were ready to accept all the official arguments and rationalizations regarding the less appealing aspects of life in revolutionary Nicaragua” (p. 130).  In 1987, Wallis actually compared Iran’s Ayatollahs to American Fundamentalists!  So the data in this chapter remind of how wrong I was on many issues, largely due to believing anti-American clergy such as Jim Wallis! 

     Higher Education, Hollander shows, is as rife with anti-Americanism as the churches.  Ever anxious to re-make the world in accord with their desires, leftist professors now dominate most of the nation’s universities.  Interestingly enough, academe seems alluring to leftists.  In one study, 60% of students who considered themselves socialists were interested in becoming professors; 30% of liberal Democrats were so inclined; only 15% of the conservative Republicans aspired to academic careers (p. 151).  In part, it seems, this is because leftists see the schools as opportunities for political action, vehicles for social transformation.  This results from the “academic freedom” that now allows professors to use their lecterns as pulpits!  Before the ’60s, most professors understood that academic freedom applied to their discipline, to research and publish without fear of retribution.  But now professors are hired as feminists or Marxists and urged to promote their ideologies in whatever they teach.  Hollander’s own discipline, sociology has especially “become a vehicle for an impassioned social criticism” (p. 156). 

     No one could consider the 70 pages of meticulously documented sources in Hollander’s chapter on Higher Education and fail to see how anti-American it has become.  He provides the data, the illustrations, the analyses, necessary to demonstrate that, as Irving Kristol wrote, “‘Never in American history have major universities been so dominated by an entire spectrum of radical ideologies as today'” (p. 146).  And what the educated elites acquire in universities is then served to the masses by the media, now controlled by the adversary culture.  Consequently, according to Meg Greenfield, journalists portray America in “surpassingly bleak” ways, suggesting the nation is “‘composed entirely of abused minorities living in squalid and sadistically-run state mental hospitals, except for a small elite of venal businessmen . . . who are profiting from the unfortunates’ misery'” (p. 215).  Inordinately large numbers (approaching 90% at points) of journalists are liberals who vote for Democrats.  They lionized Ralph Nader, Gloria Steinem, and Andrew Young but disdained Ronald Reagan and Margaret Thatcher.  Indeed, among journalists “Fidel Castro was far more popular than Reagan,” according to one comprehensive study (p. 252). 

     “Objective” reporting no longer elicits much support in the media, since it fails to unveil the hidden “structures of power and privilege” the enlightened elites discern.  Promoting one’s cause–celebrating mantras such as peace and justice, condemning the evils of racism, sexism, poverty, apartheid–are no longer restricted to editorial pages and opinion pieces.  So we find that “National Public Television refused to air a documentary made by Cuban émigré film-makers, exposing human rights violations in Cuba, unless it was paired with a reverential program made” by one of Castro’s henchmen (p. 225).  Imagine PBS insisting on a pro-KKK documentary to accompany a film on lynching in the South!  The maker of an award-winning documentary on Vietnam refused to interview refugees from Vietnam but gave ample exposure to Communist Vietnamese officials.  Hollywood films, made by Oliver Stone et al., spoon-feed little but anti-American propaganda into the mouths of a gullible public. 

     Having documented the anti-American phenomenon in three institutional settings, Hollander proceeds to demonstrate its results.  In a detailed chapter entitled “The Pilgrimage to Nicaragua,” he illustrates the power of “political tourism” to shape public opinion.  During the ’80s, the Sandinistas effectively manipulated the American public by orchestrating tours of American clergy like Jim Wallis, professors like Richard Falk, politicians like John Kerry and John Conyers.  Folks wanting to see a successful, egalitarian revolution beheld their dream world when they spent a few days in Nicaragua.  “Here,” reported one journalist, “was a place seemingly run by the kind of people who were Sixties radicals.  Wherever one went, people were young, singing political folk songs and chanting ‘Power to the People.’  One night there was even a Pete Seeger concert in town!” (p. 265).  It was, of course, all tightly orchestrated theater!  Interior Minister Tomas Borge, often a convivial tour guide, actually had two offices.  One, with Bibles, crucifixes and pictures of his family, was reserved for foreign guests.  The other, his real one, showcased pictures of Marx and Lenin.  Nicaragua’s people knew the real Borge and, given the opportunity to vote, removed him and the Ortega brothers from power.  As one voter explained:  “‘It was all lies, what the promised us'” (p. 306). 

     But the lies of the Left, repudiated in Nicaragua, still shape the “worldview of college students” in America.   Few entering freshmen identify themselves as Leftists, but they’re pressured to acclimate to an academic culture strongly tilted toward the radical Left.  They learn to feel alienated from and angry at American culture, especially in its capitalist components.  While wealthy and privileged themselves, they join their professors and pretend to identify with the world’s poor and oppressed.  Consequently, polls reveal that students in the ’80s disliked Ronald Reagan as much as Joseph Stalin.  Indeed, next to Hitler, Reagan was the most “unappealing political leader” (p. 324).  They even equated the atrocities of the Holocaust with the Vietnam War–a striking illustration of moral equivalence. 

     Much the same may be said about the Third World and Western Europe, where the Italian Marxist writer, Eugene Ionescu, proclaimed:  “I am one of the rare European intellectuals who has never been anti-American” (p. 367).  Heading the list of acidic anti-Americans was the English philosopher Bertrand Russell, who (in the early ’50s) compared the U.S. under Truman to Germany under Hitler, warned that Joe McCarthy would be elected President, and accused the U.S. of waging germ warfare in Korea.  In time he devoted himself to opposing both the U.S. involvement in Vietnam and any reliance upon nuclear weapons.  “Late in life Russell reached the conclusion that ‘the American government was genocidal, the police efforts pretty much on par with the camp guards at Auschwitz and black rioting a justified response to a campaign of extermination'” (p. 373).  Such animosity also characterizes Mexican and Canadian intellectuals.  According to Canadians, the most “reprehensible” political leader in modern times was Ronald Reagan, who was judged worse than “Hitler, Stalin, Idi Amin, Pol Pot, or the Ayatollah Khomeni” (p. 434). 

     Having exhaustively (in 500 pages) examined the subject, Hollander concludes that socialists cannot be other than anti-American.  That’s because, Kenneth Minogue says, socialists uphold positions that cannot “be rationally modified,” finding a moral sense of identity not in any commitment to improving man’s lot, but finding fulfillment in struggling “against the world in which they live” (p. 466).  Thus we find the influential postmodernist literary critic, Duke University Professor Frederic Jameson, championing “counterinsurgency warfare” and denouncing America’s  “neocolonialism,” all the “while cherishing and defending memories of the Chinese Cultural Revolution!” (p. 467). 

* * * * * * * * * * * * * * *

                A syndicated columnist, Mona Charin, deals with related issues in Useful Idiots:  How Liberals Got It Wrong in the Cold War and Still Blame America First (Washington:  Regnery, c. 2003), and her notes indicate a significant reliance upon the scholarly works of Paul Hollander.   The book’s title, Useful Idiots, is allegedly a phrase used by Lenin to describe naïve Westerners who helped Communists propagandize the world, and certainly there have been legions of such folks–”liberals” in Charin’s lexicon.  She begins her treatise by endorsing Winston Churchill’s assertion, following WWII, that “the whole world [was] divided intellectually and to a large extent geographically between the creeds of Communist discipline and individual freedom” (p. 2).  Liberals in the West, she argues, have sided with the Communists for 60 years, finding ways to excuse the evils of the USSR and Red China while castigating the any nation–and preeminently the U.S.–committed to free enterprise and personal liberty.  Thus, when President Reagan referred to the USSR as an “evil empire,” a prestigious history professor, Henry Steele Commager, called it “‘the worst presidential speech in American history, and I’ve read them all.  No other presidential speech has ever so flagrantly allied the government with religion.  It was a gross appeal to religious prejudice'” (p. 12). 

                To many liberals, to be an anti-communist was worse than being a communist.  Joe McCarthy was worse than Joseph Stalin!  Alger Hiss and the Rosenbergs were portrayed as victims of American bigotry.  ABC’s Peter Jennings freely labeled Cuba’s Fulgensio Batista an evil dictator, but never in 40 years has he seen Fidel Castro as equally despotic!  During the Vietnam War, the New Left solidified its support for socialism and antipathy toward America.  As Susan Sontag declared, concerning her trip to North Vietnam during the war:  “‘Vietnam offered the key to a systematic criticism of America'” (p. 40).  Charin shows how TV and major newspapers misrepresented the war, leading to America’s defeat, something much desired by the likes of Noam Chomsky, Jane Fonda and Tom Hayden.  The ’68 Tet Offensive, for example, was portrayed as a major military defeat, when in fact it was a stunning victory!  But we who watched Walter Cronkite on CBS never knew that.  Photographs were staged and quotations were manufactured, by journalists like Peter Arnett, in their effort to facilitate the Communists triumph in Vietnam. 

                Subsequently, between 1974 and 1980, ten nations, including Laos, Ethiopia, and Nicaragua, were sucked into the Communist orbit.  Though such events occurred without a single “free election, liberals persisted in the argument that they represented the popular will and took communist regimes at their word when they claimed to be pursuing the ‘people’s’ interests” (p. 82).  Influential writers, such as Edmund Wilson, considered “the USSR the ‘moral light at the top of the world'” and found no fault with its endeavors.  The 1972 Democratic presidential nominee, George McGovern, published his autobiography in 1977, proudly including photographs of himself with Fidel Castro and Vietnam’s premier Pham Van Dong.  As recently as 1985, Paul Samuelson, whose economic textbook was widely used in American universities, praised the central planning strategies of the USSR, agreeing with John Kenneth Galbraith, who lauded the Russian prosperity which manifested itself “in the appearance of solid well-being of the people in the streets” (p. 105). 

                Vis a vis Soviet arms and expansion, liberals consistently counseled disarmament, diplomacy, and U.N. resolutions.  So Jimmy Carter cancelled plans to build the B-1 bomber.  The “nuclear freeze” movement elicited much support.  President Regan’s deployment of SS-20s in Europe and his proposed Strategic Defense Initiative were stridently condemned, with Senator John Kerry helping block the SDI.  Liberals opposed the higher military spending Reagan championed and did whatever possible to gut the CIA.  Mainline churches that provided “sanctuary” for refugees from El Salvador provided little comfort for folks fleeing the Sandinistas in Nicaragua.  Little Elian Gonzalez was returned to Castro’s Cuba, a comfort to liberals like Eleanor Clift who celebrated the fact that he’d attend safe schools and have free health care, much better than Florida.  “New York Times columnist Thomas Friedman wrote:  ‘Yup, I gotta confess, that now-famous picture of a U.S. marshal in Miami pointing an automatic weapon toward Donato Dalrymple and ordering him the name of the U.S government to turn over Elian Gonzalez warmed my heart”” (p. 245).  To Charin, appeasers in the Cold War, like Friedman and Clift, were as misguided as the appeasers in WWII. 

                Space precludes further details set forth in Useful Idiots.  It’s thesis, however, is graphically illustrated by the words of an avowed socialist, Columbia University Professor Eric Foner, a former president of the Organization of American Historians, who wrote, following 9/11, that he “wasn’t sure ‘which is more frightening:  the horror that engulfed New York City or the apocalyptic rhetoric emanating daily from the White House'” (pp. 254-55).  Useful Idiots!

151 Powell & Lodahl: PLNU Theologians






Two of Point Loma Nazarene University’s accomplished theology professors, Samuel M. Powell and Michael Lodahl, share nearly identical career paths: both attended Nazarene colleges and graduated from Nazarene Theological Seminary in 1981, received graduate degrees (Powell from Claremont Graduate School, Lodahl from Emory University) in 1988, and have recently published scholarly books on the doctrine of creation. Earlier the two co-edited Embodied Holiness (IVP), and they are clearly two of the brightest and most prolific scholars in the Church of the Nazarene. Both seek to engage contemporary thought and address it from a committed Wesleyan stance. Professor Powell tends to deal with things as a systematic theologian, carefully consulting historic thinkers and logically building his case. Professor Lodahl tends to think speculatively, weaving together theological notions in accord with his commitment to process thought. Powell stresses the Logos of God in creation, whereas Lodahl celebrates His Love.  Though I differ at points with my colleagues–as some of my illustrations reveal–I’ll try to sum up rather than critique their books, encouraging all interested in current Nazarene thought and teaching to read and evaluate them.

Professor Powell’s Participating in God: Creation and Trinity (Minneapolis:  Fortress Press, c. 2003) is “an exercise in systematic theology” that endeavors “to think about the world in a way that is scientifically responsible and also faithfully Christian” (p. xi).  He assumes the Christian faith and the contemporary scientific worldview are both (in their appropriate spheres) truthful, and he seeks to blend them.  Modern science, however, trumps certain traditional beliefs concerning God’s activity in the world, so “we may, for theological reasons, affirm that God operates in the universe through the laws and may wish to hold that God can operate directly and apart from the laws, [but] the Bible’s affirmation that all, or nearly all, events are the direct and immediate result of God’s action is best regarded as part of an ancient and (for us) incredible world-view” (p. 115).

Powell particularly stresses that the inner essence of the Trinity–one nature in three persons–provides a key to understanding the natural world.  To argue his case, he follows “the regulative, the hermeneutical, and the ethical dimensions” of the Christian faith (p. 4).  “Faith designates the regulative dimension of doctrine.  Understanding denotes the interpretative or hermeneutical dimension.  These two, together with the ethical dimension, constitute the substance of Christian doctrine” (p. 26).  The Bible and its interpretation in the Christian Tradition provide the regulative aspects of theology.  Tracing the development of Christian thinking from Irenaeus to Athanasius, Basil, and Aquinas, one finds a growing understanding of God’s involvement in His creation.  As they plumbed the depths of biblical revelation, they grasped this truth:  “We participate in the trinitarian life of God” (p. 42).

This was particularly evident in the Logos focus of St. Athanasius, the great architect of the Nicene Creed, who stressed the importance of what was called “deification.”  The Eternal Word–God of God, Light of Light–certainly became man.  But our salvation, our “deification does not occur simply by virtue of the incarnation. The grace that brings this about is received by participating in the Word ‘through the Spirit.’  It is by our participation in the Spirit that we are deified” (p. 47).  St. Thomas Aquinas embraced this truth and clarified the critical difference between the infinite Being of God and the opportunity He gives finite beings to have communion with Him.  “This distinction allowed Thomas to assert that a trace of the Trinity is found in all creatures, since they are all effects of God’s causality, but only rational creatures bear the image of the Trinity, since only rational creatures have mind, the structure of which is analogous to the Trinity” (p. 49).  Surpassing Aquinas, however, is Paul Tillich, whose views undergird Powell’s.  Tillich equated salvation with attaining a “new being” through participating with the Spirit in the “Ground of Being,” God Himself.  Tillich’s “presentation is more solidly trinitarian than is Thomas’s,” says Powell, “especially with regard to the Spirit.  Indeed, God as Spirit is the fulcrum on which Tillich’s analysis rests” (p. 54).

Having rooted his presentation in trinitarian theology, Powell then seeks to understand the universe “in a trinitarian way.”  In successive steps, he shows how the inorganic, organic, and human worlds illustrate certain trinitarian truths.  There is, for example, “persistence and change in time.”  Some things, like atoms, persist without development, simply appearing and disappearing.  Other things, like molecules, combine atoms and develop.  We, like molecules, constantly change, so “human nature is also something not yet accomplished; it is underway” (p. 70). Especially as we enter into the kingdom of God, where freedom flourishes, significant personal growth occurs and “the distorting effects of sin are being overcome” (p. 82).  

Powell especially emphasizes–in accord with “theologians of hope” like Jurgen Moltmann–man’s openness to the future and the importance of hope as a theological virtue.  In the Christian tradition, hope has primarily been rooted in the prospect of a specific person’s victory over death.  But, Powell insists:  “God’s response to the phenomenon of death” takes shape in the “metaphor” of “eternal life,” which “is not the unending continuation of human existence but must be thought of as a present reality, much as the future is to be thought of as that which presses on the present as the domain of the possible.  Eternal life is a life conducted in the face of the future of God’s kingdom and in the power of that future” (p. 84).  Precisely what this means, Powell cannot say:  “Does it mean the survival of the individual person as an individual?  A re-creation of the universe?  Or something else?  It is impossible to know” (p. 84).  In part this is because “persons” have no specific, given metaphysical “essence.”   Aquinas helpfully portrayed the Trinity as three divine persons’ “subsistent relations” within the unity of the Godhead.  Consequently “personhood,” Powell insists, “is essentially a social phenomenon” (p. 139).  We become persons as we interact with other persons.  We have no essential “self” per se, for selfhood develops through time.  This takes place much the same as “one becomes a scientist,” which involves “learning to think and practice in certain ways and . . . see some things as more important than other things.  This teaming is an intensely interpersonal process.  So it is with becoming a person.  Our entry into the social order is an entry into the realm of persons and we do so by means of interpersonal interactions” (p. 142).

As we become persons we embrace the moral standards of our social world.  To be fully human is to be morally responsible.  Our foremost ethical challenge, Powell says, is to interact rightly with the world around us.  Historically, some Christians have withdrawn for the world, trying to transcend it in otherworldly, often ascetic ways.  Others have sought to embrace it, fully participating in all its endeavors, jettisoning Christian distinctives.  Powell suggests we follow a via media, taking some clues from Albert RitschI, who called Christians to both transcend and participate in the world. To a degree, this follows Aquinas’ emphasis on the Natural Law–discerning and following the divine principles evident in the natural world.  It also fits the Protestant Reformers’ call to discover and live out one’s vocation, making the world a better place through creative work.  Participating in God means sharing His concern for His world.

* * * * * * * * * * * * * *

Professor Powell’s most recent publication is Holiness in the 21st Century: Call, Consecration, Obedience Perfected in Love (San Diego:  Point Loma Press, c. 2004).  He declares that “the doctrine of holiness is not just one important doctrine among others but that it is the center of the Christian faith.  It is the center of our faith because God is love and because there is no higher calling for human beings than to share in the nature of the God who is love” (p. 21).  Consequently, he seeks to “contribute to the discussion about holiness that has been taking place in the Church of the Nazarene since its beginning” (p. 7).

During her first 75 years the Church of the Nazarene cohesively affirmed her “cardinal” doctrine of entire sanctification.  “According to this consensus,” Powell explains, “holiness was achieved by an act of consecration, in which, in a decisive moment, one gave oneself utterly to God in an act of devotion.  The results were threefold:  first, one received the Holy Spirit in full (usually referred to as ‘the baptism in the Holy Spirit); second, many of the effects of original sin were overcome (or ‘eradicated’); third, one began to experience perfect love for God and neighbor” (p. 8). This consensus began to unravel during the 1970s as a “relational” understanding of holiness (rooted in the thought of Martin Buber and propounded by Mildred Bangs Wynkoop and Rob Staples, former Nazarene Theological Seminary professors) challenged it.  Though the Wynkoop-Staples view has exerted considerable influence in many circles, the denomination has not fully embraced it.  Consequently, unless a better approach is found, Powell thinks the doctrine of holiness may very quickly become an interesting “fossil” without much currency in the church.

To provide a solution (fully understandable in light of his Participating in God), Professor Powell urges us to rethink our theology, to understand holiness as a mystical “participation in the trinitarian life of God and [acknowledge] that the perfection of holiness means the full actualization of this participation” (p. 17).  To join in the loving communion of the Father and Son through the Holy Spirit is to enter into the depths of divine holiness.  This means more than following some rules, more than asking what would Jesus do.  “We do not merely become like God,” says Powell.  “Instead, we abide in God and God abides in us.  Or, to use the language of 2 Peter, we become participants in the divine nature” (p. 18).  Participating in God, we should live differently, so Powell suggests some practical ramifications of this transformation.  We should deal responsibly with our wealth.  We should be pro-life, opposing abortion as a means of birth control. However, he cautions, there are no absolutes in such areas–loving God and our neighbor leads to various particular, prudential, Spirit-led decisions.  In all these areas, we cannot escape struggles, doubts, and imperfections, because “the fact is that the holy life is lived under the condition of sin” (p. 25).

The older (and to Powell no longer plausible) approach to holiness, with “its emphasis on the instantaneous character of becoming holy ignores the dynamic character of human existence” (p. 28), whereas holiness understood as participating in God, led by the Spirit, allows for constant growth with little need for defining moments of crisis experience.  Such spiritual growth requires participating in the life of the Church as well.  To Powell, the nation of Israel was holy, and so is the Church.  Within the social web of relationships, one becomes a person and learns how to live a holy life.  Indeed, “the Church, as an ensemble of relationships and mutual influencing, has a sacramental function” (p. 33).  In the Church we’re taught truthful doctrines, especially concerning Jesus.  In the Church we interact with others who illustrate the ways God works within our lives.  And in the Church we are held accountable for our decisions and development sharing life with fellow believers makes one holy.

* * * * * * * * * * * * * *

Professor Michael Lodahl sets forth his understanding of creation in God of Nature and of Grace:  Reading the World in a Wesleyan Way (Nashville: Kingswood Books, c. 2003.)  The book’s title comes from lyrics in a Charles Wesley hymn:  “Author of every work divine / Who dost through both Creations shine: / The God of nature and of grace.”  He finds John Wesley’s “hermeneutic grounded in love and a method attentive to experience” liberating, and he reads Scripture thusly.  Though he admits to “reading Wesley as a champion of a decidedly strong version of the doctrine of divine immanence” (p. 124), Lodahl clearly seeks to accurately apply some of Wesley’s insights to very contemporary concerns.  He also acknowledges his debt to the relational theology of Mildred Bangs Wynkoop, incorporates some threads of feminist thought, and explicitly commits himself to process theology as expounded by John Cobb.  The gerunds he uses to identify the major points of his presentation–Making, Molding, Mending–indicate that everything moves, and in the midst of it “God the Weaver reaches deeply into our mothers, indeed into our great mother Earth, to knit us together with great care” (p. 18).

Lodahl first examines Psalm 104 to demonstrate how God is at work–”Making” all that is.  Neither this psalm nor other biblical passages describing creation should be taken literally:  “This ancient cosmology is beautiful, but it is not science” (p. 35).  Rather, it is poetry, allowing the reader to engage in “what Paul Ricoeur called a ‘second naivete’–a playful and imaginative reading of Scripture that frees me to stand on the Sunset Cliffs of San Diego (as I often do) and breathe deep of ‘the breath of God'” (p. 44).  God’s love-making, revealed in Scripture, Lodahl says, is necessarily non-coercive and is best illustrated by Jesus’ suffering on the Cross.  So God must have created through wooing a pre-existing chaos into creation’s evolving realms.  The creative “Word” celebrated in St John’s Prologue “is not a coercive omnipotence unilaterally forcing the world to conform to its demands; it is, to the contrary, a vulnerable, sacrificial, and ostensibly ‘weak’ Word that invites and allures through the wooings of love” (p. 66).  This Word (Jesus Christ), “This one whom Christians confess to be the Messiah, God’s Anointed One, did not (and I believe does not and shall not) fit the description of the world-conquering, apocalyptic lord” (p. 181).  He imposes no predetermined qualities, not even goodness, on His creatures.  Creatures are not called “good” in Genesis because a good God designed them to be such.  Quite the contrary, “God does not ‘already know from eternity’ that the creatures are good; God sees (experiences?) their goodness, their fittedness to God’s creative purposes.  God, in other words, responds with approval to the world’s own response to the divine invitation to let there be” (p. 65).

Professor Lodahl’s position also entails revising the ex nihilo doctrine of creation as generally understood in the Christian tradition, which has generally pointed to a definite beginning point–much like the “big bang” of modem physics–where time and space and matter dramatically appeared.  To Lodahl, creation ex nihilo is better understood as a declaration that everything is radically dependent upon God, that nothing could be apart from Being itself. Perhaps, he says (with John Cobb), the material world and its Creator are co-eternal.  The Eternal God is thus eternally loving and shaping the world in a self-surrendering way; He influences rather than demands.  His “influence is, then, an empowering of the creature to ‘move itself,’ to exercise the agency appropriate to its capacities.  God does not ‘move’ the creature, but graciously and humbly gifts and graces the creature with the power of its own agency and integrity as a creature” (p. 98).  He is, as described in the book’s second part, gently “Molding” a material world which is, as Sally McFague says, the “body of God.”  We must, whether thinking about God or man, reject any rigid metaphysical dualism, for a fully immanent God cannot be severed from the physical world, just as a person’s “soul” cannot be separated from his body.  Still more:  God’s engaged in saving all creation, not simply human beings within it.  Thus God surely laments animal suffering, eating meat, global poverty, capitalistic consumerism, etc.

And God’s also engaged (as we discover in the book’s third section) in “Mending” this not-yet-perfect world.  God “saves” us by transforming (mending) us here-and-now as we enter into a loving relationship with Him.  We should move beyond the “all-too-traditional Christian understanding of redemption, essentially gnostic in nature, that even today tends to envision salvation as the individual soul’s postmortem ascent to heaven” (pp. 222-223).  God alone is eternal, “but apparently the God and Father of our Lord Jesus Christ is not above sharing the divine life or gifting the creature with the Creator’s own Spirit, in such a way that God does not possess but instead passes on God’s own life to the creaturely, the finite, the mortal” (p. 229).  As Lodahl reflects upon broader eschatological themes, he rejects both premillenialism and postmillenialism (and amillenialism as well, I assume), for they envision a terminal  “end” of history.  Rather than “simplistically” taking the Apostles’ Creed’s assertion that Jesus shall dramatically “come again to judge the living and the dead” to mean that He will split the heavens in a dramatic “second coming” moment, Lodahl asks, “what if the real ‘end’ of history, God’s most fundamental telos or purpose for our world, is the gracious (re)creation of human beings to become, in this life, creatures made, molded, and mended by divine love?  What if God’s ‘end’ for the world is that love might flourish–that we might become lovers of God and all of our neighbors? Might this provide a more adequate Wesleyan reading of eschatology?” (p. 172).  Rather than an “end” of time, the “end” will be God’s eternal mending of a world forever in process.

To join God in mending things we should address “ecology in a Wesleyan Way.”  All around us it’s evident that “human selfishness, greed, and violence–especially in tandem with industrial and technological developments of the past several centuries–have done perhaps irreparable damage to our planetary home” (p. 209).  Along with John Wesley, in his sermon “General Deliverance,” Lodahl decries the sufferings endured by the good earth and animal world.  All this is, basically, an unfortunate aspect of a world not sufficiently evolved, not yet persuaded to live in perfect harmony.  But we’re called, Lodahl says, to be “the vanguard of God’s ongoing labors to create a world of which it might be said, ‘It is very good'”(p. 198).  Shelving Wesley’s notion that man’s sin destroyed the Garden of Eden’s perfect harmony, he says:  “Whatever might be entailed in the Christian affirmation that God’s creation is good, it cannot mean that there once was a pristine, painless perfection from which he world has fallen due to human disobedience” (p. 208).  Creation always groans, as Paul says in Ro 8:26, and we may join with him, praying, working and hoping that in time God’s perfect plan will be realized and groaning cease.

* * * * * * * * * * * * * * *

Professor Lodahl’s latest work is titled:  “All Things Necessary to Our Salvation”:  The Hermeneutical and Theological Implications of the Article on the Holy Scriptures in the Manual of the Church of the Nazarene (soon to be published by PLNU’s Point Loma Press, so I cannot provide precise page citations).  Therein he explains why the Church of the Nazarene came to her position on biblical inspiration and suggests how it applies to a proper understanding of the doctrine of creation.  The official position of the church affirms “the plenary inspiration of the Holy Scriptures . . . given by divine inspiration, inerrantly revealing the will of God concerning us in all things necessary to our salvation; so that whatever is not contained therein is not to be enjoined as an article of faith.”

This statement largely appropriated the position of the Anglican and Methodist churches, taking its final form as a mediating position between Fundamentalism (espoused by former Methodists such as H.G. Morrison, who became a general superintendent in the Church of the Nazarene) and Modernism (evident in many mainline denominations).  Professor Lodahl’s explanation of developments in the early decades of the church, moving from a simple movement’s commitments to a denomination’s creed, cogently places the Church of the Nazarene in the nation’s ecclelsiastical rainbow.  He deeply appreciates and appropriately cites the “greatest” 20th century Nazarene theologian, H. Orton Wiley, who worked skillfully at the General Assembly in 1928 to steer the church away from affirming an inerrant (i.e. verbally inspired) text while retaining a deep commitment to an authoritative Bible.  The “salient additions” Wiley made in 1928 were “plenary inspiration and inerrantly,” insuring “that the denomination would espouse the conviction that biblical authority is rooted in soteriology, or the doctrine of salvation.”  Doing so he carved out “a little bit of elbow room” for a scholarly hermeneutic that allows for a fully divine-human composition of the sacred writings.

Wiley, Lodahl argues, especially committed Nazarenes to interpreting the written word in accord with the “Living Word,” the Lord Jesus Christ.   God fully revealed Himself in His Son, and all Scripture bears witness to Him.  Thus we must not allow “usurpers” such as the Church, or the Bible, or human Reason, to displace the Living Word.   “To put it simply, Jesus Christ is God’s full and final Word, a Word uttered incarnationally.”  Consequently, “Scripture’s authority rests essentially in its capacity to testify truthfully, and therefore salvifically, to this Living Word in history.”

Given this understanding of biblical inspiration, Professor Lodahl urges us to read Genesis 1 as a declaration that God has created all that is, not a depository of scientific information concerning creation.   As Wiley said, “The Genesis account of creation is primarily a religious document.  It cannot be considered a scientific document, and yet it must not be regarded as contradictory to science.”  More importantly, approaching the text christologically leads one to root one’s doctrine of creation in John 1 rather than Genesis 1.  Consequently, Lodahl thinks “Wiley would have profited by more aggressively incorporating his appreciation for the Johannine theme of Jesus Christ as God’s Living Word into his interpretation of Genesis 1.”

Doing precisely this, Professor Lodahl applies Wesley’s “hermeneutic of love” to Genesis 1.  The great call of the Bible is to love God and neighbor.  “The church of Jesus Christ, then, reads the creation stories of Genesis in Christ, through Christ, and toward Christ–and so within the dynamic, the dynamo, of the age to come.  It is not back to Adam and Eve in a garden that Jesus’ disciples are called.  Rather, they are beckoned forward into God’s unimaginable future, foretasted now in Jesus Christ and his Spirit-breathed fellowship of the local church congregation, where ‘there is no longer Jew or Greek, there is no longer slave or free, there is no longer male and female’–and may I dare to add, ‘there is no longer creationist and theistic evolutionist?–’for all of you are one in Christ Jesus’ (Gal. 3:28).”

# # #

150 Appraising The Crusades






          Crusades and crusaders have lately elicited little more than antipathy and abuse.  After mentioning the need for a crusade in response to 9/11, President Bush quickly cleansed his language of all such references.  In Christian circles, where we used to sing “Onward Christian Soldiers,” any hint of Christian militancy has been suppressed.  Billy Graham once held crusades around the world but now uses less offensive terms.  For a century, Christian colleges–such as Point Loma Nazarene University, where I teach–happily embraced the crusader as a suitable mascot for athletic teams.  After all, Crusaders were brave men who risked their lives as cross-bearing pilgrims, determined to rescue the holy city of Jerusalem.  But PLNU recently discarded the Crusader logo and now portrays its representatives as Sea Lions.  (Ironically, the most notable historical reference to this creature was “operation Sealion”–Hitler’s proposed invasion of Britain in WWII). 

          Given the Crusaders’ current disfavor, it’s important to learn a bit about their history!  As a rule of thumb, those who most detest the Crusades know the least about them!  Fortunately, there’s  been a revival of serious historical work in this area, for which Thomas F. Madden’s A Concise History of the Crusades (New York:  Rowman & Littlefield Publishers, Inc., c. 1999) provides a convenient entryway.  A chronological narrative, Madden’s account begins with “the call,” first evident in Pope Gregory VII’s proposal to send a Christian army against the Turks in 1074, just 20 years after the momentous schism between eastern and western branches of Christendom.  To Gregory this would be “an errand of mercy and an act of charity” to restore the unity of the Church (p. 7). 

          Gregory’s aspiration found a clear voice in his successor, Pope Urban II, who responded, in 1095, to the Byzantine emperor’s request for assistance with a call for the First Crusade.  Thousands of folks rallied, taking up the Cross as pilgrims, “cross bearers” determined to do penance by going to Jerusalem and restoring her sacred sites to Christian control.  Typical of feudal society, crusaders moved without much cohesion or plan.  Thus, needing supplies, they pillaged the countryside as they moved and especially alienated the Byzantines they supposedly came to rescue.  Multiplied thousands of them–especially the followers of Peter the Hermit–were slaughtered by Muslims in Turkey.  But in time a remnant of the Crusaders conquered Jerusalem in 1099 and established a Christian kingdom that lasted for nearly a century. 

          What they lacked, however, were three essentials for permanent success:  “a strong ruler, ready troops, and abundant supplies” (p. 39).  Lacking organization, continual dissent plagued the feudal states established along the east coast of the Mediterranean Sea.  Though many Europeans fought as Crusaders, few of them stayed–indeed virtually all the great nobles returned home as quickly as possible.  Often there were only a few hundred knights, plus a few thousand foot soldiers, in the entire region, holding out against thousands of Moslem warriors.  Losses, by 1140, led to the preaching of the Second Crusade, primarily by St. Bernard of Clairvaux, though little resulted from the campaign that failed (in its only significant action) to take Damascus in 1148. 

          Resurgent Islam in the 1180s prompted the Third Crusade, which enlisted the largest number of warriors.  In 1187 Saladin defeated a Christian army near Nazareth at the Horns of Hattin, one of the most decisive battles ever fought.  (The real Saladin, parenthetically, bears little resemblance to his benign popular image–he was, in fact, a rather ruthless, dictatorial man.)  The Christians were decimated, and all the gains of the previous century seemed imperiled.  So Richard the Lion Heart and other European kings orchestrated “the largest military enterprise of the Middle Ages” (p. 81).  Some victories were won, but basically the Crusaders negotiated with Saladin and retired from the fray.

          Five more crusades, during the 13th century, targeted the Holy Lands.  One ended up sacking Constantinople after a complicated struggle between Latin and Orthodox forces.  Others bogged down in Egypt, and one imploded with the death of Louis IX in Tunis.  In 1291, the final fortress in the Holy Land fell, and the Crusaders withdrew.  However, conflict between Christians and Muslims continued.  Constantinople fell in 1453 and Vienna was besieged in 1529.  Crusades of various sorts still attracted followers.  But the Reformation divided the Church, and in the 16th century political divisions and loyalties became paramount.  “In that new world,” Madden says, “the crusade had no place” (p. 213). 

* * * * * * * * * * * * * * * * * *

          Madden frequently cites the work of Jonathan Riley-Smith, a professor of history at the University of Cambridge, who is generally considered the premier authority on the crusades.  Of his many monographs, the volume to first consult is What Were the Crusades? (3d ed., San Francisco:  Ignatius Press, c. 2002).  He notes that computer-aided research has especially revised historians’ understandings of the crusades.  “In particular,” he says, “I have become much more aware of the penitential element in crusading and the way it coloured the whole movement.  I now believe that it was its most important defining feature” (p. xii).  Piety–doing penance so as to receive indulgences–not avarice or ambition, prompted the Crusades.  Urban II, calling for the First Crusade, “was, in effect, creating a new type of pilgrimage, like the perigrinatio religiosa in that it was volunteered out of devotion, but also like the penitential one oin that its performance constituted a formal penance and was set by him in the context of the confessional” (p. 55). 

          Waging war for pious reasons was permitted, according to Christian teaching, so long as it was a “just war.”  Rooted in the thought of St Augustine, sanctified by the approval of saintly preachers such as St Bernard of Clairvaux, Crusaders considered themselves fighting to defend innocent Christians who had been violently overwhelmed by Moslems.  Urban II, calling for the First Crusade, urged his hearers to “liberate” fellow Christians suffering the tyranny of Seljuk Turkish rule, to liberate the holy shrines in Jerusalem, to love their brothers enough to lay down their lives for them.  It was a just and righteous reason to take up arms.

          Valid crusades were duly declared by legitimate authorities–another mark of a just war.  For two centuries a series of popes–many of them, beginning with Gregory VII, quite godly–urged crusaders to sally forth, doing the Lord’s work.  And Christian kings–such as the revered Louis IX–invested time and talent seeking to do it.  Crusaders fought with the assurance that the highest authorities, both spiritual and secular, supported their efforts.  Most importantly, crusading was a means of grace, placing “the act of fighting on the same meritorious plane as prayer, works of mercy and fasting” (p. 56).  This is evident in that, despite the Crusaders’ penchant for indiscriminate and ruthless violence, the “first crusaders began each new stage of the march barefooted and fasted before every major engagement.  In June 1099 they processed solemnly around the city of Jerusalem, which was still in Muslim hands,” following robed priests and singing songs (p. 58).  Crusaders went forth only because they were buoyed up by words, such as St Bernard’s:

 Go forward then in security, knights, and drive off without fear the enemies of the Cross of Christ, certain that neither death nor life can separate you from the love of God which is in Jesus Christ. . . .  How glorious are those who return victorious from the battle!  How happy are those who die as martyrs in the battle!  Rejoice, courageous athlete, if you survive and are victor in the Lord; but rejoice and glory the more if you die and are joined to the Lord.  For your life is fruitful and your victory glorious.  But death . . . is more fruitful and more glorious.  For if those who die in the Lord are blessed, how much more are those who die for the Lord! (p. 65) 

* * * * * * * * * * * * * * * * * *

          With the most recent, highly-respected historical scholarly works in one hand, it is fascinating to pick up two books written two notorious generalists–Hilaire Belloc and G.K. Chesterton–nearly a century ago.  The two friends–known as “Chesterbelloc” by some–shared similar interests but contrasting personalities.  This appears in the books the two wrote dealing with the crusades. 

          A bellicose Belloc’s The Crusades was first published in 1937 and was reprinted by Tan Books and Publishers in Rockford, IL in 1992.  He wrote with the conviction that the Seljuk Turks’ victory over Byzantine Christians in 1071 at Manzikert, the incident that led to the First Crusade, would have quickly led to the conquest of Constantinople and perhaps of all Europe, had not the Crusaders rebuffed them.  “The Mongols overran, devastated, and destroyed all that land of hither Asia which had been the solid foundation of the Byzantine power; the reservoir of Byzantine landed wealth, the nursery of our religion.  The victorious Turks pillaged and killed wholesale . . . .  They so cut at the roots of all civilization that it withered before them.  Within much less than a lifetime the whole vast district of interior Asia Minor was ruined” (pp. 16-17).   At that moment, both Pope Gregory VII and Hilaire Belloc recognized:  “The issue was the life or death of Christendom” (p. 17).  And that issue still stands unresolved, for the crusades unfortunately failed to crush Islam, which remains intact and still threatens Christianity. 

          Belloc’s account mixes narrative and analysis.  “Human affairs are decided through conflict of ideas, which often resolve themselves by conflict under arms” (p. 1).  At times he provides gripping descriptions of the men and armies, terrain and cities, weapons and strategies, battles and bloodshed, which characterized the crusades.  Of the armies that assembled in the First Crusade, he writes (anticipating the conclusions of today’s historians):  “The host was essentially a host of pilgrims; the armed as well as the unarmed thought of themselves as men engaged on a pilgrimage; a journey undertaken with a religious object for its goal and under a vow” (p. 36).  

Crusade leaders such as Bohemond, Baldwin, and Tancred come to life in Belloc’s story, for they’re portrayed with a novelist’s attention to detail–physique, hair color, temperament, character.   He even provides numerical estimates (too often unmentioned by crusade historians) of the forces involved:  perhaps 300,000 people crossed over the Bosphorus into Asia, of whom some 40,000 made it to Jerusalem.  Of the 40,000, only 1,500 were knights–the virtually irresistible mounted warriors who consistently defeated far greater numbers of Islamic warriors.   Their religious fervor was clearly illustrated in the transformation that took place among the crusaders when the lance head that pierced Christ’s side was unearthed in Antioch and the solemn processions around Jerusalem shortly before the city was attacked.  “They went in solemn train, chanting the holy chants, from the Mount of Olives, dominating the town, round by the north and west to Zion hill; and all the walls were crowded with the Negroes and the Saracens, jeering at them and their canting–planting crosses in full sight of the Christians, which they spat upon and otherwise defiled” (pp. 113-114).  The subsequent savagery of the Crusaders’ behavior in Jerusalem was fueled by the Muslims’ sacrilege, and “a violent resistance ended in general massacre” (p. 115).  Though the valiant Tancred tried to restrain them, the Christians slaughtered the holy city’s defenders, sadly blemishing their endeavor. 

Apart from the descriptive passages–devoted singularly to the first three crusades–Belloc’s strength lies in his analysis.  He particularly emphasized how the failure to take Allepo (mid-way between Antioch and Edessa in northern Syria) in 1097 and Damascus in 1098 ultimately doomed the Christian cause.  By failing to occupy these strategic sites, Muslims controlled the important north-south road that skirts the desert east of the region’s mountains.  Had the Crusaders seized control of Allepo and Damascus and “permanently occupied the whole maritime belt of Syria between the Mediterranean and the Desert, they would have cut Islam in two.  That is the central strategic truth of the Crusades–but they never occupied the whole” (p. 181). 

Further hindering the Crusades was the incessant internal strife and disorganization that forever limited the Christians’ efforts.  Though they allegedly came to “help” Byzantium recover her lost lands, the Crusaders all too often battled the Greeks whose objectives differed from theirs.  Crusaders constantly squabbled among themselves, even resorting to violence to resolve disputes.  Decisions were endlessly debated and delayed, simply because of the nature of Western Europe’s feudal society.  The first generation of Christians, a small minority surrounded by Greek Christians as well as Muslims, soon intermarried, and their descendents often opposed the Europeans who came to help them in later crusades. 

Thus weakened, the Crusader states were vulnerable to Saladin, the “fanatically anti-Christian” Muslim who won the pivotal battle of Hattin in 1187.  Subsequently, despite various crusades, the Muslims would control the region.  However, Belloc insists, the first century of crusading arrested the Turkish advance and saved European civilization.

* * * * * * * * * * * * * * * * * * *

Belloc’s friend, G. K. Chesterton, dealt with the Crusades in a travel book he wrote shortly after World War I, titled The New Jerusalem (Fort Collins, CO:  Roman Catholic Books, reprint of 1921 publication).  Typical of Chesterton, this book abounds with imaginative insights, unexpected correlations between modern events and ancient history, and an enthusiastic defense of Christianity accompanied by a warm-hearted critique of her foes.  Beginning his journey, for example, he noticed the cross-roads in his own village.  This reminded him that “The sight of the cross-roads is in a true sense the sign of the cross.  For it is the sign of a truly Christian thing; that sharp combination of liberty and limitation which we call choice.  As I looked for the last time at the pale roads under the load of cloud, I knew that our civilization had indeed come to the cross-roads” (p. 16). 

That civilization, of course, for centuries has battled Islam, which denies free choice!  And as Chesterton traveled to Middle East he witnessed many manifestations of these two worldviews.  Muslims had flowed out of the desert, everywhere evident to travelers such as Chesterton, and “it is the nature of all this outer nomadic anarchy that it is capable sooner or later of tearing anything and everything in pieces; it has no instinct of preservation or of the permanent needs of men.  Where it has passed the ruins remain ruins and are not renewed; where it has been resisted and rolled back, the links of our long history are never lost” (p. 29).  They were–and are–barbarians, and “there is above all this supreme stamp of the barbarian; the sacrifice of the permanent to the temporary” (p. 67). 

These barbarians invaded Christian lands.  Whatever heroic virtues one may find in the Islamic invaders, “certainly it was Islam that was the invasion and Christendom that was the thing invaded.  An Arabian gentleman found riding on the road to Paris or hammering on the gates of Vienna can hardly complain that we have sought him out in his simple tent in the desert” (p. 34).  As invaders, the Muslims occupied territories long shaped by the Roman Empire.  Crusaders, following the challenge by the Pope in Rome, simply endeavored to restore that empire.  They were “not riding into Asia” but determined to restore lands in Asia to European control.  “In one sentence, it meant that Rome had to recover what Byzantium could not keep” (p. 208). 

As a tourist in Jerusalem, Chesterton noticed small things that reveal much larger realities.  Take for instance the fact that Muslim women wore black dresses whereas Christian women wore white.  “A stranger entirely ignorant of that world would feel something like a chill to the blood when he first saw the black figures of the veiled Moslem women, sinister figures without faces.  It is as if in that world every woman were a widow” (p. 101).  The Christian woman in Bethlehem, however “is made to look magnificent in public.  She not only shows all the beauty of her face; and she is often very beautiful,” but she wears a jeweled crown that “can only conceivably stand, for what we call the Western view of women, but should rather call the Christian view of women” (p. 108).    The differences could not be more black and white! 

Important differences are equally evident when one compares the Medieval Crusaders with their Muslim foes.  Unfortunately, Chesterton says, the anti-Christian bias of the Enlightenment led to a hostile misrepresentation of the Crusades in 18th and 19th century novels and histories.  Such prejudice underlies the erroneous tendency to compare Crusader “intolerance” with the “toleration shown by the Moslems” (p. 261).   “In those romances the Arab is always credited with oriental dignity and courtesy and never with oriental crookedness and cruelty.  The same injustice is introduced into history, which by means of selection and omission can be made as fictitious as any fiction.  Twenty historians mention the way in which the maddened Christian mob murdered the Moslems after the capture of Jerusalem, for one who mentions that the Moslem commander [Saladin] commanded in cold blood the murder of some two hundred of his most famous and valiant enemies after the victory of Hattin” (p. 260).  This bias is evident, Chesterton notes, when writers such as Voltaire vent their hostility to the Cross by condemning–or ridiculing–those who marched under its banners as Crusaders.  All such prejudice, however, must be understood as “a prejudice not so much against Crusaders as against Christians” (p. 264). 

To confront such prejudices, Chesterton would have us recover the Medieval mind, to get back on the “right road” that led to the Crusades!  Indeed, he believed Europe (represented by English troops occupying Palestine when Chesterton visited) should re-establish Christendom in the Middle East.  Intrinsically barbarian, locked into an oversimplified theology, Islam simply cannot sustain a civilization.  So “It is now more certain than it ever was before that Europe must rescue some lordship, or overlordship of these old Roman provinces” (p. 266). 

149 The Case for A Creator





            In 1959, Chicago hosted a Centennial Celebration marking the publication of Charles Darwin’s On the Origin of Species.  Speakers like Sir Julian Huxley boldly portrayed the Darwinian theory as fully established, and Stanley Miller’s recent origin-of-life experiment seemed to prove that lifeless chemicals, properly jolted by electricity, had fused to make amino acids, the organic building blocks for proteins and thus life.  James Watson and Francis Crick had just unraveled the mystery of DNA, which promised to deliver a fully naturalistic explanation for terrestrial organisms.  Evolution through natural selection reigned as absolutely in the life sciences as did Marxism in the U.S.S.R.    

Few then would have imagined that, 40 years later, a vigorous scholarly movement labeled “Intelligent Design” would challenge Darwinian dogmas and elicit serious attention, including discussions in the New York Times and the Los Angeles Times and essays in prestigious journals such as Natural History, the publication of the American Museum of Natural History.  A fine scholarly overview of this movement is now available in Thomas Woodward’s Doubts about Darwin:  A History of Intelligent Design (Grand Rapids:  Baker Books, c. 2003), a highly readable rendition of his Ph.D. dissertation. 

            “Murmurs of dissent” from Darwinism had occasionally rippled the scientific waters, as was evident when the noted French zoologist Pierre Grasse published L’Evolution du Vivant in 1973 and boldly rejected its core concepts.  The fossil record, he insisted, holds all the evidence we have for  life’s ancient history, and it reveals nothing akin to the “gradualism” basic to Darwin’s theory.   Sir Fred Hoyle, a Nobel Prize winner in physics, evaluating the mathematical probability of life evolving through chance and necessity, concluded that it was about as possible as a tornado putting together a Boeing 747 with materials sucked up from a junkyard.  Darwinian disciples, such as Stephen Jay Gould, occasionally admitted this–all the while devising improbable hypotheses to sustain it. 

But such “murmurs” hardly troubled the scientific community’s entrenched commitment to Darwinism.  “Intelligent Design” surfaced in the 1980s, Woodward says, with the revisionist scientific work of Michael Denton, an agnostic who declared:  “‘Neither of the two fundamental axioms of Darwin’s macroevolutionary theory–the concept of the continuity of nature . . . and the belief that all the adaptive design of life has resulted from a blind random process–have been validated by one single empirical discovery of scientific advance since 1859′” (p. 47, italics Woodward’s).  Indeed, as he concluded Evolution:  A Theory in Crisis:  “‘One might have expected that a theory of such cardinal importance, a theory that literally changed the world, would have been something more than metaphysics, something more than a myth.'”  But in fact, “‘the Darwinian theory of evolution is no more nor less than the great cosmogenic myth of the twentieth century'” (p. 24).         

Denton’s work was soon absorbed by Phillip Johnson, a law professor at the University of California, Berkeley, who developed an interest in Darwinism fueled by his conviction that neither the evidence nor the argumentation demonstrate its truth.  It seemed obvious to him that “metaphysical naturalism,” not empirical data, sustained the evolutionary creed.    Johnson maintains, Woodward says, that “‘Darwinism functions as the central cosmological myth of modern culture–as the centerpiece of a quasi-religious system that is known to be true a priori, rather than as a scientific hypothesis that must submit to rigorous testing'” (p. 95).  Following Johnson’s wedge in the 1990s came “the four horsemen” of the Intelligent Design movement who were just finishing their graduate studies:  Steven Meyer, earning a degree from Cambridge University; William Dembski and Paul Nelson at the University of Chicago; and Jonathan Wells, at the University of California, Berkeley.  Linked up through the internet and scholarly conferences, they published and argued their position in collections of essays such as The Creation Hypothesis and Mere Creation.  

A well-established scholar, Michael Behe, was also drawn to the movement by his own disillusionment with orthodox Darwinism.  As a tenured biochemist at Lehigh University, he defended  Phillip Johnson in a 1991 letter to the prestigious journal Science.  Then, five years later, he tossed one of two “rhetorical bombs [that] jarred the world of biological science” (p. 153).  The first, an article by David Berlinksi (a Jewish mathematician) in Commentary, launched “a full-scale attack on the credibility of Darwinian evolution” and then Behe published Darwin’s Black Box, vividly and persuasively showing that tiny parts of the cell, like the flagellum, appeared to be “irreducibly complex” and thus inexplicable in Darwinian categories.  The book was reviewed in more than 100 publications and enjoyed unexpected sales. 

Like Johnson, Behe was influenced by Michael Denton’s Evolution:  A Theory in Crisis, which dealt him “the greatest intellectual shock of his life” (p. 157).  But he was also angered by the scientific establishment’s deceit in portraying (especially in school textbooks) macroevolution as demonstrably factual.  Rethinking what he knew best, biochemistry, he suspected that complicated systems, including “blood clotting, the cilium, and intracellular transport” defied Darwinian explanations.  Subsequent searches of the literature confirmed his suspicion, for he found therein a “‘thundering silence.’  Not one biochemist in the past forty hears had even attempted a testable explanation for the origin of any of the systems about which he was writing” (p. 158).  Indeed, the intricate design he observed in tiny cells seemed best understood as a product of Intelligent Design rather than chance and necessity. 

Finally there’s William Dembski, with earned doctorates in both mathematics and philosophy of science, who brought intensity and high velocity intelligence to the movement.  Establishing his “explanatory filter” as a means whereby one can differentiate between events that are merely natural and those that are clearly designed, Dembski roots his presentation in advanced mathematics and probability theory.  In the words of Ron Koons, an erudite philosopher at the University of Texas, Dembski is the “Isaac Newton of information theory, and since this is the Age of Information, that makes Dembski one of the most important thinkers of our time” (p. 178).  A torrent of articles and books by Dembski, addressing both highly scholarly and lay readers, have bolstered the ID case.  

In 1966, at the age of 14, sitting in a high school biology class, Lee Strobel embraced atheism, confident that some basic truths he was learning fully justified his decision.  He took as demonstrable four propositions:  1) life had originated–as Stanley Miller allegedly proved–that life could accidentally spring from primordial soup; 2) Darwin’s “tree of life” demonstrated the evolution of everything from a common ancestor; 3) Ernst Haeckel’s portraits of different embryos showed the similarity of fish, hogs, rabbits, humans, et al. at the beginning of their development; 4) a “missing link,” the archaeopteryx fossil–half reptile, half bird–validated the Darwinian hypothesis.  Alas, all those planks of his childhood atheism, Strobel says–in The Case for a Creator:  A Journalist Investigates Scientific Evidence That Points Toward God (Grand Rapids:  Zondervan, c. 2004)–have been largely refuted by recent scientific developments.  He–and many others whose atheism seemed justified by science–had based his worldview on fantasy rather than fact!  And since science should relentlessly seek for truth he wrote this book to illustrate how some eminent thinkers–loosely aligned in their support for “Intelligent Design”–find it reasonable to believe in a Creator. 

As an experienced journalist, Strobel first interviewed Jonathan Wells, the author of Icons of Evolution, whose undergraduate and graduate degrees in biology were earned at U.C. Berkeley.   Responding to a question concerning the origin of life, Wells noted that “Science magazine said in 1995 that experts now dismiss [Stanley] Miller’s experiment because ‘the early atmosphere looked nothing like the Miller-Urey simulation'” (p. 37).  In fact, doing Miller’s experiment with the chemicals now thought to have constituted early earth’s surface would produce formaldehyde and cyanide, hardly the building blocks of living organisms!  Wells also deconstructed Darwin’s “tree of life.”  Darwin himself admitted that the fossil record looked nothing like the tree he drew in The Origin of Species, but he trusted evidence would turn up in time to demonstrate it.  In fact, Wells says, fossil finds during the past 150 years “have turned his tree upside down by showing” that virtually all major forms of life appeared suddenly in the Cambrian explosion, a five million year window of time in earth’s five billion year history (p. 43).  According to one expert, “the major animal groups ‘appear in the fossil record as Athena did from the head of Zeus–full blown and raring to go'” (p. 44).  Rather than a tree, one sees something like a lawn!  The fossil record, one Chinese paleontologist asserts, “‘actually stands Darwin’s tree on its head, because the major groups of animals–instead of coming last, at the top of the tree–come first, when animals make their first appearance'” (p. 45). 

Haeckel’s embryos were Strobel’s next “facts” to fall!  It turns out, Wells says, that Haeckel forged the drawings that have been endlessly reproduced in biology textbooks!  Though some of his German colleagues asserted, in the 1860s, that the drawings were false, devout Darwinians found them helpful illustrations and continued to use them.  Eight of ten textbooks on evolutionary biology currently used by universities contain them!  On a popular level, “in 1996, Life magazine described how human embryos grow ‘something very much like gills,’ which is ‘some of the most compelling evidence of evolution'” (p. 51).  In fact, Wells says, human embryos have no gills!  What looks like gills are simply wrinkles on the neck of the tiny baby!  No less an authority than Harvard’s Stephen Jay Gould, late in life, condemned the fraudulent drawings, labeling them “‘the academic equivalent of murder'”–though he did little for 20 years to expose them. 

Finally, the fourth of Strobel’s childhood certainties, “the archaeopteryx missing link,” collapsed under the evidence presented by Jonathon Wells.  Allegedly, the archaeopteryx fossil demonstrated the transition from reptiles to birds, a basic Darwinian assumption.  Actually, we now know, it’s not a reptile at all.  “‘It’s a bird with modern feathers, and birds are very different from reptiles in many important ways–their breeding system, their bone structure, their lungs, their distribution of weight and muscles” (p. 57).  It’s a strange looking extinct bird, to be sure, but it’s purely bird!  Even more striking, this bird, so long cited as proof for the Darwinian theory, appears much earlier in the fossil record than the alleged reptilian ancestors of birds! 

Worse yet are “missing links” such as the archaeoraptor, featuring the tail of a dinosaur and the forelimbs of a bird.  In 1998 National Geographic trumpeted that this fossil illustrated the evolution of feathered dinosaurs into birds.  Unfortunately, the fossil was a fraud–someone glued together reptile and bird fossils and sold the artifact for a tidy profit!  Indeed, fake fossils litter the paleontological marketplace!   Something of the same applies to “Java man,” a primary entry in the World Book Encyclopedia Strobel religiously read as an adolescent atheist.  In truth, the pictures of “Java man” were imaginative drawings based upon a skullcap, a thigh bone, and three teeth.   In fact, the thigh bone doesn’t go with the skullcap, which seems to be the same as that of modern humans.  “In short, Java man was not an ape-man as I had been led to believe, but he was ‘a true member of the human family.’  This was a fact apparently lost on Time magazine, which as recently as 1994 treated Java man as a legitimate evolutionary ancestor” (p. 62).  

The biological evidence set forth by Jonathan Wells finds fascinating parallels in physics and astronomy.  Allan Rex Sandage–as Edwin Hubble’s protégé, probably the world’s foremost cosmologist–declared in 1985 that he’d become a Christian, at the age of 50, because the “Big Bang” defies naturalistic explanations.  “It was my science,” Sandage said, “that drove me to the conclusion that the world is much more complicated than can be explained by science.  It was only through the supernatural that I can understand the mystery of existence” (p. 70).  Similarly, Nobelist Arno Penzias said:  “Astronomy leads us to a unique event, a universe which was created out of nothing, one with the very delicate balance needed to provide exactly the conditions required to permit life, and one which has an underlying (one might say ‘supernatural’) plan” (p. 153).  Indeed, he noted, “‘The best data we have are exactly what I would have predicted had I nothing to go on but the first five books of Moses, the Psalms and the Bible as a whole'” (p. 77). 

The implications of the Big Bang congealed for Strobel when he interviewed William Lane Craig, who unpacked the deceptively profound “Kalam” argument for God’s existence.  This involves “three simple steps:  ‘Whatever begins to exist has a cause.  The universe began to exist.  Therefore, the universe has a cause'” (p. 98).  By contrast, atheists such as Quentin Smith claim:  “‘the most reasonable belief is that we came from nothing, by nothing, and for nothing'” (p. 99).   Craig, however, cited evidence for each step in the Kalam position, responded to its critics, and established, to Strobel’s satisfaction, the validity of taking the Big Bang as a clue to the necessity of positing an eternal God presiding over the whole finite process of creation.  

An interview with Robin Collins revealed the intricate “fine tuning” of the universe, perfectly suited for life on earth, and a similar talk with Jay Wesley Richards and Guillermo Gonzalez, authors of the recently-published The Privileged Planet, demonstrated the amazing coincidence of factors that makes the earth quite special, if not utterly unique.  The acclaimed John A. O’Keefe, considered “the godfather of astrogeology,” summed it all up by declaring that it is mathematically probable that “only one planet in the universe is likely to bear intelligent life.  We know of one–the Earth–but it is not certain that there are many others, and perhaps there are no others” (p. 191).  Still more, O’Keefe said:  “We are, by astronomical standards, a pampered, cosseted, cherished group of creatures; our Darwinian claim to have done it all ourselves is as ridiculous and as charming as a baby’s brave efforts to stand on its own feet and refuse his mother’s hand.  If the universe had not been made with the most exacting precision we could never have come into existence.  It is my view that these circumstances indicate the universe was created for man to live in” (p. 191). 

Turning from the “privileged planet” to the miniscule cell, Strobel sought out biochemist Michael Behe, whose Darwin’s Black Box, David Berlinski says “‘makes an overwhelming case against Darwin, on the biochemical level,” arguing with “‘great originality, elegance and intellectual power.’  Added Berlinsky:  ‘No one has done this before'” (p. 196).   Behe explained how a tiny bacterial flagellum moves, propelled by what one scientist calls “the most efficient motor in the universe” (p. 205).  He also explained the intricate process whereby blood clots to stop bleeding.   Just as a mousetrap illustrates a “specified complexity” indicating intelligent design, so too do the extraordinarily more sophisticated marvels of nature. 

            Equally persuasive of design is the presence of information in DNA.  “Human DNA,” says George Sim Johnson, “contains more organized information than the Encyclopedia Britannica.  If the full text of the encyclopedia were to arrive in computer code from outer space, most people would regard this as proof of the existence of extraterrestrial intelligence” (p. 219).   So where does that information come from?  To Stephen A. Meyer, this is the critical question.  “If you can’t explain where the information comes from, you haven’t explained life, because it’s the information that makes the molecules into something that actually functions” (p. 225).  Given its information content, it’s mathematically improbable that even a simple protein molecule could have come into being through purely naturalistic means during the limited time following the Big Bang.   And since we cannot escape concluding that information comes from a mind, we are justifiably inclined to conclude that the information pervading the cosmos is derived from a Cosmic Mind.             

            The alternative, the purely naturalistic view, strikes Strobel as “simply too far-fetched to be credible” (p. 277).   Such a position requires one “to believe that:

·         Nothing produces everything

·         Non-life produces life

·         Randomness produces fine-tuning

·         Chaos produces information

·         Unconsciousness produces consciousness

·         Non-reason produces reason” (p. 277)

Such propositions, he concluded, require a great deal of “blind faith” in the Darwinian hypothesis, taxing reason far beyond that required by Christian theism.   Indeed, we may very well be entering an era of scientific breakthroughs that restore the powerful union of faith and reason evident in great scientists of the past such as Sir Isaac Newton.        

            Strobel’s strengths lie in his journalistic skills:  he interviews some of the finest thinkers in the Intelligent Design community (including a few, such as J. P. Moreland, I’ve not mentioned), helps them clearly explain their positions in ways ordinary readers can comprehend, and adds personal touches to enhance the discussions.  His own story provides an interesting context to the presentation, but he never makes himself its centerpiece.  One closes the book with a profound appreciation for the brilliance of the men interviewed, supplemented by dozens of quotations from the world’s elite scientists, and a conviction that one is fully warranted when affirming faith in the Creator.                                                                                                                                                        

148 Solzhenitsyn’s Warnings





            During the 1970s I read most of Alexander Solzhenitsyn’s novels (One Day in the Life of Ivan Denisovich; First Circle; Cancer Ward) as well as The Gulag Archipelago, a massive (three volume) documentation of Soviet brutality under Lenin and Stalin, and The Oak and the Calf, an account of his struggles with censorship in the USSR.  By the decade’s end, thanks to Solzhenitsyn, I was delivered from some of the academy’s gilded portraits of the USSR and a bit better prepared to discern the Marxist rhetoric so glibly infusing many analyses of American history.  And I was also prompted to re-examine, during the next decade, America’s role in the world vis-à-vis both Communism and similarly aggressive ideologies such as Islam. 

          Recently, assessing Spanish elections, wherein a docile public wilted in the face of terrorism, I find myself thinking about, and re-reading, some of Solzhenitsyn’s addresses.  He was awarded the Nobel Prize in 1970, and his Nobel Lecture (New York:  Farrar, Straus and Giroux, c. 1972) focused on art and the role of the artist.  “One kind of artist,” markedly evident in the avante garde individualists of the West, “imagines himself the creator of an independent spiritual world and shoulders the act of creating that world and the people in it, assuming total responsibility for it” (p. 4).  Such self-serving rebels against convention enjoy moments of fame but do little good.  The other kind, endorsed by Solzhenitsyn, rightly understands his sacred vocation and “acknowledges a higher power above him and joyfully works as a common apprentice under God’s heaven” (p. 4). 

          To work wisely and well as an artist is a truly noble calling, for as Dostoevsky said, “Beauty will save the world.”  Great art, truthful art, weathers the winds of time and gives wings to our souls.  Indeed, Plato’s “old trinity of Truth, Goodness, and Beauty” (p. 7) retains its ancient grandeur, and nothing rivals the importance of investing one’s life in illuminating and defending such transcendent realities, the “permanent things.”  Speaking personally, Solzhenitsyn noted that he miraculously survived his years in the Gulag, while thousands perished.  So he had a sacred mission:  to record, to explain, to imbed their story in the nation’s literature.  “Our twentieth century has turned out to be more cruel than those preceding it, and all that is terrible in it did not come to an end with the first half” (p. 22).  Millions died because too few believed in “fixed universal human concepts called good and justice” while the oppressors disdained them as “fluid, changing, and that there for one must always do what will benefit one’s party” (p. 220.  Sadly enough, might-makes-right philosophies forever enlist devotees, and hijackers and terrorists ever wreck their carnage.  But despite the fact that (as Dostoevsky lamented) there is much “slavery to half-cocked progressive ideas” (p. 24), one must courageously seek to refute them.     

          This means refuting the “spirit of Munich” that has spread cancerously throughout the West.  That spirit, Solzhenitsyn says, “is dominant in the twentieth century.  The intimidated civilized world has found nothing to oppose the onslaught of a sudden resurgent fang baring barbarism, except concessions and smiles.  The spirit of Munich is a disease of the will of prosperous people; it is the daily state of those who have given themselves over to a craving for prosperity in every way, to material well-being as the chief goal of life on earth” (p. 24).  He referred, of course, to the agreement Neville Chamberlain made with Adolf Hitler in 1938, declaring:  “How horrible, how fantastic, how incredible it is that we should be digging trenches and trying on gas masks because of a quarrel in a far-away country between people of whom we know nothing!”  Returning to the cheering masses in England, he proclaimed the arrival of “Peace in Our Time.” 

Replying to Chamberlain, Winston Churchill said: “I do not grudge our loyal, brave people    . . . the natural, spontaneous outburst of joy and relief when they learned that the hard ordeal would no longer be required of them at the moment; but they should know the truth.  They should know that . . . we have sustained a defeat without war, the consequences of which will travel far with us along with our road.”  The next year, of course, Germany invaded Poland.  Even then, however, many Europeans sought to remain “neutral,” numbly paralyzed in their pacifism.  This, Churchill said, was “lamentable; and it will become much worse.  They bow humbly and in fear of German threats.  Each one hopes that if he feeds the crocodile enough, the crocodile will eat him last.  All of them hope that the storm will pass before their turn comes to be devoured.  But I fear–I fear greatly–the storm will not pass.  It will rage and it will roar, ever more loudly, ever more widely.” 

The ghastly carnage of WWII, of course, might have been avoided had Churchill’s warnings been heeded.  But Chamberlain’s appeasement postponed the conflict until it could only be waged against desperate odds.  Neither the League of Nations nor Europe’s politicians had the courage to resist.  So it’s up to writers such as himself, Solzhenitsyn said, to speak the truth to the world.  While struggling against the autocracy of the USSR, he’d found an international fraternity of writers who rallied to his side when Communist hardliners sought to suppress him.  His weapon, naturally, was the writer’s pen enlisted to proclaim the truth.   Tyranny thrives by lying.  Truth tellers expose and ultimately defeat the tyrants.  Writers “can VANQUISH LIES!  In the struggle against lies, art has always won and always will” (p. 33).  And so, he memorably declared in closing:  “ONE WORD OF TRUTH OUGHTWEIGHS THE WORLD” (p. 34). 

* * * * * * * * * * * * * *

          Exiled from the USSR soon after receiving the Nobel Prize, Solzhenitsyn found refuge in the mountains of Vermont, where he continued to write and declare the truth.  Initially lionized by the American intelligentsia, he was invited to deliver the 1978 commencement address at Harvard University, published as A World Split Apart (New York:  Harper & Row, c. 1978).  He began his speech abrasively, noting that though Harvard’s motto is Veritas graduates would find that “truth seldom is sweet; it is almost invariably bitter” (p. 1).  But he would speak truly anyway!  And his words proved “bitter” to many who heard him! 

          After assessing various developments around the world, he questioned the resolve of the West to deal with them.  Alarmingly, he said, “A decline in courage may be the most striking feature that an outside observer notices in the West today.  The Western world has lost its civic courage, both as a whole and separately, in each country, in each government, in each political part, and, of course, in the United Nations.  Such a decline in courage is particularly noticeable among the ruling and intellectual elites, causing an impression of a loss of courage by the entire society” (pp. 9-11).  This decline, “at times attaining what could be termed a lack of manhood,” portended a cataclysmic cultural collapse.  

          Solzhenitsyn also lamented the West’s materialism, litigiousness, licentiousness, and irresponsible individualism.  Personal freedom is, of course, a great good, but irresponsible freedom erupts in evil acts and “evidently stems from a humanistic and benevolent concept according to which man–the master of this world–does not bear any evil within himself, and all the defects of life are caused by misguided social systems, which must therefore be corrected” (p. 23).  If so, it would seem that affluence would eliminate crime!  Strangely enough, however, crime was more rampant in the wealthy West than in the impoverished USSR! 

          Then he upbraided the media.  Granted virtually complete “freedom,” journalists in the West used it as a license for irresponsibility.  Rather than working hard work to discover the truth, they slip into the slothful role of circulating rumors and personal opinions.  Though no state censors restrict what’s written, “fashionable” ideas get aired and the public is denied free access to the truth.  Fads and fantasies, not the illumination of reality, enlist the mainstream media.  “Hastiness and superficiality–these are the psychic diseases of the twentieth century and more than anywhere else this is manifested in the press” (p. 27).    Consequently, “we may see terrorists heroized, or secret matters pertaining to the nation’s defense publicly revealed, or we may witness shameless intrusion into the privacy of well-known people according to the slogan ‘Everyone is entitled to know everything'” (p. 25). 

          Solzhenitsyn was further disturbed by the widespread pessimism and discontent Westerners displayed regarding economic development.  Amazingly, elite intellectuals celebrated the very socialism that had destroyed his homeland.  (Remember that Harvard’s superstar economist, John Kenneth Galbraith, still trumpeted the virtues of socialism in the 1980s!)  This, Solzhenitsyn warned, “is a false and dangerous current” (p. 33).  In the East, “communism has suffered a complete ideological defeat; it is zero and less than zero.  And yet Western intellectuals still look at it with considerable interest and empathy, and this is precisely what makes it so immensely difficult for the West to withstand the East” (p. 55).  But the capitalist system in the West is no panacea either.  Both East and West, he said, need “spiritual” rather than “economic” development, and the spirit has been “trampled by the party mob in the East, by the commercial one in the West” (p. 57).  

          American politicians who appeased Communism especially elicited Solzhenitsyn’s scorn.  In fact, looking at the nation’s recent withdrawal from Vietnam, he said:  “the most cruel mistake occurred with the failure to understand the Vietnam War.  Some people sincerely wanted all wars to stop just as soon as possible; others believed that the way should be left open for national, or Communist, self-determination in Vietnam (or in Cambodia, as we see today with particular clarity).  But in fact, members of the U.S. antiwar movement became accomplices in the betrayal of Far Eastern nations, in the genocide and the suffering today imposed on thirty million people there.  Do these convinced pacifists now hear the moans coming from there?  Do they understand their responsibility today?  Or do they prefer not to hear?  The American intelligentsia lost its nerve and as a consequence the danger has come much closer to the United States.  But there is no awareness of this.  Your short-sighted politician who signed the hasty Vietnam capitulation seemingly gave America a carefree breathing pause; however a hundredfold Vietnams now looms over you” (p. 41).  The future he envisioned would be shaped by a “fight of cosmic proportions,” a battle between the forces of either Goodness or Evil.  Those who are morally neutral, those who exalt in their moral relativism, are the true enemies of mankind.   Thus, two years before Ronald Reagan was elected President, Solzhenitsyn insisted that only a moral offensive could turn back the evil empire. 

          Cowardice had led to retreat in Southeast Asia.  Democracies themselves, Solzhenitsyn feared, lack the soul strength for sustained combat.  Wealthy democracies, especially, seem flaccid.  “To defend oneself, one must also be ready to die; there is little such readiness in a society raised in the cult of material well-being.  Nothing is left, in this case, but concessions, attempts to gain time, and betrayal” (p. 45).  More deeply, the “humanism” that has increasingly dominated the West since the Renaissance explains its weakness.  When one believes ultimately only in himself, when human reason becomes the final arbiter, when human sinfulness is denied, the strength that comes only from God will dissipate.  Ironically, the secular humanism of the West is almost identical with the humanism of Karl Marx, who said:  “communism is naturalized humanism” (p. 53). 

          Consequently, he said, “If the world has not approached its end, it has reached a major watershed in history, equal in importance to the turn from the Middle Ages to the Renaissance. It will demand from us a spiritual blaze; we shall have to rise to a new height of vision, to a new level of life, where our physical nature will not be cursed, as in the Middle Ages, but even more importantly, our spiritual being will not be trampled upon, as in the Modern Era” (pp. 60-61).  This speech ended Solzhenitsyn’s speaking career in the United States.  The nation’s elite newspapers–the New York Times and Washington Post–thenceforth ignore him.  Prestigious universities, such as Harvard, closed their doors.  He became something of a persona non grata and spent the last 15 years of his life in America living as a recluse, working industriously on manuscripts devoted to Russia’s history. 

* * * * * * * * * * * * * * * *

          In the years immediately prior to Solzhenitsyn’s Harvard speech, he spoke to several American and British audiences, setting forth themes summarized at Harvard.  His speeches were published in Warning to the West (New York:  Farrar, Straus and Giroux, c. 1976).  He particularly assailed the appeasement proposals of Bertrand Russell, summed up in the slogan “Better Red than dead.”  To Russell and his fifth-column ilk, Solzhenitsyn replied:  “Better to be dead than a soundrel.  In this horrible expression of Bertrand Russell’s there is an absence of all moral criteria” (p. 119). 

          Delivering an address over BBC in 1976, Solzhenitsyn noted that “until I came to the West myself and spent two years looking around, I could never have imagined the extreme degree to which the West actually desired to blind itself to the world situation, the extreme degree to which the West has already become a world without a will, a world gradually petrifying in the face of the danger confronting it, a world oppressed above all by the need to defend its freedom” (p. 126).  “There is a German proverb,” he continued, “which runs Mut verloren–alles verloren:  When courage is lost, all is lost.  There is another Latin one, according to which loss of reason is the true harbinger of destruction.  But what happens to a society in which both these losses–the loss of courage and the loss of reason–intersect?  This is the picture which I found the West presents today” (pp. 126-127).  This predicament, he thought, proceeds from centuries of philosophical and theological development and colonial expansion. 

The First World War, culminating this process, virtually destroyed Europe, and in its wake the evils of socialism inundated Russia, annihilating 100 million or more of its people.   Europeans, eschewing moral criteria to follow narrowly pragmatic policies, stood by silently.  England’s prime minister, “Lloyd George actually said:  ‘Forget about Russia.  It is our job to ensure the welfare of our own society” (p. 131).  So Russia’s erstwhile “allies,” ignoring her wartime sacrifices, did nothing to stop the Bolshevik’s triumph, tyranny, and terror.  Even as millions were executed or sent into the Gulag Archipelago, when six million peasants died in the Ukraine in the 1930s, Westerners ignored it.  Sadly, Solzhenitsyn said:  “Not a single Western newspaper printed photographs or reports of the famine; indeed, your great wit George Bernard Shaw even denied its existence.  ‘Famine in Russia?’ he said.  ‘I’ve never dined so well or so sumptuously as when I crossed the Soviet border.'” (p. 133). 

Similarly, during WWII England and the Allies benefited from Russia’s assistance.  But following the war Stalin continued, with little criticism in the West, to oppress his people.  “Twice we helped save the freedom of Western Europe,” he said.  “And twice you repaid us by abandoning us to our slavery” (p. 136).  Frankly, he believed that Westerners preferred peace and security, pleasure and comfort, to demanding justice for Russia’s oppressed.  So they ignored the mass deportations of “whole nations to Siberia” and the occupation of Estonia, Latvia, and Lithuania!  Having stopped Hitler, they seared their consciences and remained untroubled with Stalin.   

          Indeed, rather than seriously evaluating and learning from Russia’s disaster, Western intellectuals seemed (in the 1970s) willing to replicate it!  “And what we see is always the same as it was then:  adults deferring to the opinion of their children; the younger generation carried away by shallow, worthless ideas; professors scared of being unfashionable; journalists refusing to take responsibility for the words they squander so easily; universal sympathy for revolutionary extremists; people with serious objections unable or unwilling to voice them; the majority passively obsessed by a feeling of doom; feeble governments; societies whose defensive reactions have become paralyzed; spiritual confusion leading to political upheaval” (p. 130). 

          Solzhenitsyn was particularly incensed by the “misty phantom of socialism” so prevalent in places like England.  “Socialism has created the illusion of quenching people’s thirst for justice:  Socialism has lulled their conscience into thinking that a steamroller which is about to flatten them is a blessing in disguise, a salvation.  And socialism, more than anything else, has caused public hypocrisy to thrive,” enabling Europeans to ignore Soviet atrocities (p. 141).  There’s actually no logic to socialism, for “it is an emotional impulse, a kind of worldly religion,” embraced and followed with blind faith (p. 142).  As an ideology, it is spread and embraced by immature, sophistic believers. 

          The British, of course, had drifted toward socialism under the post-WWII Labor leaders.   Consequently, “Great Britain, the kernel of the Western world, has experienced this sapping of its strength and will to an even greater degree, perhaps, than any other country.  For some twenty years Britain’s voice has not been heard in our planet; its character has gone, its freshness has faded” (p. 144).  The land of Churchill had vanished!  “Contemporary society in Britain is living on self-deception and illusions, both in the world of politics and in the world of ideas” (p. 144).  What was true about Great Britain, he insisted, was equally true about much of the West. 

          As one would anticipate, Solzhenitsyn’s BBC career ended abruptly!  Neither British nor American politicians, labor leaders, professors or journalists wanted to be rebuked for their failures!  In the 1970s, neither the United Nations nor the Europeans, neither Richard Nixon nor  George McGovern, neither Gerald Ford nor Jimmy Carter, neither William J. Fullbright nor John F. Kerry had the courage to oppose Communism in Southeast Asia.  Nor do numbers of their successors today seem ready to deal with the violence and injustices in the Middle East.   Let us, however, never say that no one warned us about appeasement’s desserts!

147 The Cornucopia of Capitalism





                As an adolescent, growing up in Stockholm, Sweden, Johan Norberg espoused anarchism, tried (with John Lennon) to “imagine there’s no countries,” and decried multinational capitalism. He longed for a world wherein folks would be free. Fifteen years later, having seriously studied economics and become a professor of that discipline, he’s still fervently committed to freedom–particularly the small but critical daily liberty to “pick and choose” what one eats, where one lives, how one works. But he’s changed his mind as to how best to extend it and written In Defense of Global Capitalism (Washington, D.C.: Cato Institute, c. 2003). Originally published in Sweden, the book was picked up by the Cato Institute (well known for its “libertarian” economic ideals), translated into English and published. Graphs and charts, footnotes and citations, references to trustworthy sources, indicate the book’s research foundations, but it’s engagingly written and quite understandable for anyone concerned with economics. 

            Contrary to the slogans shouted by today’s anarchists protesting “globalization” (the World Bank and International Monetary Fund, multinational corporations and international trade agreements), despite doctrinaire leftist claims that exploitation and deprivation are spreading everywhere, Norberg demonstrates that during the past three decades a transformation has taken place around the world.  People in countries such as India and China have made startling, unprecedented economic gains.  “Consumerism,” so often labeled evil by Western critics, appears to have stimulated developments that have dramatically raised the standard of living.  This took place not as a result of a socialist revolution, “but rather from a move in the past few decades toward greater individual liberty” (p. 23).   Attuned to an ancient Chinese proverb, “When the wind of change begins to blow, some people build windbreaks while others build windmills,” Norberg wants us to flow with the wind of free enterprise and make the world a better place. 

            Dealing with facts, rather than utopian fantasies, leads one to discover that “between 1965 and 1998, the average world citizen’s income practically doubled, from $2,497 to $4,839, adjusted for purchasing power and inflation” (p. 25).  Amazingly, though one would never expect it if one listened to leftist pundits and social gospel preachers, “world poverty has fallen more during the past 50 years than during the preceding 500” (p. 25).  Population has, indeed, soared during these decades, but “the number of absolute poor has fallen by about 200 million” (p. 26) because of rapid economic development.  Poverty in Asia declined from 60 to 20 per cent in 20 years!  Economic growth erases poverty. 

            There’s far less hunger in the world today because “we have never had such a good supply of food” (p. 31).  This results, primarily, from the “green revolution” once strongly opposed by environmentalists who issued dire warnings as to its long-term impact.  Germ and insect resistant crops, better sowing and harvesting methods, more efficient use of available water, have resulted in an amazingly productive agricultural system.  Though best illustrated in the United States, the same pattern is evident world-wide.  Famines now occur less often, in large part Norberg says, because democracies (and their freedoms for individuals) seem never to experience such.  Famines strike places like North Korea, the former Soviet Union, Cambodia, Ethiopia–all ruled by tyrants.  Dictatorships, not agricultural failures, not ecological abuses, cause famines. 

            China especially reveals the positive impact of global capitalism.  In the 1970s Deng Xiaoping “realized that he would have to distribute either poverty or prosperity, and that the latter could only be achieved by giving people more freedom” (p. 47).  Peasants were allowed to lease land, to grow and market crops for themselves.  They did so “to such a huge extent that nearly all farmland passed into private hands in what may have been the biggest privatization in history.  It paid off, with crop yields rising between 1978 and 1984 by an incredible 7.7 percent annually.  The same country that 20 years earlier had been hit by the worst famine in human history now had a food surplus” (p. 47).  Half a billion Chinese–nearly twice the American population!–climbed out of poverty simply because they could participate in a capitalistic economy.  “The World Bank has characterized this phenomenon as ‘the biggest and fastest poverty reduction in history'” (p. 48). 

            Vietnam, surprisingly, shows the same trend.  Though impoverished by its Marxist straightjacket for several decades, “Vietnam since the end of the 1980s has introduced free trade reforms and measures of domestic liberalization” (p. 133).   Exports, especially rice, have boomed.  “This has resulted in rapid growth and a uniquely swift reduction of poverty.  Whereas 75 percent of the population in 1988 were living in absolute poverty, by 1993 this figure had fallen to 58 percent, and 10 years later had been reduced by half, to 37 percent; 98 percent of the poorest Vietnamese households increased their incomes during the 1990s” (p. 133).   Similar currents have streamed through India and South Korea.  In the 1960s South Korea was poorer than Angola, but today it’s the world’s 13th largest economy.  Conversely, North Korea, sentenced to the nightmare endemic to Communism, sunk even deeper into the pit of deprivation and desperation. 

What huge investments in “foreign aid,” what highly touted “compassionate Christian ministry” endeavors failed to significantly impact, an unleashed free enterprise capitalism accomplished in two decades!  There are, manifestly, many global inequities.  But “the fantastic thing,” Norberg says, “is that the spread of democracy and capitalism has reduced them so dramatically” (p. 61).  Wherever government steps aside and lets individuals flourish, they freely invest and innovate and forge associations that precipitate prosperity.  In such free systems, folks like Bill Gates will, of course, become fantastically wealthy.  But the system that sustains them also provides a rising tide that lifts everyone’s boat.  If my income doubles in two decades, while Gates’ quadruples, why should I complain?   Unless I’m consumed by envy, I won’t!    In a capitalist system, the “poor benefit from growth to roughly the same extent and at the same speed as the rich.  They benefit immediately from an increase in the value of their labor and from greater purchasing power” (p. 81).  It’s obvious that capitalism accentuates inequalities.  But this occurs not because capitalism makes some folks poor, but because it makes “its practitioners wealthy.  The uneven distribution of wealth in the world is due to the uneven distribution of capitalism” (p. 154). 

            Such a capitalistic order “requires people to be allowed to retain the resources they earn and create” (p. 66).  Private property, so demonized by socialistic thinkers, proves to be the essential lynchpin for widespread economic development.  The folks at the bottom benefit the most from private property.  “The Peruvian economist Hernando de Soto has done more than anyone else to show how poor people lose out in the absence of property rights” (p. 91).  Conversely, public spending–even on behalf of the poor–ultimately harms its intended beneficiaries.  Taking from the rich to enrich the poor harms the poor.  Do-gooders, especially the enlightened elites who direct the welfare state and feel highly righteous in distributing the dole, feel good about themselves but actually do little good!  In Asia, where poverty has declined so rapidly, there has been almost no “redistribution” of wealth, no “social justice.”  Only millions of free people lifting up themselves!  East Asia’s “miracle shows an open, free-enterprise economy to be the sine qua non of development” (p. 103).   Several  million ordinary people, pooling their wisdom in the marketplace, working and saving, buying and selling, investing and losing investments, prove far more prescient and trustworthy than a few dozen bureaucrats orchestrating a planned economy. 

The division of labor basic to capitalism means that “one hour’s labor today is worth about 25 times more than it was in the mid-19th century.  Employees, consequently, now receive about 25 times as much as they did then in the form of better pay, better working conditions, and shorter working hours” (p. 68).  The alleged “victims” of multinational corporations, workers in “the poorest developing countries” employed by American-affiliated companies like Nike, earn “eight times the average national wage!” (p. 217).  Unlike the “sweatshops” denounced by Leftists marching in the streets, American factories in the Third World pay their employees handsomely and contribute to the rapid development of once impoverished lands. 

In short:  the world is, in fact, much better than it was a century ago.  And it’s almost exclusively the result of the spread of democracy and free enterprise. 

* * * * * * * * * * * * * * * *

Norberg’s work confirms the portrait painted in It’s Getting Better All the Time:  100 Greatest Trends of

the Last 100 Years by Stephen Moore and Julian L. Simon (Washington, D.C.:  Cato Institute, c. 2000).  Moore took the economic data collected by the late Julian Simon, a noted economist, and distilled it (with colorful charts) to illustrate the book’s thesis:  “Freedom works” (p. 12).   Simon became somewhat notorious for publicly challenging various “doomsayers,” most notably the alarmists trumpeting the environmental crisis.  In 1980 he challenged Paul Ehrlich to put his money where his mouth was:  to wager $1000 on his pessimistic  predictions.  “A few years before that Ehrlich wrote:  ‘I would take even money that England will not exist in the year 2000.’  He wrote in 1969, on the eve of the green revolution, that ‘the battle to feed humanity is over.  In the 1970s the world will undergo famines.  Hundreds of millions of people will starve to death.’  Although Professor Ehrlich continues to make dimwitted statements like this, he is still taken quite seriously by the American intelligentsia.  He even won a MacArthur Foundation ‘genius’ award after he made these screwball predictions” (pp. 20-21). 

But Simon, unimpressed with Ehrlich’s Stanford credentials and bombastic assertions, dismantled his façade.  Setting forth the terms of his wager, he allowed Ehrlich to choose any five natural resources that he thought would become more expensive in the next 10 years.  “By 1990 not only had the optimist (Simon) won the bet, but every one of the resources had fallen in price, which, of course, is another way of saying that the resources had become less scare, not more” (p. 20).  Things were, by every measurable indice, getting better.  The 20th century also witnessed incredible economic and political advances.  The authors contend, “there has been more improvement in the human condition in the past 100 years than in all of the previous centuries combined since man first appeared on earth” (p. 1).  Blessed with such improvements, many of us fail to realize how significant they are.  “No mountain of gold 100 years ago could have purchased the basics of everyday life that middle-income Americans take for granted in 1999” (p. 6).  Underlying this spectacular development, and largely explaining it, are three things:  electricity; modern medicine; and the microchip. 

Of the 100 positive trends the book highlights, increased longevity is one of the most impressive.  Since the beginning of the industrial revolution, life expectancy has doubled, perhaps “‘the greatest miracle in the history of our species'” (p. 26).  Americans live 30 years longer than they did in 1900.  In China, in 1950, life expectancy was 40 years; today it’s 63, an amazing 50 percent gain in 50 years.  Infant mortality has sharply declined.  Deadly diseases, such as tuberculosis, smallpox, and diptheria, have been largely eliminated.  Miracle drugs, cures for cancer, treatments for heart disease all make for longer lives, freedom from killer diseases. 

Contrary to Malthusian predictions, we now have more food and less threat of famine than ever, despite the globe’s population growth.  Today’s farmer produces 100 times as much food as did his counterpart a century ago.  Prices for food have declined steadily.  Wealth, rather than shrinking as more people share the planet, has dramatically increased.  The true “wealth” is human ingenuity.  The world’s resources are not a finite pie, demanding that it be cut and divided in ever-smaller portions.  True wealth is the result of creative persons finding ever better ways to live.  So more and more people have been getting more and more wealthy. 

            For all the good news contained in the book, it’s also obvious that the 20th century was, in some respects, the worst of all centuries.  Multiplied millions of people died in wars–and four times as many were liquidated by totalitarian governments.  Somewhere between 150 and 200 million innocent folks were sacrificed on the altars of (largely socialist, whether fascist or communist) ideology (p. 16).  These ghastly evils were done, almost exclusively, by regimes that deprived people of individual freedom.  Despots determined to dictate economic systems, always for “the good of the people,” launched their programs by confiscating guns and restricting free speech, by stamping out the free press and restricting the opportunity to worship God.  Virtually “every great tragedy of the 20th century has been a result of too much government, not too little” (p. 15). 

The book’s succinct, clear, persuasive.  As Lawrence Kudlow, the chief economist for CNBC, writes:  “This book is so chock full of good news that it’s virtually guaranteed to cheer up even the clinically depressed.  Moore and Simon dismantle the doomsday pessimism that’s still so commonplace in academia and the media.  The evidence they present is irrefutable:  Give people freedom and free enterprise and the potential for human progress is seemingly limitless” (back cover). 

* * * * * * * * * * * * * * * * *

The libertarian humorist P.J. O’Rourke provides much the same evidence in Eat  the Rich (New York: 

Atlantic Monthly Press, c. 1998).  “I had one fundamental question about economics,” he says, beginning his book:  ‘Why do some places prosper and thrive while others just suck?  It’s not a matter of brains.  No part of the earth (with the possible exception of Brentwood) is dumber than Beverly Hills, and the residents are wading in gravy.  In Russia, meanwhile, where chess is a spectator sport, they’re boiling stones for soup” (p. 1).  Why?  It’s a good question!  And it’s a question O’Rourke clearly answers:  free people, under the rule of law, prosper. 

            Reared in a privileged American home, O’Rourke went off to college, where he and his peers imbibed the intellectual currents of the ’60s, posed as hippies, and styled themselves “Marxists” without much of a clue as to what that entailed.  In time, he became a journalist, started thinking like an adult,  and began to notice, as he traveled the globe in the 1990s, the importance of economics.  Curious, he pulled out the economics textbook he’d been assigned in college, Samuelson and Nordhaus’ Economics, widely used throughout the country for 40 years.  “Professor Samuelson,” O’Rourke discovered, “turns out to be almost as much of a goof as my friends and I were in the 1960s” (p. 8).   To Samuelson Karl Marx was “the most influential and perceptive critic of the market economy ever” (p. 8) and blessed his memory by embracing his theories, arguing that socialist improvements to the American economy would make life better for all concerned.   

            Having personally witnessed Marx’s influence in various world areas, O’Rourke resolved to discard Samuelson–and fellow travelers like of John Kenneth Galbraith–and find better answers to his questions in countries that have embraced either capitalism or socialism.  He discovered that there can be “good capitalism,” like that found on Wall Street, largely responsible for America’s amazing prosperity.  There can be “bad capitalism” such as developed in Albania following the collapse of Enver Hoxha’s tyranny.  Albanians in the 1990s were “free,” but not doing well.  “The Albanian concept of freedom approaches my own ideas on the subject, circa late adolescence.  There’s a great deal of hanging out and a notable number of weekday, midafternoon drunk fellows” (p. 47).  But not much productive labor!  Lots of freedom, but little enterprise!

            In Sweden, O’Rourke checked out what’s often called a “good socialism.”  One can do very little and get quite a lot in this workers’ paradise.  A mere 2.7 million of the 7 million Swedes work to pay for folks getting benefits or working for the government.  Unfortunately, bills come due in time.   The nation’s economy is slowly shrinking.  In 1950 the nation was among the richest on earth.  Swedes were taxed at about the same rate as are Americans today, with the government spending 31 percent of GDP.  Then came the socialist takeover, when the welfare state replaced capitalism.  Productivity slipped.  Crime boomed.  The Swedes mortgaged the future and bought momentary comfort.  But the good times will end, O’Rourke predicts, and cracks in the social fabric indicate that the end may be near at hand.

            Checking out Cuba, O’Rourke found a “bad socialism.”  Everything seems shattered by Castro’s revolution.  Simply looking out his hotel window, he saw “holes in everything:  holes in roofs, holes in streets, holes where windows ought to be” (p. 77).   The island looks war-ravaged, and the people seem shell-shocked into silence.  The fading beauty of Havana, where folks were in fact rather free under Batista, gives witness to the losses Cuba has suffered.  The inescapably totalitarian aspects of socialism manifest themselves in Castro’s “paradise.”  The residue–or the debris–of socialism now litters Russia a decade after the “collapse” of communism.  O’Rourke noted that Russians were certainly more active and alive than in the 1980s, but the “system” still hardly works.  Notably absent, he says, is the rule of law.  So “businessmen” behave more like thugs than entrepreneurs.  “What would be litigiousness in New York is a hail of bullets in Moscow.  Instead of a society infested with lawyers, they have a society infested with hit men.  Which is worse, of course, is a matter of opinion” (p. 129).  The rampant corruption, he believes, is directly tied to Marx and Lenin, the men who laminated their amoral, nihilistic worldview onto the nation. 

            The African nation of Tanzania, O’Rourke says, illustrates “how to make nothing from everything.”  It’s one of the world’s truly impoverished nations.   By comparison, “Papua New Guinea is almost ten times more prosperous, never mind that some of its citizens have just discovered the wheel” (p. 166).   There’s plenty of  arable land and abundant natural resources.  The people were little affected by European colonialism and have suffered few wars.  What went wrong is attributable to Julius Nyere, the celebrated “teacher” who led the nation for nearly three decades.  He imposed a stern, rigorously egalitarian collectivism, styled “familyhood,” designed to make Tanzania a peoples’ paradise.  Everything’s regulated, everything’s prescribed by government.  To be blunt:  Tanzania is poor because Nyere and his socialistic enthusiasts “planned it” (p. 175). 

            By contrast there’s Hong Kong, which demonstrates “how to make everything from nothing.”  One of the best examples of laissez-faire economics, Hong Kong’s British colonial government did little but “keep the peace, ensure legal rights, and protect property” (p. 199).  Individuals took the initiative and fueled an economic “miracle.”  “With barely one-tenth of 1 percent of the world’s population, Hong Kong is the world’s eight-largest international trader and tenth-largest exporter of services” (p. 205).  What will happen with its absorption by mainland China, of course, remains to be seen.

            Summing up his discoveries in economics, O’Rourke admits it’s pretty much what his parents told him before he went off to the university: “Hard work, Education, Responsibility; Property rights; Rule of law; Democratic government” (pp. 233-34) insure economic prosperity.  Especially important is the rule of law, for rampant freedom (as in Albania) or rampant crime (as in Russia) prevent economic development.  People will work hard, save, invest, risk and innovate only when the law protects their property.  All in all–professor Samuelson notwithstanding–Adam Smith was right:  the free market provides the best for the most. 

            Discerning, clear-headed, witty and understandable, O’Rourke’s treatise provides a remarkably astute world tour of diverse economies, locating their sources and detailing their consequences.  Fun to read, but memorable in its message! 

146 Child Care? Who Cares?

“Train up a child in the way he should go:  and when he is old, he will not depart from it” (Proverbs 22:6).   Caring for children ever characterizes healthy cultures.  Even “primitive” cultures invested much in rearing the coming generation–as evident in an Iroquois tradition that encouraged folks to consider the next seven generations when charting tribal policies.   If you want to make a “good society” you need to rear “good kids.”  Robert Coles, long-time Harvard professor and highly-regarded authority on children, says:  “Good children are boys and girls who in the first place have learned to take seriously the very notion, the desirability, of goodness–a living up to the Golden Rule, a respect for others, a commitment of mind, heart, soul to one’s family, neighborhood, nation–and have also learned that the issue of goodness is not an abstract one, but rather a concrete, expressive one:  how to turn the rhetoric of goodness into action, moments that affirm the presence of goodness in a particular life lived” (The Moral Intelligence of Children).

Given such ancient wisdom, given the need most everyone acknowledges that we need to rear “good” children, their conditions–as documented in several recent studies–should concern us all.  Robert M. Shaw, M.D., a child and family psychiatrist who once taught at the Albert Einstein College of Medicine, now directs the Family Institute of Berkeley, California, and maintains his psychiatric practice.  He has recently published, with Stephanie Wood, The Epidemic:  The Rot of American Culture, Absentee and Permissive Parenting, and the Resultant Plague of Joyless, Selfish Children (New York:  ReganBooks, c. 2003).  Though written without any clear religious commitment, the book echoes profoundly religious themes; coming from a writer comfortably settled in the liberal environs of Berkeley, California, it champions a thoroughly conservative message.

The book’s lengthy subtitle encapsulates its message, and Shaw writes with a deep sense of outrage at the ways parents, for the past 30 years, have failed their kids.  He claims the killings at Columbine High School at Littleton, Colorado, “did not surprise” him.  Hardly “an aberration,” killers Eric Harris and Dylan Klebold simply demonstrated what one would expect to result from “the childrearing attitudes and practices that have spread like a virus from home to home in this country” (p. x).   Spending time with youngsters–or merely walking through a shopping mall–should alert us to their sullen, angry, whining, self-absorbed attitudes, ample “signs that our society has become toxic to children” (p. xi).

The big problem, as James Dobson indicated long ago, is parents’ failure to discipline their children.  In truth, “No!” is a good word!  Kids need boundaries, limits, restrictions.  They actually welcome “limits on when they go to bed, when they do their homework, when they watch TV, what they eat, who they play with.  And they thrive in tightly managed environments” (p. 129).  Permissive parenting is poor parenting!  “When parents let a child run wild, they are in fact abandoning him” (p. 147).  Without careful guidance, Shaw says, children fail to develop into caring, sensitive adults.  But because they spend so much time away from their kids, today’s  parents internalize a great deal of guilt and are overly-anxious to please rather than direct their offspring.  They even try to be friends with their youngsters, consulting rather than correcting them.  Whenever a mom or dad tells a child “Let’s go” and appends an “OK?”  there’s a problem!  Adults, not children, must make such decisions.

Parents have also allowed themselves to be brainwashed by “the parenting gurus who preach child-centric theories, asserting:  ‘Never let your baby cry,’ ‘He’ll use the potty when he’s ready,’ ‘Discipline is disrespectful,’ ‘The child’s feelings should come first'” (p. 15).  And when ill-disciplined kids get out of control, there’s always Ritalin and Prozac, which doubled in usage within a single decade.   All sorts of verbal evasions proliferate like crab grass!  Kids are called “difficult,” “oppositional,” “high-maintenance,” etc.  In fact, they’re just spoiled!  Rather than dealing with the real issues, the “experts” have simplistically prescribed a singular cure:  self-esteem!  Whatever’s wrong, self-esteem will correct it!  Bumper stickers and awards ceremonies, incessant praise and mandatory applause, all seek to make children “feel good” about themselves.  A sense of “self-esteem,” it’s said, develops when kids enjoy incessant approval.  Nonsense! says Shaw.  The self-esteem peddled by “pop psychologists is nothing less than self-worship, narcissism,” and it sizably contributes to the many problems youngsters struggle with.  Real self-esteem, on the other hand, is a by-product of authentic accomplishments.  Actually scoring a touchdown–not getting praised for trying–gives one self-esteem.   Just do something worthwhile, something good, and forget the smiley faces.

Doing things means viewing less TV.   Watching too much, and thinking about it too little, proves toxic to youngsters.  Most kids are mostly unsupervised as they weekly absorb anywhere from 20-50 hours of programming, much of it whetting appetites for consumption, sex, and violence.  Consequently, they read less and learn less, have fewer friends and like their parents less.  They also are much more discontented with things in general.  Shaw urges parents to monitor and control their children’s TV time.  The medium–like computers and music–has much to offer.  But we need to choose what’s right and protect our kids from what’s wrong.

What children most need isn’t more TV or awards or drugs but, rather, more parental care.  As John Locke observed, centuries ago, “Parents wonder why the streams are bitter, when they themselves have poisoned the fountain.”   Especially in the early years, a baby needs a mother’s arms and words.  “She alone has that unique instinctual drive that prepares her to engage in a developmental dance with her newborn” (p. 26).  Without what Shaw calls “motherese” during a baby’s first two years, his cognitive and emotional development suffers.   “This incredible relationship between mother and child is absolutely unique, the single most sacred thing in our culture” (p. 34).  And yet, amazingly enough, this “sacred thing” has been ruthlessly assailed and ridiculed, rejected by powerful elites in this country.

Those who have urged women to pursue full-time careers–feminists of all shades who have urged women to ignore their own inner promptings–have created a world profoundly hostile to children’s wellbeing.   Truth to tell, institutionalized childcare is mainly defended by those who place parents’ concerns above children’s.  Considerable dishonesty pervades the social sciences, where studies are hyped or ignored in accord with their support of working mothers and day schools.  Two parents, both pursuing careers full-time, Shaw insists, can  hardly provide “the optimum environment for raising children” (p. 80).   He writes with deep conviction, for his life has been spent dealing with “anguished parents and their children, and I can tell you this much is true:  at least one of the parents has to make raising the children the top priority” (82).   Anything less puts kids in harm’s way.  There’s much harm, for example, in childcare.  The more time a child spends in childcare facilities the less closely he will bond with his mother–and the more behavioral problems he will have thereafter.

Shaw’s contentions are buttressed by Brian C. Robertson’s Day Care Deception:  What The Child Care Establishment Isn’t Telling Us (San Francisco:  Encounter Books, c. 2003).   This is a modest updating of his earlier publication, Forced Labor:  What’s Wrong with Balancing Work and Family (Dallas:  Spence Publishing Company, c. 2002).  Robertson works as a research fellow at the Family Research Council’s Center for Marriage and Family, and he edits the Family Policy Review.  He argues that young children need constant, loving, motherly attention; a healthy attachment, early established, enables babies to develop well.  No paid substitutes can actually “mother” a baby.  “As G.K. Chesterton remarked over eighty years ago, ‘If people cannot mind their own business, it cannot possibly be more economical to pay them to mind each other’s business, and still less to mind each other’s babies'” (p. 154).  That truth, however, has been systematically denied and rejected by the elites who shape public opinion and establish public policy.  Consequently, more and more children suffer a variety of behavioral problems that ultimately affect American culture.

Basic to Robertson’s case is “attachment theory,” best represented by the noted psychologist John Bowlby and popularized by Benjamin Spock, who urged moms to stay home with their children as much as possible until they were at least four years old.  To separate a child from his mother was widely understood to endanger the child’s well-being.  During the past 30 years, however, vigorous critics have denied attachment’s import.  Though no evidence supported their case, the critics basically silenced (through intimidation) the attachment theorists.  Consequently, Dr. Spock’s 1992 edition of Baby and Child Care says nothing about the need for any infant-mother attachment and even encourages parents to elevate self-fulfillment over concern for children.  Explaining his radical about-face on this issue, Spock said that too many women “pounced” on him and blamed him for making them feel “guilty.”  Convinced they would work whether he approved or not–and unable to withstand feminist wrath–he says:  “I just tossed it.  It’s a cowardly thing that I did; I just tossed it in subsequent editions” (p. 73).

Spock represents the almost universal capitulation of elite academic and media “authorities” on childcare.  They deny the data Roberson presents which is, indeed, alarming.  It’s evident that professors and journalists care much more for their agenda than the truth.  In Bernard Goldberg’s lengthy experience as a journalist, he witnessed the success of feminists, who “are the pressure group that the media elites (and their wives and friends) are most aligned with.”  Consequently, “America’s newsrooms are filled with women who drop their kids off someplace before they go to work or leave them at home with the nanny.  These journalists are not just defending working mothers–they’re defending themselves” (Bias, 163, 178).  This explains why “research” justifying day care for kids gets prominent exposure, whereas equally valid “research” condemning it is rarely reported.

The same holds for professors in elite universities.  Despite a great deal of preening about “academic freedom” and fearlessly pursuing the truth, no tolerance is granted  “research” suggesting children suffer when deprived of their parents’ presence.  Like Social Security for politicians, daycare for children is the “third rail” for academics–touch it and you die!  Professors hoping to be published, to get tenure, to enjoy advancement and prestige in their profession, simply cannot challenge feminist orthodoxies.  Indeed, Dr. Louise Silverstein, in the American Psychologist, urged her colleagues to “‘refuse to undertake any more research that looks for the negative consequences of other-than-mother care'” (p. 103).  One of the few who dared to do so is a highly regarded scholar, Jay Belsky, who initially defended (in the ’70s and ’80s) the notion that children fared well in daycare facilities.  In time, however, mounting evidence prodded him to reverse himself.  Suddenly, he found himself attacked as an enemy of working women–indeed of women in general!  Publishing his research proved difficult.  He was “shunned at scientific meetings” (p. 43).  He’d become an outcast, a nobody!  Consequently, he’s accepted an appointment in England!

What the professors and journalists refuse to report, however, should be reported.  For children increasingly suffer as a result of parental deprivation.  On a purely physical level it’s clear that children in day care institutions are far more likely to be sick than their counterparts at home.  One epidemiologist actually called day care centers “the open sewers of the twentieth century” (p. 87).  Chronic inner ear infections, diarrhea, dysentery, jaundice, hepatitis A all thrive when small children are mixed together, and “high quality” centers are as disease-ridden as their less esteemed rivals.  Harder document, of course, is the soul-suffering endured by young children.  Kids now spend more time alone, more time with TV, less time eating meals at home, less time talking with adults.  They’re more likely to demonstrate anti-social behavior and less likely to internalize solid ethical principles.

These problems are fully understood by America’s parents, though denied by the nation’s elites!  More than three-fourths of ordinary moms and dads would prefer for moms to stay home with young children.  When day care is needed, they much prefer it be provided by a relative or friend.  But three-fourths of the alleged “experts” (generally highly-educated, and especially academic women), however, prefer day care centers.  And, though these scholars and journalists are quite wealthy, they want the government to subsidize their “child care.”  Some, like Hillary Clinton, propose aggressive interventions by the state.  So, Mrs. Clinton urged:  “Every home and family should be taught through parenting education and family visitation by social service intermediaries, how to raise children.  This would begin in the prenatal stages and continue through childhood'” (pp. 156-157).

Senators Hillary Clinton and Edward Kennedy and Christopher Dodd set the tax policies and national agenda to comply with the radical feminist agenda.  Though few parents want what Clinton et al. seek to dictate, they are subjects of the welfare state and struggle to cope with its policies.  It’s a daunting struggle, but Robertson provides data and perspectives with which to resist it.

Though rather unwieldy (640 pp.) and repetitious at times, William D. Gairdner’s The War Against the

Family:  A Parent Speaks Out on the Political, Economic, and Social Policies That Threaten Us All (Toronto:  Stoddart Publishing Co., c. 1992) gives us a Canadian parent’s perspective on a variety of issues.  A graduate of StanfordUniversity and an Olympic athlete, Gairdner weaves together history, philosophy, theology, education, psychology, sociology and jurisprudence, touching on everything from abortion to taxation.  Much of the book’s value derives from his quotations, sources, and interesting synthesis of his studies regarding the state of the modern family.

Let me focus on only one of his major themes:  the doleful impact of the Welfare State, the deleterious effect of all utopian schemes that propose to improve upon the natural order of things.  As he writes in his Preface, this book “shows how the political, economic and social/moral troubles that play themselves out in the nation at large inevitably trickle down to alter our most private lives and dreams; how any democracy based on freedom and privacy will strangle itself if it drifts toward, or is manoeuvred into, a belief in collectivism of any kind” (p. ix).  To the extent socialism triumphs, Gairdner argues, the family suffers.

This is graphically evident in Sweden, often touted as a grand example of  “democratic socialist” success.  Following the ideological schemes of the economist Gunnar Myrdal and his wife Alva, a “radical feminist sociologist” (p. 138), Sweden engineered a cradle-to-grave welfare state.  (The Myrdals’ work, incidentally, prompted the U.S. Supreme court’s 1952 Brown v. Topeka Board of Education decision mandating public school desegregation.)  In fact, early plaudits for the Swedes’ egalitarian economic system have paled of late as its debts are now mounting.  In the words of Goran Bruhner, “Sweden used to be a welfare paradise on earth.  Not it is the sick man of Europe” (p. 14).  Swedes pay the world’s highest taxes, and two-thirds of the nation’s GNP is devoted to government spending.  One-third of the people produces goods while the other two-thirds redistributes the money derived from taxing the producers.  Ten percent of the workforce fails to work on any given day–rising to 20 percent on Monday and Friday!  Swedes are “sick” 23 days a year.

Social, as well as economic decay, also marks Sweden.  The government has pursued a markedly secular agenda, evident in a 1968 publication, titled “The Family is Not Sacred.”  The author of the article declared:  “I should like to abolish the family as a means of earning a livelihood, let adults be economically independent of each other and give society a large share of responsibility for its children . . .  In such a society we could very well do without marriage as a legal entity” (p. 139).  To a great extent that has taken place in Sweden.  Fewer people marry in Sweden than in any Western nation.  Two thirds of the people in Stockholm live alone!  Swedes who do marry usually cohabit beforehand–getting motivated to marry when a child results from their intimacy.  In the midst of it all, the Swedes are having fewer and fewer children.  And those that are born are quickly lodged in daycare facilities.  Following the Myrdals’ socialist agenda, Sweden pursued policies pushing women into the workforce.  Today 60 percent of the women work–45 percent of them for the government.

The Swedish Welfare State, Gairdner insists, has delivered a lethal blow to the family.  But, to the enlightened elitists in Canada –the “Court Party”– Sweden serves as a model to follow!  Beginning with Pierre Troudeau’s ascent to power in 1968, Canada ‘s leaders have systematically orchestrated a radical swerve to the left, quickly imposing state controls in virtually every area of life.  Should Canadians–and Americans–wonder about what happens to the family when socialism triumphs, simply look to Sweden .  Doing so, Gairdner says, should prompt us to reverse directions!

One of the great reversals needed involves education, to which Gairdner devotes several chapters.  State-controlled education–one of the goals listed in The Communist Manifesto–illustrates the damage children suffer when subjected to a centrally-planned, bureaucratic system.  Amazingly, Americans in New England and the old Northwest demonstrated a higher rate of literacy in 1840 than they do today!  If you think clearly about it, “there is little difference between a collectivized, command economy and collectivized, command education.  Neither can work well, and the unit cost of the product is very great–about double the cost of the same education rendered privately” (p. 198).  Failing schools demand more money and more teachers in smaller classes, ignoring solid evidence showing that neither makes much of a dent in students’ performance.    Public school problems cannot be solved by the public schools, for they are in fact the problem!

The public school movement, strongly championed by “reformers” like the Fabians in England, dislodged churches and private schools as mentors of the young.  In 1905, the Intercollegiate Socialist Society was formed, with John Dewey as a founding member.  He and his associates envisioned “education for a new social order,” and his highly influential Democracy and Education said nary a word about home and family while stressing grand themes like “social unity” and “State consciousness.”  An admirer of the communist endeavors in Russia and the ’20s and ’30s, Dewey wanted to abolish private property and install a state-controlled economic system.  To secure those ends, he taught successive generations of educators to be “change agents” who would transform the public schools into centers for collectivist ideology.

“History will surely show,” says Gairdner, “that one of the tragic links in the long chain of Western decline was the surrender by families, to the nation State, of control over their children’s education.  As Yale historian John Demos has aptly argued, the school is one of the institutions responsible for the long-term ‘erosion of function’ of the family.  And Stanford’s Kingsley Davis writes that ‘one of the main functions [of the school system] appears to be to alienate offspring from their parents” (p. 208).   But we need not abandon our young to the state!  To reverse the harm being done to our kids, Gairdner urges us to support private schools, vouchers, anything possible to take back some of the power from the omnivorous state.  And, perhaps, there is headway being made in the U.S. today!  Ultimately, truth prevails, and it’s difficult to evade the truth of G. K. Chesterton, a century ago:  “This triangle of truisms, of father, mother and child, cannot be destroyed; it can only destroy those civilizations which disregard it” (p. 584).

145 The Homosexual Agenda

In 1986, United States Supreme Court Chief Justice Warren Burger, writing for the majority in Bowers, upholding   Georgia  law forbidding sodomy, said:  “Decisions of individuals relating to homosexual conduct have been subject to state intervention throughout the history of Western civilization.  Condemnation of those practices is firmly rooted in Judeo-Christian moral and ethical standards . . . . [Sir William] Blackstone described ‘the infamous crime against nature’ as an offense of ‘deeper malignity’ than rape, a heinous act ‘the very mention of which is a disgrace to human nature’ and ‘a crime not fit to be named.’  To hold that the act of homosexual sodomy is somehow protected as a fundamental right would be to cast aside millennia of moral teaching.”  His historical perspective was accurate, and his citing Blackstone revealed his reliance upon one of the masterful authorities in jurisprudence.

Soon thereafter, however, the Court discarded Blackstone and millennia of moral teaching!  Seventeen years after the Bowers decision, the Supreme Court, in   Texas  , reversed itself and effectively legalized sodomy.   A few months later a Massachusetts Supreme Court case ordered the state legislature to draft legislation facilitating same-sex marriage.  Thus we’re alerted to what Alan Sears and Craig Osten consider in The Homosexual Agenda:  Exposing the Principal Threat to Religious Freedom Today Nashville  :  Broadman & Holman, Publishers, c. 2003).  R. Albert Mohler Jr., President of The Southern Baptist Theological Seminary, recommends the book, declaring that “The sexual revolution of the last half-century amounts to the moist sweeping and significant reordering of human relationships in all of human history.”  The revolution was orchestrated by a cadre of activists who now make “the legitimation and celebration of homosexuality” the next stage of sexual liberation.

Indeed, “As one observer of the homosexual movement [the Orthodox Jewish columnist Don Feder] has warned, ‘Gay activists are sexual Marxists.  Legitimizing same-sex unions as a warm-up act.  Ultimately they want to eliminate any barriers, and signposts, that limit or channel the exercise of human sexuality'” (p. 96).  As is evident in Sweden, they also want to eliminate any criticism, much less opposition, of their behavior.  The Swedish parliament recently forbade “all speech and materials opposing homosexual behavior and other alternative lifestyles. Violators could spend up to four years in jail” (p. 183).  Deeply influenced by sociologists Gunnar and Alva Myrdal, the Swedes have sought, as Alva Myrdal urged, to treat all adults “in the same manner by society, whether they lived alone or in some sort of common living arrangement.”  Same-sex, as well as heterosexual, “living arrangements” are fine.   In the   U.S.  , under the guise of “hate crimes” legislation largely written to appease homosexual activists, teachers and pastors may very well become liable to prosecution simply for upholding biblical standards regarding sexual conduct.  Indeed, Senator Ted Kennedy, a constant supporter of hate-crimes bills,  has “called religious objections to homosexual behavior ‘an insidious aspect of American life'” (p. 202).

To attain their goals, sexual revolutionaries have known they must destroy (or at least immobilize) the family and the church, the two social institutions most opposed to sexual license.  To expand the definition of “family” to include many sorts of “loving” relationships, to force the church (in the name of “love”) to validate such bonds, has been a basic part of the homosexual agenda.  Courts have increasingly granted gay and lesbian couples to adopt children.  Revealingly, “Al Gore and his wife Tipper donated $50,000 to the Human Rights Campaign to help its ‘FamilyNet’ campaign promote homosexual adoption.  Their book, Joined a the Heart:  The Transformation of the American Family, prominently featured homosexual ‘families'” (p. 111).

One of the longest levers slowly easing the public’s hostility to homosexual activity is the entertainment industry.  Portraying gay and lesbian activities as healthy–and branding any criticism of such as hateful–have become pervasive in films and television.  Comedies have been especially effective by first disarming and then appealing to viewers for “tolerance.”  As Michael Medved noted:  “A Martian gathering evidence about Americasociety, simply by monitoring our television, would certainly assume that there were more gay people in America than there are evangelical Christians” (p. 28).

Despite the ancient opposition of Christians to homosexual acts, today’s Churches have gradually moved from “loving the sinner” to endorsing sodomy as an appropriate expression of sexuality so long as it occurs within the context of “love.”   Though this is most evident in the consecration of an openly gay bishop in the Episcopal Church, evangelical activists, such as Tony Campolo and his wife Peggy, have aggressively promoted “the radical homosexual agenda (p. 128).  Peggy, particularly, has argued Paul’s apparent condemnation of same-sex relations in “Romans 1 does not apply to monogamous, ‘loving,’ homosexual relationships, and that evangelicals who feel differently than her are ‘grossly misinformed'” (p. 129).  Such statements appear in one of the books’ most disquieting chapters, entitled “The Silence (and Silencing) of the Church.”  Rarely these days does one hear words such as Martin Luther’s:  “The heinous conduct of the people of Sodom is extraordinary, inasmuch as they departed from the natural passion and longing of the male for the female, which was implanted by God, and desired what is altogether contrary to nature.  Whence comes this perversity:  Undoubtedly from Satan, who, after people have once turned away from the fear of God, so powerfully suppresses nature that he beats out the natural desire and stirs up a desire that is contrary to nature'” (p. 123).

In part, as the authors carefully document, Christians have been silenced through violence and intimidation–as when gay activists invaded S. Patrick’s Cathedral and disrupted a mass being conducted by Cardinal John O’Connor.  Others threw condoms in a service of Village Seven Presbyterian Church in   Colorado Springs because a prominent layman, Will Perkins, had supported an amendment to the state constitution which would have banned any preferential treatment of homosexuals.  In part, homosexuals have moved into the church through doors opened by radical feminists “who have tried to reshape the church and the gospel I their own image.  That dodge can be best summarized as ‘the Bible has to be interpreted in the context of the time it was written and therefore that passage is no longer relevant today'” (p. 126).  When churches rewrite the Scripture, using “gender-inclusive language” and approve praying to “Mother and Father God,” there is no reason to deny homosexual arguments regarding a new version of the Christian faith, suited to gay and lesbian desires.

In response to the homosexual agenda, Sears and Osten urge Christians to be true to Scripture and Tradition.  They cite the words of Titus Brandsma, a Dachau  martyr, who said:  “Those who want to win the world for Christ must have the courage to come into conflict with it” (p. 205).  There’s no question that opposing homosexual activists requires courage.  It’s the courage evident in the words of the Anglican archbishop of Sidney, who said:  “‘The Christian Gospel is the insertion of truth into the untrustworthy discourse of the world.  Some of us want to be kind, so loving that we will not speak the truth.  The therapeutic model of pastoral care has been perverted into mere affirmations of human behaviour.  Our love is no love, for it refuses this great test:  will it speak boldly, frankly, truthfully?”  Sadly enough, he continued:  “We have contributed towards the gagging of God, perhaps because we are frightened of suffering.  But there is one fundamental task to which we must be committed, come whatever may:  Speak the truth in love'” (p. 211).

* * * * * * * * * * * * * * * * * * * *

Christopher Wolfe has edited Same-Sex Matters:  The Challenge of Homosexuality  (Dallas  :  Spence Publishing House, 2000).    In his introductory essay, Wolfe argues that “there is no question that our current family instability–and the growing acceptance of homosexuality–reflects, among other factors, the influence of changing social mores on contraception, premarital sex, cohabitation, and no-fault divorce.”  The moral relativism pervading contemporary culture justifies “whatever is pleasant and does not immediately harm others in some relatively tangible way” (p. 17).     Refusing to condemn fornication and adultery, so long as they involve “consenting adults,” one cannot easily express outrage at homosexual acts.

Patrick Fagan, along with several of the other essayists, roots today’s sexual permissiveness in the contracepting culture that emerged in the 1940s.  He cites Sigmund Freud, interestingly enough, who asserted, “in ‘The Sexual Life of Human Beings’ that the separation of procreation and sexual activity is the most basic of perversions, and that all other sexual perversions are rooted in it:  ‘The abandonment of the reproductive function is the common feature of all perversions.  We actually describe a sexual activity as perverse if it has given up the aim of reproduction and pursues the attainment of pleasure as an aim independent of it'” (p. 29).  The Anglican Church abandoned its historic opposition to contraception at the Lambeth Conference in 1930, other Protestant denominations soon followed along.

Religious reservations regarding homosexuality were also weakened as divorce and abortion gained acceptance.  Focusing attention on “hard cases,” cultivating compassion for “victims,” effectively pulled public opinion toward greater acceptance of what earlier generations condemned.  Just as “love” grown cold justified divorce so could “love” powerfully felt justify homosexual relationships.  Just as opposition to abortion was effectively branded “hateful” toward women, so opposition to sodomy was labeled “hate” and “homophobia.  Indeed, as Michael Medved makes clear in his essay, “the main threats to the family in America do not come from the gay community.  They come from infidelity, they come from divorce, they come from all the temptations heterosexuals fear and feel in a hedonistic culture” (p. 167).

For anyone interested in Church history, Robert Louis Wilken’s essay, “John Boswell and Historical Scholarship on Homosexuality” is most helpful, since Boswell’s “scholarship” is routinely cited by homosexual activists anxious to suggest that the EarlyChurch tolerated their lifestyle.  Though highly praised by The New York Times and similarly leftist media, Boswell’s work is, in fact, deeply flawed, indeed “bogus.”  His writings, such as Christianity, Social Tolerance, and Homosexuality,  and Same-Sex Unions in Pre-Modern Europe illustrate “advocacy scholarship, pseudohistorical learning yoked to a cause, tendentious scholarship at the service of social reform, a tract in the culture wars” (p. 198).  From a first-rate scholar such as Wilken, this is a damning indictment.  And it properly extends to all “scholars” who try to reinterpret either biblical or historical materials to “christianize” homosexuality.

The impossibility of doing so becomes clear in Bishop Fabian Bruskewitz’s “Homosexuality and Catholic Doctrine.  After citing the official teaching documents of the Church, Bruskewitz reminds Catholics that their opposition to homosexuality is derived from a theology of creation, crediting Him with the goodness and design of all that is.  By nature, homosexual acts go counter to the created order.  They violate the essence of love.  Livio Melina, a professor of moral theology at the PontificalLateranUniversity in Rome, makes this clear:  “‘In the homosexual act, true reciprocity, which makes the gift of self and the acceptance of the other possible, cannot take place.  By lacking complementarity, each one of the partners remains locked in himself and experiences contact with the other’s body, merely as an opportunity for selfish enjoyment.  At the same time, homosexual activity also involves the illusion of a false intimacy that is obsessively sought and constantly lacking.  The other is not really the other.  He is like the self; in reality, he is only the mirror of the self which confirms it in its own solitude exactly when the encounter is sought.  This pathological narcissism has been identified in the homosexual personality by the studies of many psychologists.  Hence, great instability and promiscuity prevail in the most widespread model of homosexual life, which is why the view advanced by some, of encouraging stable and institutionalized unions, seems completely unrealistic'” (p. 222).

* * * * * * * * * * * * * *

Though written nearly a decade ago, I still regard Jeffrey Satinover’s Homosexuality and the Politics of Truth (Grand Rapids:  Baker Books, 1996) the best book on the subject. The author, a medical doctor, has been involved treating AIDS patients from the beginning of the epidemic’s outbreak in the early ’80s.  He knows the truth and is bold to declare it.  He is also deeply compassionate, distressed by the pain endured by those afflicted with the deadly HIV virus.

The truth is:  like alcoholism, homosexual behavior is deadly.   One study “found that the gay male life span, even apart from AIDS and with a long-term partner, is significantly shorter than that of married men in general by more than three decades.  AIDS further shortens the lie span of homosexual men by more than 7 percent” (p. 69).  They inordinately suffer chronic diseases, including syphilis and gonorrhea, such as hepatitis, rectal cancer, and bowel disorders.  They take their own lives and suffer mentally.

Amazingly, when AIDS began to do its deadly work, “the first priority” in the gay community “was to protect homosexuality itself as a perfectly acceptable, normal, and safe way of life” (p. 15).  Rather than trying to protect individuals from disease, something that would have required amending one’s lifestyle, the gay community orchestrated a political movement designed to protect it, by misleading the public, asserting that homosexuality is genetically programmed, irreversible, and normal.  In fact, there is no evidence for a “gay gene,” and homosexuality is largely a learned behavior.  It can, therefore, be reversed–and thousands of individuals have been restored, through “reparative therapy” to heterosexuality.  And it is, in fact, utterly un-normal, running counter to the most basic laws of nature.

To promote their deceit, homosexual activists engaged, skillfully, in politics!  In the ’70s they persuaded the American Psychiatric Association’s Board of Trustees to declassify homosexuality as a “disorder,” though a large majority of psychiatrists still judged it such.  Political pressure applied behind the scenes, not scientific evidence, dictated the change.  Homosexual activists, by disrupting meetings and intimidating officials, gained “scholarly” validation for their sexual behavior.   The American Psychological Association soon followed suit.  With “science” supporting their cause, homosexuals then turned to legislatures and courts, slowly overturning the nation’s moral consensus.

What’s needed, Satinover says, is a recovery of the Judeo-Christian ethos that once characterized this nation.  Secularists have opened the gates to a resurgent Gnostic paganism, ever tolerant of “diversity” in many forms.  Himself Jewish, Satinover urges Orthodox Jews and Christians to join together in promoting a biblically based public, as well as private, morality.  Sin must be called sin!  Christians, especially, need to recover veneration for the Old Testament Law!  Ultimately, “it is not really a battle over mere sexuality, but rather over which spirit shall claim our allegiance, [for] the cultural and political battle over homosexuality has become in many respects the defining moment for our society.  It has implications that go far beyond the surface matter of ‘gay rights.’  And so the more important dimension of this battle is not the political one, it is the one for the individual human soul” (p. 250).

* * * * * * * * * * * * * * * *

In Legislating Immorality:  The Homosexual Movement Comes Out of the Closet (Chicago:  Moody Press, c. 1993), George Grant and Mark A. Horne evaluate the issue from a strongly Evangelical perspective.  Thus their main concern is “an uninformed and compromised church” which needs to discern “that whatever is right is actually good, that whatever is good is actually just, and that whatever is just is actually merciful.  The kindest and most compassionate message Christians can convey to homosexuals and their defenders is an unwavering Biblical message” (p. 5).

The authors provide both contemporary and historical illustrations, showing the pervasiveness of homosexuality, especially in non-Christian cultures.   With the resurgence of paganism in the Enlightenment, the West has increasingly tolerated it.  With refreshing candor, Camille Paglia, a highly secularized writer, asserts:  “Happy are those periods when marriage and religion are strong. . . .  Unfortunately, we live in a time when the chaos of sex has broken out into the open. . . .  Historiography’s most glaring error has been its assertion that Jude-Christianity defeated paganism.  Paganism has survived in the thousand forms of sex, art, and now the modern media. . . .  A critical point has been reached.  With the rebirth of the gods in the massive idolatries of popular culture, with the eruption of sex and violence into every corner of the ubiquitous mass media, Judeo-Christianity is facing its most serious challenge since Europe’s confrontation with Islam I the Middle Ages.  The latent paganism of western culture has burst forth again in all its daemonic vitality'” (p. 54).

Homosexuals have successfully infiltrated the media and schools, using their influence to dissolve opposition to their orientation and behavior.  School administrators, such as Joseph Fernandez in New York, seek to impose curricula containing books like Daddy’s Roommate and Heather Has Two Mommies, books published by a company “that specializes in subversive homosexual works” (p. 79) and promoting the acceptance of homosexuality.   Though angry parents ousted Fernandez from his position as School Chancellor, the schools for two decades have increasingly urged tolerance–indeed often admiration–for homosexuals.

Churches, too, have eased or eliminated their opposition to homosexual acts.  Mainline denomination, especially, have divided over the issue.  Grant and Horne see this as a symptom of a more basic issue:  their integrity.  For one’s position on homosexuality cannot be severed from “the issue of biblical authority, the nature of church ministry, the scope of church discipline, and the church’s responsibility and relationship to the civil sphere” (p. 165).   Citing official declarations from United Methodists, Episcopalians, Presbyterians, et al., the authors demonstrate the degree to which the nation’s churches have gradually embraced the homosexual agenda.  Even self-professed evangelicals, such as Tony Campollo,  Virginia Ramey Mollenkott and Letha Scanzoni, open doors of acceptance to gay rights.

What’s needed, the authors argue, is a recovery of the true biblical and historically Christian position.  In the   Early  Church, believers separated themselves from the sexually perverse Greco-Roman culture.  This included homosexual practices–something staunchly condemned by every extant pre-Constantinian text.  In time, as Christians become numerically dominant, laws reflected their convictions.  Thus Emperor Theodosius, in 390 A.D., “declared sodomy a capital crime and various Christian realms continued to enforce that standard for almost two millennia” (p. 209).

144 Judicial Tyranny?


            When he argued the case for the ratification of the United States Constitution in Virginia , James Madison, the document’s most influential architect, warned:  “I believe there are more instances of the abridgment of the freedom of the people by gradual and silent encroachments of those in power than by violent and sudden usurpation.”  To protect the people’s freedom, the Constitution balanced powers in the federal government, safeguarding the rule of law from tyrannical usurpers.  Madison ‘s concern for the loss of freedom to “gradual and silent encroachments” was recently revived in an issue of Commentary (October 2003), wherein  distinguished contributors addressed the question:  “Has the Supreme Court Gone Too Far?”  Their essays demonstrate the extent to which recent Supreme Court decisions regarding affirmative action and sodomy–simply the latest of a list of similar judicial edicts–have forced many thoughtful folks to ponder the fate of constitutional law in America . 

            One of the contributors, Lino A. Graglia, a professor at the University of Texas Law School, argued that the Court has abandoned its assigned role–interpreting the Constitution–and now pursues “policy choices” designed to empower an elite, enlightened minority of like-minded liberals.  To Graglia, “Virtually every one of the Court’s rulings of unconstitutionality over the past fifty years–on abortion, capital punishment, criminal procedure, busing for school racial balance, prayer in the schools, government aid to religious schools, public display of religious symbols, pornography, libel, legislative reapportionment, term limits, discrimination on the basis of sex, illegitimacy, alien status, street demonstrations, the employment of Communist-party members in schools and defense plants, vagrancy control, flag burning, and so on–have reflected the views of this same elite. In every case, the Court has invalidated the policy choice made in the ordinary political process, substituting a choice further to the political left. Appointments to the Supreme Court and even to lower courts arc now more contentious than appointments to an administrative agency or even to the Cabinet–matters of political life or death for the cultural elite–because maintaining a liberal activist judiciary is the only means of keeping policymaking out of the control of the American people.”

            Another contributor to the Commentary symposium, Judge Robert H. Bork, had earlier and more amply set forth his views in Coercing Virtue:  The Worldwide Rule of Judges ( Washington :  The AEI Press, 2003).  As the book’s subtitle indicates, Bork believes that judicial activism now threatens peoples’ liberties everywhere, for they “are enacting the agenda of the cultural left” (p. 2).  As tenured members of the intelligentsia (labeled the “New Class” by Bork), judges increasingly consider themselves entitled to impose their political and cultural worldview.  They illustrate what G.K. Chesterton noted as a universal phenomenon:  “In all extensive and highly civilized societies groups come into existence founded upon what is called sympathy, and shut out the real world more sharply than the gates of a monastery. . . .  The men of a clique live together because they have the same kind of soul, and their narrowness is the narrowness of spiritual coherence and contentment, like that which exists in hell” (Heretics, 5th ed., 1905, pp. 180-181).   C.S. Lewis similarly observed the persistent desire we all possess to enter the “inner ring” and thereby gain power over others. 

            When the “inner ring” consists of irreligious intellectuals, utopian ideologies replace theological dogmas and guide their thinking.  As Max Weber noted, in The Sociology of Religion, intellectuals who reject religion easily embrace “the economic, eschatological faith of socialism.” Most 20th century secular utopians have embraced a socialist agenda and seek to attain it through political means.  “The socialist impulse remains the ruling passion of the New Class” (p. 6), though it now focuses on cultural issues such as sex and education rather than economics.  Modern “liberalism,” with its commitment to social change through political coercion, is thoroughly socialistic, Bork says.  And it is equally authoritarian, for the cultural elites, everywhere failing to persuade the masses to democratically embrace their values, now seek to impose them through the courts. 

            Consequently, “What judges have wrought is a coup d’etat–slow-moving and genteel, but a coup d’etat nonetheless” (p. 13).  They also lend support to a collage of special interest groups–environmentalism, feminism, multiculturalism–which share a socialistic commitment to reshaping the world.  Bork’s view was earlier espoused by the esteemed sociologist Robert Nisbet, who  noted that “‘crusading and coercing'” courts have preempted power so as to precipitate “the wholesale reconstruction of American society,” aiming to implement what Jean-Jacques Rousseau and Jeremy Bentham championed:  “sovereign forces of permanent revolution” (p. 10).  This revolution, embodied in activist judges, is both political and cultural and has significantly, if subtly, replaced “traditional moralities with cultural socialism” (p. 137).   

            Of particular interest to Bork is the internationalization of this agenda.   He devotes two chapters to Canada and Israel , whose courts are on the cusp of judicial activism.  Europe courts such as the International Criminal Court, established in 1998, have become aggressive in asserting the prerogatives of “international law”–generally understood as the edicts of elite tribunals.  “Crimes against humanity” were cited justify legal moves against Chile’s Augusto Pinochet and Yugoslavia’s Slobodan Milosevic, but not against China’s Li Peng or Cuba’s Fidel Castro!  Wars to combat communism are labeled unjust, whereas wars that advance causes favored by elite jurists are justified for advancing “universal human rights.”  American Supreme Court justices have, alarmingly, begun to cite non-American courts in issuing decisions.  Thus Justice Stephen Breyer has cited court decisions in India , Jamaica , and Zimbabwe !  The U.S. Constitution may have little bearing on the Court’s decisions, but Zimbabwe ‘s jurists apparently do! 

* * * * * * * * * * * * * * * * * * * *

            ReadingCoercing Virtue prompted me to re-read Bork’s Slouching Towards Gomorrah:  Modern Liberalism and American Decline (New York:  ReganBooks, c. 1996), a work of cultural commentary rather than legal analysis.  Modern liberalism, as Bork defines it, espouses apparent antinomies:  radical egalitarianism and radical individualism.  It triumphed as the New Left of the ’60s, represented by Bill and Hillary Clinton, Tom Hayden and Jane Fonda, gained control of the nation’s institutions in the ’90s.   Teaching at Yale Law School , Bork saw a radical change in the class that entered in 1967.  Radicalized in their undergraduate years, they “were angry, intolerant, highly vocal, and case-hardened against logical argument” (p. 36).  Simultaneously angry and hedonistic, crusading for “social justice” and care freely cohabiting, they espoused a nihilism that now pervades the nation. 

            In time, the young radicals took their ideals and became “part of the chattering class, talkers interested in policy, politics, and culture.  They went into politics, print and electronic journalism, church bureaucracies, foundation staffs, Hollywood careers, public interest organizations, anywhere attitudes and opinions could be influenced” (p. 51).  They established a variety of special interest groups–environmental, feminist, abortion rights, ethnic, etc.   And they are leading us, Bork believes, down the slope to moral degradation, Gomorrah !

* * * * * * * * * * * * * * * * * *

            Judge Bork also wrote an essay for a symposium that was provocatively titled “The End of Democracy?” and published in the journal First Things (November 1996).  At the heart of the controversy, says Richard John Neuhaus, the journal’s editor-in-chief, is the degree to which we still have a constitutional republic.  Neuhaus once attended a conference wherein a legal scholar concluded his presentation with the assertion that ‘we are no longer living under the Constitution of the United States of America .’  To which a Supreme Court justice in  attendance responded, ‘Welcome to the second half of the twentieth century'” (p. 244).  Though many were amused at the moment by the justice’s quip, the truth seems to be that we no longer live under the Constitution ratified in 1789.

            The essays elicited a flurry of controversy, with dozens of responses printed in various periodicals–and in subsequent issues of First Things.  All the relevant materials, plus a lengthy “Anatomy of a Controversy” by Richard John Neuhaus (including the anecdote regarding the Constitution cited in the prior paragraph), were collected into a single volume, edited by Mitchell S. Muncy, entitled The End of Democracy?  The Celebrated First Things Debate with Arguments Pro and Con” (Dallas:   Spence Publishing Company, 1997). 

            The journal’s editors, introducing the essays, wondered “whether we have reached or are reaching the point where conscientious citizens can no longer give moral assent to the existing regime” (p. 3).  The term “regime” ignited a storm of protest, but the editors used it by design to indicate the possibility that “we the people” no longer rule our own country.  “Democratic” means too routinely fail to attain the people’s ends and policies they clearly oppose, such as unrestricted abortion rights, are routinely imposed through judicial fiat, as was especially evident in two 1992 Supreme Court decisions:  Romer v. Evans and Planned Parenthood v. Casey

            Such decisions prompted a dissenting Justice Antonin Scalia to declare:  “Day by day, case by case, [the Court] is busy designing a Constitution for a country I do not recognize” (p. 10).  In Romer, the Court overturned the clearly expressed will of the people of Colorado , who had adopted, through a statewide referendum, a constitutional amendment specifically denying homosexuals the special protections and rights granted them by some municipalities.  Commenting on the case, Robert Bork notes that “Romer is a prime instance of ‘constitutional law’ made by sentiment having nothing to do with the Constitution.”  Rather, it established “the newly faddish approval of homosexual conduct among the elite classes from which the justices come and to which most of them respond” (p. 12). 

            Russell Hittinger, a professor of law at Tulsa University , argues that the amazing claims set forth by the Court in Casey asserted that even if Roe v. Wade was legally questionable it was legitimate since the American people had accepted it as law.  One of the dissenting justices, Byron White, called his colleagues’ decision in Roe an “exercise of raw judicial power,” and Casey locked in concrete that decision, making “abortion the benchmark of it is own legitimacy, and indeed the token of the American political covenant” (p. 18).  The Court behaves as if the American people had established a new “regime” ruled by judicial edicts, not legislative enactments.  After examining crucial decisions, Hittinger asserts that the new regime is “a very bad regime” (p. 27) because it leaves the weakest among us–the unborn children and the helplessly infirm–at the mercy of those who want them dead.  It excludes the people from political power, rightly exercised through legislative elections and deliberations.  And, sadly enough “it has made what used to be its most loyal citizens–religious believers–enemies of the common good whenever their convictions touch upon public things” (p. 28). 

              Hadley Arkes, a professor of law at Amherst College when the essays were published, carefully considers the implications of Romer v. Evans, the Supreme Court decision which nullified a constitutional amendment secured through a referendum whereby the people of Colorado sought to invalidate the preferential treatment homosexual activists had secured in certain localities.  This decision illustrates the propensity of judges to “advance the interests of gay rights and other parts of the liberal agenda” (p. 31).  Ultimately, Arkes insisted, the gay activists want to redefine the family and legalize same-sex marriages.  This is evident in the words of Nan Hunter, a lesbian activist Bill Clinton appointed, in 1993, “deputy general counsel/legal counsel” in the Department of Health and Human Services, who sought “to dismantle the legal structure of gender in every marriage'” (p. 35).  Such radical changes, of course, cannot be won through the democratic process, whenever the people are allowed to express and implement their convictions.  Only by enlisting an “enlightened” elite, only by pushing their agenda through the courts, can gay and lesbian activists attain their goals.

            In “Kingdoms in Conflict,” one of the more radical essays in the symposium, Charles Colson argued that we are now witnessing the culmination “of a long process I can only describe as the systematic usurpation of ultimate political power by the American judiciary–a usurpation that compels evangelical Christians and, indeed, all believers to ask sobering questions about the moral legitimacy of the current political order and our allegiance to it” (p. 41).  Supreme Court decisions, especially those securing abortion rights, cannot be prod devout citizens to ponder their allegiance to a regime responsible for the deaths of millions of unborn children.  Citing theologians such as Calvin and Aquinas, who endorsed Augustine’s aphorism that “an unjust law is no law at all,” Colson wonders how much more must take place before Christians begin to challenge and even disobey their masters. 

            Sharing Colson’s discontent, Robert P. George, a professor of politics at Princeton University , suggested that we may very well be subjects of “The Tyrant State.”  Though America is still a democratic society, “even a democratic regime may compromise its legitimacy and forfeit its right to the allegiance of its citizens” (p. 54) when it endorses what John Paul II called “the culture of death.”  This has occurred, in the U.S. , primarily through legalized abortion.  Sadly enough, in our democracy “our judges–whose special responsibility it is to preserve the core democratic principle of equality before the law–are the ones whose edicts have betrayed this principle” (p. 56).  Reflecting on what we should do, right now, given the significant freedoms we still enjoy, Professor George urges us to heed Pope John Paul II, “‘to have the courage to look the truth in the eye and to call things by their proper names, without yielding to convenient compromises or to the temptation of self-deception.’  Let us, therefore, speak plainly:  The courts, sometimes abetted by, and almost always acquiesced in, federal and state executives and legislators, have imposed upon the nation immoral policies that pro-life Americans cannot, in conscience accept” (p. 61). 

            These five essays constitute the heart of The End of Democracy.  The rest of the book contains a variety of responses, ranging from letters to First Things to lengthy essays published in other periodicals.  Most of them are quite critical, and some (Peter Berger and Gertrude Himmelfarb) on the editorial board of First Things resigned lest they be implicated in the questioning of America ‘s “democracy.”  Others (James Dobson and Mary Ann Glendon) endorsed the endeavor. 

            What becomes clear, in both the original essays and the responses to them, is the fact that abortion deeply divides this nation.  In an essay for The National Review, a “neoconservative” Jewish writer, William Kristol explained:  “the truth is that abortion is today the bloody crossroads of American politics.  It is where judicial liberation (from the Constitution), sexual liberation (from traditional mores), and women’s liberation (from natural distinctions) come together.  It is the focal point for liberalism’s simultaneous assault on self-government, morals, and nature.  So, challenging the judicially-imposed regime of abortion-on-demand is key to a conservative reformation in politics, in morals, and in beliefs” (p. 94). 

            Kristol’s analysis is amplified by Hadley Arkes, on of the original essayists, in an explanation of their intent.  Rooted in the Declaration of Independence’s appeal to the natural law–that by nature all men are entitled to certain rights, especially the right to life, he and other contributors “spoke no treason, and they took care not to incite people to a course of lawlessness.  But . . . we come to the very edge when our government tells us that the killing of unborn children must be regarded as a private right; that we may have no proper concern about the terms on which killing is carried forth in our neighborhoods; and that the meaning of ‘homicide’ is no longer part of the business of people living together in a republic” (p. 169).