173 The Souls of Our Young

In Soul Searching: The Religious and Spiritual Lives of American Teenagers (New York: Oxford University Press, c. 2005), Christian Smith and Melinda Lundquist Denton provide an amply documented and academically persuasive portrait of America’s youth. Smith is a professor of sociology at the University of North Carolina and the principal investigator of the National Study of Youth and Religion—a well funded, methodologically clear endeavor that relies upon both extensive surveys and personal interviews. Denton is the study’s project manager. “To our knowledge,” they say, “this project has been the largest, most comprehensive and detailed study of American teenage religion and spirituality conducted to date” (p. 7).

America’s teenagers are remarkably religious; 40 percent attend “religious services once a week or more, and 19 percent report attending one to three times per month” (p. 37). Only 18 percent have no religious involvement. Amazingly enough, “teens as a group profess to want to attend religious services not less, but actually more than they currently do” (p. 38). They praise their congregations as “warm and welcoming” (p. 61) and find adults therein reliable and trustworthy. Their parents, more than anyone else, influence them, and they reveal little hostility toward them. Such youngsters have little interest in fringe or “alternative” religions and seem to be quite conventional in almost every way. “The vast majority of U.S. teenagers identify themselves as Christians” and “regularly practice religious faith” (p. 68). The mantra of avant garde folks like Michael Lerner—”spiritual but not religious”—hardly registers with typical teenagers.

One interviewee, incidentally, was attending a Nazarene church and spoke highly of it. He liked Wednesday and Sunday night services, the youth group and Sunday school. What he found attractive in the church was this: “It’s good people, you know. And not only that, I also actually learn,” something important to him because he wanted to know how to “be a God-fearing person and go to heaven or whatever, you know?” (p. 100).

The more devout among them are thereby advantaged in “a host of ways,” making a positive difference in: “risk behaviors, quality of family and adult relationships, moral reasoning and behavior, community participation, media consumption, sexual activity, and emotional well-being” (p. 219). Whether one considers drugs and alcohol or school attendance or getting along with parents, the religious teenagers do much better. They watch less TV, fewer R rated movies, less pornography, and play fewer video games. In some categories—such as pornographic movies, where the “devoted” teens watched 0.5 a year while the “disengaged” saw 2.5—the statistics reveal dramatic differences. “Nearly all Devoted teens believe in waiting for marriage to have sex, compared to less than one-quarter of the Disengaged who believe the same” (p. 223). Devoted teens are far happier than the Disengaged and feel more closely connected with others. They craft positive plans for the future and seriously ponder “the meaning of life” (p. 226). The statistical tables delineating these differences, found on pp. 220-227, are most impressive in demonstrating the authors’ conviction that religion helps teens.

The positive news regarding the role of religion in teenagers’ lives must be balanced, however, by information regarding its doctrinally deficient nature. Our youngsters have little knowledge of any content to the Christian faith! They take a thoroughly individualistic approach to questions regarding God, man, and salvation—though they are generally quite inarticulate when asked to explain much of anything about their views. Indeed, the authors conclude: “In our in-depth interviews with U.S. teenagers, we also found the vast majority of them to be incredibly inarticulate about their faith, their religious beliefs and practices, and its meaning or place in their lives” (p. 131).

Their religion is best defined as “Moralistic Therapeutic Deism.” They believe in a rather distant (unless needed to solve one’s problems) God, who “wants people to be good, nice, and fair to each other, as taught in the Bible and by most world religions” (p. 162). “The central goal of life is to be happy and to feel good about oneself,” and “Good people go to heaven when they die” (p. 163). They believe God “designed the universe and establishes moral law and order. But this God is not Trinitarian, he did not speak through the Torah or the prophets of Israel, was never resurrected from the dead, and does not fill and transform people thorough his Spirit. This God is not demanding. He actually can’t be, because his job is to solve our problems and make people feel good. In short, God is something like a combination Divine Butler and Cosmic Therapist: he is always on call, takes care of any problems that arise, professionally helps his people feel better about themselves, and does not become too personally involved in the process” (p. 165).

Today’s teenagers also entertain a view of human nature quite at odds with the Christian tradition. Teens “tend to assume an instrumental view of religion. Most instinctively suppose that religion exists to help individuals be and do what they want, and not as an external tradition or authority or divinity that makes compelling claims and demeans on their lives, especially to change or grow in ways that may not immediately want to” (p. 148). While they freely acknowledge their sins, they apparently feel no condemnation as sinners! They share the broader culture’s presumption that we are autonomous individuals, free to shape our future in accord with our own desires. Religion is viewed as an enjoyable activity, but it ought not particularly influence one’s decisions. Autonomous individuals can hardly judge the behavior of others, and today’s teens are radically non-judgmental. “The typical bywords, rather, are ‘Who am I to judge?’ ‘If that’s what they choose, whatever,’ ‘Each Person decides for himself,’ and ‘If it works for them, fine'” (p. 144).

“What we heard from most teens,” Smith and Denton say, “is essentially that religion makes them feel good, that it helps them make good choices, that it helps resolve problems and troubles, that it serves their felt needs. What we hardly ever heard from teens was that religion is about significantly transforming people into, not what they feel like being, but what they are supposed to be, what God, or their ethical tradition wants them to be” (pp. 148-149). The youngsters interviewed rarely expressed interest in a religion that “summons people to embrace an obedience to truth regardless of the personal consequences or rewards. Hardly any teens spoke directly about more difficult religious subjects like repentance, love of neighbor, social justice, unmerited grace, self-discipline, humility, the costs of discipleship, dying to self, the sovereignty of God, personal holiness, the struggles of sanctification” (p. 149), or any of the classical themes of Christian discipleship.

For those of us working with young people, this book is both encouraging and chastening. Kids are hungry for God and the churches are bringing them into religious fellowships. Unfortunately, they learn little about the great doctrines of the Church and rarely are challenged to live out the sterner stuff of the scriptures.

* * * * * * * * * * * * * * *

Barbara Curtis, in Dirty Dancing at the Prom: And Other Challenges Christian Teens Face (Kansas City: Beacon Hill Press of Kansas City, c. 2005), provides a deeply personal insight into the lives of today’s adolescents. Prodded by one of her son’s remarks regarding the school prom—where “freak dancing” rather resembled sexual foreplay—she launched an investigation, primarily through interviews, into teen culture, hoping to help parents struggling with the issues she faces. What she found is (to her) alarming. Neither today’s dances, nor today’s teenagers, are quite the same as they were 40 years ago. Indeed, perhaps “it’s time proms carried warning labels” (p. 8). And not only proms but many aspects of teen culture merit them as well!

Curtis has twelve children (three of them, Down Syndrome children, adopted) and became a Christian only after she was well into the parenting process. In fact, her oldest daughter went to her high school prom and spent the night with her boyfriend. Having almost no religious roots, living in northern California, they took a laissez-faire approach to most everything, lacking any “moral compass to guide us, just following the crowd” (p. 10). She and her first husband were “hippies” who named their first two daughters Samantha Sunshine and Jasmine Moonbeam! Her second husband, a “spiritual seeker” was similarly rooted in the ’60s ethos. “Drugs, promiscuity, and radical politics” were part of the air they breathed in MarinCounty!

They became Christians, however, as a result of attending a conference where they were presented with Campus Crusade for Christ’s “Four Spiritual Laws.” Everything changed! They suddenly saw the world differently, bathed in the Light of Christ. “Though Tripp [her husband] and I had known about Jesus, we had thought of Him simply as a great spiritual teacher. . . . . This was the first time we had heard the truth about who He was. We did receive Jesus, then and there, on March 21, 1987. Tears were streaming down our faces, and we knew something profound had happened” (p. 108). And they wanted to rear their children differently. So, after home-schooling some of their children in California, they moved to Virginia, hoping to find a more solid, family-friendly society. But teen culture respects no state boundaries, and she found herself facing the great challenge of helping her kids deal with its harmful currents.

In the process she discovered the importance of seven items that constitute the chapters of this book: 1) Being Grounded in God’s Love: Self-esteem; 2) Setting Limits: Self-Assurance; 3) Avoiding Temptation: Self-control; 4) Developing Compassion: Self-sacrifice; 5) Standing Up For What’s right: Self-Respect; 6) Making the Most of Mistakes: Self-help; 7) Living with Integrity: Self-satisfaction.

Curtis discovered, firstly, how important it is to anchor teens in the reality of God’s love. When battling with self-esteem issues, so frequently savaged by their peers in the desensitized atmosphere of the schools, kids need to know they are precious in God’s sight. Those who grow up in homes where they know that both God and their parents love them are far more likely to be self-confident and resolute in resisting temptation. “Self-esteem is tied to knowing God’s love for us,” Curtis says (p. 21). Loving children requires parents to stick together. “So perhaps the most loving thing parents can do for their children is to honor their own wedding vows—for better, for worse, for richer, for poorer, in sickness, and in health, until death” (p. 25). Statistical studies demonstrate the significant suffering kids endure when their parents divorce. Curtis herself grew up “fatherless” and feels “the hole in the souls of fatherless girls” (p. 25). Girls also need godly dads who protect them! “There’s a part of every woman that still longs to be Daddy’s little girl, to feel completely safe and protected” (p. 26).

Protecting kids means setting limits. Curtis confesses she “was once a permissive parent. Having grown up with no spiritual foundation or moral guidelines myself, I didn’t have anything really to pass on. And since my background wasn’t undergirded with love, I had no understanding of what parental love looked like” (p. 31). She had no rules for bedtimes or much of anything else. She thought loving meant letting others do whatever they felt like. Then her oldest daughter, as a high school junior, began coming home at two in the morning. Mom awakened to the fact that youngsters lack wisdom and need guidance—and even clear rules. She also discovered that “kids don’t just need limits—they secretly want them” (p. 32). Love issues reasonable rules. Youngsters will always test them, but parents must uphold them for the good of their kids. This means that a mom or dad can’t be a child’s “best friend”— something 43 percent of the nation’s parents aspire to! Best friend parents, of course, never make rules or require homework or do anything to displease their “friend.” Truth to tell, however, kids both need and want parents! As one of the girls Curtis interviewed said: “‘I want my mom to be my mom'” (p. 46).

Many of the rules, in our world, necessarily focus on protecting kids from illicit sexual activities—evident in the fact “that more than one third of babies born in the United States were born to unwed mothers” (p. 48). Youngsters obviously need to develop the invaluable trait of self-control, though they find little encouragement to do so in the movies, songs, and TV programs that powerfully shape them. “The switch from romance to eroticism in entertainment has put enormous pressure on today’s teens” (p. 50). Thus parents have a great task: to both require obedience and encourage self-discipline. Curtis lists helpful ways to do so: encourage group dates; open your home to your kids’ friends; give them cell phones and keep them accountable; “eliminate latchkey hours;” and supervise entertainment.

Kids also need to learn compassion. By nature, they’re not so, necessarily! They learn to recognize, as Rick Warren says, “It’s not about you.” Others matter. And they should matter to teens. Being part of a big family certainly helps cultivate this, as Curtis makes clear. But kids still need to be taught to care for others—often by serving siblings at home. They need to know the difference between loving sinners and hating sins. They need to become aware of a world full of needs and hurts—something easily acquired through an acquaintance with world missions. Parents praying for missionaries and supporting World Vision or Compassion International clearly teach children elementary compassion lessons.

Standing up for what’s right, even when it’s unpopular, elicits a profound sense of self-respect. So parents need to both illustrate and encourage it, because our kids are on the “frontlines” of the culture war. Persecution—albeit it often subtle—is a fact of life for Christians in the public schools. The kids she interviewed all testified to the challenges they face at school. Getting involved in athletics or drama frequently forces a teen to make choices regarding his values and convictions. In the Curtis family, the author’s husband has consistently insisted: “It’s not who’s right but what’s right.” Films, such as High Noon, to Kill a Mockingbird, and Bonhoeffer, afford opportunities to emphasize the need for courage in living righteously. Kids thus nurtured generally find the courage to stand up for what’s right and discover, in the process, a great sense of personal dignity.

Growing up is marked by successes and failures. Learning from one’s mistakes, growing through disappointments, prepares one for adulthood. Curtis, of all people, knows the truth that “All have sinned and fall short of the glory of God.” The doctrine of original sin was validated by both her own transgressions and with every baby she reared! Confessing her own failures to her kids, as well as to God, showed them the value of openness and honesty. Failures aren’t fatal. With God’s help, the slips and sins of youth can be both confess and transformed into wisdom and strength. And that’s what’s needed for the integrity that makes one satisfied with life.

For parents seeking to understand and rightly rear their teenagers, Dirty Dancing at the Prom provides welcome assistance.

* * * * * * * * * * * * * * *

Two Canadian philosophers, Joseph Heath and Andrew Potter, give us an analysis of the impact of an earlier generation’s youth culture in Nation of Rebels: Why Counterculture Became Consumer Culture (New York: HarperBusiness, c. 2004). The rebels of the ’60s, the baby boomers, talked much about changing the world and making it a non-materialistic utopia of peace and beauty, but as adults they have tacitly repudiated their early idealism. The authors lament this loss, rather like socialists forever insisting on the purity of a system that never quite works as it should, but they insist we understand what happened through an analysis of the false ideas that have flourished since the ’60s.

Failing to think deeply enough and implement their convictions, counter-cultural radicals simply celebrated the wrong things—hippie attire, mindless music (today’s rap most the latest manifestation), mind-altering drugs. They generally imagined that reality could be shaped in accord with one’s nostalgia or hopes in anarchical utopias. Radicals imagined they would save the world by “subverting” the dominant culture through “alternative” art, clothing, “appropriate technology,” organic food, “free range chicken,” fair trade coffee, voluntary simplicity, and protest songs. In fact, as the baby boomers moved into positions of power in various institutions, they brought “their hippie value system with them” (p. 197).

“When the Beatles sang ‘All you need is love,’ many people took it quite literally” (p. 71). Rather than deal with the nitty-gritty problems of poverty and illiteracy and injustice, rather than understand the importance of productivity and personal discipline, counter-cultural rebels followed the lead of folks like Theodore Roszak and fixated on what he called “the psychic liberation of the oppressed.” They swallowed aphorisms coined by the likes of Herbert Marcuse, with his curious admixture of Marx and Freud, who lamented “repressive tolerance,” a phrase, Heath and Potter say, “makes about as much sense now as it did then” (p. 35). Which is to say it’s nonsense.

In short: critiquing mass society has failed to change it. The counterculture has majored in critiques for 40 years, but little resulted from their efforts. Sanctimoniously denouncing various kinds of “commodification,” radicals have settled into comfortable echelons of privilege (working at “cool jobs,” especially in universities, in “cool cities” such as Seattle and San Francisco) appropriate for themselves as the new “creative” class, earning twice as much as the working class. Indeed, “Cool is one of the major factors driving the modern economy. Cool has become the central ideology of consumer capitalism” (p. 188). Consequently, “the modern no-collar workplace, with its casual dress codes and flexible work hours” looks for all the world “like a hippie commune under professional management” (p. 202).

Nation of Rebels takes seriously the counter-culture of the ’60s, and it merits thoughtful reading. There seems to be much truth in the book’s thesis that the impact of the boomers was secondary rather than primary, and the changes they wrought were harmful rather than helpful.

172 American Autobiographies: Buckley; Hillerman; Medved

In 1951, when William F. Buckley published God and Man at Yale, there were millions of ordinary “conservatives” who lacked an intellectually vigorous forum for their ideas. When Buckley soon thereafter launched The National Review, they found both a forum and a spokesman who greatly shaped what is now arguably the dominant political position in the country. To understand the man who, for 50 years, has written and inspired an amazing array of writers and politicians, young and old, Buckley’s Miles Gone By: A Literary Autobiography (Washington, DC: Regnery Publishing, Inc., c. 2004) proves indispensable. Like the man himself, the autobiography is a bit unconventional, for Buckley simply strings together various previously written items to tell his story. “The design of this book,” he says, “is to bring together material I have written over fifty years, with an autobiography in mind” (p. xiii). Thus it is episodic rather than linear, refulgent with remembrances rather than chronological specifics. But the book is strangely effective, for one sees, through the passages presented, the world as Buckley saw it at very specific times. And one learns, while reading, who he is and how his ideas have shaped his life.

Buckley was blessed with virtuous parents. His father, he says, “was the most admirable man I ever knew” (p. 12). He prospered greatly, sired a large family, and presided over both business and family affairs with dignity and discernment. Importantly, his son remembers, he “was wonderful with children (up until they were adolescents; at which point . . . he took to addressing us primarily by mail, until we were safe again at eighteen)” (p. 35). He demonstrated “a constant, inexplicit tenderness to his wife and children, of which the many who witnessed it have not, they say, often seen the like” (p. 49). High praise for an archetypical “patriarchal” father! His mother, a vivacious and attentive woman, “never lost a southern innocence” (p. 51) and was ever determined to do “the will of God” (p. 52). “There were rules she lived by, chief among them those she understood God to have specified. And although Father was the unchallenged source of authority at home, Mother was unchallengeably in charge of arrangements in a house crowded with ten children and as many tutors, servants, and assistants” (p. 52). Amidst all the stresses and strains of caring for such a brood, she remained resolutely cheerful. Indeed, she refused to ever “complain; because, she explained, she could never repay God the favors He had done her, no matter what tribulations she might be made to suffer” (p. 54). His remarkable parents revered education, culture, and the Catholic faith, and they effectively reared their children accordingly.

Buckley’s collegiate education took place at Yale University, an experience recorded in God and Man at Yale, the book that brought him national attention (which I reviewed in issue #158 of my “Reedings.”) Twenty-five years later he was asked to write an introduction for an “anniversary edition” of the book, and now (looking back 50 years) “To young inquisitive friends, I say: Don’t bother to read the book, but do read the introduction” (p. 58), which is reprinted here. Trends evident at Yale, shortly after WWII, soon swept the country. Caving in to the fashionable notion that “all sides” of every issue deserve a hearing, insisting that “tolerance” and “diversity” are crucial components for academic respectability, most universities had lost their “mission.” A commitment to “academic freedom” had replaced their original raison d’etre. But to Buckley, only a focused “mission” justifies the existence of any university!

A surprising amount of Miles Gone By is devoted to sailing and skiing. While I cannot share Buckley’s fascination with the former I fully identify with the latter! The first day he skied (aged 29) he “thought seriously of abandoning journalism, my vendetta with the Soviet Union, my music, and my sailing, and settling down in Vermont, working five years to qualify as a ski instructor, and spending the balance of my life on the slopes” (p. 192). Fortunately he thought better of the idea. But thereafter he routinely took vacations in Switzerland and Utah, finding delight in both the beauty of the scenery and the challenge of the sport, skiing into his eight decade. “I know of no sport, no hobby, no avocation, as indulgent as skiing in giving you exactly the combination you wish of challenge, relaxation, thrill, exhilaration” (p. 195). Amen!

In a fascinating section, entitled “People,” Buckley celebrates “Ten Friends”–David Niven, the superb actor; Ronald Reagan, the president; Henry Kissinger, the diplomat; Claire Boothe Luce, the congressman; Tom Wolfe, the novelist; Vladimir Horowitz, the pianist; Roger Moore, the movie producer; Alistair Cooke, the historian; Princess Grace, the movie star turned Princess of Monaco; and John Kenneth Galbraith, the liberal Harvard economist. What’s amazing about this list is their prominence and diversity. Like Will Rogers, Buckley seems to genuinely “like” people and successfully established lasting friendships with various sorts of them.

Ever readable, ever enlightening, this “literary autobiography” is a fitting testament to its author.

In Nearer, My God: An Autobiography of Faith (New York: Harcourt Brace & Company, c. 1997), William F. Buckley gives readers insight into his soul. Almost blissfully, he reports: “I was baptized a Catholic and reared as one by devoted parents whose emotional and intellectual energies never cloyed” (p. xx). His “mother was a daily communicant. Father’s faith was not extrovert, but if you happened on him just before he left his bedroom in the morning, you would find him on his knees, praying” (p. 4). Consequently, he declares: “My faith has not wavered, though I permit myself to wonder whether, if it had, I’d advertise it . . . . “I wish I could here give to my readers a sense of my own personal struggle, but there is no sufficient story to there to tell” (p. xx). Righteous examples, particularly parental, surely matter–eternally!

As a Catholic attending Yale, he found little to trouble his faith but much to dissipate his hope for higher education! Colleges such as Yale had, before WWII, abandoned any commitment to Christian doctrine, assuming that a decent percentage of pious professors would maintain a suitable “religious atmosphere” of some nebulous sort! “When I left Yale in 1950,” says Buckley, “I had become convinced that it, and presumably, other colleges like it were engaged in discouraging intellectual and spiritual ties to Christianity” (p. 36). Half a century later, this trend is distressingly evident even in the formerly Christian prep schools of New England, where “there is today another God, and it is multiculturalism” (p. 37). More broadly, and ominously, he thinks: “What has happened, in two generations, is the substantial alienation of the secular culture from the biblical culture” (p. 233). That process now gains speed and threatens the very foundations of our society.

Buckley’s own theological convictions are rooted in thinkers such as John Henry Newman and were invigorated by challenging, far-ranging conversations with the likes of Sir Arnold Lunn (a skiing companion), Whittiker Chambers, Russell Kirk, Richard John Neuhaus, Jeffrey Hart, Malcolm Muggeridge, Chuck Colson, and Eugene Genovese. Ever eclectic in his friendships, he seems able to draw and distill insights from some of the world’s finest thinkers. And it’s clear that “faith” to Buckley is primarily an intellectual conviction regarding the truth of Christian doctrine. Nearer, My God contains none of the “personal experiences” so central to evangelical memoirs, little of the “strangely warmed” heart moments pietists prize. But it does make clear the author’s conviction that “anyone who is looking for God, Pascal said, will find him” (p. 85). That Buckley has found God is most evident in this treatise.

A writer of a different sort, Tony Hillerman, tells his life story in Seldom Disappointed: A Memoir (New York: HarperCollinsPublishers, c. 2001). Hillerman’s mysteries–The Blessing Way; Listening Woman; Skinwalkers; Coyote Waits, to name a few–are set in Navajo country and provide an effortless way to understand something of Navajo culture.

Born to an impoverished farm family in Oklahoma, Hillerman profited from the example of hard-working, devout parents. His father, he believes, literally worked himself to death and died young. His mother gave him an enduring example of courage and resolve. In the midst of depression and poverty, she refused to be daunted. To her, children “had nothing to worry about except maintaining our purity, being kind to others, saving our souls, and making good grades. With Papa’s help, she persuaded us that we were something special. We weren’t just white trash. Great things awaited us. Much was expected of us. . . . whining and self-pity were not allowed” (p. 46). Whatever happened, Mama would say: “Offer it up.” Give it to God and keep on keeping on! “We were born, we’d live a little while, and we’d die. Then would come joy, the great reward, the Great Adventure, eternal life” (p. 46).

Hillerman managed to graduate from high school and gain entrance to Oklahoma A&M, just as WWII was erupting. He soon joined the Army, went to Europe, and fought with his buddies through France and into Germany. Seriously wounded, losing an eye and walking with a limp thereafter, he received multiple decorations. All of this he describes with a wry, self-deprecating sense of humor, making light of his “heroism” and military life in general. He tested and confirmed the fact that there’s much “truth behind the axiom: ‘There are two ways of doing things. The right way and the Army way'” (p. 151). After months in various hospitals, he finally returned to Oklahoma, entered the University of Oklahoma, and studied journalism. Happy to maintain a “Gentlemanly C” grade point, his academic career was notably undistinguished, though he profited from at least one of his journalism professor’s instruction regarding “tight” writing. Use the right words! Eliminate adverbs and adjectives!

More important than professors, however, was a woman he met at OU in his senior year! Marie Unzner instantly enchanted him, and he persuaded her to become his wife. She proved a great blessing, for she “had more confidence in my writing than I did” (p. 260). Ever cheerful and optimistic, she continually encouraged him to pursue his dreams. Whereas his parents had nurtured him early in life, setting him on the right track, heading toward “that Last Great Adventure, and understanding that the Gospels Jesus used to teach us were the road map to make getting there a happy trip,” the final half-century of his life was “filled with love, joy, and laughter by a wonderful wife, partner, and helpmate named Marie” (p. 320).

Degree in hand and a wife to care for, Hillerman sought work. The position he found was in Borger, Texas, located “sixty miles north of Amarillo on rolling, almost treeless tundra of the high end of the Texas Panhandle” (p. 179). A more inauspicious beginning for a fledgling writer could not be imagined. But he started working, covering local stories and (importantly) observing people in all sorts of situations. Decades later some of the characters in his novels are based upon some remarkably admirable people he knew in Borger. Soon he found a better job in Lawton, Oklahoma, then moved to Oklahoma City to work for the United Press. That led, in 1952, to an assignment as UP Bureau Manager in Santa Fe, New Mexico, where he would work for more than a decade.

While recording the news, Hillerman sensed a deeper longing to write more creatively, to be a novelist. Despite a growing family of six children (all but one adopted), with his wife’s encouragement he decided to change careers and moved to Albuquerque to pursue a degree in English at the University of New Mexico. Once there, the opportunity to teach in the journalism department opened up, and he settled into the academic life for 15 years. Ever discerning, he discovered that the faculty was divided into two groups. Pragmatic “Organization” folks, with whom Hillerman sided, taught hard sciences and history; they mainly wanted to help the university survive and secure their salaries. Their antagonists, the “Crazy Bus” crowd–mainly representing such departments as Education, Sociology, and Anthropology–”was a mix of 1930 Marxism, Nihilism, Hedonism, and disgruntlement” who greatly troubled the state’s tax payers (p. 243). Infused with the vapors of the ’60s, they were out to change the world.

Hillerman found satisfaction teaching in the ’70s. “Students were interested, grade mania and the resulting grade inflation had barely emerged, the curse of political correctness had not yet paralyzed deans and department chairmen and corrupted the faculty” (pp. 262-263). He actually had “fun.” But the ’80s changed things. “The numbing dogma of PC hung over the campus, tolerating no opinions except the anointed ones. With free speech and free thought ruled out by inquisitors running Women’s Studies and the various minorities studies, the joy of learning had seeped out of students. With it went the joy of teaching. Time to quit” (p. 263). So he did! “One day after delivering a lecture so bad even I knew it was boring, I decided to quit academia and return to the real world” (p. 250). That meant writing and publishing novels!

Fortuitously, he found his métier–the mystery novel set in Navajo country. He also found agents and editors who enabled him to sell books. In time he flourished as his fans spread the news and his peers awarded his craftsmanship.

Michael Medved, known to many through his popular talk show, looks back on his life as a series of Right Turns: Unconventional Lessons from a Controversial Life (New York: Crown Forum, c. 2004). He structures the book with a series of 35 “lessons,” generally chronological but essentially thematic, to show how he has developed as a pundit, a very public intellectual, from a thoroughly radical leftist–opposing the Vietnam War and working for the ’72 McGovern campaign at its “Jewish desk”–to a deeply conservative Orthodox Jew, father and media figure. Importantly, he says: “This book isn’t about ‘my truth’; it’s about The Truth, to the extent I can apprehend and explain it” (p. 5). This puts him “counter to all trendy notions of moral relativism, which suggest that someone with different life experiences will inevitably read different conclusions, and that these conclusions deserve no less respect than mine” (p. 5).

Medved was born in Philadelphia, the grandson of industrious Jewish immigrants. One his grandmothers “grasped, and passed along, one of the greatest truths of life: it doesn’t matter how much you earn, so long as you spend less than you bring in” (p. 35). His parents soon moved to the Point Loma area of San Diego, where he went through the city’s public schools and imbibed liberal Democratic values from his parents. Such values were, however, early challenged by one of his uncles, Moish, an unusually erudite, self-educated and successful electrician, who was born in Ukraine in 1905. Taking him aside for a “man-to-man” talk, Uncle Moish warned young Michael against Communism, the “Scarlet Plague” that was ruining millions of people around the world, and “‘the people who are most likely to get sick, and who are going to suffer the most, are the brightest minds, the biggest idealists, the natural leaders of this world. They are people just like you'” (p. 47).

But young Michael hardly heeded (though he remembered) his uncle’s admonition for many years. While attending Yale, awash with radical students in the ’60s, he observed youngsters in the Students for a Democratic Society who vividly illustrated the “Scarlet Plague.” He also witnessed, as a sophomore, the impact of the drug-addled counterculture that swept through the university in 1966. Medved seemed temperamentally hostile to the “dopester dementia” and listened to a different melody, finding a healthy alternative by hitchhiking, almost every weekend, through sections of the “flyover world” disdained by the academic elitists. He also began, at Yale, a slow return to the faith of his fathers, Judaism, discovering, as he titles one chapter, “You Can Go Home Again.”

Following his graduation from Yale, he entered Yale Law School (getting acquainted with Hillary Clinton) but quickly decided he was not really interested in being a lawyer. So he returned to California, married, and enrolled in a writing program at the University of California, Berkeley, in 1972. Here he confirmed the truth that “Liberal Heroes Aren’t All Heroes” in the person of Ron Dellums, a “Castroite” congressman representing Berkeley and Oakland. Medved accepted a staff position in the Dellums’ campaign and grew quickly disillusioned with his candidate, who “reminded me of another tall, lanky, hugely ambitious, humorless pol I had known (and disliked) years before: John Kerry” (p. 170).

Barely arrived in Berkeley, Medved experienced another wake-up call–his home was burgled. The police caught the thief, who was a career criminal routinely released to practice his craft at public expense. The typical Berkeley intellectual’s sympathy for criminals, evaluated from the perspective of a victim, was simply “mad.” The cops, Medved decided, not the UC professors, see things as they really are. Consequently, Medved abandoned, in the ’70s, the vacuous ideologies and “utopian promises of the youth counterculture, while embracing traditional Judaism, entrepreneurial adventure, cops, and even Christians” (p. 204). He also read “Alexander Solzhenitsyn’s harrowing masterpiece, The Gulag Archipelago” (p. 209), a timely gift from his uncle Moish. Realizing that the “Scarlet Plague” explained both the USSR’s gulag and the counterculture’s fanaticism, he felt “guilty and heartsick for my country and for the so-called peace movement in which I had played such an active part” (p. 213).

Relocating to Los Angeles, where his folks now lived, he wrote, with a friend, a successful book, What Really Happened to the Class of ’65? and gained entrée to the media world. He wrote more books and became a noted film critic, interacting on a regular basis with the Hollywood elite. He also moved steadily toward Orthodox Judaism, getting involved in synagogue activities and taking seriously the precepts of his faith. His childless first marriage had collapsed, and he now shared, with his second wife, Diane, the conviction that “children represented an explicit focus of our relationship, giving us a sense of purpose, of destination” (p. 297). They came to strongly oppose divorce and abortion, enlisting as partisans in the “culture war” that divides America.

Addressing this “war,” he said, in an off-the-cuff 1990 speech: “This is the very nature of the cultural battle before us. It is, at its very core, a war against standards. It is a war against judgment. It’s proponents insist that the worst insult you can offer someone today is to suggest that he or she is judgmental” (p. 344). This is dramatically evident in the realm of art, where “ugliness has been enshrined as a new standard,” where “the ability to shock” is as admired as “the old ability to inspire” (p. 345).

Given the opportunity to do talk radio, Medved moved to Seattle determined to “inspire” listeners to embrace the “right” way he has found. This book certainly clarifies the message he wants to impart and enables one to understand the messenger.

171 Captialism & Christians

Returning recently from a conference featuring some influential contemporary thinkers, I noticed a book in my library with essays by a number of them–The Capitalist Spirit: Toward a Religious Ethic of Wealth Creation (San Francisco: Institute for Contemporary Studies, c. 1990), edited by Peter Berger. One of the 20th century’s most influential sociologists, Berger was driven by the data to shift from an anti-capitalist to a pro-capitalist stance, and in the book’s foreward he explains that the most important thing about wealth is that it must be created. This has proven to be a difficult concept for many religious thinkers to either understand or embrace, for most pre-modern religious thinkers, living in relatively static societies, could only envision justice through distributing existing wealth. Like the 18th century mercantilists, today’s “zero sum” economists envision a world with finite resources that need to be properly shared.

We all know that many pre-modern 16th century thinkers like Martin Luther resisted any re-casting of Medieval cosmology. But in time most everyone recognized that Copernicus and Kepler had rightly read the skies and set forth an accurate account of the ways the world functions. Equally important economic insights came to light in the 18th century as the Industrial Revolution opened up new avenues for productivity and the creation of wealth. But many ethicists, rooted in a pre-modern philosophy, failed to craft their moral convictions to fit the new economy. “It is no wonder, then,” says Berger, “that so many religious thinkers have been anticapitalist and prosocialist in their instinctive inclinations” (p. 2). Having shared such inclinations for years, Berger sympathizes with them. But he finally realized how they misled him. They simply are not true.

Nor have they ever been! Nostalgic visions of the Early Church as a precursor of socialism–sharing “all things in common”–are unfounded. Socialistic assertions regarding the Early Church resemble Rousseau’s portrayal of the “noble savage” in North America. The distinguished ancient historian Robert M. Grant, in “Early Christianity and Capital,” concludes that, contrary to the assertions of Christian socialists: “The church both ancient and medieval respected private property. The 38th Article of Religion of the Church of England . . . simply follows the central tradition when it insists that ‘the Riches and Goods of Christians are not common, as touching the right, title and possession of the same; as certain Anabaptists do falsely boast.’ In an equally traditional manner, the article balanced this statement with the exhortation that ‘nothwithstanding, every man ought, of such things as he possesseth, liberally to give alms to the poor, according to his ability” (p. 28).

Contrary to those who winnow the Old Testament for their socialistic economics–often taking things such as the Jubilee Year out of context, David Novak (an eminent Jewish scholar) insists that “equality” in the Old Testament has meaning only “in the sphere of rectification, that is, the restoration of private property misappropriated in one way or another” (p. 32). Even “charity” was not emphasized, for it too often renders recipients passive and dependent. In fact, economic justice, “in accordance with the principles of the Covenant, is thus best accomplished by loans” (p. 38). And commercial loans, Jewish teachers decided, must be understood differently from agricultural loans. Thus loaning money for investment justified collecting interest, whereas agricultural loans (generally of a brief duration) did not. “In other words, the loan is not given because the borrower has nothing but the shirt on his back, so to speak. Rather, the loan is now more probably for the sake of investment, a risk taken by both lender and borrower in the hope that the future will yield a better income than the present. In this case, the need for the sabbatical year release from indebtedness, which in the agricultural context would make a loan into a charitable gift, would no longer be required” (p. 47).

Michael Novak’s essay, “Wealth and Virtue: The Development of Christian Economic Teaching,” shows how a select circle of 18th century Scottish “moralists” (David Hume and Adam Smith) understood the essence and importance of free enterprise capitalism, defending the proper pursuit of wealth as admirable and socially beneficial. They “sought to construct a new ethos for Western civilization and, indeed, the world” (p. 70). Both sought to alleviate the plight of the poor, and envisioned “the surge of spiritual independence and the extension of humane sympathies that would flow from the sway of a more free and beneficent regime” (p.74). They particularly sought to replace the elitist, anti-capitalist position generally championed by intellectuals and artists with one favoring the bourgeoise, which empowered ordinary people. Indeed, though often portrayed as advocating a ruthless “dog eat dog” economy, Adam “Smith’s discussion reminds one of Saint Thomas’s definition of love: to will the good of another” (p. 68).

George Weigel, reknowned for his definitive biography of John Paul II, Witness to Hope, recounts the uneasy history of “American Catholicism and the Capitalist Ethic.” Whereas many Protestants have supported free enterprize capitalism, Catholics tended to critique it. Like Southern slave owners, enamored with the novels of Sir Walter Scott, they idealized the agrarian socioeconomic structures of the Middle Ages. Uneasy with the individualism evident in Protestant America, 19th century Catholics like Orestes Brtownson condemned capitalism and proposed an ideal “Christian society.” Eminent Catholic clerics early sided with the Knights of Labor in the 1880s, embracing its socialist prescriptions and strongly supported FDR’s New Deal 50 years later. Half-a-century later, in the 1980s, despite mounting evidence to the contrary, Catholic bishops and academics generally denounced “Reaganomics” and free enterprise capitalism. However, in the aftermath of Vatican II, and a fresh Catholic openness to the modern world, came the “creation-centered social thought of John Paul II” (p. 96). From the highest authority came the endorsement creative entrepreneurship. “Wealth creation,” to John Paul II, “is a specifically economic form of human participation in God’s abiding creativity, God’s sustaining care for his creation” (p. 96). It’s time, Weigel argues, for Catholics to embrace the free enterprise economy that has so uplifted the world and join John Paul II in making it Christian.

* * * * * * * * * * * * * * * * * * * *

In The Church and the Market: A Catholic Defense of the Free Economy (New York: Lexington Books, c, 2005), a Catholic historian, Thomas E. Woods, endeavors to counteract the anti-capitalist views of Christians who fail to see its worth. The Industrial Revolution, often deplored by socialists because of its reliance on child labor and exploitative practices, was in fact a great boon for the working classes. Bad as it was, it was an improvement on what went before! “To say that the free market led to the destruction of some previously existing, harmonious community life is simmply to defy historical testimony” (p. 165). Child labor, for example, was no new thing in 1800! Farm kids worked long and hard from time immemorial. To work hard in factories was not a major change. What changed, as economic conditions improved during the 19th century, was the ultimate abolition of “child labor” and the radical improvement of their living conditions–life expectancy, nutrition, education, etc.

Woods especially urges readers to seriously study economics and to discover truths discerned by 15th and 16th century Spanish Scholastics such as Juan de Mariana, as well as 20th century Austrians such as Ludwig von Mises. Whereas modern socialists, enthralled with Karl Marx, have embraced an illusion, the truth-seeking economicsts have carefully studied mans’s nature and prescribed the best ways for his flourishing. The Scholastics and Austrians, Woods says, both “sought to ground economic principles on the basis of absolute truth, apprehensible by means of reflection on the nature of reality” (p. 216). Prices, for example, rightly reflect market demand. Consumers–not the labor expended producing–should determine the value of various goods. Whenever the state intervenes, artificially setting “just prices,” dire if unintended consequences follow. Just wages are also best set by the marketplace. To Domingo de Soto, writing in the 16th century, workers who agree to a given salary are fairly paid when their employer pays as promised. Wages rise when wealth is created, and the prerennial socialist impulse to dictate “fair wages” generally militates against the very creative process that justifies higher salaries.

Money and banking, of course, are major economic concerns. We Americans live under the rule of the Federal Reserve, which, by issuing “fiat currency” basically “creates money out of thin air” (p. 93). Since it was founded in 1913, “the dollar has lost about 95 percent of its value” (p. 93). While claiming to control inflation, the “Fed” has, in fact, caused it! There are major moral problems with fiat currency, Woods argues, for it is, in fundamental ways, “not conceptually distinct from simple counterfeiting” (p. 97). The Spanish Scholastics knew this centuries ago. They also knew that some of the traditional teaching regarding usury could not address the dynamic, commercial economy of the world emergent in the 16th century. Indeed, “Catholic theologians had overturned virtually all of the older arguments against usury–at the very time that Martin Luther was busily attempting to rehabilitate them” (p. 114).

Regarding the welfare state, Woods invokes a recent warning by Pope John Paul II: “By intervening directly and depriving society of its responsibility, the Social Assistance State leads to a loss of human energies and an inordinate increase of public agencies, which are dominated more by bureaucratic ways of thinking than by concern for serving their clients, and which are accompanied by an enormous incrfease in spending. In fact, it would appear that needs are best understood and satisfied by people who are closest to them and who act as neighbors to those in need” (p. 147). But the welfare state directly harms neighborhoods and families. And it undermines private property rights–rights that Pope Leo XIII branded “sacred and inviolable” (p. 195).

* * * * * * * * * * * * * * * * * * * * *

Woods anchors his position regarding “the church and the market” in the scholarly work of Alejandro A. Chafuen, Faith and Liberty: The Economic Thought of the Late Scholastics (New York: Lexington Books, c. 2003). The popularity of Max Weber’s thesis, yoking capitalism and Calvinism, has obscured the numbers of Catholic philosophers who carefully situated free enterprise capitalism within the natural law teachings of the Church. “Our analysis of the Schoolmen’s writings,” says Chafuen in his conclusion, indicates “that modern free-market author owe the Scholazstics more than they realize. The same can be said for Western civilization” (p. 159).

This meant they stressed the sanctity of private property. Noting that many of Jesus’s associates “were quite wealthy for their times” (p. 32), they “declared it was heresy to say that those who have property cannot enter the kingdom of heaven” (p. 33). “According to [Juan de] Molina, private property may have existed even before original sin, since in that state, men could agree by common consent to divide the goods of the earth. The commandment ‘Thou shalt not steal’ implies that the division of goods does not pervert natural law” (p. 36). One scholar says that for these Scholastics “‘the right to property was an absolute right that no circumstances could ever invalidate'” (p. 42). This, Chafuen says is because: “Private property is rooted in human freedom, which is founded in human nature, which, like any other nature, is created by God. Private property is the essential prerequisite for economic freedom” (p. 160).

When they considered “public finance,” the Scholastics cautioned against government involvement in economics. “To believe in private property means to believe in limited government” (p. 132). Taxes should be minimal. The budget should be balanced. The currency should never be debased as a means of redistributing wealth. Administrative officials should not be allowed to grow rich at public expense. More than anything else, high taxes produce poverty. “‘Taxes are commonly a calamity for the people and a nightmare for the government'” said Juan de Mariana. “‘For the former, they are always excessive; for the latter, they are never enough, never too much'” (p. 57).

Contrary to socialists, for whom the “labor theory of value” of commodities is an article of faith, Scholastics trusted the marketplace to establish fair prices. Commerce and trade are necessary for a healthy society. Surviving through subsistence farming and barter economics condemns folks to perpetual indigence. To Molina, one should not scoff at different prices for the same goods in different areas, for “‘the just price of goods depends principally on the common estimation of the men of each region. When a good is sold in a certain region or place at a certain price (without fraud or monopoly or any foul play), that price should be held as a rule and measure to judge the just price of said good'” (p. 75). We value goods insofar as they are useful to us. It’s their usefulness–not the effort invested into making them–that determines what we’re willing to pay. Efforts to fix prices, through monopolistic controls established by either entrepreneurs or workers, are harmful and wrong. Wages, the Late Scholastics taught, should be set by the marketplace, where a “just” wage is whatever a worker freely accepts. A doctor’s wages, it follows, will be higher than a garbage collector’s, for we are willing to pay more for medical care than manual labor.

In all their works, the Scholastics sought to clarify the nature of justice for all–and especially for ordinary folks. “The protection of private property, the promotion of trade, the encouragement of commerce, the reduction of superfluous government spending and taxes, and a policy of sound money were all detined to improve the condition of the workers. They recommended private charity as a way to alleviate the sufferings of those who could not work. According to the Late Scholastics, and in agreement with the Holy Scriptures, the rich are under obligation to help the poor. Money could be better used if the rich would reduce their superfluous spending and increase their alms” (p. 110).

Still worth reading, to understand the Evangelical economic thought, is Craig M. Gay’s With Liberty and Justice for Whom? The Recent Evangelical Debate over Capitalism (Grand Rapids: William B. Eerdmans Publishing Company, c. 1991), originally written as a Ph.D. dissertation under Peter Berger’s guidance. As one would expect, this work is detailed, carefully documented, and quite helpful for anyone wanting to hear different voices from within Evangelicalism. Gay first noted the growing influence of the “New Class” intellectuals within Evangelicalism who profit from and thus endorse the leftist planks of the welfare state. This “New Class,” says Peter Berger, “‘rhetorically identifies its own class interests with the general welfare of society and especially with the downtrodden. . . . This is especially so because the knowledge class has such an interest in the welfare state, which is ostensibly set up on behalf of the poor and of other disadvantaged groups'” (p. 189). Led by “radicals” such as John Howard Yoder, Jim Wallis and Ron Sider, leftist Evangelicals denounce capitalism and America’s “oppressive” society. “Jim Wallis has stated, for example, that ‘overconsumption is theft,’ and Ronald Sider has insisted that ‘an affluent minority devours most of the earth’s non-renewable resources'” (p. 31). Anabaptist thought undergirds much of their protest, and they clearly long to establish their vision of the “kingdom of God” in this land. This should come through redistribution–taxes on the rich funding programs for the poor, legislation securing entitlements establishing various kinds of economic, racial, and sexual “equality” everywhere.

Clark Pinnock, closely associated with Wallis in the ’70s, later renounced his radical views, stating: “I remember being asked if I realized the Marxist content of what we were saying . . . and being puzzeled by the question. . . . It seemed reasonable to think of the rich as oppressors, and the poor as their victims. The Bible often seemed to do the same thing. It was obvious to me that the welfare state needed to be extended, that wealth ought to be forcibly redistributed through taxation, and that the third world deserved reparations from us, that our defense spending was in order to protect our privilege, and the like'” (p. 36). What’s now clear to Pinnock and other scholars is the Marxist influence on the Evangelical Left.

Rejecting Marxism and defending capitalism is the Evangelical Right, represented by Harold Lindsell, for years the editor of Christianity Today, and Ronald Nash, an influential Reformed philosopher, who taught at Westminister Theological Seminary. “In a sense,” Gay says, “those on the right have become traitors to the New Class” (p. 193). Thus they rarely if ever get invited (as does Jim Wallis) to high profile meetings of the inner circle of opinion shapers in New York and Washington, D.C. They may be intelligent, but they’re not accredited member of the reigning intelligentsia that controls the media and universities.

Those on the Right are, Stephen Brint says, “‘blue-collar workers, small-business people, and farmers'” (p. 191). They tend to be older, less educated, and live in what’s now called “red” states. Free market capitalism, they insisted, provides the best system yet developed to produce and distribute goods. The world is far better off as a result of modern capitalism. Wealth is created and spread abroad through free trade, and “in such an economy, no man becomes rich by oppressing another but rather by helping others” (p. 70). Thinkers on the Right anchor their defense of capitalism in the natural law. Given our nature, it’s the best economic system. To Lindsell, the free enterprise approach is approved by God’s Word and “is binding on all men everywhere.” Divinely ordained, “it is normative, it will work, and it will prove itself to be superior to socialism, which can only be validated by denying what God has revealed and can only function by destroying the foundations on which Western culture has beeen built'” (p. 100).

Neither Left nor Right is the “Evanglical Center” that finds “capitalism as a cause for concern.” Folks like Carl F.H. Henry (in The Uneasy Conscience of Modern Fundamentalism) and Bob Goudzwaard (in Capitalism and Progress) represent, for Gay, the evangelical mainstream. Henry clearly rejected Marxism, but his concern for social justice led him to criticize aspects of modern American capitalism, and “Goudzwaard has argued that the crisis of Western civilization . . . has been precipitated by the idolization of progress in the modern period, a problem linked to the institutionalization of modern capitalism” (p. 136). Such “mainstream” thinkers want to preserve valuable aspects of free enterprise capitalism while encouraging governmental intervention to mitigate its excesses and provide basic welfare needs for all peoples.

Gay concludes his book with two chapters evaluating what he has described, providing the reader a helpful perspective. “This is the best survey of evangelical thought about capitalism that I know of,” says Goudzwaard, and I concur. It still merits attention, nearly 20 years after it was written.

170 Disconsolate Brits

Recent laments from some eminent British writers provide a somber appraisal of their nation’s current conditions.  As ever, one must put such complaints in perspective, but their concerns certainly merit reflection.  Peter Hitchens (not to be confused with his brother Christopher) is a provocative journalist who compiled a collection of essays entitled The Abolition of Britain:  From Winston Churchill to Princess Diana (San Francisco:  Encounter Books, 2000).  As the subtitle indicates, Hitchens repeatedly contrasts Winston Churchill and Princess Diana (and their markedly different funerals) to compare the Britain of 1997 with that of 1965.  “The dead warrior was almost ninety, full of years and ready to die.   He represented the virtues of courage, fortitude and endurance; he was picturesque rather than glamorous,” whereas Diana, dying young in an accident, “was snatched from life in the midst of youth, beauty and glamour.  Her disputed virtues were founded on suffering (real or imagined) and appealed more to the outcasts and the wounded than to the dutiful plain heart of England” (p. 17).   More broadly, in society, the independence and tenacity of Churchill gave way, during the last third of the 20th century, to a celebrity culture curdling in an ethos of sentimentalism and victimization evident in England’s response to Princess Diana.

This cultural change has been aided by the erosion of historical knowledge.  In an essay entitled “Born Yesterday,” Hitchens laments the demise of  historical perspective in a land where “all kinds of rubbish are blown by the wayward winds of modern education and popular ‘culture'” (p. 46).  In the schools, the study of history has shifted from knowledge to “skills.”  What is studied or learned, say the educationists, is less important than asking questions and (especially) empathizing with those mistreated, for various reasons, in either the past or present.  Consequently, many traditional “heroes,” particularly of the military sort, are portrayed as villains, because they fought rather than appeased their enemies.  There is, in fact, a “belittling of the Second World War” in the current curriculum (p. 60).  For example, a 1995 videotape distributed to the schools to commemorate VE-day “mentioned Churchill only for a few seconds, and then to say he lost the 1945 election” (p. 60).

What’s encouraged in the schools is emotivism, especially self-righteous wrath regarding racists, sexists, or capitalists–all pilloried as oppressors of the weak and marginalized.  “The sort of topics recommended” by the educationists “have a weary familiarity for anyone acquainted with the Marxist interpretation of the twentieth century:  ‘the working classes’, ‘women in society’, ‘imperialism’ and so on” (p. 56).  Re-phrased, Marxist thought fuels Britain’s “class war.”  But Marxism is only an updated  version of the radicalism unleashed in 1789 by the French Revolution.  For nearly two centuries the British resisted the radical, totalizing, Jacobist ideology embraced by many Europeans in the 19th and 20th centuries that now “seeks to extinguish Britain, not by revolution, but by stealth” (p. 300).  Today’s leftists, intent on cultural rather than economic revolution, believe “education should be used to eradicate privilege and elitism, to spread the gospel of the new society in which everyone (and everything) is equal, a sort of concrete embodiment of that hideous song ‘Imagine’, which has become the hymn of sixties boomers” (p. 64).

The triumph of this trend was encapsulated by Prime Minister Tony Blair who, in 1997, the year of Princess Diana’s funeral, said:  “I am a modern man.  I am part of the rock and roll generation–the Beatles, colour TV, that’s the generation I come from” (p. 1).  Indeed, much about  Blair and the left-wing leftists who currently control the nation elicits Hitchens’ scorn.  He notes, for example, how prophetically Aldous Huxley, in Brave New World, envisioned “the cynical, puerile, bubblegum election campaigns fought by Bill Clinton in 1992 and 1996, and by Tony Blair in 1997” (p. 139).  That the Beatles and TV–not Shakespeare and Handel–have shaped a “modern man” like Blair cannot but dismay cultural conservatives.

In “Hell Freezes Over” Hitchens lampoons recent developments in the Church of England.  “Hell was abolished,” he writes, “around the same time that abortion was legalized and the death penalty was done away with” (p. 105).  Eminent ecclesiastics, such as Bishop John Robinson (of Honest to God fame) led the way on every front in the war against traditional, orthodox Christianity.  “The Ten Commandments, once blazoned behind every altar in the kingdom, were frequently left out of the Church of England’s Communion service . . . .  The King James version of the Bible, with its majestic but sometimes frightening language, was rejected by modernizers who sought to make it more ‘accessible’, replaced new versions which nonetheless somehow lacked the old scriptures’ force'” (p. 106).  The ancient majestic liturgy of the Book of Common Prayer was subtly subverted by “alternative” services.  Hymns disappeared.  “At the funerals of the young, entirely secular pop songs are often played as substitutes for hymns.  In the last few years, mourners have taken to telling jokes during funeral eulogies, as if they were at a wedding” (p. 126).  And the ancient Gospel of personal redemption from sin through the work of Jesus Christ was replaced by a Social Gospel urging folks to support political activism of a leveling, leftist sort.  Consequently, the Church of England is hardly more than an empty shell–emptied of theology, worship, beauty, and (much to the dismay of the “reformers”) people.

* * * * * * * * * * * * * * * * * * * *

Theodore Dalrymple shares Hitchens’ evaluation of his native land:  “In the past few decades, a peculiair and distinctive psychology has emerged in England.  Gone are the civility, sturdy independence, and admirable stoicism that carried the English through the war years.  It has been replaced by a constant whine of excuses, complaints, and special pleading.  The collapse of the British character has been as swift and complete as the collapse of British power” (p. 5).  Dalrymple is a medical doctor who has worked for the past two decades in an inner-city hospital and prison in London.  Though without religious faith, he seems to sense a humanitarian “call” to work among what he calls the “underclass.”  Along with his medical work, he has also flourished as an essayist, and he is, Peggy Noonan says, “the best doctor-writer since William Carlos Williams.”  Some of his essays appeared in Life at the Bottom:  The Worldview That Makes the Underclass (Chicago:  Ivan R. Dee, c. 2001).  “A specter is haunting the Western World,” he says “the underclass” (p. vii).  His first sentence, of course, replicates Marx’s opening line in The Communist Manifesto, substituting “underclass” for “communism.”

This “underclass” Dalrymple deals with on a daily basis demonstrates the power of pernicious ideas, for “the social pathology exhibited by the underclass” has been promulgated by an intelligentsia intent on denying free will and personal responsibility, promoting a fashionable moral relativism.  Educators discount correct grammar or spelling.  Artists claim there is no higher or lower culture.  Highly educated folks dress and talk like the less educated “workers” they feign to understand and emulate.  “Differences” between cultures and behaviors there may be, but nothing is qualitatively better that anything else, nothing is absolutely right or wrong.  Consequently, “the aim of untold millions is to be free to do exactly as they choose and for someone else to pay when things go wrong” (p. 5).  Sadly enough, those responsible for such behavior, the elite “intellectuals were about as sincere as Marie Antoinette when she played the shepherdess” (p. xi).  Their play-acting “is a crude and simple one, a hangover from Marxism:  that the upper and middle classes are bad; that what has traditionally been regarded as high culture is but a fig leaf for middle- and upper-class oppression of the working class; and that the working class is the only class whose diction, culture, manners, and tastes are genuine and authentic” (p. 81).

Thus he cites in Shakespeare’s King Lear to clarify the book’s theme:  “This is the

excellent foppery of the world, that when we are sick in fortune, often the surfeits of our own behavior, we make guilty of our disasters the sun, the moon, and stars; as if we were villains on necessity, fools by heavenly compulsion, knaves, thieves, and teachers by spherical predominance, drunkards, liars, and adulterers by an enforced obedience of planetary influence; and all that we are evil in, by a divine thrusting on.  An admirable evasion of whoremaster man, to lay his goatish disposition on the charge of a star!” (I, ii).  Dalrymple beholds Shakespeare’s truth, on a daily basis.  Prisoners he treats routinely use the passive mood when describing their crimes.  Thus three men who stabbed others “used precisely the same expression when describing to me what happened.  ‘The knife went in,’ they said” (p. 6).  They weren’t responsible!  The knife simply went in, killing a person–acting on its own, one assumes!  Another prisoner, a car thief, claimed he could not stop stealing and blamed the doctor for not stopping him!  These ill-educated criminals are, without knowing their source, voicing ideas spawned by some of the 20th century’s most powerful ideologies–”Freudianism, Marxism, and more recently sociobiology–in denying consciousness any importance in human conduct” (p. 22-23).

And the criminals know how to use criminologists’ rhetoric to legitimate their crimes!  “The great majority of the theories criminologists propound lead to the exculpation of criminals,” Dalrymple says, “and criminals eagerly take up these theories in their desire to present themselves as victims rather than victimizers” (p. 218).  So too they latch on to the ideas of social reformers, leftist philosophers and politicians who call for economic egalitarianism and denounce the wealthy.  The thieves he deals with generally “believe that anyone who possesses something can, ipso facto, afford to lose it, while someone who does not possess it is, ipso facto, justified in taking it.  Crime is but a form of redistributive taxation from below” (p. 219).  He astutely connects the fashionable theories of the professors and journalists with the lawlessness on the streets, noting that “those who propagate the idea that we live in a fundamentally unjust society also propagate crime” (p. 220).

In an essay entitled “What Is Poverty?” Dalrymple insists the real poverty in England is moral rather than economic.  Sadly enough, the Welfare State, designed to eliminate material “poverty,” has incubated a more devastating spiritual poverty.  He notes that young medical doctors (many of them the children of immigrants) who join his hospital staff initially think that their patients are oppressed by society and in need of various kinds of assistance.  “By the end of three months,” he says, “my doctors have, without exception, reversed their original opinion that the welfare state, as exemplified by England, represents the acme of civilization” (p. 142).  After working with London’s tax subsidized underclass, a Filipino doctor said:  “‘life is preferable in the slums of Manila'” (p. 142).  Dalrymple himself, having worked for a time in Tanzania and Nigeria, declares that  “nothing I saw–neither the poverty nor the overt oppression–ever had the same devastating effect on the human personality as the undiscriminating welfare state.  I never saw the loss of dignity, the self-centeredness, the spiritual and emotional vacuity, or the sheer ignorance of how to live that I see daily in England.  In a kind of pincer movement, therefore, I and the doctors from India and the Philippines have come to the same terrible conclusion:  that the worst poverty is in England–and it is not material poverty but poverty of soul” (p. 143).

* * * * * * * * * * * * * * * * * * * *

Theodore Dalrymple revisits many of the same issues in a more recently collection of essays:  Our Culture, What’s Left of It:  The Mandarins and the Masses (Chicago:  Ivan R. Dee, c. 2005), and his jaded pessimism grows apace.  Civilization, he notes, is a terribly fragile thing, as the horrors of the 20th century demonstrate.  And in London, as the 21st century begins, he’s witnessing its collapse–a collapse caused by nihilistic intellectuals who, to this point, have not yet suffered the dire consequences evident in the inner-city.  “Having spent a considerable portion of my professional career in Third World countries in which the implementation of abstract ideas and ideals has made bad situations incomparably worse, and the rest of my career among the very extensive British underclass, whose disastrous notions about how to live derive ultimately from the unrealistic, self-indulgent, and often fatuous ideas of social critics, I have come to regard the intellectual and artistic life as being of incalculable practical importance and effect” (p. xi).  Sadly enough, economists, novelists, film directors, journalists, and rock stars are waging a relentless war on the very innards of civilization, for barbarism begins, as Ortega y Gassett said, with the collapse of standards.

In a variety of ways, modern intellectuals have dismantled the barriers that restrain evil behaviors.  “In the psychotherapeutic worldview to which all good liberals subscribe, there is no evil, only victimhood” (p. 260).  Justify, as does George Soros, the legalization of drugs, and drug abuse soon shatters the delicate social bonds of family and neighborhood.  Encourage folks to “do your own thing,” and financially subsidize them with the welfare state, and all kinds of destructive things transpire!  Consequently, the nation that in 1921 recorded only one crime for every 370 inhabitants suffered one for every 10 in 2001.  England has become, especially since WWII, a distressingly crime-ridden land.  Fathers, who once accepted the responsibilities of caring for children, now knowingly abandon them “to lives of brutality, poverty, abuse and hopelessness” (p. 13).  Social workers have replaced fathers, freeing men to live as perpetual adolescents, forever seeking adventures and entertainments, “petulant, demanding, querulous, self-centered, and violent” when frustrated (p. 14).  They’ve simply embodied the fashionable theories of the intelligentsia, whose notions have mounted “a long march not only through the institutions but through the minds of the young.  When young people want to praise themselves, they describe themselves as ‘nonjudgmental.’  For them, the highest form of morality is amorality” (p. 14).

Interestingly, Dalrymple recurrently stresses the importance of dress!  How one looks seems mysteriously linked to how one acts and who one is.  With tongue (slightly) in cheek, he even suggests that tattoos cause crime!  He says this because virtually all the prisoners he treats sport a bewildering variety of tattoos.  In his younger days, he resisted any notion that appearances matter.  He “had assumed, along with most of my generation unacquainted with real hardship, that scruffy appearance was a sign of spiritual election, representing a rejection of the superficiality and materialism of bourgeois life.”  Wealthy artists and slovenly professors once seemed avante garde and stylish.  Older and wiser now, he says, “I have not been able to witness the voluntary adoption of torn, worn-out, and tattered clothes–at least in public–by those in a position to dress otherwise without a feeling of deep disgust.  Far from being a sign of solidarity with the poor, it is a perverse mockery of them; it is spitting on the graves of our ancestors, who struggled so hard, so long, and so bitterly that we might be warm, clean, well fed, and leisured enough to enjoy the better things in life” (p. 26).

Those feigning to reject bourgeois values think themselves (in accord with Marx) champions of the proletariat.  Virtually all modern intellectuals claim to identify with and support the poor, the marginalized, the disadvantaged.  Focusing on this, in an essay entitled “How–and How Not–to Love Mankind,” Dalrymple compares Karl Marx with Ivan Turgenev.  Both men were born in 1818 and died in 1883.  “Turgenev saw human beings as individuals always endowed with consciousness, character, feelings, and moral strengths and weaknesses; Marx saw them always as snowflakes in an avalanche, as instances of general forces, as not yet fully human because utterly conditioned by their circumstances.  Where Turgenev saw men, Marx saw classes of men; where Turgenev saw people, Marx saw the People.  These two ways of looking at the world persist into our own time and profoundly affect, for better or worse, the solutions we propose to our social problems” (p. 77).  Consequently, in Marx’s writings “we enter a world of infinite bile–of rancor, hatred, and contempt–rather than of sorrow and compassion” (p. 83).  Latent in his Communist Manifesto is the carnage wrought by his followers in the last century.  Millions died in the gulags.  And millions more today languish, dying spiritually, in the darkening eddies of the Marxist-inspired modern welfare state.

Dalrymple’s essays touch on many themes I’ve not mentioned.  His essays on Shakespeare and Virginia Woolfe, for example, indicate his concern for literary culture.  His observations on the differences between Hindu and Muslim immigrants are well worth pondering.  He is, Roger Kimball says, “the Edmund Burke of our age, eloquently anatomizing the moral depredations of that pseudo-enlightenment which has left large tracts of Western Society the province of thugs, social workers, liberal bureaucrats, and other enemies of civilization and the ordered liberty upon which it depends.”

* * * * * * * * * * * * * * * * * * * * * * *

Alice Thomas Ellis is one of England’s finest contemporary novelists.  She is also a devout, thoroughly traditional Roman Catholic.  Consequently she wrote, for the periodical Oldie, some short, trenchant, columns packed with her distaste for things happening in her Church that were recently published as:  God Has not Changed:  The Assembled Thoughts of Alice Thomas Ellis (London:  Burnes & Oates, 2004).  The churches that have emptied, during the past 40 years, did so for a reason–the ancient Faith has been jettisoned by a clergy more intent on being well-liked and respected than on teaching the truth.  Many of them “are too nervous to mention their beliefs–if they’ve even got them any more–and subject us instead to anodyne twaddle about their own experience” (p. 65).  Allegedly trying to reach the young, they have failed and in the process alienated loyal, older folks like herself.   No longer real believers, they’re much like butchers “inclined to vegetarianism” (p. 17) who lack the decency to change vocations!

She’s particularly distressed with the allegedly “Christian” feminists agitating for power and preeminence in the Church.  “A group recently carted round a church crucifix with a female on it–happily not a real one–referring to the curious thing as Jesa Crista.” Says Ellis:  “Sheer, pure nuttiness can go no further.  Never mind it is blasphemous, it is silly to suggest that historical figures can change sex” (p. 2).  She’s equally critical of those who reduce to the Gospel to the most fashionable “social justice” movement.  What she calls “the Red Guard of the Church, in the Wake of Vatican II” has effectively “completed the work of destruction begun in the Reformation” (p. 33).  Indeed, she warns, the “humanist protestantism to which the liberals incline is a first dip in the sea of atheism” (p. 50), and the Church is sinking rapidly into its depths.  Ellis is most probably too pessimistic, but her verbal darts deftly call into question certain postures and pronouncements of contemporary churchmen.  And one cannot but smile as she skews some of the more outrageous fads and heresies afflicting the Church.

169 Original Intentions – Founding Fathers

As a “new nation” America was uniquely shaped during the first half-century other existence. To Daniel Webster, the forging of the Constitution was absolutely central to that process. “Hold on to the Constitution of the United States of America and the Republic for which it stands–what has happened once in six thousand years may never happen again,” said Webster. “Hold on to your Constitution, for if the American Constitution shall fail there will be anarchy throughout the world.” The genius of this Constitution, M.E. Bradford argued, lies in the Founding Fathers’ Original Intentions: On the Making and Ratification of the United States Constitution (Athens: The University of Georgia Press, c. 1993).  Bradford, who died the year this book was published, was a professor of English at the University of Dallas, a very traditional Catholic university. Like Richard Weaver, with whom he has much in common, he belonged to the “Southern agrarian” school and considered himself primarily a “rhetorician.” He devoted his scholarly life to understanding the American way. As is evident in his important earlier work,

A Worthy Company: Brief Lives of the Framers of the United States Constitution, Bradford especially sought to show the deeply Christian (rather than secular Enlightenment) commitments of the men who birthed this nation.

Original Intentions is a collection of lectures Bradford delivered at various law schools (e.g. The University of South Carolina) and universities (e.g. DartmouthCollege) during 1976, the “bicentennial” year. They rest upon a thorough investigation of the primary sources–especially the records of influential persons largely unknown today but influential in that era. Several of the lectures deal with the debates that took place in states (Massachusetts; North Carolina; South Carolina) considering ratification following the convention. In a most helpful forward, the distinguished historian Forrest McDonald identifies two themes that weave their way throughout the lectures. First, Bradford argued that the Constitution established a clearly, indeed severely limited government. Second, he repeatedly employs the English philosopher Michael Oakeshott’s distinction between “nomocratic” and “ideological” readings of the document.  According to the nomocratic reading, McDonald says, the Constitution “is primarily a structural and procedural document, specifying who is to exercise what powers and how. It is a body of law, designed to govern, not the people, but government itself; and it is written in language intelligible to all, that all might know whether it is being obeyed” (p. xii). For fully 150 years this nomocratic understanding generally prevailed. Since WWII, however, a teleocratic view has captivated the nation’s laws schools and courts “and has all but destroyed the original Constitution” (p. xii).

In 1787, there were men like Alexander Hamilton and James Madison who wanted to establish a strong, centralized government. Madison, Bradford shows, was hardly the “father of the Constitution,” for his “Virginia Plan” was quickly rejected as the majority of delegates insisted on preserving significant roles for the 13 states. In fact, they almost disbanded the convention until a series of compromises brought into being a much more modest compact than Madison envisioned. The committee that finally drafted the document was chaired not by Madison but by John Rutledge of South Carolina and closely followed the proposals of Connecticut’s Roger Sherman, who believed the “objects of union . . . were few: (1) defense against foreign danger; (2) control of internal disputes and disorders; (3) treaties; (4) foreign commerce, and a revenue to be derived from it” (p. 11).

Philadelphia in 1787 was quite unlike Paris following the 1789 Revolution. The Americans drafted a document designed to establish “a more perfect union,” but not an absolutely perfect nation.  While abstractions like “Liberty, Equality, Fraternity” may arouse emotions, catchy slogans no more establish a sound republic than New Year’s resolutions establish a good character. The American Constitution “is more concerned with what government will not do for each of us than with the positive description of acceptable conduct, which is left to local and idiosyncratic definition–to society, local customs, and tested ways. Most important, it is not about enforcing the abstract ‘rights of man’ or some theory of perfect justice and aboriginal equality, not even with the Bill of Rights added to it” (p. 13).

The French approach, on the other hand, stressing the “rights of man,” Sir Herbert Butterfield wisely noted, illustrates the modem endeavor to “make gods now, not out of wood and stone, which though a waste of time is a fairly innocent proceeding, but out of their abstract nouns, which are the most treacherous and explosive things in the world” (The Englishman and His History {Archon Books, 1970}, p. 128). America’s Constitution, conversely, contained few “abstract nouns,” concentrating instead (following the British example) on the “old liberties” familiar to English-speaking peoples. The Common Law jurists, such as Coke and Blackstone, not the radical philosophes, such as Diderot and Rousseau, were their authorities. “John Adams, especially, admired the fundamental law of Great Britain, describing it as ‘the most stupendous fabric of human invention’ and a greater source of ‘honor to human understanding’ than any other artifact in the ‘history of civilization'” (p. 28). And Virginia’s Patrick Henry agreed, touting the British system ‘”the fairest fabric that ever human nature reared'” (p. 31).

“It is,” Bradford concludes, “impossible to understand what the Framers attempted with the Constitution of the United States without first recognizing why most of them dreaded pure democracy, judicial tyranny, or absolute legislative supremacy and sought instead to secure for themselves and their posterity the sort of specific, negative, and procedural guarantees that have grown up within the context of that (until recently) most stable and continuous version of the rule of law known to the civilized world: the premise that every free citizen should be protected by the law of the land” (p. 32).

Bradford’s lecture on “Religion and the Framers: The Biographical Evidence,” reveals how profoundly wrong-headed is the modem judiciary’s “separation of church and state.” Anyone deeply-rooted in the primary sources, he insists, cannot but recognize and revere the deeply Christian beliefs of some 95 percent of “the 150 to 200 principal Founders of the Republic” (p. 88). In their private papers, wills and ars moriendi, they routinely referred “to Jesus Christ as Redeemer and Son of God” (p. 89). Many of them, including Patrick Henry and George Washington, opposed Jefferson’s moves to disestablish the church in Virginia. Central figures in the making of the new nation–including Elias Boudinot, Roger Sherman, Charles Cotesworth Pinckney, Luther Martin, and John Dickinson–were deeply devout and zealous Christians. To portray the Framers as deists, a la Jefferson and Franklin, is, Bradford declares, egregiously wrong. It is, however, the typical textbook story foisted upon the public these days.

Turning to the post-Civil War Reconstruction Amendments, Bradford argues they did not significantly change the “nomocratic” essence of the 1787 Constitution. But since 1945 these amendments (and especially the 14th), through the doctrine of “incorporation,” have been increasingly used to make the Constitution “a teleocratic instrument: a law with endlessly unfolding implications in the area of personal rights” (p. 104). This has been done through “the shoddy scholarship of the Warren Court,” amply evident in an opinion of Justice Potter Stewart, who selectively cited “bits of speeches that appear to support his views and especially radical language contained in clauses rejected by Congress as a whole” (p. 118). Consequently, “in the end we get Chief Justice Warren saying that ‘the provisions of the Constitution are not time worn adages or hollow shibboleths . . . [but] vital living principles.’ And we also get Warren’s apologists coming after him, arguing that the court had always from the Founders the ‘implied power’ to revise and rewrite the Constitution according to its recognition of a ‘higher’ or ‘natural law.’ Taken together, their words describe according to its essence just what a teleocratic constitution might be, or describe no constitution at all (p. 125).

Bradford’s burden in these lectures is obviously to limit the powers of the federal government, making it truly a “federal” government of limited authority. And the evidence he cites certainly validates his conviction that the “original intentions” of the Founding Fathers were largely forgotten during the 20th century.

* * * * * * * * * * * * * * * * * *

Bradford’s recent concerns were previsioned, at the beginning of the Republic, by Anti-Federalists like Patrick Henry and John Taylor of Caroline (the intellectual leader of the strict constructionist Jeffersonian Republicans). Born in 1753, Taylor was admitted to Virginia’s CarolineCounty bar in 1774, just as the American Revolution began. He joined a Virginia regiment and ultimately became a major in the Continental army. Thereafter he served in the Virginia General Assembly and was thrice appointed to serve out senatorial terms in the United States Senate. But his great vocation, he believed, was to farm well and write wisely. His plantation, “Hazelwood,” became a model of “scientific” farming–reclaiming exhausted soil and illustrating the goodness of the agrarian life. In 1813 he published Arator: Being a Series of Agricultural Essays, Practical and Political. Writing an introduction to it, ME. Bradford said: “Taylor is like Cato … in treating advice on farming as a species of moral instruction . . . [for] Arator is about the social order of an agricultural republic, and not just about farming” (in the Liberty Fund edition, 1977, p. 37). Like Jefferson, Taylor believed that agriculture should be the basis of any healthy society.

As a political thinker, Taylor is best known for helping craft the Virginia Resolutions in 1798 and for three lengthy works published during the last decade (1814-1824) of his life: (1) An Inquiry into the Principles and Policy of the Government of the United States, (2) Tyranny Unmasked, and (3) New Views of the Constitution of the United States. He wrote to decry the manifest concentration of power in the federal government that was utterly unwarranted by the Constitution. The financial policies of Secretary of the Treasury Alexander Hamilton (such as funding state debts, internal improvements, and the National Bank) in the 1790s contravened the Constitution. Subsequent protective tariffs were designed to help northern industries (and wealthy industrialists) rather than the people. And the nationalistic decisions of the Supreme Court under the guidance of John Marshall, were not envisioned by the architects of the United States. All such centralizing developments elicited Taylor’s strong condemnations.

In Tyranny Unmasked (Indianapolis: Liberty Fund, 1992), Taylor primarily attacked the protective tariff that so harmed the agrarian South. There is no difference, he insisted, between taking property through violence and taking it through taxes and fiscal policies designed to award a privileged minority. “A tax may be imposed for two objects; one to sustain a government, the other to enrich individuals” (p. 116). There is no difference between a tyranny with one man on top and a tyranny with a thousand men on top. Elected tyrants are still tyrants. Fifty years after the Revolution, Taylor warned, Americans “must once more decide whether we will be a free nation. Freedom is not constituted solely by having a government of our own. Under this idea most nations would be free. We fought a revolutionary war against exclusive privileges and oppressive monopolies” (p. 84). To grant similar privileges and monopolies under the auspices of the “national” government would betray the fundamental nature of the United States.

A free people, Taylor insisted, require a limited government. “All reflecting individuals, except those bribed by self-interest, believe that liberty can only be preserved by a frugal government, and by excluding frauds for transferring property from one man to another. In no definition of it has even its enemies asserted, that liberty consisted of monopolies, extensive privileges, legal transfers of private property, and heavy taxation. In defining a tyrant, it is not necessary to prove that he is a cannibal. How then is tyranny to be ascertained? In no other perfect way that I can discern, except as something which takes away our money, transfers our property and comforts to those who did not earn them, and eats the food belonging to others.” (p. 226).

Ambition and avarice ever haunt the corridors of power. Thus freedom flourishes only when power is restrained by the checks and balances set forth in the Constitution, and most especially in the 10th Amendment that specified: “The powers, not delegated to the United States by the constitution, nor prohibited by it to the States, are reserved to the states respectively, or to the people.”

* * * * * * * * * * * * * * * * * * * *

In 1823, a year after publishing Tyranny Unmasked, John Taylor of Caroline published New Views of the Constitution of the United States (Washington, D.C.: Regnery Publishing, Inc., c. 2000). Whereas the protective tariff served as the focus for the earlier work, the original intentions of the Framers of the Constitution served as the subject for the latter, and it is, James McClellan says in his Introduction, “the locus classicus of states’ rights jurisprudence” (p. xiii). In 1818 Congress had permitted the publication of Robert Yates’ notes of the Constitutional Convention. (Before this, by Congressional order, nothing was known of the behind-the-scenes debates of the delegates, and James Madison’s journal was not published until the 1840s. Thus the strong states’ rights concerns of the Convention’s Framers was largely unknown for 25 years). Comparing their actual intent, as recorded in Yates’ Journal, with the widely-known interpretations set forth by Madison and Hamilton in The Federalist Papers, Taylor discovered pervasive “distortions of the original meaning and a nationalistic bias” (p. liii) of the latter.

“Had the journal of the convention which framed the constitution of the United States, though obscure and incomplete,” Taylor said, “been published immediately after its ratification, it would have furnished lights towards a true construction, sufficiently clear to have prevented several trespasses upon its principles, and tendencies towards its subversion” (p. 13). The Framers clearly envisioned a limited federal government, not the national regime evident by 1820. Indeed, as the several states appointed delegates to the Convention they insisted on using the right word, unanimously rejecting “the recommendation of a national government, and by excluding he word national from all their credentials, demonstrated that they well understood the wide difference between a federal and a national union” (p. 18).

Taylor devoted many pages to carefully examining the materials in Yates’ journal, dismayed that its contents had been buried for 30 years. “Thus the vindicators of a federal construction of the constitution are deprived of a great mass of light, and the consolidating school have gotten rid of a great mass of detection. Secrecy is intended for delusion, and delusion is fraud. If it was dictated by an apprehension, that a knowledge of the propositions and debates, would have alarmed the settled preference of the states and of the publick, for a federal form of government, it amounts to an acknowledgement that these propositions and debates were hostile to that form and to the publick opinion” (p. 47). Deprived of the truth, many naively believed the positions espoused in The Federalist Papers. So Taylor devoted much of the book to an examination and refutation of the interpretations set forth therein by Hamilton and Madison,

as well as clarifying his own understanding of the Constitution. Their differences are demonstrable:  “These gentlemen believed that a supreme national government was best for the United States, and I believe that a genuine federal system is more likely to secure their liberty, prosperity, and happiness” (p. 75). The question is: which interpretation best represents the “original intentions” of the Framers?

Given the evidence from the original sources, Taylor defended the “federal” rather than the “national” system. “The delegations, reservations, and prohibitions of the constitution, combined with the rejection of powers proposed in the convention, constitute a mass of evidence, more coherent and irrefragable for ascertaining the principles of our political system, than can be exhibited by any other country; and if it cannot resist the arts of construction, constitutions are feeble obstacles to ambition, and ineffectual barriers against tyranny. …. This mass of evidence stands opposed to those constructions which are labouring to invest the federal government with powers to abridge the state right of taxation; … to expend the money belonging to the United States without control; to enrich a local capitalist interest at the expense of the people; to create corporations for abridging state rights; to make roads and canals; and finally to empower the supreme court to exercise a complete negative power over state laws and judgments, and an affirmative power as to federal laws” (p. 189).

Looking at Taylor’s 1823 list in 2006, it is evident that his fears have materialized. Uncontrolled spending, even by Republicans elected to restrain it, continues unabated as we enter the 21st century.  “Local capitalists” routinely gain advantages, through the hoards of lobbyists (many of them former senators and congressmen) who wine and dine “public servants” such as Congressman Randy “Duke” Cunningham. Federal bureaucracies, such as the Environmental Protection Agency or Department of Education, have slowly increased their coercive roles in realms formerly reserved to state and local governments. Internal improvements–”roads and canals” in Taylor’s day–have been widely nationalized, as is most evident in “disaster relief in Louisiana and federal influence in minor matters like speed limits. And the Supreme Court, greatly feared by Taylor, has become a major player in making laws and shaping society. Court decisions, whether mandating abortion rights or racial preferences in university admissions, reveal the enormous political power now resident in the hands of nine unelected jurists.

Hamilton and Madison certainly exerted influence in the 1787 Constitutional Convention, but ultimately their position, calling for a strongly centralized government, was soundly rejected by that body.  This was because the Framers prized an ordered liberty. “Society, well constructed, must be compounded of restraint and freedom, and this was carefully attended to in framing our union. The states are restrained from doing some things, and left free to do others; and the federal government was made free to do some things, but restrained from doing others. This arrangement cannot be violated, without making one department a slave or an usurper. A division of political rights between the people and a government, can only preserve individual liberty” (p. 301). In sum: “Freedom without restraint, or restraint without freedom, is either anarchy or despotism” (p. 301).

Taylor’s position, of course, was embraced by John C. Calhoun and in time by the architects of The Confederate States of America. Thus his states’ rights argument cannot escape the stigma of slavery and segregation in the South. But the essence of what Taylor (and Bradford) argue–that the best government is a limited government–still has currency. One need do no more than note the latest “pork barrel” legislation, or the Supreme Court’s meddling in local decisions regarding placement of the Ten Commandments, or recent presidents’ decisions to help hurricane victims or pay for drug prescriptions or dispatch troops around the world, to realize how centralized and powerful the government created by the Constitution has become. The current regime may be necessary–or it may be better. But it is clearly not what the 1789 convention envisioned.

168 Europe at Risk

In The Cube and the Cathedral:  Europe, America, and Politics Without God (New York:  Basic Books, c. 2005), George Weigel (the noted biographer of John Paul II and Benedict XVI) ponders the plight of a Europe losing the religious faith that birthed it.  He compares “the cube”–La Grande Arche de la Defense, the massive (40 story), modernist glass cube built in Paris by the late socialist French president, Francois Mitterand–with the nearby cathedral,  Notre Dame de Paris, one of the grandest monuments of the Middle Ages.  The two structures symbolize the worldviews contending for the heart of the continent–a struggle that is “fundamentally a problem of cultural and civilizational morale” (p. 6).

Certain trends in post-WWII Europe deeply distress Weigel:  a failure to condemn either communism or Islamofascism; a spineless pacifism vis a vis terrorists and criminals; a mindless support for international organizations, such as the EU and UN; an irrationality evident in the high percentage of French and Germans who think the U.S. actually orchestrated the 9/11 attacks; a marked economic decline; a startling demographic devolution–the dramatic evidence of a people without concern for future generations; a growing contempt for the elderly and deceased; an unwillingness to face the bankruptcy of social welfare and pension systems; and a militantly anti-Christian agenda embraced and imposed by Europe’s elite.

All of these problems, as Aleksandr Solzhenitsyn noted in his 1983 Templeton Prize Lecture, began at the dawn of the 20th century with World War I and Europe’s subsequent “lost awareness of a Supreme Power” and “rage of self-mutilation” that explain why it stood by and allowed “the protracted agony of Russia as she was being torn apart by a band of cannibals. . . .  The West did not perceive that this was in fact the beginning of a lengthy process that spells disaster for the whole world” (pp. 33-34).  Similarly, said Henri de Lubac (in his magisterial The Drama of Atheist Humanism), the 20th century’s great disasters stemmed from “an attempt to promote a vision of man apart from God and apart from Christ. . . .  Forgetfulness of God [has] led to the abandonment of man” (p. 119).

Europe’s loss of faith stands exposed in the proposed constitution for a new Europe that was recently rejected (for economic self-interest, not religious concern) by the people of France and Holland.  This “constitution” had a lengthy historical section that said nothing about the role of Christianity in the making of Europe!  To Weigel, this “self-inflicted amnesia” provides “a key to the ‘Europe problem’ and its American parallels” (p. 55).  What’s largely forgotten is the freedom for excellence embedded in the thought of St. Thomas Aquinas and evident in Western Christian Culture.  “Freedom is the capacity to choose wisely and act well as a matter of habit–or, to use an old-fashioned term, as a matter of virtue.  Freedom, on this understanding, is the means by which we act, through our intelligence and our will, on the natural longing for truth, goodness, and happiness that is built into us as human beings” (pp. 79-80).   To be fully human is to be truly free.  A concert organist freely plays Bach’s music after mastering highly technical disciplines through practice.  One is, likewise, a free person by virtue of mastering the cardinal virtues.  A free society is sustained by similar disciplines and is “characterized by tolerance, civility, and respect for others, societies in which the rights of all are protected by both law and the moral commitments of ‘we the people’ who make the law” (pp. 81-82).

Rejecting and rivaling the realism of Thomas Aquinas was the voluntaristic nominalism of William of Ockham, which has “had a great influence on Christian moral theology” (p. 83).  There is no question that the slow and steady growth of nominalism, during the past 500 years, has weakened the foundations of the West.  In Ockham’s nominalism there are no universals, no absolutes apart from the arbitrary edicts of God–which may or may not be sustained tomorrow.  In modern nominalism, the only edicts are our own–or those manufactured by elites such as the United States Supreme Court.  To Servais Pinckaers (a Belgian Dominican who crafted the phrase “freedom for excellence” regarding Aquinas), “Ockham’s work was ‘the first atomic explosion of the modern era,'” and it brought into being “a new, atomized vision of the human person and ultimately of society.  In Ockham we meet what Pinckaers calls the freedom of indifference” (p. 83).  Freedom is doing whatever one wants to do and “has nothing to do with goodness, happiness, or truth” (p. 85).  Nietzsche’s will-to-power is the modern manifestation of Medieval nominalism, and his pernicious nihilism is everywhere evident in everything from Nazism to the “postmodernism” of Derrida and Foucault, from Supreme Court decisions such as Lawrence v. Texas to the terrorism of Islamofascists  blowing up themselves and their innocent victims in London subways or Amman hotels.

Europe–and America–Weigel insists, must choose either the freedom of excellence or the freedom of indifference, the cathedral or the cube.  This short book is more of a journalistic sketch than an in-depth study, but Weigel’s concerns are both prescient and compelling.

* * * * * * * * * * * * * * * * * * * *

Whereas Weigel is a serious scholar who writes for a popular audience, Tony Blankley is a journalist who’s written a thoughtful treatise entitled The West’s Last Chance:  Will We Win the Clash of Civilizations? (Washington, D.C.:  Regnery Publishing, Inc., c. 2005).  Like Weigel he thinks Europe is at risk and Americans should be concerned.  “The threat of the radical Islamists taking over Europe is every bit as great to the United States as was the threat of the Nazis taking over Europe in the 1940s.  We cannot afford to lose Europe” (p. 21).  Both America and Europe have the resources needed to resist this, but the question we now face is whether they have the will to do so.

The Nazis in the 1920s and 1930s were a small, militant group determined to take control of both  Germany and Europe.  “They particularly targeted German youth” (p. 47).  The same may be said of Islamofascists today.  Large numbers of Muslims are angry at their plight in the world, humiliated by their economic and military weakness vis a vis Israel and the West, and willing to heed the radical voices calling for jihad and terror.  In response, Europe’s leaders (following the pacifist path of those Oxford University students who declared, in the ’30s, they “would not fight for King or country”), engage in denial and accommodation and appeasement.  Like Stanley Baldwin 70 years ago, they court popularity by funding public housing rather arms and actions, talking about peace and security rather than conflict and struggle.

This was clear to George Orwell in 1940, when he wrote:  “‘I thought of a rather cruel trick I once played on a wasp.  He was sucking jam on my plate, and I cut him in half.  He paid no attention, merely went on with his meal, while a tiny stream of jam trickled out of his severed esophagus.  Only when he tried to fly away did he grasp the dreadful thing that had happened to him.  It is the same with modern man.  The thing that has been cut away is his soul, and there was a period–twenty years, perhaps–during which he did not notice.’  In Orwell’s view, Western man has lost his soul in the aftermath of World War I” (p. 133).

But Baldwin and the Oxford students were refuted by events in 1939, when men of sterner stuff (namely Winston Churchill) were needed.  Perhaps in WWII the Allies recovered some of the West’s legacy and breathed life back into their culture.  But it was, Blankley thinks, ephemeral.  And whether or not the soul Orwell declared dying is forever dead only time will tell.  But “Europe’s future is in danger because Europe has forgotten its past.  In the Middle Ages, Europeans held a healthy respect, even fear and awe, of the power and vigor of Islamic culture” (p. 96).  Not so today!  Muslim immigrants have flooded Europe but, unlike other ethnic groups, have refused to assimilate in host countries.  They seek domination, not integration!   Yet Europeans fail to identify and fight them as mortal foes and Islamist jihadists if unopposed will triumph.

* * * * * * * * * * * * * * * * * * * *

Roger Scrutin, a fine English philosopher, focuses his finely honed analytic mind on The West and the Rest:  Globalization and the Terrorist Threat (Wilmington:  Intercollegiate Studies Institute, c. 2002).  He especially stresses the critical philosophical differences between Islam and Christianity.  The very word, Islam, denotes submission, surrender, whereas the Christendom celebrates the personal dignity and freedom that results from a careful separation of church and state.  “The Muslim faith, like the Christian, is defined through a prayer.  But this prayer takes the form of a declaration:  There is one God, and Muhammad is his Prophet.  To which might be added:  and you had better believe it.  The Christian prayer is also a declaration of faith; but it includes the crucial words:  ‘forgive us our trespasses, as we forgive them that trespass against us'” (p. 36).  The two faiths are, quite simply, radically different, and this has led to different “social contracts.”

The West has also been shaped by the Enlightenment, with its commitment to reason and the civic virtues of “law-abidingness, sacrifice in war, and public spirit in peacetime” (p. 55).  To the extent irrationalism, lawlessness, pacifism, and the demise of patriotism pervade the West, they subvert the civilization that has sustained them.  Contempt for objective truth, championed by Nietzsche and his postmodern epigones, easily leads to the disdain for any form of authority so evident on many university campuses.  Nietzschean skepticism inculcates moral relativism, and fewer and fewer Europeans seem willing to risk anything, much their lives, for anything or anyone.  They’re increasingly disinterested in even marrying and rearing children, much less fighting for what’s right.  “Religious societies generate families automatically as the by-product of faith” (p. 69).  Secular societies, however, have little concern for anything sacred such as the family, repudiating it as “patriarchal” and oppressive.

Weakened by the loss of religious faith and Enlightenment values, the West now faces the onslaught of a revived worldwide Islam, committed to the “holy law” of Mohammed.  Scrutin guides the reader in a careful survey of Islamic history and thought, emphasizing that:  “Conquest, victory, and triumph over enemies are a continual refrain of the Koran, offered as proof that God is on the side of the believers” (p. 120).  Still more:  “For the first time in centuries Islam appears, both in the eyes of its followers and in the eyes of the infidel, to be a single religious movement united around a single goal” (p. 123).  And wherever they have come to power, we have witnessed “murder and persecution on a scale matched in our time only by the Nazis and the Communists.  The Islamist, like the Russian nihilist, is an exile in this world; and when he succeeds in obtaining power over his fellow human beings, it is in order to punish them for being human” (p. 127).

Nothing but a revival of a commitment to the West and its virtues can withstand the Muslim onslaught that is now taking place around the world.  To understand why this is necessary, Scrutin proves enlightening.  How to do it, however, is less clear!

* * * * * * * * * * * * * * * * *

For 25 years Bat Ye’or (an Egyptian scholar living in Switzerland writing under a pseudonym to help shelter her from the violence automatically addressed toward scholars who dare criticize Islam) has explored the somber reality of dhimmitude–the status of non-Muslims in Islamic societies.  Her scholarly works include:  The Dhimmi:  Jews and Christians Under Islam (1985), The Decline of Eastern Christianity under Islam:  From Jihad to Dhimmitude (1996), which I reviewed in my “Reedings” #131, and Islam and Dhimmitude:  Where Civilizations Collide (2002).  These are historical works, but her latest treatise, Eurabia:  The Euro-Arab Axis (Madison, N.J.:  Fairleigh Dickinson University Press, c. 2005) describes what’s taking place right now, particularly among the bureaucratic elites who increasingly control the continent.  Above all, she challenges readers to look clearly at the documents, many included as appendices to the text, that challenge the stories told by the largely pro-Islamic culture czars (the journalists and professors and politicians) in both Europe and America.  She’s clearly partisan in her presentation, finding little of value in Islam.  But she’s an enormously well-informed partisan, and her facts and perspective simply cannot be ignored by anyone seriously concerned with world affairs.

“This book,” she says, delineating her thesis, “describes Europe’s evolution from a Judeo-Christian civilization, with important post-Enlightenment secular elements, into a post-Judeo-Christian civilization that is subservient to the ideology of jihad and the Islamic powers that propagate it.  The new European civilization in the making can be called a ‘civilization of dhimmitude.’  The term dhimmitude comes from the Arabic word ‘dhimmi.’  It refers to subjugated, non-Muslim individuals or people that accept the restrictive and humiliating subordination to an ascendant Islamic power to avoid enslavement or death.  The entire Muslim world as we know it today is a product of this 1,3000 year-old jihad dynamic, whereby once thriving non-Muslim majority civilizations have been reduced to a state of dysfunctional dhimmitude.  Many have been completely Islamized and have disappeared.  Others remain as fossilized relics of the past, unable to evolve” (p. 9).

“For well over a millennium,” Bat Ye’or continues, “following the seventh-century Muslim military offensives against Byzantium, European powers instinctively resisted jihad–militarily when necessary–to protect their independence.  The response of the post-Judeo-Christian Europe of the late twentieth century has been radically different.  Europe, as reflected by the institutions of the EU, has abandoned resistance for dhimmitude, and independence for integration with the Islamic world of North Africa and the Middle East.  The three most apparent symptoms of this fundamental change in European policy are officially sponsored anti-Americanism, anti-Semitism/anti-Zionism and ‘Palestinianism'” (p. 10).  The targets may change, but Muslim objectives and strategies remain constant:  “Hostage taking, ritual throat slitting, the killing of infidels and Muslim apostates are lawful, carefully described, and highly praised jihad tactics recorded, over the centuries, in countless legal treatises on jihad” (p. 159).

The French have pioneered the current process of accommodation.  Charles de Gaulle, whose pride was injured by his exclusion from the Yalta Conference as WWII ended, and who witnessed the demise of France’s colonial empire, determined to create a French-led alliance of Mediterranean states that would effectively counteract the growth of American power.  His successors hoped France would become the “protector of Islam and Palestinians against America and Israel.  They hoped that a pro-French Islam would facilitate the quiet control of former colonies within the French orbit and spread French culture,” ultimately establishing “an enormous market” that would restore France’s former glory (p. 148).  The unsuccessful Egyptian-Syrian assault on Israel and the Arab oil embargo in 1973 accelerated such developments.  Europeans abruptly began discussing and defending the “Palestinian people,” a new name for Arabs living in the disputed region between Israel and Jordan.  The nation of Israel, which had enjoyed Europe’s support for 25 years, suddenly became the bete noir of the Middle East.  Hungry for oil, Europe began to support the Muslim dictators who supplied it.  Needing workers during that era of economic expansion, immigrants from North Africa and Turkey were encouraged to relocate.

In the midst of all this, innumerable conferences were held and papers published as part of the EAD (“Europe-Arab Dialogue”) regarding the new coalition.  Bat Ye’or seems to have read every report of every such conference, and she takes seriously their wording, for they seem to have slowly shaped EU policies.  At such gatherings, attended by Arab and European elites, impressive words were uttered regarding human rights, religious rights, women’s rights, workers’ rights, etc.  And the Muslim workers flooding Europe have enjoyed such “rights.”  Still more:  they have secured special privileges and protections under the auspices of the increasingly powerful European Union.  This was, of course, a one-way process.  Arabs, for instance, demand absolute respect for Islam in European lands but continue to persecute Jews and Christians wherever Islam (with its Shari’a) reigns.

Muslims even insist that Islam rather than Christianity be recognized as the primary “civilization”–indeed “the spiritual and scientific fountainhead of Europe” (p. 98).  Robin Cook, the British Foreign Secretary, actually said that “Islamic art, science and philosophy,” along with Greek and Roman culture, had helped shape England (p. 172)!  A fantasyland of Muslim rule in Medieval Spain–”the Andalusian utopia”–is routinely cited as evidence of past Islamic tolerance and educational sophistication.  And, as the recent draft of a EU constitution indicates, Muslim influence grows.  Arab delegates to various EAD sessions ever demand that Europe’s schools and textbooks favorably portray Muslims and Islamic history, whereas Jews and Christians are routinely demonized by Muslim educators.  “Through the ‘Diaglogue,” Bat Ye’or insists, “Arab League politicians and economists have gained a firm ascendancy over Europe’s policy and economy” (p. 123).

Multiplied billions of dollars have been extorted from Europe, sent as “aid” and “development funds” to Muslim countries–the dar al-islam world ruled by Shari’a.  “The huge sums that the EU pays to Arab Mediterranean countries and the Palestinians amount to another tribute exacted for its security within the dar al-sulh [i.e. the subservient state of dhimmitude].  Europe thereby put off the threat of a jihad aimed at the dar al-harb [the world Muslims are obligated to attack and control] by opting for appeasement and collusion with international terrorism–while blaming the increased world tensions on Israel and America so as to preserve its dar al-sulh position of subordinate collaboration, if not surrender, to the Islamists” (p. 77).  By supporting the economic and cultural policies Arabs demand, Europeans have, Bat Ye’or thinks, surrendered their lands and traditions.   By surrendering, they have become dhimmis, for “dhimmis do not fight.  Dhimmitude is based on peaceful surrender, subjection, tribute, and praise” (p. 204).

When President Bush fought back, following the 9/11 attacks, subduing the Taliban in Afghanistan and invading Iraq, “the Anti-Americanism that had been simmering for years among European Arobophiles, neo-Nazis, Communists, and leftists in general” (p. 227) boiled over.  In a profound sense, Bat Ye’or says, such Anti-Americanism thrives in “cowardly or impotent societies, which have chosen surrender through fear of conflict” (p. 242).  It’s the resentment of the weak regarding the strong, “an intellectual totalitarianism disguised as a virtue for states which have entrusted their security to those who threaten them” (p. 242).

Still more:  “By implicitly enlisting in the Arab-Islamic jihad against Israel–under labels such as ‘peace and justice for Palestinians’–Europe has effectively jettisoned its values and undermined the roots of its own civilization.  It even struck a blow against worldwide Christianity, abandoning the Christians in Lebanon to massacres by Palestinians (1975-83), those of Sudan to jihad and slavery, and the Christians of the Islamic world to the persecutions mandated by the dhimma system” (p. 115).  Despite this, however, “the EU is implicitly abetting a worldwide subversion of Western values and freedoms, while attempting to protect itself from Islamic terrorism by denying that it even exists, or blaming it on scapegoats” (p. 227).

In short:  with precious oil and prolific immigrants the Muslims have moved to impose a state of dhimmitude upon Europe.  Jihad is succeeding as Islam extends its sway–though oil has replaced the sword as the weapon of choice and immigrants rather than warriors serve of agents of occupation!  There’s a war going on in Europe, and Europeans have closed their eyes to preserve an “illusion of peace” (p. 252), much as they did while National Socialism and Communism devoured the continent in the 20th century.

# # #

167 “Built to Last” and “Good to Great”

I rarely read books on “leadership” or business management, finding them generally focused on “bottom line” issues and generally irrelevant (or even contrary) to the academic and religious world that’s always been my primary concern.  Some recent treatises, however, merit consideration, for they explore both the realm of economics and the deeper recesses of human nature.  Economics, in its most basic sense, means household management and obligates one to act wisely for the good of one’s family and community–certainly a central concern for any ethic.  And the ways people do business, in any society, obviously offers many clues to the nature of human nature.

A decade ago James C. Collins and Jerry I. Porras published Built to Last:  Successful Habits of Visionary Companies, setting forth the data and insights gained from a six-year research project at the StanfordUniversity graduate School of Business, and it became the number one business book for 1995. Two years later the authors added a new introduction and concluding chapter (New York:  HarperCollins, c. 1997), making the paperback edition both more extensive and conclusive, showing how the book’s business principles apply to individuals and small groups within corporations, non-profits as well as for-profit organizations.

The authors focused on some “truly exceptional companies that have stood the test of time” (p. xxiii), wondering how they lasted while competitors came and went.  What they discovered, first of all, was the difference between “clock building and time telling.”  Patiently constructing a well-honed organization, with less attention to quarterly statistics, matters much in the long run.  Lasting success comes through institutional soundness, not dramatic leadership or ephemeral enthusiasm.  “Luck favors the persistent.  This simple truth is a fundamental cornerstone of successful company builders.  The builders of visionary companies were highly persistent, living to the motto:  Never, never, never give up” (p. 29).

The best companies were almost uniformly devoted to “more than profits.”  Their main concern has been to preserve their “core values.”  The premier “architects” of great companies generally established a “core ideology” that has persisted, in some instances, for more than a century.  Though Henry Ford himself certainly had some unattractive traits, he seemed to care more for making lots of cars than maximizing his fortune.  “I don’t believe we should make such an awful profit on our cars,” said Ford.  “A reasonable profit is right, but not too much,” said he.  “I hold that it is better to sell a large number of cars at a reasonably small profit . . .  I hold this because it enables a larger number of people to buy and enjoy the use of a car and because it gives a larger number of men employment at good wages.  Those are the two aims I have in life” (p. 53).  Though one should always place such rhetoric in perspective, Ford’s claim rings true for his and a number of built-to-last companies.  Collins and Porras conclude:  “Contrary to business school doctrine, we did not find ‘maximizing shareholder wealth’ or ‘profit maximization’ as the dominant driving force or primary objective through the history of most visionary companies.  They have tended to pursue a cluster of objectives, of which making money is only one–and not necessarily the primary one” (p. 55).

Paul Galvin, the founder of Motorola, consistently defined profits as the means to the company’s goal, making a good product, not its raison de etre.  Galvin’s son and successor, Robert, wrote a series of essays in 1991, stressing such things as “creativity, renewal, total customer satisfaction, quality, ethics, innovation, and similar topics; not once did he write about maximizing profits, nor did he imply this was the underlying purpose–the ‘why’ of it all” (p. 82).  Motorola’s competitor, Zenith, by contrast, lacked such a commitment and, following the death of its founder, focused almost singularly on profits and market share, losing its way in the process.

Importantly:  “You do not ‘create’ or ‘set’ core ideology.  You discover core ideology.  It is not derived by looking to the external environment; you get at it by looking inside.  It has to be authentic” (p. 228).  Furthermore:  “You cannot ‘install’ new core values or purpose into people.  Core values and purpose are not something people ‘buy in’ to.  People must already have a predisposition to holding them.  Executives often ask, ‘How do we get people to share our core ideology?’  You don’t.  You can’t!  Instead, the task is to find people who already have a predisposition to share your core values and purpose, attract and train these people, and let those who aren’t disposed to share your core values go elsewhere” (pp. 229-230).

In addition to preserving core values, great companies continually find innovative ways to “stimulate progress.”  Their identity remains constant but their strategies ever evolve.  This, in fact, is “the central concept of this book:  the underlying dynamic of ‘preserve the core and stimulate progress’ that’s the essence of a visionary company” (p. 82).  Successfully doing so involves five things, each given a separate chapter by Collins and Porras:  1) Big Hairy Audacious Goals; 2) Cult-like Cultures; 3) Try a Lot of Stuff and Keep What Works; 4) Home-grown Management; 5) Good Enough Never Is.

Henry Ford’s Big Hairy Audacious Goal was “to democratize the automobile.”  General Electric, under the legendary Jack Welch, sought to “become #1 or #2 in every market we serve and revolutionize this company to have the speed and agility of a small enterprise” (p. 95).  Boeing, in 1965, determined to build the 747 jumbo jet at all costs–and it nearly cost everything, stretching the company to its absolute maximum.  McDonnell Douglas, by contrast, consistently refused to risk losses and thus failed to successfully compete with Boeing.

Cult-like Cultures characterize companies like Nordstroms.  All employees start at the bottom, working on the floor as salesmen.  There they’re on trial, seeing whether they truly satisfy customers.  Employees receive a card–WELCOME TO NORDSTROM–stating the company’s character:  “We’re glad to have you with our Company.  Our number one goal is to provide outstanding customer service.  Set both your personal and professional goals high.  We have great confidence in your ability to achieve them.  Nordstrom Rules:  Rule #1:  Use your good judgment in all situations.  There will be no additional rules” (p. 117).  The company is fanatically committed to this simple rule, and customer satisfaction has validated its effectiveness.   “Nordstrom reminds us,” say the authors, “of the United States Marine Corps–tight, controlled, and disciplined, with little room for those who will not or cannot conform to the ideology” (p. 138).

Still more:  “This finding has massive practical implications.  It means that companies seeking an ’empowered’ decentralized work environment should first and foremost impose a tight ideology, screen and indoctrinate people into that ideology, eject the viruses, and give those who remain the tremendous sense of responsibility that comes with membership in an elite organization.  It means getting the right actors on the stage, putting them in the right frame of mind, and then giving them the freedom to ad lib as they see fit.  It means, in short, understanding that cult-like tightness around an ideology actually enables a company to turn people loose to experiment, change, adapt, and–above all–to act”  (pp. 138-139).

Built-to-last companies continually adapt to the evolving marketplace by trying “a lot of stuff” and keeping “what works.”  In the methodological sense they are totally pragmatic, remarkably Darwinian.  They tenaciously retain their core values but freely change their modus operandi.  They have a “vision” but rarely craft detailed “long range” plans.  Thus “Bill Hewlett told us that HP ‘never planned more than two or three years out’ during the pivotal 1960s” (p. 144), and they learned from their mistakes.  As R.W. Johnson Jr., said, regarding Johnson & Johnson:  “Failure is our most important product” (p. 147).   A Wal-Mart store in Louisiana placed friendly “people greeters” at the store’s entrance primarily to deter shoplifters, only to discover that it was a marvelous public relations strategy that soon spread throughout the giant retail chain.  GE’s Jack Welch, reading Johannes von Moltke’s writings on military strategy, coined the phrase “planful opportunism” to describe the fact that in business as well as in war “detailed plans usually fail, because circumstances inevitably change” (p. 149).

Welch personifies the “Home-Grown Management” the authors find in most successful companies.  Bringing in an outsider–whether because of his charismatic gifts or her politically correct sex or some alleged need for “fresh blood” rarely helps a company.  Welch succeeded at GE because he followed a century of highly successful CEOs.  Amazingly, “across seventeen hundred years of combined history in the visionary companies, we found only four individual cases of an outsider coming directly into the role of chief executive” (p. 173).  Companies that nourish employees’ development, recognize their talent, and reward their commitment find able leaders to assume control of the corporation.  Companies that don’t–such as Disney in the ’70s–flounder while hiring outsiders like Michael Eisner.

Good Enough Never Is” means that successful companies never rest on their laurels.  Their CEOs demand continual improvement.  Thus J. Willard Marriott, Sr., said:  “Discipline is the greatest thing in the world.  Where there is no discipline, there is no character.  And without character, there is no progress. . . .  Adversity gives us opportunities to grow.  And we usually get what we work for” (p. 188).  His son sustained his “Mormon work ethic,” putting in 70-hour weeks and diligently traveling to make sure his facilities were first-rate.  Howard Johnson’s son, however, left the details of the organization to others while he enjoyed the “good life” in New York.  Before long Howard Johnson was failing while Marriott continued to prosper.  Momentary successes, however impressive, are but step stones to ever-higher goals.  “Like great artists or inventors, visionary companies thrive on discontent” (p. 187).   Like great coaches, thriving companies continually recruit the best available talent, work incessantly on training and motivating employees, and challenge everyone to excel in everything they do.

* * * * * *  * * * * * * * * * * * * * *

Having analyzed companies that were “built to last,” Jim Collins (assisted by 10 researchers) sought to explain why a few of them truly excel in Good to Great:  Why Some Companies Make the Leap and Others Don’t (New York:  HarperBusiness, c. 2001).  “Good is the enemy of great,” he says in his first sentence (p. 1).  There are lots of “good” schools, teams, churches, and businesses.  Because they’re “good enough,” however, they never generate the commitment necessary to become truly great, an attribute Collins grants only eleven of the Fortune 500 companies–Abbott; Circuit City; Falnnie Mae; Gillette; Kimberly-Clark; Kroger; Nucor; Philip Morris; Pitney Bowes; Walgreens; Wells Fargo.  So he sought to discover the “timeless principles–the enduring physics of great organizations–that will remain true and relevant no matter how the world changes around us” (p. 15).

Some of the things they didn’t find are striking.  “Larger-than-life, celebrity leaders who ride in from the outside are negatively correlated with taking a company from good to great” (p. 10).  Financial packages for top executives matter little.  Long-range planning strategies aren’t a factor.  Nor do cutting-edge technologies, mergers and acquisitions, motivational novelties, or various other voguish “keys” make for success.  Conversely–and far less flamboyantly–what truly mattered, as companies climbed to greatness, was “a process of buildup followed by breakthrough, broken into three broad stages:  disciplined people, disciplined thought, and disciplined action” (p. 12).  Great “companies have a culture of discipline.  When you have disciplined people you don’t need hierarchy.  When you have disciplined thought, you don’t need bureaucracy.  When you have disciplined action, you don’t need excessive controls.  When you combine a culture of discipline with an ethic of entrepreneurship, you get the magical alchemy of great performance” (p. 13).

It all begins with what Collins calls a “Level 5 Executive,” such as Darwin E. Smith at Kimberly-Clark, who blends “personal humility and professional will” (p. 20) and orchestrates the transformation.  A shy, self-effacing, hard-working farm boy who slowly moved up the ranks of the company, Smith brought a “ferocious resolve” to renew an aging paper producer and did so.  Like Smith, Level 5 leaders are “incredibly ambitious–but their ambition is first and foremost for the institution, not themselves” (p. 21).  The eleven CEOs whose companies “met the exacting standards for entry into this study” (p. 28) were remarkable men, but they’re largely unknown!  The rarely graced the cover of People Magazine or appeared on 60 Minutes or dined with Barbara Streisand!  They rarely talked about themselves, and admiring outsiders tended to focus on the companies, not the executives who ran them!  Level 5 Executives take responsibility for failures and generously praise others for successes.  Lee Iacocca, by contrast, often seemed to lead Chrysler as a means of self-promotion, so “that insiders at Chrysler began to joke that Iacocca stood for ‘I Am Chairman of Chrysler Corporation Always'” (p. 30).  And Iacocca’s flamboyant success in the 1980s resembled a soaring, then quickly deflated, balloon.

The second phase in moving from good to great is finding the right (and eliminating the wrong) people.  People matter more important than plans.  “First who, then what,” guides the great companies.  Nucor succeeded because the company found “that you can teach farmers how to make steel, but you can’t teach a farmer work ethic to people who don’t have it in the first place” (p. 50).  So the company established steel plants in rural areas and focused on hiring men who knew how to work.  Great companies consider an employee’s character more important than job training or school degrees.

Thirdly, great companies “confront the brutal facts (yet never lose faith)” (p. 65).  In the 1960s, while the grocery giant A&P faltered by clinging to antiquated practices, “Kroger began to lay the foundations for a transition” (p. 65) that made it the number one grocery chain by 1999.  Facing facts, not dreaming dreams, distinguish solid leaders.  The reason “charismatic” leaders often fail, in the long run, is because of their penchant for casting unrealistic visions.  Winston Churchill had great oratorical ability, and his words inspired the world in the 1940s.  But he was adamantly realistic, demanding to know the “brutal facts” during the war, and his decisions were rooted in reality, not rhetoric.  Leaders aren’t cheerleaders.  “If you have the right people on the bus, they will be self-motivated.  The real question then becomes:  how do you manage in such a way as not to de-motivate people?  And one of the single most de-motivating actions you can take is to hold out false hopes, soon to be swept away by events” (p. 74).

Next Collins explains “the hedgehog concept, an important aspect of the breakthrough phase.  Unlike foxes, who dash in a dozen different directions, pursuing the freshest trail, hedgehogs “simplify a complex world into a single organizing idea, a basic principle or concept that unifies and guides everything” (p. 91).  Walgreens, for example, decided to establish “the best, most convenient drugstores, with a high profit per customer visit” (p. 92).  Committed to that task, Walgreens prospered while Eckerd (hungry for growth in any area, such as video games) withered.  The hedgehog concept brings together three essentials:  1) determining “what you can be the best in the world at;” 2) knowing that you can be well paid for your efforts; and 3) discovering that you deeply care for and love what you do (p. 96).

The fifth step in becoming great is “a culture of discipline” that is nourished rather than imposed.  Holding employees responsible, but granting them freedom to make their own distinctive contributions, distinguishes great organizations.  This requires rigorous recruitment and hiring–getting “self-disciplined” employees who are committed to the company.  “In a sense, much of this book,” says Collins, “is about creating a culture of discipline.  It all starts with disciplined people.  The transition begins not by trying to discipline the wrong people into the right behavior, but by getting self-disciplined people on the bus in the first place” (p. 126).  “Throughout our research, we were struck by the continual use of words like disciplined, rigorous, dogged, determined, diligent, precise, fastidious, systematic, methodical, workmanlike, demanding, consistent, focused, accountable, and responsible” (p. 127).

Finally, there are “technological accelerators.”  Good-to-great companies freely utilize the latest technologies, but they think differently about them.  Their core values, their hedgehog tenacity, determine the use of technologies.  Rather than insisting on everything be the latest and finest, they pick and choose precisely what new things will actually contribute to the organization’s goals.  Technologies may help, but they never create the momentum needed for success.  If the latest technology fits the goal, then every possible effort must be made to master and utilize it.  Be the very best in making it work for you.  If not, let it alone.

* * * * * * * * * * * * * * * * * *

Quite different in its approach to organizational success is The Way of the Shepherd:  7 Ancient Secrets to Managing Productive People (Grand Rapids:  Zondervan, c. 2004), by Kevin Leman and William Pentah.  The short book tells a story about an ambitious MBA student at the University of Texas who wanted to learn everything his professor could impart–and ended up making weekly visits to the professor’s nearby ranch, learning the art of shepherding.

First, you must “know the condition of your flock.”  This means keeping constantly in touch with employees, getting to know them personally, attending to their daily needs.  Isolated executives inevibably fail to understand the true condition of their organizations.  Nothing substitutes for walking about the workplace, asking questions, answering questions, being available.  Second, you must “discover the shape of your sheep.”  Before you hire an employee, discern whether he or she will contribute to the health of the organization.  Make sure you get healthy sheep and monitor their condition on a regular basis.  It’s the sheep, not the shepherd, who produce the wool and mutton!  Thirdly, you must “help your sheep identify with you.”  To do this you must model “authenticity, integrity, and compassion” (p. 51).  Living out the high standards you expect of your employees, carefully and continually communicating your own values and vision, elicits commitment from your “sheep.”  Being “professional” isn’t enough, because good leaders are primarily “personal” and they treat their workers as subjects rather than objects.

To “make your pasture a safe place” means making sure your employees don’t fight over scarce resources.  There must be enough good grass to eat.  Folks who are content where they are rarely look for “greener pastures.”  Just as sick sheep must be culled from the flock before they spread contagious diseases so too must disgruntled employees be dismissed.  This helps explain the need for a shepherd’s “rod and staff.”  At times one must tap a straying ewe with a staff to rightly direct her.  When a lamb gets stuck in a crevice, the crook on the staff enables one to rescue him.  Ever out front, leading, the shepherd uses the tools necessary to direct and encourage, to nudge or correct, his sheep.  Persuasion, not coercion, is most often the key–but there must be fence lines and limits to the freedom granted one’s flock.  The shepherd’s rod provides the means to protect the sheep from predators–both outsiders and insiders.  A shepherd uses a rod, when necessary, to discipline a wayward lamb or a deviant rebel.

Finally–the seventh point–a good shepherd has a good heart.  Hirelings often do the shepherd’s work, but they often do it poorly because they’re hirelings.  Getting paid is not a sufficient motivation to do the demanding work of a real shepherd.  Having a heart for people, actually caring for them and their situation, makes one a really good leader.  A good business is a good place to work, and a good workplace makes good things.

The Way of the Shepherd is a quick read, obviously rooted in biblical principles running from Psalm 23 to Jesus’ words concerning His shepherd’s role.

166 Benedict XVI

Following the election of Joseph Cardinal Ratzinger as Pope Benedict XVI, I have read or re-read half-a-dozen of his works in an effort to better understand the new pontiff.  Doing so illuminates both the man and (through him) the Roman Catholic Church and the modern world.  Ratzinger provides us a brief overview of his first 50 years in Milestones:  Memoirs 1927-1977 (San Francisco:  Ignatius Press, 1998).  Born and reared in Bavaria by devout parents, he enjoyed a blessed childhood.  His father, a rural policeman, moved frequently about the region between the Inn and Salzach rivers, and he retired at the age of 60 (in 1937) to a house outside Traunstein.  The area is richly rooted in history, reaching back several millennia to the Celts and Romans.  It was early christianized by Irish missionaries.  Ratzinger conveys the sense that he knows his land and people and finds stability therein.

Ratzinger attended the gymnasium in Traunstein, where he thoroughly mastered Latin and Greek, a linguistic foundation for his later mastery of theology.  He began such studies just in time, for Hitler’s National Socialist regime soon required students to study science and modern languages rather than the classics.  Students such as himself, however, were “grandfathered” in and allowed to complete their classical curriculum.  Entering adolescence, he decided to enter the priesthood.  In 1943, as Hitler’s war effort began crumbling, all boarding school students (Ratzinger included) were required to serve in a civil defense force.  When he became eligible for military service, he was spared active duty, but he was forced to work in a labor camp (which he fled as the war was ending) and thus support the regime.

When the war ended, Ratzinger resumed his seminary education at Freising.  Despite the lack of virtually everything material, the students joined together and zestfully studied for the priesthood, delving into a broad spectrum of philosophy and literature as well as theology.  From Freising, Ratzinger went to Munich to study at the university.  Here he encountered outstanding scholars and relished the challenge of new ideas and diverse perspectives.  He also dug deeply into biblical studies and the thought of St. Augustine.  “When I look back on the exciting years of my theological studies,” he recalls, “I can only be amazed at everything that is affirmed nowadays concerning the ‘preconciliar’ Church” (p. 57).  Rather than being a tradition-bound static era, it was a time of ferment and radical questioning.

His intellectual brilliance fully evident, Ratzinger was encouraged to pursue the doctorate and did so while serving as an assistant pastor in Munich.  He worked hard in youth ministry, received his degree, and then began teaching in the seminary in Freising.  Subsequently he moved to Bonn, where he as awarded the chair in fundamental theology.  Soon thereafter (moving quickly up the academic ladder) he was invited to Munster, then Tubingen and Regensberg.  In the midst of his moves, he was fully involved in the theological discussions of the ’50s and ’60s–including the efforts of some to reduce Revelation to the historical-critical method of biblical exegesis.  While at Tubingen, he saw existentialism literally collapse, to be replaced by the pervasive Marxism that continues to shape European universities.  His encounters with Karl Rahner ultimately led him to note that “despite our agreement in many desires and conclusions, Rahner and I lived on two different theological planets” (p. 128).  Scripture and the Fathers, not Kant and modern thought, were his beacons of truth.

Fully expecting to remain in academia for a lifetime, Ratzinter was, quite unexpectedly, appointed archbishop of Munich and Freising in 1977.  He chose, as his Episcopal motto, a “phrase from the Third Letter of John, ‘Co-worker of the Truth'” (p. 153).  To fulfill that calling, he sought to anchor his diocese to the eternal Rock of Christ.  Committing one’s all to “the side of God,” of course, never guarantees worldly success, even in the Church.  But it does give stability to one’s decisions.  And it explains why Pope John Paul II soon called on Ratzinger to take control of the Congregation for the Doctrine of the Faith.

* * * * * * * * * * * * * * * * * * * *

Not long after assuming his new position in Rome, Cardinal Ratzinger was interviewed by Vittorio Messori, an Italian Journalist.  The written record of that meeting, The Ratzinger Report (San Francisco:  Ignatius Press, c. 1985), provides considerable insight into both Ratzinger himself and his concerns for the Church.  He appears as a deeply devout man, clearly troubled by certain developments in the Catholic world following Vatican II that changed the Church more in 20 years than in the previous 200.  Especially troubling were theological currents, recklessly justified as in “the spirit of Vatican II,” which undermined the very foundations of faith.

To Ratzinger, orthodoxy–right belief–must ever remain preeminent in the life of the believers, for “faith is the highest and most precious good–simply because truth is the fundamental life-element for man.  Therefore the concern to see that the faith among us is not impaired must be viewed–at least by believers–as higher than the concern for bodily health” (p. 22).  Timeless truths (e.g. sin and grace) ever offend secularists, but the Church must proclaim them.  Providing an example, the cardinal noted that he hoped some day to have the time to probe “the theme of ‘original sin’ and to the necessity of a rediscovery of its authentic reality” (p. 79).  Failing to take seriously this doctrine “and to make it understandable is really one of the most difficult problems of present-day theology and pastoral ministry” (p. 79).  But unless we’re sinners, we need no salvation!  Yet all around us “Christian” preachers refuse to tell folks the truth about sin!  Poorly informed, many folks just assume that everyone somehow goes to heaven because we’re all good enough to deserve it.  The Church has been entrusted with one great task:  to tell Truth to the world.  You don’t discern Truth by counting ballots.  She is sacramental and hierarchical, not social and democratic.  By teaching the Credo, the Our Father, the Decalogue, and the sacraments, Christ’s ministers can effectively lift up the Truth that’s sufficient for man’s redemption.

Thus the Church is, Ratzinger insisted, a divinely directed rather than a purely human institution and as such must live differently from the world.  Sadly enough, “We have lost the sense that Christians cannot live just like ‘everybody else'” (p. 115).  There is, along with orthodoxy, an orthopraxy that should characterize the Church.  So he insists on the ancient phrase:  Ecclesia semper reformanda.  But true reform comes neither from bureaucrats nor critics.  Reform ever results from the leaven of saints!  “Saints, in fact, reformed the Church in depth, not by working up plans for new structures, but by reforming themselves.  What the Church needs in order to respond to the needs of man in every age is holiness, not management” (p. 53).

The manifest lack of–indeed, contempt for–personal holiness characterizes the sexual revolution birthed in the ’60s.  In time we will lament, Ratzinger said, “the consequences of a sexuality which is no longer linked to motherhood and procreation.  It logically follows from this that every form of sexuality is equivalent and therefore of equal worth” (p. 85).  Unfettered from any real end, “the libido of the individual becomes the only possible point of reference of sex” and everyone does pretty much as he desires.  Consequently, “it naturally follows that all forms of sexual gratification are transformed into the ‘rights’ of the individual.  Thus, to cite an especially current example, homosexuality becomes an inalienable right” (p. 85).  Yet another current of the sexual revolution, radical feminism, has greatly troubling the Church, seeking to alter her very structure and message.  The cardinal was “in fact, convinced that what feminism promotes in its radical form is no longer the Christianity that we know; it is another religion” (p. 97).

In the face of much moral chaos, however, Ratzinger insists the Church must retain her focus and restate her message, come what may!

* * * * * * * * * * * * * * * * * * * *

A decade later another journalist, Peter Seewald, interviewed Ratzinger and published their conversations in Salt of the Earth:  Christianity and the Catholic Church at the End of the Millennium (San Francisco:  Ignatius Press, c. 1996).  Seevold provides a personal introduction, indicating that he had, as a youngster, rejected the Faith and thus interviewed Ratzinger with some genuine personal concerns regarding himself as well as his subject.  So his first section focused on “The Catholic Faith:  Words and Signs.”

Ratzinger’s interested mainly in philosophy, theology, doctrine, ethics.  He grants that knowing theology doesn’t make one a better person, but when rightly studied and appropriated it matters eternally–both for an individual and the Church.  Though more celebrated “problems” may capture newspaper headlines, the real crisis in the Church today is theological, for she’s entrusted with declaring what one ought to believe.  To Ratzinger, “To the substance of the faith belongs the fact that we look upon Christ as the living, incarnate Son of God made man; that because of him we believe in God, the triune God, the Creator of heaven and earth; that we believe that this god bends so far down, can become so small, that he is concerned about man and has created history with man, a history whose vessel, whose privileged place of expression, is the Church” (p. 19).  In our day, especially in Europe, where the Church now represents a minority of the population, it takes courage to uphold the Faith in the face of mounting hostility.

Shifting from the discussion of Faith, Seewald asked Ratzinger a number of biographical questions.  (If one’s read the cardinal’s Milestones, much in this section is repetitious, though one certainly gets fresh perspectives as he answers questions.)  He acknowledges that he is something of a Platonist and is openly devoted to St. Augustine.  He also cites a turning point, for him personally, when Marxists suddenly gained power, especially in the universities, in the late ’60s.  He instantly knew that “Christians” trying to mix Marx with Jesus–flying the flag of  “progressivism”–would lose their integrity as Christians.  Since that time, “progressives” within the Catholic Church have sought to change her sexual standards, to install female priests, to make the Church something akin to themselves rather than Christ.

Obviously, Ratzinger noted, “not all who call themselves Christians really are Christians” (p. 220).  Real Christians seek to live out the Christ-like life divinely imparted to them.  They’re not intent on changing the world!   Indeed, as the 20th century demonstrates, “everything depends on man’s not doing everything of which he is capable–for he is capable of destroying himself and the world–but on knowing that what ‘should’ be done and what ‘may’ be done are the standard against which to measure what ‘can’ be done” (p. 230).  To give us direction we need spiritual renewal, not political revolution.  We need saints, not power-hungry protesters.  “What we really need,” says Ratzinger, echoing his words in The Ratzinger Report, “are people who are inwardly seized by Christianity, who experience it as joy and hope, who have thus become lovers.  And these we call saints” (p. 26).

* * * * * * * * * * * * * * * * * * * *

The third set of published interviews, God and the World:  A Conversation with Peter Seewald (San Francisco:  Ignatius Press, c. 2000), further enriches our understanding of Pope Benedict XVI.  By now the journalist Seewald had returned to the Faith and his questions are both more informed and sympathetic.  The conversations took place during three days in the abbey of Monte Cassino.  That a book of 460 pages, dealing expertly with the whole spectrum of Christianity, can be compiled in three days indicates something of the genius of Ratzinger!

Setting the stage in his preface, hinting at his own journey back to faith, Seewald wondered what to make of the fact that “Within a short period of time, something like a spiritual nuclear attack had befallen large sections of society, a sort of Big Bang of Christian culture that was our foundation” (p. 13).  To which Ratzinger, “one of the Church’s great wise men . . . patiently recounted the gospel to me, the belief of Christendom from the beginning of the world to its end, then, day by day, something of the mystery that holds the world together from within became more tangible.  And fundamentally it is perhaps quite simple.  ‘Creation,’ said the scholar, ‘bears within itself an order.  We can work, out from this the ideas of God–and even the right way for us to live'” (pp. 14-15).  Faith and love, rightly amalgamated, provide us that way.

Consequently, the Faith, rooted in the Truth of Revelation, cannot be compromised.  “I always recall the saying of Tertullian,” Ratzinger says, “that Christ never said ‘I am the custom’, but ‘I am the truth'” (p. 35).  Thus the task of the Church, in the words of Romano Guardini, is to:  “‘steadily hold out to man the final verities, the ultimate image of perfection, the most fundamental principles of value, and must not permit herself to be confused by any passion, by any alteration of sentiment, by any trick of self-seeking'” (p. 65).  To the cardinal:  “Christianity makes its appearance with the claim to tell us something about God and the world and ourselves–something that is true and that, in the crisis of an age in which we have a great mass of communications about truth in natural science, but with respect to the questions essential for man we are sidelined into subjectivism, what we need above all is to seek anew for truth, with a new courage to recognize truth.  In that way, this saying handed down from our origins, which I have chosen as my motto, defines something of the function of a priest and theologian, to wit, that he should, in all humility, and knowing his own fallibility, seek to be a co-worker of the truth” (p. 263).

Seeing the truth, discerning the Logos in creation, enables one to share Sir Isaac Newton’s conviction that “The wonderful arrangement and harmony of the universe can only have come into being in accordance with the plans of an omniscient and all-powerful Being.  That is, and remains, my most important finding” (p. 47).  The clear mathematical structure of the cosmos reveals its Logos.  Equally rational, one discerns moral truths that are as objective and inflexible as mathematical formulae.  The Ten Commandments, explained by Ratzinger as “commandments of love” (p. 180), are always and everywhere valid because they tell us the truth about God and ourselves.  Thus it follows, he says, that:  “Setting moral standards is in fact the most prominent work of mercy” (p. 317).

Since Seewald guides Ratzinger through the major themes of the Catechism, God in the World is a rather handy, informal primer for the Catholic faith.  Combined with The Ratzinger Report and Salt of the Earth, it provides valuable insight into the personality and theology of the new pontiff.

* * * * * * * * * * * * * * * * * * * *

Ratzinger played a significant role, as a young theologian, in the deliberations of  the Second Vatican Council.  Soon thereafter, in 1967, he gave a series of lectures at Tubingen that were published as Introduction to Christianity (New York:  The Seabury Press, c. 1969).  Here we find the scholar, citing current theologians, documenting positions, doing intellectual work of the highest order.  The foundation for much of what he said in the intervening 40 years was made clear in this book.

He first addressed “belief in the world of today,” noting the widespread disbelief that challenges the Church.  Multitudes cannot believe in anything intangible.  Even within the Church, many find themselves troubled with doubts of all kinds.  Ratzinger sheds light on the problem by tracing its history, especially evident in the shift from the Ancient and Medieval position that “Verum est ens” to the view of Giambattista Vico, that “Verum quia factum.”  Following Vico, thinkers like Hegel and Marx reduce all questions to historical issues and consider what man makes, not what God has made.  In time (initially among the intelligentsia but now everywhere) belief in God slowly eroded away.

To introduce modern man to Christianity, then, Ratzinger proposed that Christians primarily declare “I believe in You,” meaning Jesus Christ, and encourage him to find “God in the countenance of the man Jesus of Nazareth” (p. 48).  When one recites the Apostles’ Creed, one certainly assents to its propositions, but more importantly he gives witness to his conversion–his “about-turn”–to the way of Christ.  Having heard the Word, one takes it in from an outside source.  Faith comes to us from God.  We do not construct a worldview of some sort and then decide to live accordingly.  Rather, we take, as a gift, what is revealed to us.  “Christian belief is not an idea but life; it is not mind existing for itself, but incarnation, mind in the body of history and its ‘We’.  It is not the mysticism of the self-identification of the mind with God but obedience and service” (p. 64).

To believe in One God involves not merely monotheism but–unlike the tendency to construct localized or tribal deities–the acknowledgement that the “God of our fathers” is “not the god of a place, but the god of men:  the God of Abraham, Isaac and Jacob.  He is therefore not bound to one spot, but present and powerful wherever man is” (p. 83).  This God simply Is.  He’s not in the process of becoming something, as is the world around us.  This “God who ‘is’ is at the same time he who is with us; he is not just God in himself, but our God, the ‘God of our fathers'” (p. 88).  The God of our fathers is also “our Father.”  Ratzinger says:  “By calling God simultaneously ‘Father’ and ‘Almighty’ the Creed has joined together a family concept and the concept of cosmic power in the description of the one God.  It thereby expresses accurately the whole point of the Christian image of God:  the tension between absolute power and absolute love, absolute distance and absolute proximity, between absolute Being and a direct affinity with the most human side of humanity” (p. 104.

Creation bears witness to its Maker.  “Einstein said once that in the laws of nature ‘an intelligence so superior is revealed that in comparison all the significance of human thinking and human arrangements is a completely worthless reflection'” (p. 106).  To Ratzinger, this means that we merely re-think “what in reality has already been thought out beforehand” (p. 106).  There is a Logos giving rational structure to all that is.  Rejecting the notion that the world is a purely random collection of material things, Christians marvel at it as the artistry of a divine Mind.  This God has revealed himself, preeminently in the Christ who referred to both His Father and the Spirit.  “God is as he shows himself; God does not show himself in a way in which he is not.  On this assertion rests the Christian relation with God; in it is grounded the doctrine of the Trinity; indeed, it is this doctrine” (p. 117).  History reveals how easily we err in trying to rationally explain this doctrine–sliding into monarchianism or subordinationism.  The Church, wisely, has insisted we be “content with a mystery which cannot be plumbed by man” (p. 118).

So too there is a mystery to Jesus Christ, the Word made flesh, reconciling the world to himself.  Fully aware of various theories concerning “the Jesus of History” and the “Christ of Faith,” Ratzinger finds “it preferable and easier to believe that God became man than that such a conglomeration of hypotheses represents the truth” (p. 159).  Thus the Virgin Birth reveals “how salvation comes to us; in the simplicity of acceptance, as the voluntary gift of the love that redeems the world” (pp. 210-211).

We’re saved by grace, given to us by a loving Father in the person and work of His Son.  To respond with faith and love makes us Christian.

165 Pragmatism’s Founders

American historians generally note that “pragmatism” is this nation’s only uniquely homespun philosophy.  Now most all philosophical labels are to a degree misleading generalizations, and the men who crafted pragmatism were hardly of one mind.  Nevertheless, as Louis Menand shows in his marvelously informative The Metaphysical Club:  A Story of Ideas in America (New York:  Farrar, Straus, Giroux, c. 2001), a handful of New England Pragmatists (Oliver Wendell Holmes, Jr.; William James; Charles S. Peirce; John Dewey) shared certain perspectives and deeply shaped this nation.  The men’s biographies–including family roots, personal experiences, social connections, New England backgrounds, and the challenge of Darwinism–provide valuable contexts for understanding their thought.

Indeed:  “together they were more responsible than any other group for moving American thought into the modern world.   . . . .  Their ideas changed the way Americans thought–and continue to think–about education, democracy, liberty, justice, and tolerance.  And as a consequence, they changed the way Americans live–the way they learn, the way they express their views, the way they understand themselves, and the way they treat people who are different from themselves.  We are still living, to a great extent, in a country these thinkers helped to make” (pp. x-xi).  To distill their positions to a single sentence:  “they all believed that ideas were not ‘out there’ waiting to be discovered, but are tools–like forks and knives and microchips–that people devise to cope with the world in which they find themselves” (p. xi).

Menand first treats Oliver Wendell Holmes, Jr., who grew up in Boston, the son of an eminent physician and writer.  Unlike his famous father, he fully ascribed to the abolitionist agenda and enthusiastically marched off to battle when the Civil War began.  He proved to be a courageous, repeatedly wounded soldier.  But in the course of the war he lost his faith in both abolitionism and God.  Moral and metaphysical certainties of any stripe, he decided, lead to ghastly violence.  He simultaneously discovered and fully embraced the philosophical naturalism espoused by Charles Darwin in On The Origin of Species.   (Importantly, one of the constants in the Pragmatists’ story is the influence of Darwin’s theory of evolution through natural selection.  However one responds to the biological hypothesis, one cannot deny its pervasive philosophical and sociological consequences)

Losing faith in God and social reform on the battlefield, Holmes substituted an admiration for his fellow soldiers and the ultimate prerogatives of power.   For the rest of his life he routinely recounted his involvement in battles and reminded folks of his wounds.  Though distressed by the war’s violence, he still seemed fixated on it.  And he clearly concluded that “might makes right” because there really isn’t any ultimate “right.”  Before Nietzsche uttered his oracles Holmes had settled into a Nietzschean nihilism.  “‘You respect the rights of man–,’ he wrote to Laski.  ‘I don’t, except those things a given crowd will fight for–which vary from religion to the price of a glass of beer.  I also would fight for some things–but instead of saying that they ought to be I merely say they are part of the kind of world that I like–or should like'” (p. 63).

Like Holmes, William James was sired by an illustrious father, Henry, who embodied both the enthusiasm and anarchical sectarianism of America’s Second Great Awakening.  Henry passed through a variety of intense religious experiences and even studied briefly at Princeton Theological Seminary.  In time he embraced Swedenborgianism, wherein he enjoyed the freedom to shape his own mystical religious convictions in accord with his own experiences.  No church ever suited him, so he became his own church.  Like many who inherit great wealth and never work to earn a living, he was fully fascinated with socialism, drinking droughts of Charles Fourier and fantasizing about “‘the realization of a perfect society, fellowship, or brotherhood among men'” (p. 85).

Young William, after traipsing about Europe and picking up a smattering of education from various tutors and schools, ultimately studied biology with the acclaimed Louis Agassiz at Harvard.  In time James embraced the very Darwinism that Agassiz rejected, though he was, of course, fully aware of its implicit, inescapable determinism.  For, Menand emphasizes:  “The purpose of On the Origin of Species was not to introduce the concept of evolution; it was to debunk the concept of supernatural intelligence–the idea that the universe is the result of an idea” (p. 121).  One may quite easily believe in a form of evolution under divine guidance.  “What was radical about On the Origin of Species was not its evolutionism, but its materialism” (p. 121).

But a materialist William James was not and could not be, so he struggled to carve out realms of personal freedom within the broader scope of biological necessity.  He could not abide Thomas Huxley’s conclusion that “We are conscious automata.”  Somehow the processes of natural selection had mysteriously spun out human beings who freely choose what to think and how to live.  “There is intelligence in the universe:  it is ours.  It was our good luck that, somewhere along the way, we acquired minds.  They released us from the prison of biology” (p. 146).  Thus the pragmatism James espoused was, he said, “the equivalent of the Protestant Reformation” (p. 88), a new faith for the new scientific world.  If believing in God and freedom enabled one to live better, such beliefs are “true.”

Charles S. Peirce, like James and Holmes, was the son of a prominent Bostonian, Professor Benjamin Peirce.  His father taught mathematics at Harvard and was, in his own right, a highly significant intellectual.  He considered himself an “idealist,” for “he believed that the universe is knowable because our minds are designed to know it.  ‘In every form of material manifestation,’ he explained, ‘there is a corresponding form of human thought, so that the human mind is as wide in its range of thought as the physical universe with it thinks.  The two are wonderfully matched.’  Thought and matter obey the same laws because both have a common origin in the mind of a Creator.  This is why the truths of mathematical reasoning (as [Benjamin] Peirce often reminded his students) are God’s truths” (p. 156).

Young Charles Peirce was precociously brilliant–and almost equally eccentric.  Thus he never settled into an established career.  He earned a living, primarily, as an employ of a federal bureaucracy, thanks to his father’s influence.  And he wrote reams of material never published in his lifetime.  Ironically, perhaps the most brilliant of the “pragmatists” was not, in many significant ways, a Pragmatist!  He did, however, deal with significant issues, such as statistics and probability theory, and in these areas insisted on a form of pragmatic epistemology.  Furthermore, like Holmes and James, he addressed the philosophical implications of Darwinism, wondering how can we know anything if the world is merely the product of chance and necessity.  He decided that “chance variation could explain evolution adequately–[but] he thought God’s love must play a more important role, a theory he called ‘agapism,’ derived in part from the Swedenborgian writings of Henry James, Sr.–and he could not imagine a universe devoid of ultimate meaning” (p. 365).  He also argued that great scientists, like Kepler, came to their conclusions through a “kind of guessing Peirce called ‘abduction’; he thought that it was a method integral to scientific progress, and that it pointed to an underlying affinity between the mind and the universe” (p. 367).

The fourth thinker Menand studies, John Dewey grew up in Vermont and studied at the state’s university in Burlington.  He earned a Ph.D. at The John Hopkins University, and then successively taught philosophy at the University of Michigan, Chicago University, and Columbia University.  He moved intellectually as well as geographically, shifting (in the 1890s) from Hegelian idealism to a form of Pragmatism (often called Instrumentalism) that he espoused thenceforth.  We learn, he decided, almost exclusively by doing.  So schools  should be places where we learn to cook, sew, and construct things; they should be small shops where we work together and solve very practical problems.  Math and science, geography and psychology–whatever’s worth knowing–should be discovered by students engaged in activities of some sort.  In John Dewey progressive educators had their American guru!  Progressive politicians had their guide!  Progressive churchmen had a new Moses!

Menand’s genius is to weave together four men’s biographies and make a tapestry of the times.  His research, evident in both the notes and the evident familiarity he has with the subject, bears witness to his patient plowing through archives as well as publications.  His synthesis, making the book much more than a series of biographical vignettes, reveals the fundamental issues and lasting legacy of the men studied.  The style, crisp and alluring, draws the reader into an exciting intellectual adventure.  The Metaphysical Club is certainly one of the finest works of intellectual history of recent decades.

* * * * * * * * * * * * * * * * * * * *

Menand considers Holmes a pragmatist, and inasmuch as he was an ethical consequentialist, he fits into that tradition.  But Albert W. Alschuler, a Professor of Law at the University of Chicago, portrays him as more properly a nihilistic existentialist, much akin to Nietzsche.  He was likewise a Social Darwinist, fully imbued with that bleakly naturalistic philosophy.  Because of Holmes and his followers, “the central lyric of twentieth-century American jurisprudence” is summed up by Perry Farrell, the lead singer of Porno for Pyros,” who decreed:  “‘Ain’t no wrong, ain’t no right, only pleasure and pain'” (pp. 189-190).  Deeply displeased with such developments, Alschuler incisively critiques Holmes in Law Without Values:  The Life, Work, and Legacy of Justice Holmes (Chicago:  The University of Chicago Press, c. 2000).  Holmes’s philosophy, rightly examined, is as irrationally adolescent and wrong as Perry Farrell’s song.

Without doubt Justice Holmes, Alschuler argues, “more than any other individual, shaped the law of the twentieth century” (p. 1).  And he cast it in utterly amoral terms, contending “that moral preferences are ‘more or less arbitrary . . . .  Do you like sugar in your coffee or don’t you? . . .  So as to truth'” (p. 1).  In the deepest sense, Holmes rejected “objective concepts of right and wrong” and set a “downward” trajectory that explains the moral vacuity of many recent court decisions.   When we wonder about Supreme Court decisions–involving cases concerning the Ten Commandments, homosexual rights, property rights, partial birth abortion, etc.–we do well to trace their philosophical roots to Oliver Wendell Holmes, Jr.

Holmes was, in many ways, a thoroughgoing skeptic, much like the ancient Sophists such as Thrasymachus (portrayed in Plato’s Republic as Socrates’ amoral antagonist).  And Holmes’s followers, in legal circles today, are legion and sophistic.  Relativism reigns.  Moral truth is whatever the largest or most vociferous or politically correct crowd desires.  Vices and virtues are merely words indicating personal preferences.  “All these American scholars,” Alschuler says, “have tilted from Socrates on the issue that marks the largest and most persistent divide in all jurisprudence.  In ancient Athens, the philosopher Thrasymachus anticipated Holmes by 2,3000 years when he said, ‘Justice is nothing else than the interest of the stronger.’

Rejecting this position, Socrates replied that justice was not the enacted will of the powerful but ‘the excellence of the soul.’  He argues that justice was unlike medical treatment (a means to an end) or an amusing game (which had no end beyond itself).  Justice was a good of the highest order–an end and a means, a good to be valued for itself and for its consequences.  In Rome four hundred years later, Cicero described justice as ‘right reason in agreement with nature'” (p. 8).  Cicero and Socrates helped shape the “natural law” or “moral realist” tradition so evident in the founding documents of the United States.  “We hold these truths to be self evident,” said Jefferson, in a succinct declaration of the natural law, “that all men are created equal and are endowed by their Creator with certain unalienable rights.”  This nation’s Constitution and the laws implementing it were shaped by Locke and Blackstone, then stamped with an American imprint by Madison, Marshall, Story, and Lincoln.  The years from 1776-1860 were the “golden age” of American law.

Oliver Wendell Holmes and his followers, however, rejected any “natural” or “divine” law and imposed what Pope Benedict XVI recently described as a “dictatorship of relativism.”  Contrary to Jefferson’s “Declaration of Independence,” Holmes saw “no reason for attributing to a man a significance different in kind from that which belongs to a baboon or to a grain of sand'” (p. 23).  “‘All my life,’ said the architect of 20th century American jurisprudence, ‘ I have sneered at the natural rights of man'” (p. 26).  Instead, he propounded “a power-focused philosophy,” says Alschuler.  As Thrasymachus asserted, “might makes right.”  Thus, while sitting on the United States Supreme Court, Holmes wrote:  “‘I have said to my brethren many times that I hate justice, which means that I know that if a man begins to talk about that, for one reason or another he is shirking thinking in legal terms'” (p. 89).

On a personal level, biographers agree, Holmes cared little for anyone other than himself.  Revealingly, when he died he left his entire estate to the federal government!  After spending 15 years preparing an authorized biography he never published, Grant Gilmore said:  “‘The real Holmes was savage, harsh, and cruel, a bitter and lifelong pessimist who saw in the course of human life nothing but a continuing struggle in which the rich and powerful impose their will on the poor and weak'” (pp. 31-32).   Which is as it should be because it simply is what is!  In his support of eugenics, fore example, he revealed a thinly disguised contempt for the weak and unfit, delighting to uphold laws sterilizing imbeciles and “writing approvingly of killing ‘everyone below standard’ and ‘putting to death infants that didn’t pass the examination'” (p. 29).

In his enthusiasm for military valor and virtues, in his celebrated will-to-power nominalism, he clearly resembled Nietzsche.  The “were born three years apart and had much in common.  Both viewed life as a struggle for power; both were antireligious . . .; both saw ethics as lacking any external foundation; both could fairly be regarded as existentialists; both saw the suffering and exploitation of some as necessary to the creative work of others; both were personally ambitious and had a strong work ethic; both had a strong sense of personal destiny; . . . both often seemed indifferent to the feelings of those around them; both found in their wartime experiences a metaphor for the universe at large; and both had military-style moustaches” (p. 19).

As a legal scholar, Alschuler gives meticulous attention to Holmes’s writings.  Though they enjoy something of a hallowed place in legal circles, Alschuler finds them sorely deficient in many ways.  He regards The Common Law, the treatise that established Holmes’ reputation in the 1880’s, a “clear failure” (p. 125).  Only the first paragraph–the lines recited by most scholars–proves memorable.  Likewise, Holmes’s 1897 article, “The Path of the Law,” considered by Richard Posner “‘the best article-length work on law ever written'” (p. 132), cannot withstand careful scrutiny.  Written, Holmes said, to “‘dispel a confusion between morality and law,'” (p. 150), committed to the proposition that “All law means that I will kill you if necessary to make you conform to my requirements'” (p. 144) the article reveals his amoral positivism.   “In The Path of the Law,” Alschuler says, “Holmes listed five words to illustrate the sort of moral terminology he proposed to banish from law–rights, duties, malice, intent, and negligence” (p. 172).  He particularly despised “duty.”  Laws are issued by whoever is in power, and one obeys them because he must do so.  But there is no inner “ought,” no moral imperative, no reason to do what’s honorable.

“The Path of the Law,” writes Alschuler, “has molded American legal consciousness for more than a century, and lawyers now carry gallons of cynical acid to pour over words like duty, obligation, rights, and justice” p. 176).  It has quite recently been described by legal scholars “as an ‘acknowledged masterpiece in jurisprudence,’ ‘the single most important essay ever written by an American on the law,’ and perhaps ‘the best article-length work on law ever written'” (p. 180).  But Alschuler insists that  Holmes’s “theory of contracts,” set forth to replace the natural law position, is “a hopeless jumble of ill-considered prescriptive and descriptive ideas” (p. 176).  Despite its influence and renown, the essay is in many ways quite “incoherent” (p. 135), and its very incoherence reflects Holmes’s Nietzschean, Darwinian worldview where nothing much makes sense.

Alschuler ends his critique of Holmes with a chapter entitled “Ending the Slide from Socrates and Climbing Back.”  The chapter’s first two paragraphs deserve quoting, for they sum up the case against one of the most powerful 20th century intellectual currents:  “The current ethical skepticism of American law schools (in both its utilitarian and law-as-power varieties) mirrors the skepticism of the academy as a whole.  Some twentieth-century pragmatists, extending their incredulity further than Holmes, have abandoned the idea the human beings can perceive external reality–not only right, wrong, and God (issues on which Holmes took a skeptical stance) but also gravity, suffering, and even chairs (issues on which Holmes was a realist.  “These pragmatists maintain that the only test of truth is what works, and a century of pragmatic experimentation has given that question a clear answer:  “Pragmatism and moral skepticism don’t they are much more conducive to despair than to flourishing.  They fail their own test of truth.  We have walked Holmes’s path and have lost our way” (p. 187).

Given their demonstrable failure, we need to find a better way.

# # #

164 Rodney Stark’s Church History

For several decades Rodney Stark, currently a sociology professor at Baylor University, devoted himself to the sociology of religion.  But he was always “a history buff,” and 20 years ago, he read Wayne Meeks’s The First Urban Christians.  Thus began, somewhat as an avocation, his reading widely in Church history.  With an academic outsider’s perspective, he began asking different questions and taking different approaches to the subject, leading to the publication of highly readable and scintillating works such as The Rise of Christianity:  How the Obscure, Marginal Jesus Movement Became the Dominant Religious Force in the Western World in a Few Centuries (Princeton: Princeton University Press, c. 1996; San Francisco:  Harper San Francisco reprint, 1997).  In general, the book seeks “to reconstruct the rise of Christianity in order to explain why it happened” (p. 3).

But the book is not a sustained chronological narrative.  Rather, each of its10 chapters stands alone–a collection of essays providing an analysis of something that strikes Stark as significant.  In chapter one he considers “Conversion and Christian Growth,” seeking to understand how the120 Christians at Pentecost launched a movement that literally won the world for Christ.  The data indicate the Early Church grew at the rate of “40 percent per decade” for several centuries (p. 6).  This is virtually the same growth rate enjoyed by the Mormons for the past century, and at that rate there would have been only 7,530 Christians by the year 100 A.D., and some 40,000 by 150.  Thereafter, as anyone understanding compound interest understands, the numbers dramatically increased and the Roman Empire was “Christian” mid-way through the fourth century.

In chapter two Stark discounts the popular notion that “Christianity was a movement of the dispossessed.”  Friedrich Engels championed this view, arguing that it was a “‘religion of slaves and emancipated slaves, of poor people deprived of all rights'” (p. 29).  Many historians–and the multitudes of scholars influenced by Ernst Troeltsch–widely embraced such Marxist thinking.  “By the 1930s this view of Christian origins was largely unchallenged” (p. 29), and legions of professors still repeat the litany.  But, Stark insists, it must be discarded because it’s utterly untrue.  Today “a consensus has developed among New Testament historians that Christianity was based in the middle and upper classes” (p. 31).  Aristocrats and wealthy believers, scholars and highly educated folks, were quite prominent in the Early Church.  This squares with current sociological evidence regarding religious sects and cults, which almost never thrive among the poor and dispossessed.  Movements such as the Mormons and Moonies appeal to the well educated and prosperous.  So, Stark argues, it makes sense to envision the Early Christians as appealing to the same social strata.

In the next chapter Stark argues that converted Jews were an enduring Christian constituency.  He thinks that  “not only was it the Jews of the diaspora who provided the initial basis for church growth during the first and early second centuries, but that Jews continued as a significant source of Christian converts until at least as late as the fourth century and that Jewish Christianity was still significant in the fifth century” (p. 49).   There were, of course, far more Jews of the diaspora than Jews living in Palestine.  Many of these Jews had so lost their Hebrew roots that a Greek translation of the Scriptures (the Septuagint) had become necessary.  Among these Hellenized Jews the Christians found a fertile field for the Gospel.  “If we examine the marginality of the Hellenized Jews, torn between two cultures, we may note how Christianity offered to retain much of the religious content of both cultures and to resolve the contradictions between them” (p. 59).

Further contributing to Church growth were epidemics, cited by Church Fathers such as Cyprian and Eusebius as factors in drawing converts to a community that cared for the sick and dying as well as offered the promise of resurrection and eternal life.   From 165-180 A.D., an epidemic (probably smallpox) decimated the Roman Empire, reducing the population by at least one-fourth.  In 251 another empire-wide epidemic (probably measles) raged.  Such horrendous crises precipitate religious questioning–as  the response of American Indians to similar catastrophes document.  Importantly, for the Church in the ancient world, while pagans fled the scene of suffering Christians came alongside those who were ill, choosing to risk death rather than desert those in need.  They also cared for the poor, the widows and orphans, demonstrating a qualitatively different kind of religious faith.  Consequently, multitudes of disillusioned pagans turned to the Christian way.

Women too were drawn to the Early Church, where they were more highly revered than in the Greco-Roman world.  This has long been recognized, but Stark seeks “to link the increased power and privileged of Christian women to a very major shift in sex ratios.  I demonstrate that an initial shift in sex ratios resulted from Christian doctrines prohibiting infanticide and abortion; I then show how the initial shift would have been amplified by a subsequent tendency to over recruit women” (p. 95).  Due to the exposure of female babies–in many families only one baby girl was allowed to live–there were significantly more men than women in the first and second centuries.  This meant, of course, a dramatic depopulation trend!  Christians, conversely, with their high view of marriage and fidelity, considered children a blessing and encouraged large families as well as opposed abortion (thus saving the lives of many women who would have died as a result of this dangerous procedure).  Contrary to some feminist readings of the documents, Stark insists that women were drawn to the Church not because it offered them places of political status and power but because Christians insisted that marriage is sacred, life is sacred, and children are to be treasured.

Finally, Stark proposes a thesis in his final chapter that explains why women and others were drawn to the Church: “Central doctrines of Christianity prompted and sustained attractive, liberating, and effective social relations and organizations” (p. 211).  Love and mercy, rooted in the Christian understanding of God as revealed in Jesus Christ, were not celebrated by pagans, but they formed the foundations for Christianity.  Nor did pagans endorse the sanctity of life.  But “above all else, Christianity brought a new conception of humanity to a world saturated with capricious cruelty and the vicarious love of death” (p. 214).

* * * * * * * * * * * * * * * * * *

In One True God: Historical Consequences of Monotheism (Princeton: Princeton University Press, c. 2001), Stark pursues the thesis that monotheism is the most important “innovation” in history.  He makes a clear distinction between “godless religions,” such as Buddhism and Taoism, and “godly religions” such as Judaism and Christianity.  “Godless religions” may assume a distant, unknowable deity of some sort, but they appeal to an intellectual elite of monks and philosophers.  “I am comfortable,” he says, “with the claim that Taoism, for example, is a religion, but it seems unwise to identify the Tao as a God.  Indeed, for centuries sophisticated devotees of Buddhism, Taoism, and Confucianism have claimed that theirs are Godless religions.  I agree” (p. 10).  Remarkably different, however, are the “godly religions,” that proclaim the reality of the “one true God” who has revealed Himself and has a clear plan for mankind, and they have proved historically momentous.

Primitive cultures–as ably documented by Andrew Lang, Paul Radin and Wilhelm Schmidt–often believed in “High Gods” remarkably akin to monotheism, but only Judaism, Christianity, and Islam embraced a coherent vision of God’s nature and of His will for the “chosen” people.  Monotheists call for conversion, and “to convert is to newly form an exclusive commitment to a God” (p. 50).  Monotheists, uniquely, were missionaries.  Though no longer so, Judaism, in the Ancient World, was known as a “missionizing faith” (p. 52), a fact noted by Max Weber, who credited the success of Jewish proselytism to “‘the purity of the ethic and the power of the conception of God'” (p. 59).  In turn, Christians so successfully spread their faith that within 300 years “more than half of the population of the empire (perhaps as many as thirty-three million people) had become Christians.  More recently, of course, missionaries have taken the Gospel almost literally to the uttermost parts of the earth.

By definition missionaries are true believers!  Clergy in established churches easily lose their evangelistic zeal, and broad-minded “liberals” during the past century (with their focus on tolerance and pluralism) quickly abandoned evangelism of any sort.  Indeed, “churchmen who no longer believed in One True God” lost any reason “for attempting to convert non-Christians” (p. 99).   Skeptical churchmen, who no longer believed “in anything more Godly than an essence, began to express doubts as to whether there was any theological or moral basis for attempting to convert non-Christians” (p. 99).  Following WWI, American liberals rejected the notion of God “as an aware, conscious, concerned, active being” and anticipated Paul Tillich’s hypothetical “God as a psychological construct, the ‘ground of our being'” (p. 100).  Rather that seeking “converts,” liberal Christians engaged in various forms of humanitarian “service,” endeavors which enlist few life-long vocations and attract few converts.

True believers seeking converts cannot but engage in religious conflicts, because “particularism, the belief that a given religion is the is the only true religion is inherent in monotheism” (p. 116), and Stark details some of the darker moments of monotheism–persecution of the Jews by both Muslims and Christians at various times, the Crusades, the 30 Years War.  This same particularism also explains the powerful persistence of the monotheistic religions.  Amazingly, however, as the last chapter–”God’s Grace: Pluralism and Civility”–shows, believers have lately learned to peacefully co-exist.  “Adam Smith’s great insight about social life is that cooperative and socially beneficial outcomes can result from each individual human’s acting to maximize his or her selfish interests” (p. 221).  Such seems true in today’s religious world.  Catholics and Protestants, Christians and Jews, have found “that people can both make common cause within the conventions of religious civility and retain full commitment to a particularistic umbrella” (p. 248).

Ironically, persecution and intolerance now distinguish secularists rather than religionists!  Deistic clergy fulminate against despised “fundamentalists,” and “a new study has demonstrated that the only significant form of religious prejudice in America is ‘Anti-Fundamentalism,’ and it is concentrated among highly educated people without an active religious affiliation” (p. 256).

* * * * * * * * * * * * * * * * * *

In a companion volume to One True God, Rodney Stark has written For the Gloryof God: How Monotheism Led to Reformations, Science, Witch-Hunts, and the End of Slavery (Princeton: Princeton University Press, c. 2003).  Four lengthy chapters (each nearly 100 pp. long) focus on the four topics listed in the book’s subtitle.  And in each one Stark tries to rectify the historical record, duly crediting Christians for their contributions to Western Civilization.  Though he acknowledges his debt to historians’ researches, he admits to being disillusioned by their biases.  He was startled by many of their anti-Christian and (especially) anti-Catholic comments.  “Far more pernicious, however,” he says, “are the many silences and omissions that distort scholarly comprehension of important matters” (p. 13).  To shed light on what really happened motivates this study.

For 2000 years the Christian Church has been renewed by continuous reformations, though Stark focuses almost singularly upon the 15th and 16th centuries. Many of these movements were “sectarian” in nature and, like the early Christians, led by “privileged” rebels such as Peter Waldo, John Wyclif, Jan Hus, and Martin Luther.  Almost never were they the “revolts of the poor” so lionized by Marxist propagandists.  Reformers sincerely sought “God’s Truth.”  Theology, not economics, motivated them, though the success of their movements was powerfully shaped by various cultural factors.  Those that truly mattered, Stark says, were three: 1) Catholic weaknesses in lands that turned Protestant; 2) government response–autocratic regimes sustained Catholicism in countries like Spain; (3 monarchs’ “self-interest,” obviously determinative in Luther’s Saxony and Henry VIII’s England, but crucial wherever Protestantism prevailed.

Stark begins his second chapter, “God’s Handiwork: The Religious Origins of Science,” with a long quotation from Andrew Dickson White’s two-volume A History of the Warfare of Science with Theology in Christendom, the popular source of much misinformation such as the “fact” that Columbus “discovered” that the earth is spherical.  White, as well as fellow atheists such as Carl Sagan and Richard Dawkins, simply falsify the historical record so as to advance their philosophical agenda.  In fact, Stark argues “not only that there is no inherent conflict between religion and science, but that Christian theology was essential for the rise of science” (p. 123).  This is not “news” to those acquainted with the work of Stanley Jaki and Alfred North Whitehead (who in 1925 stated that science developed in tandem with Medieval theology), but it certainly challenges conventional textbook presentations.  It cannot be too strongly stated that Christianity uniquely nourished science, whereas neither the Greeks nor the Chinese, neither the Maya nor the Muslims encouraged scientific development.  And many scientists today are strong Christians!  Indeed, “professional scientist have remained about as religious as most everyone else, and far more religious than their academic colleagues in the arts and social sciences” (p. 124).  Stark illustrates his case with an impressive list of great Christian scientists, past and present.

Still more:  neither the “Dark Ages” nor the “Scientific Revolution” is a historically accurate label, for the latter was very much a continuation of the former.  Significant scholarly, and scientific, work took place during the Medieval era, a time of “‘precise definition and meticulous reasoning, that is to say, clarity'” as Alfred Crosby insisted (p. 135).  “Christianity depicted God as a rational, responsive, dependable, and omnipotent being and the universe as his personal creation, thus having a rational, lawful, stable structure, awaiting human comprehension” (p. 147).  Thus St. Albert the Great was a great scientist in the 13th century–and probably a much better thinker than was Nicholas Copernicus in the 16th!  The greatest scientist of the 18th century, Isaac Newton, devoted inordinate time (and a million written words) to biblical study and speculation.  His private letters “ridiculed the idea that the world could be explained in impersonal, mechanical terms” (p. 168).  According to John Maynard Keynes, who purchased a collection of his manuscripts, Newton “‘regarded the universe as a cryptogram set by the Almighty'” (p. 172).

Newton’s approach to the universe, however, was scuttled by Charles Darwin and his epigones.  In Stark’s judgment, “the battle over evolution is not an example of how ‘heroic’ scientists have withstood the relentless persecuting of religious ‘fanatics.’  Rather, from the very start it has primarily been an attack on religion by militant atheists who wrap themselves in the mantle of science in an effort to refute all religious claims concerning a Creator–an effort that has also often attempted to suppress all scientific criticism of Darwin’s work” (p. 176).  The theory of evolution through natural selection has not really explained the origin of species, though a great deal of rhetorical disingenuity disguises that fact.  For example:  millions of fossils have been unearthed during the past century, “but the facts are unchanged.  The links are still missing; species appear suddenly and then remain relatively unchanged” (p. 180).  Thus great thinkers, such as Karl Popper, have “suggested that the standard version of evolution even falls short of being a scientific theory” (p. 191)

Yet the Darwinian faithful retained their fervor and Popper was assailed for his obtuseness!  What Stark labels “the Darwinian Crusade” has been propelled by “activists on behalf of socialism and atheism” (p. 185).  Alfred Russel Wallace shared Darwin’s evolutionary hypothesis and declared that it unveiled “the coming of that biological paragon of selflessness, ‘socialist man'” (p,. 186).  In Darwin’s library one finds “a first edition of Das Kapital, inscribed to ‘Mr. Charles Darwin.  On the part of his sincere admirer, Karl Marx, London 16 June 1873).’  More than a decade before, when he read The Origin, Marx wrote to Engels that Darwin had provided the necessary biological basis for socialism” (p. 186).  Thomas Henry Huxley’s passionate commitment to Darwinian evolution was deeply rooted in his anti-Christian hostility.  Ideology and emotion, not objectivity, dominates Darwinism!

Stark’s third chapter is entitled:  “God’s Enemies: Explaining the European Witch-Hunts.”  Careful calculations, he insists, indicate that during the witch-hunting era (1450-1750), “in the whole of Europe it is very unlikely that more than 100,000 people died as ‘witches'” (p. 203).  Though radical feminists and anti-Christian historians often toss around numbers in the millions, they are simply venting their feelings and prejudices rather than dealing with the evidence.  Stark suggests that the confluence of satanism, magic, and political developments in the Protestant Reformation best explain the outbreak of witch-hunts.  They rarely occurred in Catholic lands, and they abruptly ended in the 18th century.  Stark’s meticulous research, and his country-by-country tabulations, persuasively discount many of the irresponsible textbook generalizations without defending the irrational frenzy underlying the killing.

“God’s Justice: The Sin of Slavery,” the book’s last chapter, argues that Christians, virtually alone among earth’s peoples, have condemned and eliminated a universal practice (evident in American Indian and African tribal societies as well as Greece and Rome, endorsed by Mohammed as well as Aristotle).  Apart from the Christian world, slavery has been taken for granted, much like the stars’ placement in the heavens.  But, Stark says, “Just as science arose only once, so, too, did effective moral opposition to slavery.  Christian theology was essential to both” (p. 291).  Certainly early Christians, such as St. Paul, “condoned slavery,” but “only in Christianity did the idea develop that slavery was sinful and must be abolished” (p. 291).  Rather than being abruptly abolished, however, it simply faded away, so that in the Medieval World it had disappeared and Thomas Aquinas branded it sinful in the 13th century.

The conquest and colonization of the New World, of course, revived the institution of chattel slavery.  But in Catholic lands it was mitigated somewhat by theological constraints, as is evident in the career of Bartholomew de Las Casas.  And in Protestant lands, by the 18th century, abolitionists began to question its legitimacy.   Revivalists such as John Wesley in the 18th and Charles G. Finney in the 19th century (not “Enlightenment” philosophers such as John Locke and Voltaire)  opposed it.  And ultimately, as “Robert William Fogel put it so well, the death of slavery was ‘a political execution of an immoral system at its peak of economic success, incited by [people] ablaze with moral fervor.'” (p. 365)

” Precisely!” Stark says, in his final sentence.  ” Moral fervor is the fundamental topic of this entire book: the potent capacity of monotheism, and especially Christianity, to activate extraordinary episodes of faith that have shaped Western civilization” (p. 365).

# # #