Mr. Richard Babbit
Editor & Publisher
Jan. 17, 2073
As you know, this year marks the centenary of my great-grandfather's religious conversion. While I never actually knew Charles "Chuck" Colson, my grandmother told me much about him and, more importantly, left me some of his personal papers and materials. There's more than enough here to write the book that we spoke about. The question is, what kind of book should I write for you? There have already been several biographies of my great-grandfather, and while most leave something to be desired, the public will probably not be receptive to yet another deconstruction of a public figure by a descendant with something to prove. While going through his papers, however, I noticed that while my great-grandfather avoided making predictions about the future, he was nevertheless very fond of trying to discern trends, and he didn't hesitate to say that certain issues would come to dominate human affairs, especially if their trajectory were unchecked. I think it might be enlightening to see how accurate he actually was. To give you a feel for how this project might shape up, here are two cases in point. First, in 1978, the old fellow wrote that in the 21st century respect for life and religious liberty would become the paramount issues. Was he right about that? Religious liberty
Confidentially, only a fanatic would deny that churches have rarely had it as good as they do now. My own church, the National Capital Assembly and Happiness Center, recently passed the 250,000-member mark. Our Board of Mature Persons decided against expanding our assembly center beyond its present 55,000-seat capacity. Instead, members will be assigned one Sunday each month in which they will be able to attend services in person. On the other Sundays, the combination of the UltraNet and Extremely High Definition Video will allow members to be present in every sense but the physical. What's more, programs minister to every domestic affiliation unit imaginable. The concordat American churches struck with the broader culture made this prosperity possible: Leave us alone, the churches said, and we will stay out of your affairs. To be honest, it's not as though the churches had much choice in the matter. We're talking about making a virtue out of necessity. The church realized back then if it hoped to thrive, it would have to accept a diminished public role. For instance, it has been settled constitutional law since 2029 that the First Amendment's guarantee of free exercise of religion only applies to purely private actions or thoughts. As the Court made clear in Favre vs. New California, the moment a religious or moral utterance leaves your lips, it's no longer a purely private act, and thus not protected. In practice, churches are allowed to teach whatever they want, as long as they stick to religious topics-as measured by the Supreme Court's "no earthly good" test. This is rarely a problem since public advocacy results in an automatic revocation of their tax-exempt status, and churches don't wish to risk this. Historically speaking, the concordat was inevitable. The story begins more than a century ago. Believe it or not, back then religious beliefs and practices received some deference from government. That's why two indigenous Americans, who worked as drug counselors, expected that their ceremonial drug use wouldn't cause them to be fired. When the case of Employment Division of Oregon vs. Smith came before the Supreme Court in 1990, the Court held that the free exercise of religion did not exempt them from laws that applied equally to everyone else. Just as with the Favre case, many people were upset by the Court's actions. (Remember, they didn't understand in those days that religion was a purely private matter.) So they lobbied Congress to pass legislation to overturn Smith. Congress complied, over and over again. Whether it was RFRA (the Religious Freedom Restoration Act), RLUIPA (the Religious Land Use and Institutionalized Persons Act) or RTEA (the Religious Tolerance Enhancement Act), the results were always the same. The Supreme Court would give Congress the back of its hand. Soon it found newer and blunter ways to say that the only constitutional rights Americans possess are those the Court says they possess. Finally, in 2012, at the urging of the NewBosWash Capital Times-Post, the Congress decided to stop sparring with the Court over constitutional interpretations. Not only was the sparring fruitless, but it was distracting Congress from what it saw as its primary mission: voting benefits and entitlements for middle-class Americans. So it codified Cooper vs. Aaron by enacting the "Judicial Supremacy Act." The act ceded authority over all matters involving values and ethics to the Court, leaving Congress to do what it did best, which was serving constituents' needs. But even if the Congress had been inclined to take on the Court again, that inclination would have gone against powerful political and cultural tides that were reshaping American public life. The presidential election of 2000 exposed the divisions that separated Americans from each other. As one commentator put it at the time, "it was almost as if two nations had voted." Even after the new president was sworn in, the hard feelings, the sense that the other guys had broken the rules with impunity and gotten away with it, persisted. This sense of rancor was so pronounced that political observers added two new terms to the political lexicon: Blues and Reds. These colors quickly became shorthand ways of describing certain irreconcilable political and cultural outlooks. (Incidentally, this is how blue and red came to replace Thomas Nast's elephant and donkey as symbols of the two major parties.) Americans, fearful that the divisions threatened entitlements, turned on those they had been taught were the "extremists," people motivated by strong convictions, particularly those of a moral or religious nature. A now infamous CNN-North America Hoy poll, taken in 2011, asked the public to name those groups that most threatened the American way of life. The top two finishers, ahead of the Zapatista commandos-before they elected members to Congress-and the Scientology Liberation Front, were the Christian Coalition and National Right to Life (happily, both have long since folded). The public's antipathy toward people of strong convictions was accompanied by Supreme Court decisions that hastened the privatization of all moral opinion. Surprisingly, it was Lee vs. Wiseman, in 1992, that led the way in this important constitutional development. Wiseman ostensibly involved a prayer at a middle school. The Court noted that religion was "potentially divisive," "coercive," and that religious people were "potentially dangerous." Given these findings, of course, it was too much to ask a young person to sit respectfully in silence. A few years later, lawyers re-reading Wiseman discovered the real issue at stake in Wiseman-one that most observers had missed. Hidden within the penumbras and emanations of the Establishment Clause was the right not to be exposed to any idea that might cause you distress. It was a right that fit perfectly with the times. On the surface, it might seem that this new right would clash with another basic right, the right to free speech. And the courts wrestled with balancing these two. The inspiration for the resolution, like the right itself, came from the Court's dealings with religious folks. Just as the courts upheld exclusion zones to keep abortion protesters away from abortion providers and women seeking an abortion, the courts eventually came around to upholding "First Amendment zones." In these designated areas, people are free to espouse any idea they want-even religious ones. As the court held in Young vs. Owens, the provision of a reasonably accessible place for the expression of ideas satisfies the First Amendment's guarantee of free speech: "First Amendment zones strike the necessary balance between free speech and the need to protect citizens from distressing ideas." Keeping strong convictions out of the public square became increasingly important as increased immigration and spiritual experimentation made America completely heterogeneous in every respect. It's hard to imagine what would have happened if the Court hadn't stepped in and kept people of diverse convictions from coming into conflict with one another. Respect for life
My great-grandfather would undoubtedly be gratified to learn that, by the most conservative estimate, it has been at least 15 years since the last nontherapeutic abortion was performed in the United States. A combination of technology and culture helped settle the most contentious issue in American public life since slavery-and without a civil war. Yes, it's true that Roe vs. Wade, whose 100th anniversary we observe next week, is still nominally the law of the land. But thanks to our God-given human ingenuity, we no longer have to choose between doing what others believe to be right and doing what we think is best for us. The seeds of this deliverance were sown at the turn of the century at Eastern Virginia Medical School in Norfolk. Researchers there took the first step toward giving women complete control over their reproductive cycle. Women had enjoyed a limited amount of control over their cycles since the introduction of the birth-control pill in 1960. But the pill wasn't 100 percent effective. There was still a chance of getting pregnant-especially if the women ever, even occasionally, forgot to take the pill. Seasonale and its descendants changed all that. The drug eliminated up to nine of a woman's 13 menstrual cycles each year. This gave women the most complete control over their fertility ever. Seasonale, and what it promised, captured the public's imagination like no drug since the polio vaccine. At that time, American women were forced to rely upon the availability of abortion (as the Supreme Court put it in Casey), but they still had mixed feelings about the procedure. They did not like having abortions, but they also didn't think pregnancy should keep them from doing what they wanted-sexually or career-wise-when they wanted to do it. The new class of fertility-inhibitors, by allowing a woman to choose the fertile periods of her life, empowered her to feel good about her choices. Women were now free to live their lives just like men, without any concern for the restraints imposed by biology. The impact of the new technology was initially small, but it grew quickly as the drugs were refined. Within 10 years of Seasonale's approval by the FDA, the number of abortions performed annually dropped to pre-Roe levels. After that, the number dropped even faster as the fertility inhibitors gained near-universal acceptance. Within another 15 years, for the first time in human history, infertility-albeit an artificially imposed infertility-became the norm rather than the exception. What's more, a broad cultural consensus emerged around these drugs. It's true that a small minority, mostly Catholics, expressed misgivings about Seasonale and the other fertility inhibitors. But unlike in the abortion debate, they were alone in their opposition. Protestant evangelicals didn't know what to make of Seasonale at first, but then they came around. As I've heard older couples at church say, "We prayed about whether these drugs were right for us. We came to see them as part of God's provision and felt at peace with the decision to use them." Who can argue with that? Another important player in this consensus was the health-care industry. When the Prometheus Group, the largest HMO in the country, announced in 2047 that it would no longer cover expenses arising from an unplanned pregnancy, the rest of the industry quickly followed suit. Americans, who had accepted mandatory genetic screening a decade earlier without a fuss, didn't complain. After all, they were used to getting permission from their health-care caseworkers before any major life change. So what was one more? Especially when people got what they wanted: a world where abortion is very rare, and where every child is definitely wanted. Although, truth be told, there are not as many of those children as there used to be. The situation in the United States isn't as bad as it is in Europe or, especially, in Japan. In these countries, officials have known since the turn of the century that native-born women were having children at far below replacement level. In Japan and parts of Asia, government and employers tried bribing women and even offering government-run dating services to reverse the trend. But the promise of control over their fertility, and their lives, made women resistant to their leaders' blandishments. Eventually, the Europeans had no choice but to encourage massive immigration. It was either that or lose the ability to fund their social welfare programs-a prospect ruled out by their rapidly aging populations. Italy, France, Germany, and Britain have all seen the descendants of these immigrants become the majority of their populations. Of course, not everyone was enthusiastic about the new order. Some people wondered aloud about what it meant to be German, Italian, or British in a country where most people's ancestry lay elsewhere. Others wondered what would become of the idea of Western civilization when, demographically speaking, the West was ceasing to exist-and, crucially, when perspectives arising from Islam and Buddhism became dominant. The culture that produced the Magna Carta, the Sistine Chapel, the Mona Lisa, and the music of Bach appeared to be without heirs. Fortunately, the United States kept us from having to answer that depressing question. America didn't face the "birth dearth" that confronted Europe and Japan-thanks to high levels of immigration from Latin America and Asia. Still, her changing demographics required a newer, more dynamic way of defining what it meant to be an American. The answer was the same as it had been for most of 20th century: participation in American mass culture. This definition worked especially well since, thanks to globalization, many immigrants arrived already familiar with American mass culture. Thus, we didn't have to ask too much of the people who were helping us to preserve our way of life, i.e., Social Security and Medicare. There have been some people who insist that our demographic hiccups are connected to a diminished respect for human life-in particular, what they call "anti-fatalism." But it's hard to see how that could be true. In fact, if anything, the opposite is the case. Even if there are fewer kids being born than before, they're more loved than ever. Each of them is treated like the prince he is. Besides, thanks to creative types at Viacom-Sony and Warner-Polygram-Fox, American mass culture continues to be a model for the rest of the world. Summary As I've demonstrated, my great-grandfather, although well-intentioned, did not take into account the power of human ingenuity to overcome our problems. He lacked sufficient faith in progress to see that the developments that concerned him were moving us toward a world in which you are free to believe and free to do as you please. As my wife Candide and I await the birth of our son, we realize that this is the best of all possible worlds. I await your response to my proposal.
Daewoo-Enron Professor of Law and Government
George Washington University