n the new Fox series "Boston Public," students have been forced to take random drug tests, teachers have been fingerprinted because the school board was worried about child molestation and the parents of the star football player have sued the principal because their son wasn't allowed to play after he failed two courses. It is a lot of litigiousness in one hour, particularly in a show that is not about law. Oddly enough, though, the idea that teachers and students live in fear of lawsuits doesn't feel like sensationalism. Instead, it seems that prime time is finally catching up with the reality of American life. As trust in traditional authorities declines, we are increasingly turning to law to regulate the kinds of behavior that used to be governed by manners and mores. In schools, in workplaces, in churches and in politics, our interactions are increasingly conducted in the shadow of legalese. We are becoming a nation of separate, resentful, legalized selves.
The litigation explosion in the last decades of the 20th century has been well chronicled. (In the 1990's, workplace bias suits tripled thanks to a series of new federal discrimination laws.) But the phenomenon I am describing is something different: not the explosion of litigation -- most of us, God willing, will never be parties in an actual case -- but the explosion of legalisms, which have become a substitute for moral and political debate.
Ever since I started teaching four years ago, for example, I have lived my professional life against a backdrop of rules that force teachers and students to think more legalistically about even the most fleeting interactions. Not long after I started, a student made a halting pass at me in my office. This wasn't a big deal, and as gently and firmly as I could, I rebuffed it. That was the end of the matter, as far as I was concerned, but I happened to mention the incident to a colleague, who told me I had to protect myself by informing the dean immediately. So I trooped downstairs to the dean's office and, to my embarrassment and his, duly reported the innocuous incident.
In an effort to protect the student's privacy, I withheld her name, although this may have defeated the purpose of the whole creepy exercise. Neither she nor I had done anything wrong, but the whole thing had to be filed away in the event of future litigation, which of course never transpired because both the student and I pretended it hadn't happened.
Jeffrey Rosen, an associate professor at the George Washington University Law School and the legal affairs editor of The New Republic, is the author of "The Unwanted Gaze: The Destruction of Privacy in America."
It is exactly this sort of pretense -- the hallmark of civilized interactions -- that is increasingly difficult in a world where every glance and gesture can lead to a lawsuit. And a result of this fear of litigation is to drive teachers and students further apart.
This trend, by the way, is not limited to romantic matters. I first noticed it when I was a law student in the early 1990's and the graduate students tried for the first time to form a union. As professors grew more remote and atomized, the grad students responded by trying to legalize their relationship with their professors. The informal mentorship roles that defined the relationship only a decade earlier had all but collapsed.
Also in This Issue
A Desire to Duplicate
A grieving family hopes to replace a lost child. A genetics-obsessed sect dreams of achieving immortality. Is this how human cloning will begin?
The 9-Year-Old Poet With the Big Advance
Meet Sahara Sunday Spain, the daughter of a former Black Panther and a well-connected artist -- and the answer to the book industry's multicultural dreams.
The 1989 massacre still haunts the regime, having ruined the lives of its many victims and their families.
This Week's Magazine Index
More recently, students' relationships with their fellow students have been legalized as well. A friend recently sent me the remarkable sexual harassment code of The Harvard Crimson, the undergraduate newspaper at Harvard. The Crimson has always taken itself very seriously, but its decision a few years ago to apply the requirements of federal antidiscrimination law to unpaid student volunteers was something of a watershed. The code forbids a daunting list of behavior, including "lewd pictures or notes," suggestive e-mail messages and "requests for sexual favors; touching, patting, hugging or brushing against a person's body in sexually suggestive manners." The policy applies not only to all activities in the Crimson's headquarters but also to "social outings which are largely dominated by Crimson staff members and employees ." The policy concludes with a stern warning about "consensual relationships," which notes that "the power exercised by a higher-ranking staff member over a lower-ranking person's advancement . . . may diminish his/her freedom of choice."
What makes codes like this so jarring isn't that they are routinely enforced. Instead, the codes are a reflection of a social transformation: the vocabulary of law and legalisms is the only shared language we have left for regulating behavior in an era in which there is no longer a social consensus about how men and women, and even boys and girls, should behave. But rather than leading to more understanding and empathy, the legalization of our personal and professional lives is leading to more social polarization and more mistrust of authority in all its forms.
his phenomenon has vindicated the prediction of Alexis de Tocqueville, who argued that as traditional sources of authority were undermined by democracy, legislators would pass an increasingly mind-numbing web of laws and regulations, designed to eradicate special privileges and to prevent those in power from favoring some citizens over others. Tocqueville warned that these laws would run the risk of creating despotism of a different sort, administered by lawyers and politicians who acted not like "tyrants but rather schoolmasters." Looking far into the future, Tocqueville feared that as individuals increasingly turned to the nanny state to regulate the most minute aspects of social life, personal interactions might be governed by "a network of small, complicated, painstaking, uniform rules." These rules might be so arcane, he feared, that citizens would eventually stop trying to understand or resist them, and increasingly large aspects of social and political life would be overseen by the American lawyer, "the lone interpreter of an occult science," who would resemble an Egyptian priest.
When Tocqueville came to America in 1831, American society was still vertical enough to have clearly identifiable social hierarchies. And in an age when citizens had no doubt where they stood in the ruthless pecking order, interactions among different classes of people were regulated by a sense of honor. The idea that gentlemen should behave honorably, for example, was an idea that high-status people traditionally used to differentiate themselves from low-status people. In a traditional honor-based society, like the Old South, if you were insulted by a social equal, you challenged him to a duel, and if you were insulted by a social inferior, you bludgeoned him with a cane. But under no circumstances would a gentleman sue another gentleman, because the honor code held that an offense against honor could only be answered by a physical attack.
The one branch of government that society trusts to exercise its authority -- the courts -- loses its authority the more that it tries to assert itself.
For example, when Sam Houston, the former governor of Tennessee, was accused of corruption by Representative William Stanbery of Ohio in 1832, Houston wrote a letter demanding satisfaction in a duel. Stanbery did not reply -- compounding the insult by refusing to recognize Houston as an equal -- and so Houston took a cane that had been cut from a hickory tree on the estate of his patron, Andrew Jackson, and used it to bludgeon Stanbery on the street. Jackson's opponents tried Houston on the floor of the House of Representatives for contempt. Despite an able defense by Francis Scott Key, the author of "The Star-Spangled Banner," Houston was censured by the speaker and was later fined $500 by a federal court. But Jackson, who had been raised in the Southern tradition that courts should never interfere in disputes about honor, officially remitted the fine. Jackson's own mother, the president later emphasized, had raised him to "indict no man for assault and battery or sue him for slander."
In 20th-century America, thankfully, identity became far more open and fluid: like Jay Gatsby, you could choose who you wanted to be rather than being defined by your social status at birth. And as American society became less hierarchical, the code of honor came to be seen, understandably, as oppressive and patriarchal -- a way of keeping women and minorities in their places.
But these changes had unintended consequences. The social critic Christopher Lasch has noted that as traditional hierarchies in families, schools and workplaces collapsed in the 1960's, the authority of parents and bosses came to be replaced by a panoply of experts -- guidance counselors, psychiatrists, therapists and judges -- who imposed social control in more therapeutic but no less confining ways. And at the same time, law began to fill the social space previously occupied by manners and mores. The rights revolution of the 1960's had many noble achievements, but in rebelling against hierarchical authority in all of its forms, it arguably threw out the baby with the bath water. In the late 1960's, as the authority of teachers and parents came under siege, school discipline began to be legalized. The United States Supreme Court set the tone in 1969, when it upheld the right of high-school students to protest the war by wearing black armbands. Students, the court noted, do not "shed their constitutional rights to free speech or expression at the schoolhouse gate."
But soon, more trivial conflicts found their way into court. As standards of grooming became contested, a series of federal cases arose out of rules that regulated the way high-school boys wore their hair. The rights revolution gave way to what Lawrence M. Friedman, the Stanford legal historian, has called a general expectation of "total justice" -- the idea that courts could compensate individuals for every misfortune, social slight or general brush with unfairness or bad luck. This trend accelerated in the 80's and 90's, as the democratizing effects of the Internet made vast amounts of information available online, and ordinary citizens found it easier to challenge the authority of traditional intermediaries, like lawyers, doctors and teachers. A result was an explosion of legalisms, as vast areas of life that used to be regulated by a complicated array of formal and informal social conventions -- from school discipline to abortion, gay rights and sexual harassment -- became regulated instead by rules and laws.
Consider, on this score, the recent international Web drama concerning Brad the Cad, the 27-year-old British lawyer who received an e-mail message at work from his girlfriend, Claire, a 26-year-old P.R. executive for MagicButton.net. In the message, Claire expressed her appreciation for an intimate encounter they had shared the other evening. Brad proceeded to forward the message to six male friends, boasting, "Now THAT'S a nice compliment from a lass, isn't it?" Within a week, the message had circled the globe, and the Web site at Brad's firm crashed after receiving 70,000 hits in a single day. Claire and Brad fled their homes to escape from the tabloid press, and a Web site was set up to debate what should become of Brad.
But very quickly a debate that began by focusing on manners and morals devolved into one about legalisms and law. The Brad and Claire Web site took a poll about whether or not Brad should be fired from his law firm, Norton Rose. The largest percentage of respondents -- 39 percent -- said he should be fired because he had abused company resources. A smaller percentage -- 26 percent -- said he should be fired for abusing Claire's trust. No one took the position that would have seemed obvious 100 years ago: Brad should be shunned socially for being a braggart, but his boasting wasn't his employer's business.
Norton Rose, for its part, cranked up a formal disciplinary procedure, grilling Brad and his colleagues in an exhaustive hearing. Finally, the law firm posted a statement about "e-mail abuse" on its own Web site, saying that it was "concerned about a clear breach" of company rules and that Brad and the others had been "disciplined but not dismissed." The moral debate about whether Brad should be punished for his caddish behavior was transformed into a legalistic debate about workplace rules. Norton Rose, at least, had the wit to resist public pressure to fire Brad the Cad.
Once a social dispute becomes legalized, people in the middle move toward the extremes. Thus, in the Florida case, the U.S. Supreme Court justices behaved like polarized molecules, just like Democrats and Republicans.
In America, by contrast, employers and schools are increasingly adopting "zero tolerance" policies when it comes to violations of codes of conduct, because the general collapse of authority means that society no longer trusts employers and teachers to exercise discretion about whether or not to punish wrongdoers for technical violations.
In an effort to avoid any hint of favoritism, some schools have also abandoned common sense. This past September, an 11-year-old girl was suspended from a sixth-grade class in an Atlanta suburb on the grounds that the chain on her Tweety Bird wallet violated her middle school's antiweapons policy. Last March, four boys in a New Jersey kindergarten class were suspended for three days after a game of cops and robbers in which they pretended their tiny fingers were guns and played at shooting each other. "This is a no-tolerance policy," said the district superintendent, with echoes of Kenneth Starr. "Given the climate of our society, we cannot take any of these statements in a light manner."
he climate that leads to these legalistic absurdities is not, in fact, a fear of violence. It is a fear of undemocratic forms of authority, a refusal to defer to teachers on principle. This is the social transformation that David E. Kelley, the lawyer and "L.A. Law" writer who created "Boston Public," "Ally McBeal" and "Picket Fences," has managed to capture, regardless of whether his shows are successful in other terms. Kelley is usefully exploring the corrosive ways that legalisms infect the most informal interactions at school and in the workplace when traditions of trust and manners break down. "We should be fingerprinted because we work on a high-school faculty?" asks a young teacher in "Boston Public" who refuses to take part in the school board's mass fingerprinting scheme. Instead of promoting confidence in the impartiality of administrators, the atmosphere of litigiousness makes the school, like the country, impossible to govern.
This atmosphere has become so pervasive that it is transforming even the last bastions of traditional authority, where consensus about what sort of behavior is shameful still exists. Robert Tuttle, a law-school colleague of mine, is a consultant for the Evangelical Lutheran Church in America. "A generation ago, if a clergyman was caught in shameful circumstances, such as stealing money or having sex with a parishioner, he would slink into the wilderness," he says. "Today, he might come back as a litigant." Now, churches who fire ministers for adultery or other misconduct risk a countersuit for wrongful termination, defamation or emotional distress.
Even more recently, federal courts have begun to dismantle longstanding First Amendment protections for a church's ability to enforce traditional standards of behavior. Some religions have a practice called "shunning," which requires members of the church to avoid all social contacts with members who have been expelled for breaking the church's moral code. Shunning and shaming are the traditional ways that hierarchical societies enforced standards of behavior before personal interactions became legalized. Although these rituals may seem archaic to outsiders, they are central to the church's ability to practice its religion without interference from the state. In the past few years, however, some parishioners who have been shunned for immoral behavior have responded with lawsuits.
In a curious case from the 1980's, a woman who had moved to a small town in Oklahoma sued the Church of Christ after it expelled her for having sexual relations outside of marriage with a local resident who wasn't a member of the congregation. The church had followed the disciplinary procedure set forth in Matthew 18: the elders confronted the woman three times; after she refused to repent of her fornication, they formally announced her transgressions to the entire congregation, which then refused to acknowledge her presence. The woman sued the church, claiming that the shunning ritual violated her privacy and caused her emotional distress. A jury awarded her $435,000, and in an outlandish opinion, the Oklahoma Supreme Court upheld the verdict. Because the woman had resigned from the church during the expulsion procedure, the court held, she was no longer a member of the church, and therefore the church had no right to discipline her.
In the wake of this opinion, lawyers are advising churches to dismantle their traditional shaming rituals and to offer wayward parishioners arbitration agreements instead.
Decisions like this point to the paradox of our increasingly democratic age. As traditional authorities find themselves under siege, citizens increasingly turn to laws and legalisms to resolve their social and political disputes. But when courts actually take sides in those disputes, they find their own legitimacy challenged by the losers, who disagree too violently with the rulings to accept them with good grace. As a result, the one branch of government that society trusts to exercise its authority -- the courts -- loses its authority the more that it tries to assert itself.
his was one of the most significant lessons of the presidential election of 2000. The fact that Bush and Gore started filing lawsuits days after the election was a historical sea change. In 1876, the last time a presidential election turned on disputed electoral votes from the state of Florida, the courts largely stayed out of the fight. Five Supreme Court justices were drafted to serve along with members of Congress on a bipartisan commission to resolve the dispute, but the commission divided along party lines. The stalemate was eventually resolved by a negotiated deal between the two political parties, in which the Democrats agreed to support the Republican who had lost the popular vote in exchange for a promise to remove federal troops from the South. But in 2000, the parties were too weak to negotiate on behalf of their candidates. The prospect of Congress choosing the next president was so unsettling that Bush and Gore scrambled to sue each other instead.
As the lawsuits were tossed like medicine balls between state and federal courts, some observers navely imagined that the U.S. Supreme Court had the authority to reach a Solomonic decision that could be respected by both camps. "Let the High Court count," Robert L. Bartlett wrote in The Wall Street Journal in November, in a forelock-tugging plea to the justices. "The Supreme Court is the one body with the prestige to lend legitimacy to any decision." But this misunderstood the fact that the Supreme Court has enhanced its power by hoarding it. By keeping cameras out of its courtroom, the Supreme Court has long understood the precarious nature of authority in a televised society. Television makes public officials seem more familiar and less mysterious, decreasing their authority while increasing their celebrity. To exercise power in a televised democracy, elected officials have to act like Oprah, confessing their intimate secrets -- like Bush's past drinking problem and Gore's fraught relationship with his father -- to convey the impression of being authentic and accessible. But this confessional accessibility creates the risk of overexposure. The justices had shrewdly positioned themselves as the most respected branch of national government precisely because they had refused to display themselves and had spoken unanimously during times of national crisis, like Brown v. Board of Education and the Nixon tapes case.
By contrast, when the court issued ideologically divided decisions in the most polarizing social and cultural disputes, the losers questioned the legitimacy of the Supreme Court itself. Roe v. Wade is the defining example: it galvanized a generation of conservatives to attack the Supreme Court as a group of lawless partisans, imposing alien social values by judicial fiat. Comparing Roe v. Wade with Dred Scott v. Sanford, Robert Bork, the rejected Supreme Court nominee, predicted that elected officials might someday refuse to obey the Supreme Court's dictates. "That suggestion will be regarded as shocking, but it should not be," Bork wrote. "To the objection that a rejection of a court's authority would be civil disobedience, the answer is that a court that issues orders without authority engages in an equally dangerous form of civil disobedience." In their attacks on the authority of the Supreme Court, the conservatives had turned themselves into the mirror images of the 60's radicals they deplored.
This sensibility reached its apotheosis in Bush v. Gore, which many conservatives seemed to view as a kind of culminating rebuke for decades of judicial and cultural defeats. Invoking the rhetoric of voting rights cases from the 1960's, five conservative Supreme Court justices stopped Florida's manual recount by inventing a novel constitutional right -- the right of each ballot to be examined in precisely the same manner as every other ballot. By preventing states from correcting the counting errors that result from different voting technologies, the conservatives arguably precipitated a violation of equal treatment far larger than they one they claimed to avoid.
What made the decision so galling for liberals and moderates, however, was that its reasoning was so sketchy and unconvincing that it seemed like a transparent mask for a predetermined political result. This suspicion was only enhanced by the fact that few conservative commentators even attempted to defend the legal reasoning of the decision; instead, they focused on what they considered the rough justice of the outcome. William Safire, for example, suggested in his column in The New York Times that the Florida Supreme Court was a group of rogue liberals trying to steal the election, and they had to be stopped by the U.S. Supreme Court -- a view that Justices Antonin Scalia, William Rehnquist and Clarence Thomas seemed to share in their separate opinion, in which they called the Florida opinion "peculiar" and "absurd." In their joint dissent, the liberal justices -- John Paul Stevens, Ruth Bader Ginsburg and Stephen Breyer -- objected that their colleagues' contemptuous view of the Florida court could "only lend credence to the most cynical appraisal of the work of judges throughout the land." But this cynicism was the culmination of decades of judge-bashing, and in deciding a presidential election on such cynical grounds, the conservative justices invited people of good faith who disagreed with the outcome to be similarly skeptical about their own handiwork.
The per curiam decision in Bush v. Gore was unsigned. The principal author of the decision, however, was Justice Anthony Kennedy, a moderate conservative who, along with Sandra Day O'Connor, voted in 1992 to uphold Roe v. Wade. In upholding Roe, Kennedy and O'Connor expressed great concern for the legitimacy of the court. What, then, can explain their decision to side with Rehnquist, Scalia and Thomas, greatly damaging the court's reputation in the process? Kennedy had been harshly criticized by conservatives for his liberal votes in the abortion and gay rights cases, and he may have felt betrayed by the court's overly expansive decision last June to strike down the so-called partial-birth abortion laws, which he plausibly viewed as an illegitimate extension of Roe.
But there is another explanation for Kennedy and O'Connor's behavior, and it has to do with the inherent polarization that results when social and political disputes are legalized. Psychological studies of group polarization suggest that after deliberation, members of a group of like-minded people are likely to move toward an even more extreme version of the positions they initially held. As the response to Roe v. Wade demonstrated, once a social dispute becomes legalized, people in the middle often move toward the extremes in order to signal their allegiance to the polarized camps. "People in partisan groups don't want to be unpopular with their friends," says Cass Sunstein, whose new book, "Republic.com," discusses the phenomenon of group polarization.
Moreover, once you take a side in a polarized situation, you are likely to feel increasingly committed to it, even if the legal arguments on behalf of your position turn out not to be very convincing. This is a phenomenon that psychologists call "confirmation bias" -- the tendency to interpret all subsequent facts and events so they support the position that you were initially less sure about. As this dynamic continued to spiral, liberal and conservative Supreme Court justices, like Republicans and Democrats throughout the country, behaved like polarized molecules, aligning themselves even more dramatically in the direction in which they were only tentatively leaning on Election Day.
Early last month, for example, in a remarkable speech to a Catholic service organization, Chief Justice Rehnquist defended the participation of Supreme Court justices in the notorious Hayes-Tilden Commission of 1877 and seemed to be pleading for similar appreciation from an ungrateful nation. "There is a national crisis, and only you can avert it," he said, describing the pressure on the justices to save the country during moments of crisis. "It may be very hard to say no." For the past 40 years, however, conservatives had argued that judges should resist the temptation to save the country from social turmoil and that political disputes should be resolved in legislatures rather than courts.
In the impeachment trial of Bill Clinton, Rehnquist preserved the illusion of his own neutrality by refusing to say what he thought about the merits of the case. But by presuming to intervene in Bush v. Gore, the justices squandered their carefully hoarded mystique, exposing themselves more nakedly than any television camera could. A few weeks after the decision, I met President Clinton at a fare-well reception, and he summed up the disappointment that many Democrats now feel toward the court they had long tried to give the benefit of the doubt. "That was one of the worst Supreme Court decisions in my lifetime, and one of the five worst decisions of all time," he said.
So what can we look forward to at the bright dawn of the George W. Bush administration? The one branch of national government that still commanded respect in a fractious society is now as weakened and discredited as the others. Nothing commands respect except public opinion, and now public opinion, too, is bitterly divided. It should be a lively four years. There will be plenty of new laws, lawsuits and scandals conducted entirely in legalese. As the nation becomes even more legalized, we will find ourselves less able to discuss with nuance and complexity the moral gray areas that exist in all of our lives. David E. Kelley will thrive, even if the Republic does not.
Back to top
Prepared: Feb 4, 2001 - 12:02:29 PM