The Black Tax: Reparations and the Scamming of the Black Community

November 9, 2003 Sunday

HEADLINE: The Black Tax;
Of charlatans, crooks and victims and the reparations scam.

BODY:

Last month a federal court in Richmond finally took action against one of the country’s most virulent tax scams: the “black tax credits.” Crystal Foster, 25, and her father and tax preparer, Robert Lee Foster, 51, were sentenced for claiming — and receiving — tax refunds as reparations for slavery. Crystal Foster claimed a taxable income of only $3,429 but demanded $500,000 from the government in reparations — and got it. The IRS actually paid her $507,490.91 to cover the interest due to the delay in sending her a check.

Cases such as the Fosters’ have fueled a cottage industry of charlatans and crooks pushing the promise of a “black tax” in what might be the greatest tax fraud in American history. The Internal Revenue Service has campaigned against the myth of a black tax for years, warning citizens that such claims amount to fraud, yet tens of thousands of claims are filed annually.

The court ordered Crystal Foster to repay the money that the IRS mistakenly had paid her and sentenced her to 37 months in prison. She had spent most of the $500,000 on a Mercedes, loans and gifts within eight days of receiving the payment. Her father was given a 13-year sentence on four counts of conspiracy to defraud the government. Similarly, Gregory Bridges, a tax accountant in Woodbridge, was convicted in June of preparing more than 100 such fraudulent returns for D.C., Maryland and Virginia residents.

The origin of the black tax and the story of its many victims combines a misunderstanding of history, raw political opportunism and old-fashioned greed.

The myth began with the April 1993 issue of Essence magazine and a piece by “journalist and economics consultant” L.G. Sherrod. Sherrod informed readers that the United States owed them for the value of the 1866 promise of “40 acres and a mule.” Citing as an authority “The People’s Institute for Economics,” she said that the adjusted value of this broken promise was $43,209. Readers could claim this amount, she advised, by writing on line 59 of tax form 1040 — which asks the filer to list “other payments” — the $43,209 in black taxes.

At first blush, one might assume that this was a joke. After all, by the same logic, one could calculate the current value of “a chicken in every pot and a car in every garage” promised in 1928 by Herbert Hoover. However, the IRS was deluged by refund requests, often with a copy of the Essence article attached. A legend had been born. Incredibly, the IRS paid out tens of millions in such refunds before realizing its mistake.

The black-tax theory is based on a mix of bad historical and legal knowledge. The promise of 40 acres and a mule was never an enforceable obligation by the government. In 1865 Gen. William Tecumseh Sherman signed Special Field Order No. 15, which made the promise. The basis for the promises was dubious because the land was largely confiscated. Government officials at the time argued that the ex-slaves could live and work the land for three years and then buy it. The ex-slaves and others viewed the land as payment for their bondage.

Within months 40,000 ex-slaves occupied 300,000 acres from South Carolina to Florida. President Andrew Johnson then rescinded the order and allowed the original owners to reclaim their land — leaving a wound that remains open today. While there were attempts to pass a law returning the land to the ex-slaves, the law was barred by Johnson, and no formal bill was signed.

This would have remained an arcane academic debate if Essence hadn’t published what amounted to a “how to” on tax fraud, playing into the hands of unscrupulous tax preparers who promised windfall refunds. In just one church, a preparer persuaded more than 1,500 people to give him $200 each to secure the refund. A few weeks ago, two people were charged in Mississippi for allegedly promising rebates of $43,209. People allegedly paid them between $25 and $6,500 for such tax advice. Ultimately, 6,300 African Americans were defrauded of $1.1 million.

Yet despite articles exposing the fraud and citing penalties, the legend just won’t die. In 2001 the IRS received more than 80,000 returns demanding $2.7 billion in refunds — most asking for $43,000. Amazingly, the IRS mistakenly paid out at least 130 such refunds in 2000 and 2001 — equaling $30 million.

While people often receive a warning from the IRS and drop the issue, others have received $500 fines and some have been prosecuted. However, most promoters have faced only fines and orders barring them from doing future work on tax returns.

Highly suspect lawsuits have been filed for reparations with the public support of black leaders, who insist that African Americans are entitled to such payments as a legal matter. Moreover, few black leaders have publicly denounced the concept of a black tax or warned citizens against participating in such filings. Instead, some leaders at a D.C. reparations conference a few years ago encouraged African Americans not to file a tax return at all — under a claim of immunity as descendants of former slaves. This has created the perfect environment for those eager to profit from the lingering sense of injury among black Americans, particularly among those who mistake political rhetoric for legal entitlement.

As for Essence, it has never fully apologized for its role in the creation of this fraud. A few years after the scam took flight, the magazine ran a brief reference to the article and noted “although many historians” supported the claims, the IRS did not. Economics consultant L.G. Sherrod reappeared as Lena Sherrod, who now advises people on their “economics” as finance and careers editor at Essence.

As for the Fosters, the scam is over, but the myth of the black tax continues. Of course, the myth did not appear spontaneously in the District or any other place. It required a mix of reckless political activism, bad journalism and outright fraud. Tragically, the victims of this fraud are black Americans who have been and continue to be ripped off by those who seek popularity or plunder at any price.

From Adultery to Polygamy: The Dangers of Morality Legislation

The Washington Post
September 5, 2004 Sunday

HEADLINE: Of Lust and the Law

BYLINE: Jonathan Turley

BODY:

Last month, John R. Bushey Jr. was finally brought to justice in a small courthouse in Luray, Va. Bushey, the former town attorney, stood before the court as an accused criminal with reporters from all over the state in attendance. The charge was adultery. Like 23 other states, Virginia still might prosecute if a husband or wife has consensual sex outside the marriage. Ten states, including Virginia, have anti-fornication statutes as well, prohibiting sex before marriage. Like many fundamentalist Islamic states, the United States uses criminal penalties to police the morality of its citizens.

These morality laws go back to the church-based “bawdy courts” of 13th-century England. Yet, the Bushey case illustrates that there are prosecutors today who remain eager to perform this quasi-ecclesiastical role — to publicly defend the institution of the monogamous marriage, and the unwed, from the ravages of lust and desire. Because these are often unrecorded misdemeanor cases, the specific number of prosecutions is impossible to determine. However, the Bushey case is far from unique. Since 1980, adultery cases have been recorded from Alabama to Massachusetts to Pennsylvania. And in 2003, Georgia prosecuted an anti-fornication case.

This latest adultery prosecution, in a county circuit court in Virginia, should motivate us to finally ban our American version of bawdy courts and force ambitious prosecutors to focus on our courtrooms rather than our bedrooms.

For 32 years, John Bushey, 66, served as the attorney for Luray — a small Shenandoah Valley town of 4,500 people. He had been married for about 18 years to Cindy Bushey, the town’s clerk. John Bushey, however, had an affair with Nellie Mae Hensley, 53, and after the affair ended, Hensley seemed to prove the adage “scratch a lover, find a foe.” Instead of going to the betrayed spouse or to her minister, she went to the police. While Hensley was divorced, Bushey was married and therefore subject to a criminal adultery charge, a misdemeanor.

The Bushey case seemed like the perfect vehicle to get the U.S. Supreme Court to finish work that it began in the 2003 case of Lawrence v. Texas, when the Court struck down anti-sodomy statutes. At one point, Bushey agreed to pursue such a course, and the American Civil Liberties Union took up his case. He kept changing his mind, however, first pleading guilty, then withdrawing the plea and pledging to fight as a matter of principle. Finally, in August, he surprised many observers by accepting 20 hours of community service as punishment for his offense. (His former lover publicly expressed outrage that Bushey would not receive a criminal record for his adulterous affair with her.)

Unfortunately, with his last-minute acceptance of punishment, Bushey implicitly accepted that the state of Virginia has a right to punish him for his moral failings. The far more important question is not Bushey’s faithfulness to principle (or to marriage), but the continuation of this archaic criminal provision, which also exists in Maryland and the U.S. military. (Such a law was recently repealed in the District.) The prosecutor in Bushey’s case, Assistant Commonwealth’s Attorney Glenn Williamson, staunchly defends the state’s interest in prosecuting adulterers. When a former lover comes to the police, he insists, the state must prosecute. His rationalization is baffling since, according to studies, he could throw a stick on any corner and probably hit a couple of adulterers.

A famous 1953 study by Alfred Kinsey found that 50 percent of married men and 26 percent of married women had engaged in adultery by age 40. A recent study by Ball State University reported that women under 40 have caught up to men in adulterous affairs. Other studies have shown that between 5 to 15 percent of married couples have “open marriages.” If Virginia were going to prosecute evenly, untold married couples in the state could be subject to prosecution when a former lover opted for the satisfaction of a public charge to heal private injury.

Imagine the work for the courts if prosecutors vigorously enforced the laws against fornication, which is generally defined as premarital sex — a crime that a 1988 study found was practiced by more than 75 percent of women and more than 80 percent of men by the age of 19.

Williamson stressed that he had prosecuted other adulterers and was grateful that “as far as general deterrence, it should now be widely known that adultery is a crime in Virginia.” It is certainly widely known after Williamson has hoisted some wretch for all unfaithful spouses to contemplate. But is it right?

With the medieval bawdy courts, the Anglican Church tried the unfaithful and imposed criminal punishments for “whoredom.” At least six adulterers were executed in England. Since women were viewed as the property of their husbands, these cases were often tried as matters of theft or trespass to chattel. Bawdy courts were embraced by such organizations as the Society for the Reformation of Manners, which supplied the dirt on the unfaithful during the 17th and 18th centuries. These cases were called “criminal conversation” and were uniformly brought by cuckolded husbands. Notably, criminal conversation laws were repealed around the time that women were given the ability to bring such lawsuits in England in the 18th century.

When the Puritans landed in the New World, they wanted their own bawdy trials. There were no church courts, but most states eventually passed laws criminalizing an assortment of private consensual conduct ranging from sodomy to fornication to adultery.

In the Colonies, adultery was once a capital crime and there are at least three recorded cases of people who were executed for adultery, and others were branded with an “A” on their foreheads. (At least one other adulterer, Thomas Newton, escaped in Connecticut shortly before his hanging). Women were routinely stripped to the waist and publicly whipped. In many cases, the convicted were given heavy fines and shaming punishments. A common shaming punishment (taken from England) was recorded in a 1640s Virginia case: the unfaithful were ordered “to stand in the middle of the . . . church upon a stool in a white sheet, and a white wand in their hands, all the time of divine service and shall say after the ministers such words as he shall deliver unto them before the congregation there present and also pay the charges of the court.”

While bawdy courts once mandated harsh punishments for adultery, today’s laws generally have lesser sentences of up to a year in jail and small fines.

It should be obvious that such laws governing private, consensual acts are no longer valid after the Supreme Court decision in Lawrence, but the Court did not actually set these laws aside when it ruled anti-sodomy statutes unconstitutional, even though it stressed that anti-sodomy laws further “no legitimate state interest which can justify its intrusion into the personal and private life of the individual.” While the Court did not address anti-fornication or adultery statutes, the dissenting justices specifically noted that the decision placed such laws in jeopardy. Self-described “morality advocates,” however, have resisted. Thus, Virginia cannot criminalize the act of sodomy between consenting adults but can often prosecute the same adults for having sex in any form under the adultery law.

Del. David B. Albo (R-Fairfax), who is in charge of streamlining Virginia’s criminal code, doesn’t approve of outsiders tampering with morality laws. The Lawrence decision, he complained, is “a perfect example of how the Supreme Court is inserting its own views into Virginia law.” Of course, Albo appears to have less of a problem when inserting his own moral views into the bedrooms of Virginia adults. Virginia, which is seeking to repeal its anti-fornication and anti-sodomy statutes, decided to keep adultery a crime.

Del. Brian J. Moran (D-Alexandria) insists that adultery must remain a crime because “adultery is wrong, and we were not going to eliminate a criminal action even though it has been infrequently prosecuted.” While many would agree adultery is wrong, there are plenty of things that are “wrong” but not crimes, such as betraying boyfriends or girlfriends in unmarried but monogamous relationships. Finally, the law is currently applied in a ridiculous fashion with only Bushey and a few others pulled out for prosecution from a virtual sea of adultery.

The real reason these laws go unchallenged appears to be self-serving politics. Joseph F. Murphy Jr., chief judge of the Maryland Court of Special Appeals and chairman of a committee to overhaul the Maryland code, put it bluntly. “You can imagine what would happen if you tried to take adultery off the books at this point. You would have a large group of people who would complain bitterly about it as another example of that state losing its moral compass.”

It takes courage to take such an action — something apparently in short supply in Virginia, Maryland and some other states.

Citizens should be able to police their marriages without the help of the Commonwealth of Virginia or the other 23 states. These laws have not deterred many adulterous spouses. They invite arbitrary prosecutions in courtrooms replete — it is statistically certain — with adulterous prosecutors, cops, jurors, clerks or judges.

And, these same courts are inundated with divorce cases of proven and admitted adultery by individuals who are never prosecuted — making such prosecutions as random as a societal drive-by shooting.

Since the days of the bawdy courts, women are no longer deemed chattel and towns no longer maintain a “whore’s chair” for public humiliation of adulterers and fornicators.

Bawdy courts have no place in a nation that cherishes individual choice and privacy. Let’s put an end to them — and leave morality prosecutions as a matter of historical interest for 13th-century scholars.

Right Goal, Wrong Means: A Vote for the District of Columbia

The Washington Post
December 5, 2004 Sunday

HEADLINE: Right Goal, Wrong Means

BODY:

Many D.C. citizens have been elated to hear about a plan to make the city into a congressional district — without the need for a constitutional amendment. That’s understandable. Residents of the District are in the unenviable position of paying taxes and yet having no true voting representative in Congress. However, the plan, known as the District of Columbia Fairness in Representation Act, would achieve a noble goal by ignoble means.

The bill, the brainchild of Rep. Thomas M. Davis III (R-Va.), chairman of the Government Reform Committee, would expand the number of House members from 435 to 437 to allow for a congressional district in Washington. To satisfy Republicans, Davis’s bill also would give Utah an additional district.

Utah, which fell just a handful of citizens short of another congressional seat in the last census, is expected to get an additional district as a result of the 2010 census. Under Davis’s plan, it wouldn’t have to wait.

Davis appears to genuinely favor a voting member for the District, and he saw an opportunity for a trade. “I don’t feel it’s a sign of weakness in our system to have to consider politics as part of the process,” he said. “Political considerations are neither good nor bad, they simply are.”

However, in matter of constitutional interpretation, politics is a poor substitute for principle. One of the greatest burdens of being a nation committed to the rule of law is that how we do something is as important as what we do. The Davis proposal would subvert the intentions of the Founders by ignoring textual references to “states” in the Constitution as the sole voting members of the houses of Congress. It also would create a city of half-formed citizens who could vote in the House but not in the Senate.

The controversy began almost 222 years ago with a riot. On Jan. 1, 1783, a large group of Revolutionary War veterans marched on Philadelphia, demanding their long-overdue back pay. Congress demanded that Pennsylvania turn out the militia to quell the rioters, but it refused. Congress then fled, first to Princeton, N.J., then to Annapolis and ultimately to New York City.

By the time congressional leaders gathered again in Philadelphia in 1787 to draft a new Constitution, one issue was prominent on many legislators’ minds: the creation of a federal district for the seat of government that would not be a part of any state. The members of Congress wanted to avoid, in the words of James Madison, the unwarranted “imputation of awe or influence” afforded to the host state of a permanent capitol.

The result was Article I, Section 8, of the Constitution, which created a federal enclave exclusively under the authority of the federal government. Virginia and Maryland agreed to pony up land for the enclave, which was gratefully accepted by Congress in 1790. Later, Congress gave some of Virginia’s land back. (The caged stones marking the original parameters of the federal district still can be seen in Northern Virginia.)

Not only does the Constitution not give the federal district a voting member in either house, it created the District precisely to be a non-state under the authority of the states represented in Congress. Article I, Section 2, specifies that members of the House are to be chosen “by the People of the several states.” Likewise, Article I, Section 3, refers to a Senate composed of two senators “from each state.” The makeup of these houses was a delicate balance, and it was a primary focus of the Constitutional Convention.

While the courts have recognized that Congress could give the District many of the same institutions and procedural rights as states, they have never suggested or ruled that Congress can create a new, fully voting member of Congress without a constitutional amendment. Indeed, when Congress wanted to give D.C. residents a voice in the election of the president, it passed the 23rd Amendment, ratified in 1961. That change notably gave the District electoral votes to which it “would be entitled if it were a State.”

Now, after failing in 1978 to ratify a similar amendment on voting rights for the District in both houses, voting-rights advocates want to avoid the constitutional process through a simple vote in Congress. Thus the Davis proposal becomes a celebration of contemporary politics over constitutional principle.

The way to achieve full representation for citizens of the District is to return the city to Maryland and reduce the federal enclave to the core of Capitol Hill and a few of its closest blocks. That is precisely what occurred when the Virginia land taken for the District was “retroceded” to the commonwealth in 1846.

Of course, strong political forces in the District and Maryland would not support retrocession. For one thing, Robert L. Ehrlich Jr. likely would be the last Republican to hold the governorship of that state for some time. Yet before we create hybrid constitutional entities, we should use the most obvious vehicle for giving voting rights to D.C. citizens without a constitutional amendment.

The amendment and retrocession processes are hardly easy, but, to paraphrase Davis, such constitutional considerations “are neither good nor bad, they simply are.”

The Return to Separate But Equal

The Washington Post
February 13, 2005 Sunday

HEADLINE: Good Intentions Aside, Separate Still Isn’t Equal

BYLINE: Jonathan Turley

BODY:

Few legal doctrines are more dangerous or despised than that of separate but equal rights — the philosophy that legitimized racial apartheid in the United States. It took the sacrifices of the civil rights struggle to put an end to both this doctrine and the officially sanctioned segregation that it justified.

Yet only months after the nation celebrated the 50th anniversary of Brown v. Board of Education — the landmark Supreme Court decision that struck down the doctrine as unconstitutional — some public and private institutions are again dabbling in separate but equal policies.

Two examples highlight this insidious trend. The first comes in the very area in which the battle for civil rights was waged most fiercely decades ago — the schools. It involves a New York City high school created specifically for gay and lesbian students two years ago.The second concerns the California prison system, whose 25-year policy of strict racial segregation of incoming prisoners has been challenged in a case now pending before the U.S. Supreme Court.

Both plans are being vigorously defended on pragmatic grounds — arguments long used by segregationists. From the court’s first articulation of the doctrine in 1896, separate but equal was always an exercise of pragmatism over principle. Rather than confront racial animus, society chose to yield to it — to achieve the appearance of racial coexistence through racial separation. While there are clearly differences between the old segregationists and the new (particularly in terms of their motives), there remain striking similarities in their methods.

New York’s Harvey Milk High School was created with the best possible intentions. Named for the assassinated San Francisco gay rights leader, it was meant to provide a sanctuary for gay and lesbian students who face tremendous pressures and even violence in many schools.

Gay rights activists have long modeled their work on the civil rights movement. But such civil rights leaders as Martin Luther King Jr. and Thurgood Marshall steadfastly refused to accept segregation in public schools — even though thousands of black students faced violence in desegregated systems. They understood that, to be truly equal, blacks had to be assimilated into every aspect of American life, even if the objective could only be reached after a period of painful confrontation.

Much like the integration of black students into white schools, the rise of a new generation of openly proud gay and lesbian students has led to greater tensions in New York schools. The city’s response was to essentially remove the victims and call it an act of reform. Mayor Michael Bloomberg defended the policy on the grounds that a separate school “lets them get an education without having to worry.” Yet, in classic civil rights terms, it is hard to see how removing gay students is any more a solution to homophobic violence in New York schools than removing James Meredith would have been a solution to racial violence at the University of Mississippi.

Harvey Milk — or Gay High, as it is often called — has become a lesson in the unintended consequences of segregation. Its creation reinforces the stereotype of gay students as fundamentally different and in need of special treatment. Some have suggested that the $3.2 million spent to establish the school could be better used to create a systemwide program of counseling and education for all students on the issues of sexual orientation and discrimination. In a city with roughly 300,000 public high school students, Harvey Milk’s 100-student capacity can handle only a small fraction of the city’s gay, lesbian, bisexual and transgender teenagers. The remainder must deal with the stigma of a segregated group and predictable taunts that they should “go to Harvey Milk,” where they belong.

On America’s other coast, California provides a second example of a separate but equal policy. The state prison has sought to control violence and reduce gang activity by temporarily segregating incoming prisoners on the basis of race. Hispanic prisoners from Southern California are separated from those from the north; Japanese and Chinese inmates are kept apart; and smaller groups — Laotians, Vietnamese, Cambodians and Filipinos — are segregated as well.

Other large states such as Illinois and New York face similar gang demographics, but none has adopted this sort of automatic segregation. California’s policy of yielding to racism rather than fighting it began almost three decades ago with small concessions, and escalated into a systemwide policy of apartheid for convicts entering any prison. In 1999, when tensions between northern and southern Hispanics erupted into riots at Pelican Bay State Prison, the standard response of corrections professionals elsewhere would have been to crack down on the inmates with a policy of zero tolerance of violence. Instead, California solved the problem by sending each group to its own prison, where it could reign as the dominant Hispanic gang.

Despite the fact that this racial segregation policy has been in place for 25 years, California prisons continue to convulse with racial violence. In 2002, there were about 7,000 incidents of assault and battery and seven deaths — the vast majority linked to racial gangs.

Officials insist that the violence would be worse without segregation for new prisoners. The federal appellate court in San Francisco agreed last year, rejecting a challenge from Garrison Johnson, a black prisoner who refused to join a gang and felt more threatened in a segregated environment. Using a test heavily weighted in favor of the prison, the court demanded that Johnson prove the impossible — that violence would not occur in cells if the policy were lifted. Officials insist that they are just dealing with the realities of racial gangs and their mutual hostility. One prison official observed that “if we have a Northern Hispanic with a Southern Hispanic, they already have a conflict before they come to prison” and the best thing is to simply give them their own space. It is the very logic that the Supreme Court used when it created the separate but equal doctrine in Plessy v. Ferguson, saying the Constitution did not require “a commingling of the two races upon terms unsatisfactory to either.” Integration, the court said then, would have to be “the result of natural affinities, a mutual appreciation of each other’s merits and the voluntary consent of individuals.” It seems unlikely that the white skinheads, black Crips, and Hispanic Fresno Bulldogs will achieve “mutual appreciation” any time soon.

The decisions to embrace separate but equal policies in a high school and a prison system are telling and tragic. Both schools and prisons represent controlled environments that strive in part to shape future conduct through compelled behavior and observation. High schools are the last such environment before most individuals join the larger society — they are the critical forum to teach not just basic curricular skills but basic citizenship skills. Removing gay and lesbian students allows prejudices and intolerance to continue unnoticed and unaddressed, permitting hateful students to become hateful adults.

Prisons are populated by certifiably asocial individuals, who failed to learn basic social principles and values. As a controlled and supervised environment, the prison is supposed to reinforce social rules of conduct through compulsory measures. The segregation policies of the California prisons not only leave racist and violent impulses unaddressed, they actually reinforce those impulses by yielding to them. A segregated prison is fertile ground for gang recruitment.

Equally disturbing is the growing level of “self-segregation” in institutions where there is no claim of racial violence or intolerance. Some colleges and universities now hold official and separate graduation ceremonies for certain minority groups; a growing number have created separate housing aimed specifically at minorities. The University of Pennsylvania houses almost one-quarter of its African American students at the W.E.B. Du Bois College House, and other schools including the University of Michigan and Dartmouth College have similar options. In a rhetorical echo of the Plessy decision, the segregated dormitories at Dartmouth are called “affinity houses.”

While many of these are voluntary choices by the students, such self-segregation still frames the academic experience in at least partially racial terms. This lesson was not lost on one Latino student at Amherst College, who was quoted in a report by the New York Civil Rights Coalition as saying: “Before I came to Amherst, I wasn’t thinking about race or class or gender or sexual orientation, I was just thinking about people wanting to learn.”

The resurrection of separate but equal is not some reflection of its inherent truth or merit. Rather, it is a reflection of a society that has increasingly favored the most expedient over the most ethical means of addressing contemporary problems. The separate but equal doctrine was the very scourge of the civil rights movement, but it continues to have pragmatic appeal — certainly over the more abstract principle of integration. After all, principle is often quite costly while pragmatism offers at least the outward appearance of tranquility at a bargain price. However, as new citizens walk out of places like the New York schools and California prisons, society may rediscover not just the convenience but the costs of separate but equal programs.

Celebrity Justice and the Case of Michael Jackson

June 19, 2005 Sunday
HEADLINE: Michael, Meet Fatty. And Errol and Martha and . . .

BYLINE: Jonathan Turley

BODY:

“Not guilty by reason of celebrity,” was one common reaction to last week’s acquittal of Michael Jackson on all 10 counts against him. The notion of “celebrity justice” — as distinct from conventional justice — has taken hold across America. Indeed, it has its own show, the syndicated “Celebrity Justice,” and there are self-described “celebrity justice correspondents” at Fox News and CNN. (CNN features two to handle the occasional sensational overload.) The cottage industry built around celebrity trials is based on the flawed assumption that the rich and famous are given free rides by jurors or simply prevail because of their ability to assemble dream teams of high-priced attorneys. Yet, these trials are different in other respects. For attorneys, the rules are often reversed from conventional criminal trials on such questions as when to put defendants on the stand or whether to attack victims.

One of the first to receive the label “Trial of the Century” was the 1921 trial of film star Roscoe “Fatty” Arbuckle for allegedly raping and killing showgirl Virginia Rappe. The trial was one of the first glimpses into the celebrity Babylon of Hollywood, ranging from Arbuckle’s bacchanalian parties to his custom-made Rolls-Royce with a toilet installed in the back seat.

After three trials, Arbuckle was acquitted and the jury even apologized for the “great injustice . . . done him.” But it was too late for the public. Fatty got us hooked, and we are still trying to get that voyeuristic monkey off our backs. A stream of celebrity cases followed, revealing the often sordid lives of the celebrity class. Among them was the 1958 murder inquest of Cheryl Crane (movie idol Lana Turner’s daughter). The public was fixated on the life of Turner who was abused by her over-sized mobster boyfriend, Johnny Stomponato — until, that is, the 14-year-old Crane plunged a 10-inch carving knife into his chest. Today we have Jackson. With an audience of 30 million television viewers, Jackson’s verdict was the most popular thing the singer has produced in years.

Most celebrity trials have a notable common element: They result in acquittals. But to suggest that acquittal is the inevitable outcome ignores the fact that many celebrities plead guilty to avoid damaging trials. Despite his acquittal, Arbuckle was ruined from the trial exposure, while celebrities such as Robert Mitchum and Hugh Grant pleaded guilty and went on to leading roles. Indeed, some of the best-known accused celebrities never faced a jury: Mitchum (marijuana possession) ; Pee-wee Herman (indecent exposure); Robert Downey Jr. (drug possession); Christian Slater (assault and drug possession); Paula Poundstone (child endangerment); Marv Albert (battery); Hugh Grant (solicitation).

The high acquittal rate also reflects the fact that celebrity trials present unique elements that are ignored at the peril of either the prosecution or the defense. For criminal defense attorneys, celebrity trials can seem like a parallel universe where conventional rules and tactics are reversed.

Take, for example, the general disinclination to put a defendant on the stand. In most cases, the risks are simply too high for a defendant. For celebrities, however, the failure to take the stand can come at a much higher cost, as shown by Martha Stewart, who served time for obstructing an investigation into her sale of ImClone stock.

Prosecutors often portray celebrities as detached, abusive personalities who use people for their conspicuous consumption or enjoyment. By the end of the government’s case, Stewart looked as if she did everything short of beating her underlings with a riding crop — an image that could only be changed by Stewart herself.

Stewart might have been saved if she had taken the stand and shown the one thing that she had resisted her whole life: vulnerability. If she had simply said that she was afraid and confused, it might have saved her. Yet, her complex personality seemed incapable of such a simple defense.

To make things worse, her legal team gave the jury a parade of celebrity friends who sat behind Stewart in public showings of support. She made it abundantly clear that the jurors were not her peers in that courtroom — Rosie O’Donnell, Bill Cosby, Brian Dennehy and the rest of her famous friends were her true peers. Her conviction was sealed because her defense played by the conventional trial handbook and refused to put her on the stand.

Ironically, celebrities are often ideally suited for testimony. They are not only natural actors but, like Stewart, people who actively market themselves to the public. For example, when Errol Flynn testified at his 1942 statutory rape trial, he turned the tide after the introduction of truly damning evidence. The swashbuckling actor was well known as preferring underage girls whom he called his “San Quentin Quails” or JBs (for jail bait).

Likewise, Arbuckle, Crane and Charlie Chaplin all testified and were exonerated. Conversely, when actress Winona Ryder faced shoplifting charges in 2002, she did not take the stand and was convicted.

There are exceptions to this rule. Celebrity defendants O.J. Simpson and Robert Blake were both accused of killing their current or former spouses (as well as a friend in Simpson’s case). Yet, neither could testify without risking that suppressed or excluded evidence could be introduced into the trial. Their acquittals stemmed from other flaws in the cases.

In the case of Jackson, no rational lawyer would have put the notoriously unstable singer on the stand. Wearing pajama bottoms and surrounded by his battalion-size entourage, Jackson could barely hold it together sitting behind the defense table.

For celebrities, the best defense is often offense: putting the accuser on trial. While used in conventional trials with mixed success, this defense has far greater resonance and success in celebrity trials. Jurors tend to be skeptical of people who flutter around celebrities.

In perhaps the most vicious example, Flynn’s lawyers played on his reputation as a rake and attacked the two accusers — Betty Hansen, 17, and Peggy LaRue Satterlee, 15 — as under-aged sirens. The revelation at trial that Satterlee had had a previous affair and later an abortion was all Flynn needed to secure acquittal.

Likewise, Arbuckle’s attorneys attacked Rappe as a woman of questionable morals. Chaplin’s lawyers had it easier in portraying his alleged victim as not only unstable but the real “little tramp”: Actress Joan Barry had allegedly broken into his home with a gun to force a reunion. He testified and was acquitted of the charge of transporting Barry across state lines for sex in violation of the 1910 Mann Act.

In some cases, a victim walks right out of central casting for a celebrity defense. Robert Blake’s wife was a notorious grifter and pornographer. After 20 years of defrauding people, particularly lonely men, you could throw a stick on any corner in LA and hit five people who wanted to kill Bonnie Lee Bakely.

Jackson’s lawyers showed that, if you do not have a grifter victim, an alleged victim’s grifter mother will do. On the stand, the mother admitted that she had made false allegations in a prior lawsuit and neatly fit the profile of a conniving, predatory personality. Even after she pleaded with the jury “don’t judge me,” they seemed not only to judge but to convict her. Indeed, jurors had more to say about her than Jackson after the verdict, including her nasty habit of snapping her fingers at them.

Celebrities can present themselves as open targets for people who want to extort money through false allegations. In Jackson’s case, it was the perfect model of the clueless meeting the unscrupulous. And it provided a narrative that any jury would have appreciated.

The prosecution often offers something more complex and fluid — and ultimately less salient for a jury. In the Simpson trial, the prosecutors fumbled their narrative out of pure incompetence. By putting police officer Mark Fuhrman on the stand, they handed the late Johnnie Cochran the ready-made story of racist cops bent on making a case against an African American celebrity.

Despite the largely circumstantial evidence, the prosecution had a true shot at conviction with Jackson. There are certain celebrities who fit the model of the Marquis de Sade defendant: a personality who has allowed his unrestrained lifestyle and tastes to mutate into perverse passions. Jackson’s bizarre, kabuki-like appearance gave testimony to his alleged perversities. The same can be said for famous “Wall of Sound” rock music producer Phil Spector, whose violent tastes and creepy hairdo seem to scream suspect for his September murder trial.

None of this means the system is incapable of handling celebrity cases. Justice was done with Jackson, who faced a weak circumstantial case and a weak prosecution. As for Simpson, the case was lost by a breathtakingly incompetent prosecution team.

These were not cases of “celebrity justice,” just celebrity trials and conventional justice. In both cases, the prosecution failed to take advantage of the peculiar patterns that shape celebrity trials and played by conventional rules while the defense played by celebrity rules. It was no contest.

Humiliating Punishments and the Abuse of Judicial Power

September 18, 2005 Sunday

HEADLINE: Shame On You;
Enough With the Humiliating Punishments, Judges

BYLINE: Jonathan Turley

BODY:

Shawn Gementera must have known that he would face some kind of punishment after a police officer nabbed him and a friend in the act of stealing letters from mailboxes along San Francisco’s Fulton Street four years ago. While jail or probation might have crossed Gementera’s mind, U.S. District Judge Vaughn R. Walker had a more creative idea. Walker sentenced Gementera to stand outside a post office while wearing a sign that read: “I stole mail. This is my punishment.” Where the judge saw a novel way of conveying society displeasure with mail theft, Gementera’s lawyers saw a violation of the Constitution’s ban on “cruel and unusual punishment.” The U.S. Court of Appeals for the 9th Circuit decided, however, that while the humiliating sentence might be unusual, it wasn’t cruel.

Lately it hasn’t been all that unusual either. The Gementera sentence — taken last month to the Supreme Court — is one of a growing number of “creative punishments” being handed down across the country by judges who want to use shame or humiliation to deter people from committing further offenses. As clever as these punishments might seem, judges are not chosen to serve as parents trying to set consequences for wayward children. Law demands not just consequences for wrongdoing, but consistent consequences. Otherwise citizens are left wondering whether they will receive a standard punishment or one improvised to suit a judge’s whim.

Shaming punishments were common in the United States before the advent of model criminal codes and the development of constitutional limitations in sentencing. While the scarlet letter made famous by Nathaniel Hawthorne’s classic novel about adultery is the best known, it was not the most common. Early sentences often required offenders to endure public displays of guilt by wearing signs or being pilloried in common areas. Adulterers were often required to carry heavy stones around a church or town.

Most shaming punishments were abandoned as either ineffective or unconstitutional. Modern law values the consistent imposition of punishment and frowns upon judges who personally tailor new forms of punishment for particular defendants. What is most dangerous about this recent trend is that, in the name of reforming citizens, judges will impose their own quirky brand of justice by ordering citizens to parade, worship or even marry. Consider a few examples, all from state or local courts:

* In Kentucky, Judge Michael Caperton recently allowed drug and alcohol offenders to skip drug counseling if they agreed to go to 10 church services. A pastor, like a divinely ordained probation officer, signs off on the completion of this obligation.

* In Texas in 2003, Judge Buddie Hahn gave an abusive father a choice between spending 30 nights in jail or 30 nights sleeping in the doghouse where prosecutors alleged the man had forced his 11-year-old stepson to sleep.

* In Georgia last year, Judge Sidney Nation suspended almost all of Brenton Jay Raffensperger’s seven-year sentence for cocaine possession and driving under the influence in exchange for his promise to buy a casket and keep it in his home to remind him of the costs of drug addiction.

* In Ohio, a municipal judge, Michael Cicconetti, cut a 120-day jail sentence down to 45 days for two teens who, on Christmas Eve 2002, had defaced a statue of Jesus they stole from a church’s nativity scene. In exchange, the pair had to deliver a new statue to the church and march through town with a donkey and a sign reading “Sorry for the Jackass Offense.”

* In North Carolina in 2002, Judge James Honeycutt ordered four young offenders who broke into a school and did $60,000 in damage to wear signs around their necks in public that read “I AM A JUVENILE CRIMINAL.” One, a 14-year-old girl, appealed and Honeycutt was reversed.

In a newspaper interview last year, Georgia Judge Rusty Carlisle said he often imposes shaming punishments when defendants seem insufficiently chastened. He cited an early case: a person accused of littering whom Carlisle felt was “kind of cocky.” So the judge gave him a cup and a butter knife and told him to scrape the gum off the bottoms of the court benches as the judge and others watched.

There’s no evidence that creative sentences work better at deterring crime than other punishments. Yet public punishments can be harshest on the most commonly targeted and vulnerable group — young people.

The recent penchant for customized punishments also undermines efforts to make criminal sentencing more uniform. Creative punishments often reflect the cultural character of a state. While an abusive father was given the choice of sleeping in a doghouse in Texas, domestic abusers were forced to attend meditation classes with herbal teas and scented candles in Santa Fe, N.M.

As elected officials, state judges know that few things please the public as much as hoisting a wretch in public. One Texas state judge, Ted Poe, was known as “The King of Shame” for his signature use of punishments like shoveling manure. Poe said that he liked to humiliate people because “[t]he people I see have too good a self-esteem.” Poe was so popular for what he called “Poe-tic Justice” that he literally shamed himself right into Congress and is now serving as a member of the House of Representatives.

In Memphis, Judge Joe Brown became famous for allowing victims of burglaries to go to the homes of the thieves and take something of equal value. When asked about his authority to order judicially supervised burglaries, Brown explained with a hint of amazement that “under Tennessee law it appears to be legal.” Brown eventually took his brand of justice to television as the host of his own syndicated court show.

What distinguishes the Gementera case is that it was a federal judge who imposed the shaming punishment. Federal judges have long been viewed as insulated from this trend — until now. And Judge Walker was upheld by the 9th Circuit Court of Appeals, which noted that “in comparison with the reality of the modern prison, we simply have no reason to conclude that the sanction . . . exceeds the bounds of ‘civilized standards’ or other ‘evolving standards of decency that mark the progress of a maturing society.’ ”

But the 9th Circuit Court’s ruling is more a devolution of standards. These novel sentences threaten the very foundation of a legal system by allowing arbitrary and impulsive decisions by judges. A judge is allowed to weigh guilt and impose sentences. Yet it is the legislature that should define the forms and range of permissible punishment for a crime. That’s why it was popular but wrong when North Carolina Judge Marcia Morey recently allowed speeders to send their fines to a charity for hurricane victims rather than to the state. Similarly, Wisconsin Judge Scott Woldt recently ordered Sharon Rosenthal, who stole money from the labor union where she was treasurer, to donate her family’s Green Bay Packers seats to his preferred charity, the Make-A-Wish Foundation. Such measures turn courts cases into private charity pledge drives.

As judges vie for notoriety through sentencing, citizens will be increasingly uncertain about the consequences of their actions. Will it be probation or humiliation? Once you allow judges to indulge their own punitive fantasies, defendants become their personal playthings — freaks on a leash to be paraded at the judges’ pleasure.

These cases betray a disturbing convergence of entertainment and justice in the United States. There has been an explosion of faux-court programs like “Judge Judy,” “Judge Hatchett” and “Judge Joe Brown.” For anyone who knows and values the legal system, these shows are vulgar caricatures that have no more relation to real law than TV’s Wrestlemania has to real wrestling. Yet it appears that some judges long for those Judge Judy moments when they can hand out their own idiosyncratic forms of justice.

If states and Congress do not act, we may find ourselves with hundreds of Judge Browns imposing sitcom justice with real citizens as their walk-on characters. In the meantime, as shaming devices become commonplace and therefore less shameful, and as there are more people walking around wearing special signs, jurists will need to dream up new, more demeaning punishments to make an impression on defendants — leaving both citizens and justice at risk.

The Supreme Court could help reverse this shameful trend with the Gementera case. Of course, even if it does, Judge Walker is unlikely to be seen standing outside the San Francisco courthouse wearing a sandwich board proclaiming “I Was Reversed by the Supreme Court” or “I Imposed Cruel and Unusual Punishment.” In some ways, that’s a real shame.

John Frederick Ames and the Law’s Misuse in a Fatal Fued

October 9, 2005 Sunday
Washington Post: A Faulty Law, a Feud, a Fatality

BODY:

Last month John Frederick Ames, a bankruptcy lawyer from Richmond, was acquitted of the murder of his neighbor, Oliver “Perry” Brooks [Metro, Sept. 17]. It was the latest chapter in a story worthy of William Faulkner that concerns an arcane 1887 law and a state legislature that refused to repeal it.

The dispute that led to Brooks’s death began in 1989, when Ames, who had purchased a 675-acre Caroline County farm from a widow facing bankruptcy, sent his neighbors a registered letter informing them that he was going to build a fence around his property. The letter also said that he was going to charge his neighbors for half the cost of the fence, which amounted to thousands of dollars. Ames said the 1887 law allowed him to bill them for the fence even without their consent.

Ames’s neighbors, who included retirees on fixed incomes, received bills of $6,000 to $45,000. All of them, including Brooks, who was living on $400 a month from Social Security, refused to pay. Ames had billed Brooks $45,000 for his share of the fence. Ames reportedly offered to forget about the $45,000 if Brooks would deed over some of his land, but Brooks refused. The case went through the courts, and in 1991, Ames finally prevailed in the Virginia Supreme Court.

His neighbors then scraped together the money for the fence — all but Brooks, that is, who continued to refuse to pay. Ames subsequently sued his neighbor for $450,000 for fence damage caused by a bull that Brooks owned. The bull repeatedly broke the fence and strayed onto Ames’s cattle farm. Ames called these bull incursions an “intentional disregard” of his rights. Brooks responded with obstinacy and anger.

The bad blood finally boiled over in April 2004, when Brooks’s bull once again strayed onto Ames’s land. Despite court orders barring him from entering Ames’s property, Brooks went to retrieve his livestock. An armed Ames told him to leave the animal. When Brooks brandished a stick he used to herd the bull, Ames shot in the face and then four more times.

Ames said the shooting was in self-defense. But his acquittal by a jury last month on a murder charge on the basis of self-defense isn’t the end of the story. Ames still may get the land that he was seeking from Brooks. He previously sued the Brooks family for $11.3 million in an action that originally cited everything from infliction of emotional distress to terrorism. He recently withdrew that action, but he still has a lien on the Brooks property and an outstanding fence payment that could exceed $150,000 with interest. The Brooks family is suing Ames for wrongful death.

Ames may fit the stereotype of a lawyer who will use any law to his advantage, regardless of the cost to others, but the Virginia General Assembly deserves equal blame for the mess that culminated in the death of a man. It repeatedly failed to repeal the archaic law that allowed the feud to get going in the first place.

When the state Supreme Court ruled in favor of Ames in 1991, it noted that Virginia was out of step with the common-law rule that a landowner’s boundary line is a lawful fence and that a cattle owner is liable for trespass by his animals. Virginia, however, does not impose such liability on livestock owners and allows them to force neighbors to pay toward “fencing out” livestock. Despite the feud and requests for the law to be changed, the legislature did not act. Only after Brooks was dead and Ames was facing a murder charge did it change the law — and then only to exempt landowners without livestock, which would not have protected Brooks.

The common law and most states impose costs on livestock owners for any damage that their animals cause to a neighbor. This sensible “fence-in” approach recognizes that a livestock owner should not be able to impose the cost of his or her enterprise on neighbors.

A fundamental purpose of the law is to reduce conflicts among neighbors by maintaining clear, consistent and fair rules. The Virginia legislature clearly failed in that duty. It may be true that good fences make good neighbors, but the Brooks killing shows that bad laws, like bad fences, make for bad neighbors.

UPDATE: It appears Donald Trump uses the same questionable means to harass home owners.

Buck Fever: Dick Cheney’s Bad Aim and Judgment is Not Unique

February 19, 2006 Sunday

HEADLINE: The Buck (Fever) Stops Here;
Bankety Bankety Bankety Bankety Bankety Bankety

When Vice President Cheney bagged a Republican donor during a quail hunt, he became the first U.S. vice president to shoot someone while in office since 1804 when Aaron Burr shot Alexander Hamilton. But from a legal point of view, the precedent that matters here may not be our third vice president, but rather a hapless Maine hunter named Donald Rogerson.

Whereas Burr pulled the trigger in a duel over honor and politics, Rogerson, like Cheney, shot someone while in search of game. Mistaking a 37-year-old housewife for a white-tailed deer, Rogerson shot and killed her. Locals insisted that the victim (who had recently moved from Iowa) was to blame because she was wearing white mittens during deer season. And a Bangor, Maine, jury cleared him of manslaughter.

No one in authority is talking about charging Cheney with a crime. But Cheney and Rogerson share the ignoble distinction of succumbing to what hunters (and lawyers) call “buck fever.” It is a phenomenon as old as hunting, defined by the Random House dictionary as the “nervous excitement of an inexperienced hunter upon the approach of game.” Yet experienced hunters have also been known to cut down neighbors they have mistaken for bucks, ducks and other quarry.

Buck fever is a recognized defense for negligent hunters, particularly youths. When a teenager shot a local businessman dressed in orange during deer season, he was excused from civil liability because of buck fever, despite the absence of any known species of orange-colored deer.

The law governing hunting accidents has long been controversial. This is the one area where citizens routinely shoot and kill other citizens without civil or criminal penalty — or even the loss of a hunting license. Indeed, most cases of accidental shootings are viewed as reasonable mistakes by hunters and often it is the victim who is blamed for failing to give a hunter a wide berth. Even in the few cases where criminal and civil charges are brought against hunters such as Cheney, they are often tried by a jury of their peers: jurors from communities where hunting and hunting accidents are a way of life.

That’s what saved Rogerson from prison. Karen Wood had only been out in her backyard for a minute, leaving her year-old twin girls in her house, when Rogerson shot her in the chest with a .30-06 rifle. Despite a 4X power scope and a distance of only 188 feet, Rogerson insisted that he mistook Wood for a deer he had seen — though a game warden found no tracks or other evidence.

Putting aside the question of how many biped deer Rogerson had previously encountered, the jury seemed to ignore the fact that Rogerson violated state law, which required him to identify a buck with antlers and to avoid shooting within 300 feet of a house. The case exemplifies the unique deference shown to hunters who maim or kill neighbors. When a hunter in Pennsylvania shot and killed a relative, he was cleared because the victim was making “animal-like movements.”

Hunting accidents stand in sharp contrast to other types of lethal negligence. In areas ranging from vehicular accidents to corporate misconduct, individuals routinely face criminal charges for reckless conduct. In hunting, however, gross negligence is often refashioned as mere “excitement.” Indeed, criminal charges can be downgraded when the killing was done in sport. In Illinois, a judge, while standing in front of his garage, was shot through the throat and shoulder and police treated the matter as “an attempted assassination.” A man finally came forward to admit that he shot the judge while firing at a hawk. His most serious charge? Shooting a bird of prey. He was put under supervision and given a $200 fine.

Hundreds of people are shot each year in “mistaken for game” cases. According to the International Hunting Education Association, in 2004, 41 people were killed and 250 wounded in hunting accidents in the United States. (Down from 91 people killed and 835 wounded in 2000). Unlike Harry Whittington, whom Cheney sprayed with as many as 200 shotgun pellets, the victims are often not even fellow hunters. Judy Moilanen was merely walking her dog in Ontonagon, Mich., when she was killed. Debra Kelly of Osseo, Wis., had her eye shot out by her 13-year-old nephew while she stood in front of her house.

Based on the public accounts of last weekend’s shooting, there’d be a good case to be made that Cheney was negligent. A person is negligent per se when he violates a statutory standard of care, such as the requirement to establish a clear line of fire and confirm a defined game. (This puts aside the fact that Cheney was hunting without a proper state stamp.)

Cheney’s is a classic case of buck fever. There was nothing particularly confusing or unexpected about an individual rejoining a hunting line, as Whittington reportedly did. Rather, it was likely the euphoria of seeking and shooting game that blinded Cheney to the fact that he was aiming at a 78-year-old attorney rather than a six-ounce bird. Medical studies show that hunters often experience a type of physiological frenzy in the presence of game — or its illusion. When shooting a deer, a male’s heart can reach 118 percent of the maximum heart rate. Given Cheney’s heart condition, hunting would seem a poor recreational choice for the vice president.

Cheney’s case reflects a troubling de facto immunity given to negligent hunters. Because of our tradition of hunting, we view people who make lethal use of a firearm as less culpable than those who make lethal use of objects like cars. Texas probably won’t require that Cheney take safety classes or suspend his license. The local county sheriff’s office has already declared the case closed. For his part, Cheney feels no compulsion to promise that the “buck (fever) stops here” and give up hunting.

At least Whittington knows who shot him. Frequently, the culprits in hunting manslaughter cases are never identified. With the expansion of suburbia, it is increasingly common for people to unwittingly enter a line of fire. In 1992, in Leeds, Ala., 22-month-old Ashley Ramage was shot and killed while simply riding between her parents in their truck.

Even in the Washington area, hunters are permitted to hunt game and fowl. Joan Manley, a federal lawyer, was shocked during a morning walk with her two golden retrievers around Jones Point in Alexandria. Alongside the heavily traveled path that runs next to the Potomac, two hunters sat with loaded shotguns in a boat resting on the shoreline; they were after ducks. Two Alexandria police officers confirmed that the men had a proper license and were expected to avoid joggers and bird watchers.

If they had failed, they could have expected no worse punishment than Cheney has received. As long as we continue to treat buck fever as a defense rather than an offense under civil and criminal laws, it’s best to leave the white mittens i

Defaming the Dead

September 17, 2006 Sunday

Elvis Presley was a pedophile. Queen Victoria, a lesbian. Abraham Lincoln, a gay adulterer. Winston Churchill, a murderous conspirator.

These are all “facts” published in recent years about famous people, and in each case such claims would normally bring charges of libel per se — a legal term signifying defamation so serious that damages are presumed. However, these statements also share one other important element: They were all published after the subjects had died. As a result, the publishers are protected by the longstanding rule that you cannot defame the dead (which, in practical terms, means you can). Once Elvis has left the living, you can say anything you want about him. No matter how malicious, untrue or vile.

Indeed, while most people are raised not to speak ill of the dead, the law fully supports those who do. Under the common-law rules governing defamation, a reputation is as perishable as the person who earned it. It is a rule first expressed in the Latin doctrine actio personalis moritur cum persona (“a personal right of action dies with the person”). The English jurist Sir James Stephen put it more simply in 1887, “The dead have no rights and can suffer no wrongs.” In other words, you’re fair game as soon as you die — even if writers say viciously untrue things about you and your life.

The question of whether the dead can be defamed came up recently in a most unlikely way: The family of John Dillinger sued over a depiction of the famous bank robber at the John Dillinger Museum in Hammond, Ind. The museum describes Dillinger as a cop killer, but his relatives note (correctly) that Dillinger was only charged with killing a police officer during his robbery of the First National Bank and Trust in East Chicago, Ind., on Jan. 15, 1934. He died before standing trial.

Disputes such as that over Dillinger — his family, unable to sue for defamation, had to rely instead on a state law that protects public figures from the commercial use of their images — serve mostly to remind us of the grossly unfair and unnecessary rule that allows people to savage the reputations of the dead.

Dillinger’s is only the latest, and far from the greatest, of such post-mortem injuries. Filmmakers and writers in past years have reinvented figures as varied as turn-of-the-century populist William Jennings Bryan, mid-century heartthrob Gary Cooper and President Richard M. Nixon to better fit a storyline — putting out false images that often become “fact” in the popular imagination. Without legal protection, such figures are subject to all matter of creative revisionism, and their families must live with whatever name and reputation they have left once the scriptwriters and biographers are done.

Through the years, many states have considered changing this rule, but have not acted. In New York, the issue came to a head in 1987, when Tawana Brawley, a black teenager, falsely accused a prosecutor, a New York police officer and a state trooper of a racist attack and rape. With people such as Al Sharpton calling the accused men racists and rapists, their reputations were utterly destroyed. The innocent police officer, Harry Crist Jr., was implicated after committing suicide. When a grand jury rejected Brawley’s claims, it took the highly unusual step of recommending that the state pass a law protecting the dead from such knowingly false statements. New York never did.

Allowing some protection for the deceased would not end historical critiques and articles. Many countries protect the reputations of the dead but have not seen a flood of defamation cases in court.

Without such protections, the dead are readily converted into madmen or murderers. Consider the character assassination of First Officer William McMaster Murdoch in the 1997 movie “Titanic.” The movie portrays Murdoch as a nut who shoots a passenger and then himself. However, not only was no one known to have been shot that night, but survivors identified Murdoch as one of the great heroes of the tragedy — giving his lifejacket to a passenger and then remaining on board to drown. (After historians and relatives objected, the studio sent a $5,000 check to Murdoch’s town of Dalbeattie, Scotland, for a scholarship fund.)

The family of the former heavyweight boxing champion Max Baer has a similarly legitimate complaint against director Ron Howard and the makers of the 2005 blockbuster movie “Cinderella Man.” It demonized Baer as the killer of two men in the boxing ring (he killed one man) and claimed he committed such notorious acts as bragging to opponent Jim Braddock’s wife, Mae, that he would kill her husband and then sleep with her.

There was no such outrageous encounter with Mae Braddock, and many have insisted that rather than boasting about killing Frankie Campbell as portrayed in the movie, Baer was haunted for the rest of his life by the death. Baer’s son, Max Baer Jr. (who played Jethro on “The Beverly Hillbillies”) told me that his father had nightmares about it and that he raised considerable money for Campbell’s family. Jeremy Schaap, who wrote the book “Cinderella Man,” told me that Baer went into an emotional “tailspin” after killing Campbell and lost a couple of fights because he refused to finish off opponents out of fear of another fatality. As for the scene with Mae Braddock, Schaap says adamantly, “It is totally made up.” (Baer, who was one-quarter Jewish, was probably best known for fighting with a Star of David on his shorts to protest rising anti-Semitism — a particular slap at Adolf Hitler when Baer defeated Germany’s Max Schmeling in 1933.)

If there were any threat of a defamation lawsuit, the studio lawyers would never have allowed such false portrayals. Indeed, ABC recently edited out material from its docudrama “The Path to 9/11” after attorneys for Clinton administration officials objected to inaccurate portrayals, including fabricated scenes. The problem was not that ABC falsely portrayed former national security adviser Samuel R. “Sandy” Berger as hanging up on CIA agents who were poised to kill Osama bin Laden. The problem was that Berger is still alive. (The scene was dropped.)

But Murdoch and Baer were long dead, so their reputations rested entirely on the self-imposed decency of the writers and directors — and in Hollywood, that means they were cinematic chum.

Publishers are often no better. Books purporting to tell all are often held until after the subject dies — leaving the family without legal recourse. Thus, the widow and children of Gary Cooper could only complain about the book “Cary Grant: The Lonely Heart,” in which authors Charles Higham and Roy Moseley claimed that Cooper was a Nazi sympathizer who “in 1938 would go to Berlin and be entertained by Hitler” — suggesting that Cooper partied with a genocidal killer. There is no evidence of any such meeting, and Cooper’s family insists that he neither met Hitler nor harbored any Nazi sympathies. Errol Flynn’s relatives sued Higham over his claim that Flynn was a Nazi spy. They lost under the common-law rule.

It would be relatively simple to draft a law to add protections for writers and publishers. States could extend the high standard for defamation of public figures to any deceased person — limiting actions to the most egregious violations in which the writer knowingly engaged in a falsehood or showed reckless disregard for the truth. The law could also limit any recovery to a declaratory judgment that corrects the public record and injunctive relief with no monetary damages.

There is an obvious precedent in the protections that most states offer for newspapers that print retractions — laws that could be extended to cases involving the deceased. For example, the New York Times reported in a 2003 obituary that the famous Harlem photographer Marvin Smith had his testicles removed after his twin brother, Morgan, died of testicular cancer in 1993. It was untrue and the Times voluntarily printed a correction.

None of this means that Hollywood should suddenly become the History Channel. The Hollywood view of history has always been more Cecil B. DeMille than Barbara Tuchman. Even a much-acclaimed movie such as “Inherit the Wind” invented scenes and so mutated the character based on William Jennings Bryan that many Americans wrongly believe that he was a bumbling, prejudiced clown. Bryan never testified that he knew the precise day and time that Earth was created — nor did he collapse in a delusional fit in court after the famous evolution verdict.

Yet in most cases, such revisionism involves distorting historical events rather than destroying historical figures. There was no reputation lost when Mel Gibson inaccurately portrayed the Scottish warrior William Wallace fighting to avenge the death of his wife at the hands of the English in “Braveheart.” (The only known account states that Wallace was pushed over the edge after a dispute with English soldiers over fish.) The wildly inaccurate movie, however, crossed the line of decency by suggesting that Princess Isabelle, based on Isabella of France, was an adulterer and that her son, Edward III, may have been fathered by Wallace. The real princess was 9 when Wallace died, she never met him and she bore Edward III seven years after Wallace died.

Just the mention of Oliver Stone pushes most historians into an open rant over films such as “JFK” and “Nixon.” Stone has insisted that he wasn’t doing anything that Shakespeare didn’t do. Yet it seems unlikely that the Bard would have falsely portrayed Pat Nixon demanding a divorce or misrepresenting President Nixon as a stumbling drunk who led a CIA operation to try to kill Cuban dictator Fidel Castro.

After all, it is Shakespeare’s Iago in “Othello” who observes that:

“Who steals my purse steals trash; ’tis something, nothing; . . .

But he that filches from me my good name robs me of that which enriches him

And makes me poor indeed.”

We are all made poorer when good people are trashed after they can no longer defend themselves. With the end of the debate over the permanent repeal of the death tax, perhaps it is time to protect more than just the assets of the deceased. Perhaps it is time to give the dead their due.

Rep. William Jefferson and the Presumption of Innocence

June 7, 2007 Thursday
SECTION: GUEST OBSERVER

LENGTH: 1188 words

HEADLINE: Jefferson Deserves Presumption Of Innocence

BYLINE: Jonathan Turley

BODY:

The 16-count indictment of Rep. William Jefferson (D-La.) this week has produced a spasm of legislative proposals and speeches calling for his expulsion from the House. Frankly, as a longtime critic of Congressional ethics rules, I never thought I would be arguing against an effort to purge or punish an unethical member. However, expelling Jefferson would violate core constitutional principles and likely trigger an intense legal fight. Even Members of Congress deserve a presumption of innocence and their “jury of peers” must remain fellow citizens, not fellow politicians.

Many of us have remarked on the strength of the evidence against Jefferson, including his famous frozen asset problem of the $90,000 found in his freezer. The 95-page indictment details 11 alleged bribery and fraud conspiracies that stretch across Africa.

Yet the merits of the Jefferson indictment are irrelevant to this debate. Expelling a Member before a conviction puts politicians in the role of a jury – meting out punishment in a politically charged environment. With the exception of the carefully structured impeachment proceedings, the framers did not foresee such a role for Members. It would be a dangerous precedent if a majority could declare a colleague presumptively guilty and toss him or her from Congress.

Expulsion before trial also is grossly unfair for a Member who is forced to defend his name in simultaneous proceedings before the courts and Congress. Putting aside the heavy financial burden, Congressional hearings could compromise privileged information or force a Member to waive constitutional rights to make a compelling case against expulsion.

Congress has long recognized those dangers and waited for the judicial system to reach its own conclusions. For that reason, the House waited until 2002 to expel then-Rep. James Traficant (Ohio), who was convicted on corruption charges. The only prior expulsions concerned two Members who were expelled at the beginning of the Civil War in 1861 as traitors.

Ironically, some of the loudest calls are now coming from Republicans who fought efforts of expulsion or punishment in the previous Congress after the indictments of former Majority Leader Tom DeLay (R-Texas) and former Rep. Bob Ney (R-Ohio). Yet, they are not alone. Many embarrassed Democrats supported stripping Jefferson of his powerful committee positions before his indictment and now support an expedited process that could lead to expulsion.

History has shown that public assumptions of guilt often fall short in an actual court of law. Various Members have been unsuccessfully investigated or even prosecuted. Rep. Alcee Hastings (D-Fla.) was acquitted of charges of corruption as a federal judge. While some denounced that verdict, Hastings has now served in the House for 15 years and is viewed by many as someone who has served with distinction.

The Hastings case is a useful point of comparison. Congress waited to impeach Hastings until after his trial. It did impeach despite his acquittal. However, there is a great difference between removing a judge and removing a Representative. Under Article III, federal judges “hold their Offices during good Behaviour.” While the Hastings impeachment was controversial because of his acquittal, it generally was accepted that Congress could impeach him under this authority.

Conversely, it would have been shocking for Congress to try to bar Hastings from service in the legislative branch based on the same evidence.

To expel a Member is to negate the votes of citizens who have a right to select their own representative, regardless of the views of other politicians. In Jefferson’s case, his constituents returned him to office after the details of the investigation were made public. Despite national calls for his ouster, he won 57 percent of the vote in his re-election in November.

The framers were adamant in restricting the authority of Congress to engage in selective pruning of its ranks. During the Constitutional Convention, the framers made reference to a contemporary controversy over the expulsion of John Wilkes from Parliament. Wilkes had publicly attacked the peace treaty with France and, in doing so, earned the ire of Crown and Parliament. After he was convicted and jailed for sedition, Parliament moved to declare him ineligible for service in the legislature. He served anyway, and eventually Parliament rescinded the legislative effort to disqualify him.

The framers feared that, unless Congress was prevented from manipulating its membership, history would repeat itself. Citing the Wilkes case as “worthy of our attention,” James Madison warned that if Congress could engage in such manipulation it would “subvert the Constitution.”

Likewise, Alexander Hamilton noted that “[t]he qualifications of the persons who may choose or be chosen … are defined and fixed in the Constitution, and are unalterable by the legislature.”

This history has helped courts understand the meaning of the Qualifications Clause of Article I, Section 2, which references state laws as setting qualifications for Members. Despite this language and the authority of Congress to punish its own Members, the Supreme Court has stressed that neither the states nor Congress can manipulate qualifications to exclude politicians. As the court noted in U.S. Term Limits v. Thornton, the framers feared that, if the membership of Congress could be manipulated, Congress could become “a self perpetuating body to the detriment of the new Republic.”

At least Wilkes had the benefit of a trial and had served time for his alleged crime. Expelling a Member before conviction would allow such manipulation by majority vote based on popular sentiment or political convenience – an obvious danger when our Congress is divided so closely between the parties. These dangers are magnified in a Congress that is now claiming the unprecedented right to create new voting Members. The House recently passed legislation that would, for the first time in history, create a new type of voting Member in the House – giving the District of Columbia a voting representative despite the fact that it is not a state. With the expulsion effort, Congress would not only be asserting the right to create new voting Members for federal enclaves but also the right to expel other Members suspected of crimes.

Politics ultimately may trump principle on this question. In a Congress under intense public criticism for its failure to pass meaningful ethics reforms, Jefferson has become a useful object lesson for Members to demonstrate their commitment to good government. Suddenly, the House looks like a Claude Rains convention with 435 Members practicing their “shocked, shocked” sound bites.

Jefferson recently resigned his only remaining committee position on the Small Business panel. He has been marginalized and vilified – for good reason. However, the House would do far greater damage to its institution if it yields to the temptation to pronounce guilt before a colleague has had his day in court.

Jonathan Turley is the Shapiro professor of public interest law at George Washington University.

Rep. William Jefferson and the Presumption of Innocence

June 7, 2007 Thursday
SECTION: GUEST OBSERVER

LENGTH: 1188 words

HEADLINE: Jefferson Deserves Presumption Of Innocence

BYLINE: Jonathan Turley

BODY:

The 16-count indictment of Rep. William Jefferson (D-La.) this week has produced a spasm of legislative proposals and speeches calling for his expulsion from the House. Frankly, as a longtime critic of Congressional ethics rules, I never thought I would be arguing against an effort to purge or punish an unethical member. However, expelling Jefferson would violate core constitutional principles and likely trigger an intense legal fight. Even Members of Congress deserve a presumption of innocence and their “jury of peers” must remain fellow citizens, not fellow politicians.

Many of us have remarked on the strength of the evidence against Jefferson, including his famous frozen asset problem of the $90,000 found in his freezer. The 95-page indictment details 11 alleged bribery and fraud conspiracies that stretch across Africa.

Yet the merits of the Jefferson indictment are irrelevant to this debate. Expelling a Member before a conviction puts politicians in the role of a jury – meting out punishment in a politically charged environment. With the exception of the carefully structured impeachment proceedings, the framers did not foresee such a role for Members. It would be a dangerous precedent if a majority could declare a colleague presumptively guilty and toss him or her from Congress.

Expulsion before trial also is grossly unfair for a Member who is forced to defend his name in simultaneous proceedings before the courts and Congress. Putting aside the heavy financial burden, Congressional hearings could compromise privileged information or force a Member to waive constitutional rights to make a compelling case against expulsion.

Congress has long recognized those dangers and waited for the judicial system to reach its own conclusions. For that reason, the House waited until 2002 to expel then-Rep. James Traficant (Ohio), who was convicted on corruption charges. The only prior expulsions concerned two Members who were expelled at the beginning of the Civil War in 1861 as traitors.

Ironically, some of the loudest calls are now coming from Republicans who fought efforts of expulsion or punishment in the previous Congress after the indictments of former Majority Leader Tom DeLay (R-Texas) and former Rep. Bob Ney (R-Ohio). Yet, they are not alone. Many embarrassed Democrats supported stripping Jefferson of his powerful committee positions before his indictment and now support an expedited process that could lead to expulsion.

History has shown that public assumptions of guilt often fall short in an actual court of law. Various Members have been unsuccessfully investigated or even prosecuted. Rep. Alcee Hastings (D-Fla.) was acquitted of charges of corruption as a federal judge. While some denounced that verdict, Hastings has now served in the House for 15 years and is viewed by many as someone who has served with distinction.

The Hastings case is a useful point of comparison. Congress waited to impeach Hastings until after his trial. It did impeach despite his acquittal. However, there is a great difference between removing a judge and removing a Representative. Under Article III, federal judges “hold their Offices during good Behaviour.” While the Hastings impeachment was controversial because of his acquittal, it generally was accepted that Congress could impeach him under this authority.

Conversely, it would have been shocking for Congress to try to bar Hastings from service in the legislative branch based on the same evidence.

To expel a Member is to negate the votes of citizens who have a right to select their own representative, regardless of the views of other politicians. In Jefferson’s case, his constituents returned him to office after the details of the investigation were made public. Despite national calls for his ouster, he won 57 percent of the vote in his re-election in November.

The framers were adamant in restricting the authority of Congress to engage in selective pruning of its ranks. During the Constitutional Convention, the framers made reference to a contemporary controversy over the expulsion of John Wilkes from Parliament. Wilkes had publicly attacked the peace treaty with France and, in doing so, earned the ire of Crown and Parliament. After he was convicted and jailed for sedition, Parliament moved to declare him ineligible for service in the legislature. He served anyway, and eventually Parliament rescinded the legislative effort to disqualify him.

The framers feared that, unless Congress was prevented from manipulating its membership, history would repeat itself. Citing the Wilkes case as “worthy of our attention,” James Madison warned that if Congress could engage in such manipulation it would “subvert the Constitution.”

Likewise, Alexander Hamilton noted that “[t]he qualifications of the persons who may choose or be chosen … are defined and fixed in the Constitution, and are unalterable by the legislature.”

This history has helped courts understand the meaning of the Qualifications Clause of Article I, Section 2, which references state laws as setting qualifications for Members. Despite this language and the authority of Congress to punish its own Members, the Supreme Court has stressed that neither the states nor Congress can manipulate qualifications to exclude politicians. As the court noted in U.S. Term Limits v. Thornton, the framers feared that, if the membership of Congress could be manipulated, Congress could become “a self perpetuating body to the detriment of the new Republic.”

At least Wilkes had the benefit of a trial and had served time for his alleged crime. Expelling a Member before conviction would allow such manipulation by majority vote based on popular sentiment or political convenience – an obvious danger when our Congress is divided so closely between the parties. These dangers are magnified in a Congress that is now claiming the unprecedented right to create new voting Members. The House recently passed legislation that would, for the first time in history, create a new type of voting Member in the House – giving the District of Columbia a voting representative despite the fact that it is not a state. With the expulsion effort, Congress would not only be asserting the right to create new voting Members for federal enclaves but also the right to expel other Members suspected of crimes.

Politics ultimately may trump principle on this question. In a Congress under intense public criticism for its failure to pass meaningful ethics reforms, Jefferson has become a useful object lesson for Members to demonstrate their commitment to good government. Suddenly, the House looks like a Claude Rains convention with 435 Members practicing their “shocked, shocked” sound bites.

Jefferson recently resigned his only remaining committee position on the Small Business panel. He has been marginalized and vilified – for good reason. However, the House would do far greater damage to its institution if it yields to the temptation to pronounce guilt before a colleague has had his day in court.

Jonathan Turley is the Shapiro professor of public interest law at George Washington University.

Boys and Toy Guns

February 25, 2007 Sunday

HEADLINE: My Boys Like Shootouts. What’s Wrong With That?

As the father of four kids younger than 9, I confess to being an overly obsessive and doting parent. I secretly follow my 8-year-old son, Benjamin, when he goes out on his bike, to make sure that he doesn’t ride in the middle of the street. I hover inches over my 18-month-old daughter, Madie, at the playground to make sure that she doesn’t eat sand. I am the very model of the risk-averse parent. Yet for some parents in my neighborhood, my kids and I are the risk to be avoided, even if it means removing their children when we show up at the park. The reason: toy guns.

I first noticed the “shunning” at the most unlikely of events. Each year on Labor Day, my Alexandria community has a “Wheel Day” parade in which hundreds of kids convert their bikes, scooters and wagons into different fantasy vehicles. Last year, we turned our red wagon into a replica Conestoga wagon with real sewn canvas over wooden ribs, wooden water barrels, quarter horse — and, yes, plastic rifles. It was a big hit and the kids won first prize for their age group. The celebration, however, was short lived. As soon as one mother spotted the toy rifles inside the wagon, she pulled her screaming children out of the event, announcing that she would not “expose them” to guns.

I must confess to feeling a mix of deep guilt and even deeper rage at that moment. It was not as though my kids were reenacting the massacre of a Cherokee village; they were simply living out innocent fantasies of the Old West. After some grumbling, my friends and I eventually dismissed the matter as some earth mother gone berserk.

But then it happened again.

My 4-year-old son, Aidan, brought his orange Buzz Lightyear plastic ray gun to “the pit,” as our neighborhood playground is known. As he began pursuing an evildoer — his 6-year-old brother, Jack — around the playground, a mother froze with an expression of utter revulsion. Glaring alternately from Aidan to me, she waited for a few minutes before grabbing her son and proclaiming loudly that he could not play there “if that boy is going to be allowed to play with guns.”

While such “zero-tolerance” parents still seem to be a minority, this is a scene that seems to be repeating itself with increasing regularity. To these parents, my wife and I are “gun-tolerant” and therefore corruptors of children who should be avoided. Not only are such toys viewed as encouraging aggressive behavior and violent attitudes, they are also seen as reinforcing gender stereotypes, with boys playing with guns or swords and girls playing with dolls or cooking sets.

My wife and I are hardly poster parents for the National Rifle Association. We are social liberals who fret over every detail and danger of child rearing. We do not let our kids watch violent TV shows and do not tolerate rough play. Like most of our friends, we tried early on to avoid any gender stereotypes in our selection of games and toys. However, our effort to avoid guns and swords and other similar toys became a Sisyphean battle. Once, in a fit of exasperation, my wife gathered up all of the swords that the boys had acquired as gifts and threw them into the trash. When she returned to the house, she found that the boys had commandeered the celery from the refrigerator to finish their epic battle. Forced to choose between balanced diets and balanced play, my wife returned the swords with strict guidelines about where and when pirate fights, ninja attacks and Jedi rescues could occur.

When I began to research this issue, I found a library of academic studies with such engaging titles as “Longitudinal Stability of Personality Traits: A Multitrait-Multimethod-Multioccasion Analysis.” The thrust was that gender differences do exist in the toys and games that boys and girls tend to choose. The anecdotal evidence in my neighborhood (with more than 60 young kids in a four-block radius) was even clearer: Parents of boys reported endless variations on the celery swords. There seems to be something “hard-wired” with the XY chromosome that leads boys to glance at a small moss-covered branch and immediately see an air-cooled, camouflaged, fully automatic 50-caliber Browning rifle with attachable bayonet.

Many parents can relate to Holley and Warren Lutz, who thought that after their daughter Seeley, they could raise her little brother, Carver, in a weapon-free house. Holley realized her error when she gave 10-month-old Carver a Barbie doll and truck one day. The little boy examined both and then proceeded to run Barbie over repeatedly with the truck. By 2, he was bending his sister’s Barbies into L-shapes and using them as guns.

One of my neighbors, Tracy Miller, a child psychologist and mother of three girls and a boy, found that her son instinctively gravitated toward improvised weaponry from an early age, while her girls, who are temperamentally more assertive, never showed the slightest interest. Miller resolved that it was better to allow this type of channeling of aggression, while keeping tabs on how it manifested itself in her son’s games.

Her view is supported by a recent flurry of studies looking at boys and their development. Michael Thompson, a psychologist and coauthor of “Raising Cain: Protecting the Emotional Life of Boys,” writes that parents often overreact when confronted with toy guns and other games: “Play is play. Violence is violence.” The key is making sure that kids distinguish between the two in their play.

Nancy Carlsson-Paige, co-author of the book “Who’s Calling the Shots?: How to Respond Effectively to Children’s Fascination with War Play and War Toys,” sees it differently. These toys are not the product of natural childhood fantasies, she says, but “really manifest the ideas of adults — of marketing people” who push toys that reflect an adult imagination more than a child’s. Yet Carlsson-Paige, who has long studied the effect of violence in the media on the social development of children, says it is true that guns and war games are a way of helping some children process the plethora of violent images on television, in videos, in the news. When I asked her about my neighborhood toy gun issues, she told me: “If parents ‘ban’ gun play, they run the risk of cutting off a valuable vehicle children need for processing the violence [because] kids use their play to make meaning of what they have experienced in life, and in this case, of the violence they have seen.”

For his part, the late child psychologist Bruno Bettelheim, author of “The Good Enough Parent,” said that there is clearly a gender difference in the toys parents give boys and girls to play with, but he thought that rather than taking guns away from boys, parents should pass them out to girls, who would be served “equally well to be able to discharge their anger through symbolic play, as with toy guns.”

While the zero-tolerance debate about guns and other such toys predated the 1990s, it was greatly accelerated after the 1999 Columbine High School shootings as educators rushed to develop formal policies against weapons (fake or real) in schools. This made obvious sense to most parents — these toys do lend themselves to disruptive games and it can be difficult from a distance to distinguish between real and toy weapons. However, nervous school officials soon began to apply these policies as strict liability offenses where even the most minor violation is treated as a cause for arrest, expulsion or special schooling.

Consider:

{bull} In New Jersey, an 8-year-old boy used an L-shaped piece of paper in a game of cops and robbers during recess. School officials called the police, saying the child had threatened “to kill other students” by saying “pow pow” on the playground. He was held for five hours and forced to make two court appearances before charges were dropped. Two 8-year-old boys were charged with making “terrorist threats” after they were found pointing paper guns at classmates. Charges were later dropped.

{bull} In Texas, a 13-year-old girl was suspended and transferred to a school for problem kids after she brought a butter knife to school with her lunch. Her parents had packed the dull knife so that she could cut her apple to make it easier to eat because she wore braces.

{bull} In Arkansas, an 8-year-old boy was punished for pointing a cooked chicken strip at another student and saying “pow, pow, pow.”

{bull} In Georgia, a 5-year-old student was suspended after he brought a plastic gun the size of a quarter to his kindergarten class.

Even drawing a picture is too close for comfort under these zero-tolerance policies. In Florida, two 10-year-olds were arrested after drawing stick figures considered to be threatening, and in Nevada, teachers tried unsuccessfully to expel a boy for drawing a cartoon of the death of his teacher.

While many people are complaining about such harsh actions and lawmakers are beginning to call for more moderate policies, some parents want zero-tolerance policies extended to playgrounds, parties and other venues. That has put many of us who have a more expansive view of what is acceptable childhood play in the unenviable position of either conforming to a policy that we believe to be excessive or continually triggering confrontations with zero-tolerance parents.

Of course, it is a bit troubling to be seen as a local gun merchant supplying the weaponry of gratuitous violence to our playgrounds. However, we do not believe that play guns and swords are ruining our children. Frankly, after three boys, my wife and I have resolved the nature/nurture debate in our house in favor of nature.

Yet on the playground there seems to be a palpable fear among zero-tolerance parents that boys harbor some deep and dark violent gene that, if awakened, is likely to end years later with some sort of Hannibal Lecter situation. Of course, there are at least 100 million men in this country who probably played with toy guns or swords as children and did not grow up to become serial killers.

As one of five kids (with two older brothers), I grew up in a liberal, no-guns household in Chicago in the 1960s. My mother considered it her duty to smash any squirt gun we brought into the house. In looking back, though, I’m sure that her gun-free policy made us all the more obsessed with the toys. My kids, on the other hand, show no such fixation. They rarely play gun games (sword fights are more common) and are more inclined to hunt for valuable rocks on the playground or convert our best linens into makeshift yurts in the living room.

Still, when their best friend recently invited them to his Army-themed birthday party, it didn’t bother us a bit (though some parents did refuse to let their children attend). In fact, I was struck by how, more than combat fighting, the boys tended to act out scenes involving rescuing comrades or defending the wounded. What I saw was not boys experimenting with carnage and slaughter, but modeling notions of courage and sacrifice. They were trying to experience the emotions at the extremes of human conduct: facing and overcoming fear to remain faithful to their fellow soldiers.

Or, as child psychologist Penny Holland put it in her book, “We Don’t Play with Guns Here,” their make-believe games were “part of . . . making sense of the world [imitating] timeless themes of the struggle between good and evil.” This explanation is probably all the more important in a world filled with violent images of war on television and in the news.

Being a weapons-tolerant parent doesn’t mean I’m thrilled by these games. I would prefer that my sons played nation-builder or rocket scientist. However, before they get to such fantasies, they seem to have to work out more basic emotions in more basic ways. So for a few more years at least, the celery will remain in the fridge and the swords on the playground.

The Duke Rape Case and Prosecutorial Abuse

Lots of Prosecutors Go Too Far. Most Get Away With It.

By Jonathan Turley
Sunday, June 24, 2007; Page B03

It was an extraordinary scene when Michael B. Nifong, the district attorney in Durham, N.C., took the stand to defend his law license after his failed crusade to convict innocent Duke University lacrosse players of gang rape. He had no more success with his own defense. After being disbarred for “dishonesty, fraud, deceit and misrepresentation,” he was suspended from his job last week and now faces a possible lawsuit in civil court.
What’s most remarkable about the whole scene, though, is how rare it is. Nifong’s misconduct was hardly unusual: Some of the most high-profile cases in history have involved strikingly similar acts of prosecutorial abuse. But instead of being punished, the worst violators are often lionized for their aggressive styles — maybe even rewarded with a cable television show.
Nifong is a classic example of the corrosive effect of high-profile cases on a prosecutor’s judgment and sense of decency. Even before the players were indicted, the district attorney had played to the passions surrounding a black stripper’s allegations that she had been raped by affluent white college boys. Nifong called the Duke players “a bunch of hooligans” and promised that he would not allow “Durham in the mind of the world to be a bunch of lacrosse players from Duke raping a black girl in Durham.”
But he had a problem. The accuser kept changing her story, and there was no evidence of a gang rape. In addition to his prejudicial comments, Nifong was accused of withholding test results showing that DNA found on the woman’s body and underwear came from at least four unknown males — but none of the 46 lacrosse team members.
Nifong isn’t the first prosecutor who, in his words, “got carried away” in the glare of television lights. In 1921, the silent-film star Roscoe “Fatty” Arbuckle was tried for the alleged rape and murder of a 30-year-old showgirl named Virginia Rappe during a party in a hotel suite. The San Francisco district attorney, Matthew Brady, faced a situation almost identical to Nifong’s: His chief witness was less than credible.
Rappe’s friend Maude Delmont dramatically described how Arbuckle had dragged Rappe into the bedroom, gleefully proclaiming, “I’ve waited five years to get you.” She insisted that she spoke with Rappe three days later, just before the young woman died (of peritonitis caused by a ruptured bladder), and related the too perfect account of how Rappe yelled, “I’m hurt, I’m dying. He did it, Maude.” In reality, rather than staying by her dying friend’s bedside, Delmont had run to send a telegram to friends that read: “We have Roscoe Arbuckle in a hole here. Chance to make some money out of him.”
It didn’t matter. Brady was hooked. Like Nifong’s conflicting DNA report, the coroner’s report in the Arbuckle case found “no marks of violence . . . and absolutely no evidence of a criminal assault, no signs that the girl had been attacked in any way.” Just as Nifong insisted that he had clear evidence against the lacrosse players, Brady released a statement (soon after receiving the coroner’s report) saying that the evidence “shows conclusively that either a rape or an attempt to rape was perpetrated.” Notably, when Arbuckle was finally acquitted in a third trial, the jury issued a written apology for the “great injustice . . . done him.”
The Duke case also has some striking resemblances to the trial of the so-called Scottsboro Boys. This case of prosecutorial abuse stemmed from a fight on the evening of March 25, 1931, in which a group of black youths threw a group of white boys off a freight train in northern Alabama. When police pulled the black boys off the train, they found two white girls dressed in men’s clothing also riding the train. The girls claimed that they had been held against their will, beaten and raped by the black youths.
Like Nifong, the Scottsboro prosecutors ignored the conspicuous absence of forensic and medical evidence supporting the rape charges — particularly the lack of bruises or torn clothing. (One girl later admitted that they had made up the story to avoid getting in trouble with the law themselves.) All nine Scottsboro defendants were convicted in one-day trials and sentenced to death, with the exception of a 13-year-old boy who was spared death by one holdout juror. (After the Supreme Court intervened and after multiple trials and pardons, the accused were released years later.)
This abuse occurred because the critical safeguard of prosecutorial discretion — the decision whether to pursue a case — didn’t protect the suspects. Despite what you see on television, the chances of being convicted in a criminal case are extremely high. Grand juries are said to be willing to “indict a ham sandwich,” and it’s not uncommon for prosecution offices to have conviction rates of 90 percent or higher. Some prosecutors grow callous and cavalier about their role. When told that he had secured the death penalty against an innocent man, a Texas prosecutor once reportedly boasted that “any prosecutor can convict a guilty man; it takes a great prosecutor to convict an innocent man.”
History is rife with such “great prosecutors” convicting the innocent to satisfy the public. In the 1913 Leo Frank trial, Atlanta chief prosecutor Hugh Dorsey pursued a Jewish factory owner for the rape and murder of 13-year-old factory worker Mary Phagan. It was a period of intense anti-Semitism, with crowds chanting “Kill the Jew” outside the courtroom. Prosecutors ignored the fact that all the evidence pointed to a janitor, Jim Conley, as the killer. Instead, they repeatedly rewrote Conley’s conflicting statements to help him manufacture a coherent account for trial. Conley was identified years later as the killer by a witness, but it was too late for Frank. He was kidnapped from prison by vigilantes (including many leading lawyers) and hanged near Mary’s grave.
Prosecutors are sworn to protect the rights of the accused as well as the accuser, to refuse to pursue cases that would not serve the interests of justice. Yet in today’s environment, it appears that prosecutors can never be too tough, the way models can never be too skinny.
Consider the career of Nancy Grace. Before becoming a CNN and Court TV anchor, she was a notorious prosecutor in Alabama. In a blistering 2005 federal appeals opinion, Judge William H. Pryor Jr., a conservative former Alabama attorney general, found that Grace had “played fast and loose” with core ethical rules in a 1990 triple-murder case. Like Nifong, Grace was accused of not disclosing critical evidence (the existence of other suspects) as well as knowingly permitting a detective to testify falsely under oath. The Georgia Supreme Court also reprimanded her for withholding evidence and for making improper statements in a 1997 arson and murder case. The court overturned the conviction in that case and found that Grace’s behavior “demonstrated her disregard of the notions of due process and fairness and was inexcusable.” She faced similar claims in other cases.
You might have expected Grace to suffer the same fate as Nifong. Instead, she has her own show on CNN, and the network celebrates her as “one of television’s most respected legal analysts.” On TV, she displays the same style she had in the courtroom. (In the Duke case, her presumed-guilty approach was evident early on, when she declared: “I’m so glad they didn’t miss a lacrosse game over a little thing like gang rape.”) The Grace effect is not lost on aspiring young prosecutors who struggle to outdo one another as camera-ready, take-no-prisoners avengers of justice. Grace’s controversial career also shows how prosecutors can routinely push the envelope without fear of any professional consequences. Often this does not mean violating an ethics rule, but using legally valid charges toward unjust ends.
Take the case of Genarlow Wilson. An honors student and gifted athlete, he was preparing for college in 2005 when he was charged in Georgia with aggravated child molestation for having consensual oral sex with a 15-year-old girl.
Though Wilson was only 17, Douglas County District Attorney David McDade and Assistant D.A. Eddie Barker secured a 10-year sentence for an act committed by thousands of teenagers every year. It’s not a crime in most states, and Georgia recently reduced it to a misdemeanor. But the prosecutors are now fighting a judge’s efforts to release Wilson. They can’t be charged on ethical grounds, but they’ve used the criminal justice system to brutalize a young man who should have received a stern parental lecture, not a 10-year prison term.
Nifong’s disbarment may deter some prosecutorial abuse, but until less visible cases are subjected to more scrutiny, it may prove to be an isolated event — driven by the same publicity that led to the abuse in the first place. If the case hadn’t been so high-profile, it’s doubtful that Nifong would have been charged, let alone disbarred, for his misconduct. The Duke case should teach us that a truly fair criminal justice system must strive to protect the rights of the accused as vigorously as it does those of the accuser.

The Criminalization of America

Published March 2007

Texas Rep. Wayne Smith is tired of hearing about parents missing meetings with their children’s teachers. His proposed solution is simple: Prosecute such parents as criminals. In Louisiana, state Sen. Derrick Shepherd is tired of seeing teenagers wearing popular low-rider pants that show their undergarments — so he would like to criminally charge future teenagers who are caught “riding low.”

Across the USA, legislators are criminalizing everything from spitting on a school bus to speaking on a cellphone while driving. Criminalizing bad behavior has become the rage among politicians, who view such action as a type of legislative exclamation point demonstrating the seriousness of their cause. As a result, new crimes are proliferating at an alarming rate, and we risk becoming a nation of criminals where carelessness or even rudeness is enough to secure a criminal record.

There was a time when having a criminal record meant something. Indeed, it was the social stigma or shame of such charges that deterred many people from “a life of crime.” In both England and the USA, there was once a sharp distinction between criminal and negligent conduct; the difference between the truly wicked and the merely stupid.

Legislators, however, discovered that criminalization was a wonderful way to outdo one’s opponents on popular issues. Thus, when deadbeat dads became an issue, legislators rushed to make missing child payments a crime rather than rely on civil judgments. When cellphone drivers became a public nuisance, a new crime was born. Unnecessary horn honking, speaking loudly on a cellphone and driving without a seat belt are only a few of the new crimes. If you care enough about child support, littering, or abandoned pets, you are expected to care enough to make their abuse a crime.

High crimes

Consider the budding criminal career of Kay Leibrand. The 61-year-old grandmother lived a deceptively quiet life in Palo Alto, Calif., until the prosecutors outed her as a habitual horticultural offender. It appears that she allowed her hedge bushes to grow more than 2 feet high — a crime in the city. Battling cancer, Leibrand had allowed her shrubbery to grow into a criminal enterprise. (After her arraignment and shortly before her jury trial, she was allowed to cut down her bushes and settle the case.)

Of course, it is better to be a criminal horticulturalist than a serial snacker. In 2000, on her way home from her junior high school in Washington, D.C., 12-year-old Ansche Hedgepeth grabbed some french fries and ate them as she went into the train station. In Washington, it is a crime to “consume food or drink” in a Metrorail facility. An undercover officer arrested her, searched her and confiscated her shoelaces.

Running out of adult targets, many state laws pursue the toddler and preteen criminal element. In Texas, children have been charged for chewing gum or, in one case, simply removing the lid from a fire alarm. Dozens of kids have been charged with everything from terrorism to criminal threats for playing with toy guns or drawing violent doodles in school.

In the federal system, Congress has been in a virtual criminalization frenzy. There are more than 4,000 crimes and roughly 10,000 regulations with criminal penalties in the federal system alone. Just last year, Congress made it a crime to sell horse meat for human consumption — a common practice in Europe where it is considered a delicacy. Congress has also criminalized such things as disruptive conduct by animal activists and using the image of Smokey Bear or Woodsy Owl or the 4-H club insignia without authorization.

The ability to deter negligence with criminal charges has always been questioned by academics. Negligent people are, by definition, acting in a thoughtless, unpremeditated, or careless way. Nevertheless, prosecutors will often stretch laws to make a popular point — even when the perpetrators have suffered greatly and shown complete remorse.

In 2002, Kevin Kelly was charged criminally in Manassas, Va., when his daughter, less than 2 years old, was left in the family van and died of hyperthermia. With his wife in Ireland with another daughter, Kelly watched over their 12 other children. He relied on his teenage daughters to help unload the van and did not realize the mistake until it was too late.

The suggestion that people like Kelly need a criminal conviction to think about the safety of their children is absurd. Kelly was widely viewed as a loving father, who was devastated by the loss. The conviction only magnified the tragedy for this family. (Though the prosecutors sought jail time, Kelly was sentenced to seven years probation, with one day in jail a year to think about his daughter’s death.)

The cost to all of us

The criminalization of America might come as a boon for politicians, but it comes at considerable cost for citizens and society. For citizens, a criminal record can affect everything from employment to voting to child custody — not to mention ruinous legal costs.

Yet, it now takes only a fleeting mistake to cross the line into criminal conduct. In Virginia, when a child accused Dawn McCann of swearing at a bus stop, she was charged criminally — as have been other people accused of the crime of public profanity.

Our insatiable desire to turn everything into a crime is creating a Gulag America with 714 incarcerated persons per 100,000 — the highest rate in the world. Millions of people are charged each year with new criminal acts that can stretch from first-degree murder to failing to shovel their sidewalks.

We can find better ways to deal with runaway bushes, castaway pets, or even potty-mouth problems. Congress and the states should create independent commissions to review their laws in order to decriminalize negligent conduct, limiting criminal charges to true crimes and true criminals. In the end, a crime means nothing if anyone can be a criminal.

The Feres Doctrine: What Soldiers Really Need Are Lawyers

The president and Congress have been falling over themselves to pledge better care for our wounded veterans in the wake of the scandal over “squalid” conditions at the Walter Reed Army Medical Center that included mold, rats, cockroaches, rotting walls and callous treatment of patients. The president has empanelled the perfunctory “blue-ribbon commission.” The hospital walls have literally been whitewashed, so politicians can use them again as backdrops for speeches about “nothing being too good” for our troops. Yet no one is talking about the one thing that soldiers and sailors are most desperately lacking: They don’t need another spit and polish; they need lawyers.
For decades, our military members have been barred from suing for medical malpractice and other forms of negligence by the government. Whether it is a military doctor cutting off the wrong leg or a military gasoline station cutting a brake line, military personnel are not allowed to seek legal relief as other citizens can. The result is that they are victims of grotesque forms of negligence that have not been widely seen in the civilian world for more than a hundred years. In the civilian system, the threat of lawsuit serves a critical deterrence of negligence by the government, companies and others. A rational actor will avoid liability costs by taking measures to minimize accidents.

Most Americans do not know that we deny our servicemembers the basic right to sue when they are injured by negligence. They live in a type of tort-free zone where their injuries are subject to relatively minor levels of compensation. With the silent approval of Congress, we have created a system of discount citizens who become easy fodder for incompetent or even criminal actors. Indeed, killing a soldier on an operating table or in a military recreation area is a virtual bargain at a fraction of the cost of a full-value citizen.

The military’s loss of legal protections is the result of a 1950 Supreme Court ruling on a series of cases that became known collectively as the Feres Doctrine. It was named after Army Lt. Rudolph Feres, who died in a fire allegedly caused by an unsafe heating system in his New York barracks. In this and later opinions, the Supreme Court interpreted the Federal Tort Claims Act to effectively bar any tort actions by servicemembers, even though Congress exempted only “combat-related” injuries. The court unilaterally decided that even injuries in peacetime that are far removed from any combat-related function are still “incident to service.” Thus, in one of the Feres cases, a soldier was barred from suing after an Army doctor left a 30-by-18-inch towel inside him marked as property of the “Medical Department U.S. Army.”

Little deterrence

As a result of the Feres Doctrine, there is little deterrence for military negligence beyond self-regulation, bad publicity or a political scandal. Because most accidents are isolated and military personnel tend to stay within the chain of command, these are relatively low risks for military tort-feasors. Moreover, since such accidents are not litigated, there is no reliable system to determine the rate of accidents in the massive military complex. Thus, we cannot reliably compare the accident rates in recreational or medical areas with their counterparts.

The military medical system is a prime example of what happens when patients are stripped of their legal protections. The military has long had many talented and dedicated doctors and nurses. Nevertheless, it also has long been plagued by scandals involving everything from doctors without medical licenses to medical treatment that borders on the medieval. Consider a few examples from the military malpractice-free-zone:

•Lt. Cmdr. Walter Hardin spent 11 months with red lesions from his legs to his torso that a doctor classified as eczema. It was correctly diagnosed as cancer shortly before he died.

•Sailor Dawn Lambert had to have a fallopian tube removed, but military surgeons left five sponges and a plastic marking device in her abdomen. They remained there for months until resulting complications forced a second surgery to remove her other fallopian tube, leaving her infertile. She was given $66 monthly in disability pay.

•Linda Branch lost her husband while he was serving in the Air Force after he was turned away twice by a military hospital that told him his intense stomach pains was nothing more than stomach flu. He died of a bowel obstruction.

•Navy Petty Officer Joe Cragnotti went to a military hospital with pneumonia, which is treatable with antibiotics. The doctor left it untreated, then Cragnotti suffered brain damage.

•Air Force Staff Sgt. Dean Patrick Witt had appendicitis but was repeatedly misdiagnosed and sent home with some antibiotics. When he finally collapsed at home, he was rushed into surgery. He came out brain-dead. It’s alleged that a series of malpractice led to his death, including the use of a pediatric rather than an adult device to open an airway when he had trouble breathing.

When civilian doctors leave a patient paralyzed or crippled for a lifetime of care, the family members often receive millions in compensation. In the military, the families receive a couple thousand dollars a month and, you guessed it, more military medical care. Dorothy Meagher found herself carrying for her son after he went in to have a cyst removed at a Navy hospital. Her family alleged that, due to an overdose of anesthetics and the failure of a Navy doctor to immediately call for assistance, her son was left a quadriplegic.

Unanswered questions

Many families in the military never know that they were the victims of malpractice because, without discovery, there is no routine way of forcing such disclosure. For example, Army Staff Sgt. Michael McClaran had a simple surgery for acid reflux. He said he was not told that the surgeon had severed two critical nerves — the cause of chronic respiratory and digestive problems.

Feres extends beyond medical malpractice. It bars lawsuits in a vast array of activities in such areas as travel, recreation, housing, restaurants, bars and service stations — military enterprises often run in competition with civilian businesses. Thus, when a rented water ski loses its brakes or a soldier is raped at a concert, the military invokes Feres and walks away immune from its own negligence.

Liberals and conservatives on the court — such as Justices John Paul Stevens and Antonin Scalia — have denounced the court’s continued use of this doctrine, as have dozens of lower court judges. This doctrine has done more harm to military personnel and families than any court-made doctrine in the history of this country.

Congress must amend the Federal Tort Claims Act to put an end to this disastrous doctrine. We can no longer afford to leave our servicemembers in the hands of politicians who express shock every 10 years as new scandals regularly emerge. Some lawmakers knew of the appalling conditions at Walter Reed but took no legislative action.

The fact is that military hospitals are often treated as little more than a reservoir of human props for political photo ops. The only other part of Reed that members of Congress routinely visit is the VIP floor located on the top floor. Known as the Eisenhower Executive Nursing Suite, it’s where high-ranking politicians, jurists, generals, admirals and diplomats are treated. Of course, the politicians, judges and foreign dignitaries are allowed to sue for any negligence.

Former senator Bob Dole, who co-chairs the new blue-ribbon commission, was treated there and recently noted that he never saw anything to complain about. That is not surprising since, unlike the vermin-infested and mold-covered rooms of wounded soldiers, politicians are given suites that include fine carpets, antique furniture, separate dining rooms and fine china.

If members of Congress truly want the best for our troops, they should start by giving them the same legal protections that the members themselves enjoy. No one is asking for Congress to treat our soldiers as high-value VIPs, but simply full-valued citizens with the same protections as the people they are defending around the world.

Jonathan Turley is the Shapiro Professor of Public Interest Law at George Washington University and a member of USA TODAY’s board of contributors. He is the author of a three-part study of the military, including its legal and medical systems.