Cover Story

Erasing the News: Should some stories be forgotten?

  •  
  •  
  •  
  •  
  • Print.

forgotten

Photo Illustration by Brenan Sharp

In September 2016, a newly formed committee of four editors at the Tampa Bay Times hosted the first of what will be quarterly meetings to develop policies for requests to remove or alter stories in online archives. This is yet another disruptive twist for journalism in the digital age: the possibility of erasing the historical record.


The committee at Florida’s largest newspaper, based in St. Petersburg, acted on such a plea at its first gathering: A woman wanted the committee to delete a story from years earlier in which she spoke with a reporter while she was interviewed for a job with a “naked maids” cleaning service when she was 19 years old. The woman now works in the more traditional business world, and the paper’s managing editor, Jennifer Orsi, thought it wasn’t fair for that instance in her life to define her now.

“Sometimes people don’t realize something may come back at them in ways they don’t expect,” says Orsi, who spearheaded the creation and is a member of, “for lack of a better term: the Web Content Review Committee.”

Consideration of such a request was unheard of before the internet, when news stories were clipped and folded in pouches filed in newsroom libraries for use by staff. For the public, newspaper archives went on microfilm or microfiche in libraries, searchable only by the approximate publication date (if known) and showing entire news pages at a time. That is nothing like the precise, instantaneous results that have come to be expected online. The historical record was there, but access wasn’t as easy or ubiquitous as the click of a mouse was.

Now, an obscure story from decades ago might appear on the first screen when a name is run through a search engine. And increasing numbers of people, sometimes represented by lawyers, want such stories “unpublished”—the word was coined for this phenomenon.

The Times editors decided to remove the story, but some mitigation and complication might make it a one-off situation. The feature had been published by the newspaper’s longtime competitor, the Tampa Tribune, which the Times acquired in May 2016 and whose archives it inherited. So the Times didn’t do the reporting, and no one on staff knew the original circumstances, Orsi explains.

“Traditionally, our policies don’t change concerning what’s published online,” she says. “The feeling is don’t unpublish the newspaper; don’t change online archives. But over the years, we’ve had some requests where it is more difficult to say no or involve reasons we hadn’t contemplated before the internet had been around for a while.”

Jennifer Orsi

Jennifer Orsi. Photograph courtesy of Gravitas Magazine.

Anecdotal evidence indicates that requests to unpublish have picked up in the United States since the Court of Justice of the European Union in 2014 created the so-called right to be forgotten. It is law for citizens in the 28-member countries that comprise the European Union.

Significantly, the court in Google v. Spain didn’t require a Spanish newspaper to delete from its archives foreclosure notices published years earlier. It instead said the plaintiff could ask a search engine to delist items because the items are “inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes of the processing” and in the “light of the time that has elapsed.”

Google sometimes casts itself as the card catalog in a huge library, and the EU court’s decision left the book on the shelf while requiring removal of the card from the index. (Critics argue that Google’s role is more involved and complex, that the search engine’s secret algorithmic rankings, selecting some results instead of others, amounts to curation.) A search of the man’s name on the newspaper’s website still would provide the foreclosure notice.

WHOSE LAW APPLIES?

Google soon developed procedures to deal with requests to delist stories in Europe, although doing so only where the person resides: Information about someone in France, for example, would be removed from that country’s domain, google.fr. But in March 2016, the French agency for privacy regulation, the Commission Nationale de L’Informatique et des Libertes, fined Google for not heeding its order to scrub links worldwide on google.com.

Last May, Google appealed what it considers an attempt at extraterritorial law, forcing a kind of censorship on people in other nations. A decision in the case is expected this year.

“France has no territorial jurisdiction over the U.S., but it’s purporting to tell Google to delete content from the U.S. market, the Canadian and Mexican markets, and others,” says Jonathan Peters, a lawyer who teaches journalism at the University of Kansas. He chairs the First Amendment subcommittee of the civil rights litigation committee in the ABA Section of Litigation.

“You have to worry about what message that sends to more autocratic regimes around the world,” Peters says. “Frankly, if you tried to create the right to be forgotten in the U.S., it would instantly violate the First Amendment.”

The Reporters Committee for Freedom of the Press and 29 other U.S. news media organizations had sent a letter to the French agency, the CNIL, before it slapped Google with a nominal fine and an ultimatum, saying the demand on the search engine behemoth is an “unacceptable interference with what people in other nations can post and read on the internet.”

Sidebar: For Sale: Your Personal Data

Although the First Amendment’s talismanic sway in the United States isn’t likely to diminish anytime soon, the push and pull between privacy and free speech is increasingly playing out here as the right to be forgotten becomes a bigger part of the debate.

A comparison of two cases, one here and one in Norway in 2009, illustrates an underlying difference in mindsets between the United States and Europe.

In Norway, a woman convicted of murder won a privacy lawsuit against a newspaper that published a photograph of her, in tears, outside the courthouse moments after the verdict in the high-profile case. The European Court of Human Rights decided in Egeland v. Norway that it was “a particularly intrusive portrayal” of her, to which “she had not consented.”

But here, the 2nd U.S. Circuit Court of Appeals at New York City ruled in 2015 in Martin v. Hearst Corp. that a woman, whose arrest records were expunged under Connecticut’s erasure law after prosecutors dropped charges against her, could not force a newspaper to remove stories about her arrest from online archives. The court noted that, although the woman can now legally swear under oath that she has never been arrested, the statute “cannot undo historical facts or convert once-true facts into falsehoods.”

The appeals court said that although the story about the arrest did not include an update about the case being dropped, it “implies nothing false about her.” But that points to another aspect of digital archives: the relative ease of updating.

Canada’s largest newspaper, the Toronto Star, will put subsequent information about acquittals or dropped charges at the top of an online story in boldface type when someone requests it and provides proof. Other publications with similar policies tend to put the updates at the bottom of stories, says Kathy English, the Star‘s public editor.

But as a general rule, the Star won’t unpublish except in cases in which the story is inaccurate or for legal reasons. Most requests concern crime stories.

“Now, people are really angry when they come to me, and they sometimes bring lawyers in,” English says. “There are cases that keep me up at night thinking about these people because they are so desperate.”

google

Google headquarters in Mountain View, California. Shutterstock.

In 2009, English conducted a study of how news publications were handling requests to change online archives as part of the Associated Press Managing Editors online credibility project (now named the Associated Press Media Editors). Her report, The Longtail of News: To Publish or Not to Publish, included the results of survey questions answered by 110 news editors in Canada and the United States, providing statistics on whether, when and how they might update archived stories, change them or even unpublish them.

While most news organizations indicated they were generally against unpublishing stories, 78.2 percent said they should sometimes do so; 67 percent said they would when information in them is inaccurate or unfair; 20.9 percent said they might delete a story when, although accurate, it has outdated information that could damage the person’s reputation in the community.

“It was such an intriguing issue,” English says of her project of about eight years ago. “But I’m beginning to wonder whether we, as journalists, have really thought through the human implications of this, where a story never disappears. I just don’t know what the alternative is.”

Kathy English

Kathy English. Photograph courtesy of the Toronto Star.

As news organizations and Google began to wrestle with demands to make information disappear, new forces came into play, including reputation management companies and the use of search engine optimization to change results. Businesses, such as ReputationDefender, charge thousands of dollars to push negative results downward in Google searches by seeding the web with positive links that rise to the top.

But that strategy can backfire. The University of California at Davis was hammered in 2016 news reports for having paid $175,000 total to two reputation management companies to scrub the web of negative results for searches that concerned a 2011 video-recorded incident, when campus police used pepper spray on student protesters sitting peacefully on the ground. Several state legislators called for the already-embattled university president to resign.

For its part, Google produces a regularly updated transparency report, available online, with statistics that concern requests it receives to provide information to the government or others, to remove information, and how its data is affected by law and policy. The report recently showed that since 2014, when Google implemented the process in Europe for requests to remove search results, it had received at least 647,000 of them, concerning about 1.8 million URLs. It has removed 43.2 percent of them, with the largest single fraction being about 15,000 from Facebook.

Google has been chary of providing details about the circumstances of those removals. But in 2015, researchers found the search engine giant had accidentally left some code in its transparency report that provided just that. The hidden information showed that less than 5 percent of requests were from criminals and high-level public figures (lower than expected), with 95 percent of them made by ordinary citizens, according to Julia Powles, a researcher at the University of Cambridge Faculty of Law who specializes in the law and policy of data sharing and privacy.

In a chart that accompanied a piece she wrote for Slate in 2015, examples of 15 successful requests ran beside a list of 16 that were rejected. From the descriptions of individual cases, it appears Google generally finds in favor of someone who has been swept up in a news story through no fault of his or her own, but not for someone irked by negative stories or accurate information about that individual.

Kathy English

Kathy English, the Toronto Star’s public editor, with a colleague. Photograph courtesy of the Toronto Star.

Two examples: A woman whose husband had been murdered decades earlier succeeded in having an article, which included her name, delisted. But a media professional lost his bid for delisting four links to news stories that reported embarrassing content he put on the internet.

NEW LAW STEPS IN

In the United States, the First Amendment typically holds sway in the face of challenges to web content widely recognized as reprehensible, such as revenge porn and mug shot shakedowns. Legislative remedies for those specific categories are increasing.

With significant increases in the past couple of years, 34 states and the District of Columbia have criminalized various aspects of revenge porn, and federal legislation was proposed last summer, according to the Cyber Civil Rights Initiative. The CCRI was started by Holly Jacobs—who was a victim herself in 2013. It has grown into a support network and information clearinghouse, and has advised many of the states and Congress on legislative proposals.

The CCRI also was influential in getting Google to announce in June 2015 that it would act on requests to delist nonconsensual nude or sexually explicit images.

“It was a watershed, and created a mini-self-governing right to be forgotten when there’s NCP,” or nonconsensual porn, says Carrie Goldberg, a New York City lawyer who developed a practice in revenge-porn law after being victimized and unable to find a lawyer who really knew the area.

Marc Rotenberg

Marc Rotenberg. Photograph courtesy of Electronic Privacy Information Center.

“We know how to locate content and get it removed swiftly,” says Goldberg, who is on the CCRI board of directors.

An arguably seedy internet business—publishing mug shots obtained from law enforcement agencies and charging fees for their removal—has survived in the courts. But since 2013, 14 states have passed legislation to rein it in, according to the National Conference of State Legislatures. Because of concerns about legitimate First Amendment rights to public records, the statutes typically prohibit the sites from charging photo removal fees or law enforcement agencies from providing photos to sites that charge such fees.

CARVING OUT ANOTHER PATH

U.S.-based proponents of in-creasing the scope of the right to be forgotten point to areas in law and policy that already do so in certain categories. These include the Fair Credit Reporting Act’s seven-year time limit on bankruptcy information and 10-year limit on civil judgments that appear in credit ratings, as well as expungement laws and, more routinely, not reporting criminal convictions of juveniles.

In a more recent example, California’s so-called online eraser law went into effect in 2015 and required various kinds of online services to remove, upon request, content posted by someone under 18 at the time. And the “ban the box” movement is growing fast to stop the use of questions about criminal history on job applications, moving any background check to later in the hiring process after qualifications have been determined. So far, 24 states have policies in place, and President Barack Obama instructed federal agencies to do so in 2015.

“The U.S. has a robust tradition of expungement and concealing bankruptcy and juvenile crime and more,” says Marc Rotenberg, a professor at Georgetown University Law Center and president and executive director of the Electronic Privacy Information Center in Washington, D.C. He thinks that if everyone’s personal history is readily available on the internet, including their worst or most embarrassing moments in life, it has a chilling effect on free expression.

Carrie Goldberg

Carrie Goldberg. Photograph courtesy of ©Vassar College—Samuel Stuart Photography.


“This country, from the beginning, has been about a second chance,” Rotenberg says. “Culturally, I think the differences [from Europe] have been overstated.”

But this should not be for Google or other search engines to decide, says Jonathan Zittrain, a Harvard Law School professor and faculty director of the Berkman Klein Center for Internet & Society at Harvard University.

If a nation enacts substantive policy for a right to be forgotten, he explains, “it should offer itself the apparatus to judge people’s claims under the policy. To delegate to Google that decision-making authority—indeed to insist upon it—privatizes what should be public law,” says Zittrain, noting that otherwise the search engines will have incentive to simply delist information rather than risk appeals and fines. “That’s an awful way to design an adjudicatory system.”

Eugene Volokh, a UCLA School of Law professor, argues that carving out parts of the historical record can be a slippery slope. He thinks, for example, that the removal rules for credit ratings might be unconstitutional.

“Let others be the judge of whether you are a different person now,” Volokh says. “If I’m going to hire a baby sitter or other employee, I might want to know what someone has done.”

(Volokh was commissioned by Google in 2012 to write a paper in response to the possibility of anti-trust regulation of its search results because they might be used in conjunction with advertising that promotes Google’s own products.)

More recently, Volokh investigated the disturbing possibility that some reputation management companies are using fake plaintiffs to sue fake defendants for libel, so the two parties can then agree to a court injunction in which the defendant agrees to remove comments or other postings on the internet.

In the United States, Google will delist a webpage when a court has weighed in. Working with Paul Alan Levy, a lawyer with the litigation group at Public Citizen, Volokh found more than 25 of these suits around the country, including some in which reader comments were faked as pretext for targeting webpages as defamatory.

Jonathan Zittrain

Jonathan Zittrain. Photograph by Arnold Adler.

“The court doesn’t expect a fake lawsuit because you can’t get real money from a fake defendant,” says Volokh, who has written extensively about the problem on his blog, The Volokh Conspiracy. “The court sees a stipulation by the supposed defendant and often rubber-stamps it with an order that Google then implements. That’s a serious problem not just for the right to be forgotten but for defamation law itself.”

In a case of more customary abuse of process, a federal court threw out a lawsuit in August 2016 by a man who claimed that potential employers didn’t hire him because they learned online of his patterns of suing other employers. He claimed that online search providers, such as Google, Microsoft and several others, killed his chances of being hired because searches under his name turned up information about that litigiousness.

The court in the Western District of Pennsylvania signed off on a magistrate judge’s recommendation and report, in Despot v. Baltimore Life Insurance Co., which noted that the man had a “pattern of filing conclusory complaints against former and prospective employers,” and that because of his history in the federal courts, “his pro se status does not save his complaint.”

Sham lawsuits and unfounded ones are to be expected in the debate now underway in the United States regarding the right to be forgotten. The issues are complex, and the solutions are never easy.

“The main idea I try to get across is that there are going to be a lot of close calls, but there should not be in American law a blanket denial or a refusal of a right to be forgotten,” says Frank Pasquale, a professor at the University of Maryland Francis King Carey School of Law who specializes in the social implications of information technology.

Given our First Amendment and the underlying reverence for free speech, the path forward will be winding and largely ad hoc, albeit with attempts to formalize policy, as with the special committee at the Tampa Bay Times. But new norms surely will develop as speech and privacy find a balance in response to disruptive technology.


This article originally appeared in the January 2017 issue of the ABA Journal with this headline: "Erasing the News: The media and lawyers wrestle with the question: Should some stories be forgotten?"

Give us feedback, share a story tip or update, or report an error.