National Pulse

Websites and apps for sharing crime and safety data have become outlets for racial profiling

  •  
  •  
  •  
  •  
  • Print.

Nextdoor The social media icon for the Nextdoor app. fyv6561 / shutterstock.com

When Shikira Porter learned that the social networking site Nextdoor was coming to her community in Oakland, California, she couldn’t wait to join. The site offers a private platform for users to meet and share information about their neighborhoods.

Not long after joining, however, Porter began to witness a disturbing trend among her neighbors.

“I started noticing under the crime and safety portion of the platform that folks were profiling,” says Porter, a director at a Marin County homeless shelter. She saw that users were disproportionately accusing black men of suspicious activity on the site. Porter brought this to Nextdoor’s attention and soon after co-founded Neighbors for Racial Justice to target this kind of bias.

According to Porter, Nextdoor users have singled out minorities for engaging in “suspicious” activities such as walking down the sidewalk, driving down a street, making a U-turn and driving too slowly.

Racial profiling, while not new, has found new outlets through web-based programs and apps that gather unfiltered information about crime and suspicious activity. The result is that users are reinforcing existing racial bias in the criminal justice system, sometimes inadvertently, experts say.

“Implicit bias refers to attitudes or stereotypes that influence how we move through the world, but in an unconscious manner,” explains Cheryl Staats, a senior researcher at the Kirwan Institute for the Study of Race and Ethnicity at Ohio State University. 

BIASED DATA

Anyone can exhibit implicit biases, as they are created by our experiences and culture. These in turn can unconsciously influence our snap judgments about a stranger’s perceived identity, according to Staats. This does not mean that people intend to act in a biased way, but still their brains unconsciously equate darker skin with criminality and danger, causing them to report accordingly.

Implicit or explicit bias that translates to reporting suspicious activity can affect the accuracy and use of crime-related data. “The more these biases are allowed to seep in, the less representative the data are of what actually happened, and the less useful they are for policy and research,” says Jennifer Doleac, an assistant professor of public policy and economics at the University of Virginia.

In the San Francisco Bay Area, commuters can report suspicious activities through the BART Watch App. It’s a smartphone tool that sends information about incidents or suspicious activity directly to Bay Area Rapid Transit police in California, though it is not meant as an alternative to calling 911 in an emergency. Users can anonymously send a text, as well as a photo, to assist police. According to Benson Fairow, a deputy chief with the BART police department, the majority of reports through the app are complaints about panhandling and disruptive behavior. Versions of this app are also being used by commuters in Santa Clara, as well as Atlanta, Boston and Buffalo.

Race has become a factor in these apps, as well. Darwin BondGraham an investigative reporter for the East Bay Express, an alternative newspaper covering Oakland, learned through a public records request that of the 763 reports on the BART app between April 7, 2015, and May 12, 2015, 198 mentioned the race of the suspicious person. Of those reports, 68 percent stated the suspect was black, according to BondGraham. A 2008 BART ridership survey found that only 10 percent of riders were black.

IT’S THE USERS

Other apps have served as outlets for racial bias. SketchFactor and Good Part of Town, originally called Ghetto Tracker, were independently created apps that claimed to help people avoid unsavory parts of cities. Both came under intense scrutiny for promoting profiling and being generally tone-deaf. Both projects have since been discontinued.

“The notion that technology is neutral needs to be abandoned,” says An Xiao Mina, who researches language bias in social media as a Knight Visiting Nieman fellow at Harvard. “We need to design our technologies from the ground up thinking about diversity of users.”

Mina admits that fixing technology to offset human nature “sounds like a daunting challenge.” However, the BART police and Nextdoor have both proactively battled profiling in their technology through altering their reporting policies.

While it is unclear whether BART Watch users’ bias translated into biased officer conduct, the department took concerns seriously. One change to the app removed the “other” category used in incident reporting. The category, according to Fairow, was “a dumping ground” for users to “pick on people that they didn’t like.” Reports categorized as “other” primarily referred to people who appeared or smelled homeless.

While police cannot control the information people upload through the app, Fairow says it’s important “that our officers know good, sound policing practices.” Since 2010, this has included ongoing bias training for BART officers and volunteering of their data for a forthcoming study about bias in their pedestrian and traffic stops, arrests and uses of force.

A Nextdoor representative responded to the ABA Journal’s questions about racial profiling by saying that it is “extremely rare” on the platform. Nonetheless, in January and April, Nextdoor posted on its corporate blog that it was implementing a series of changes intended to diminish user bias. The changes include a prohibition on racial profiling in the site’s guidelines, making racial profiling a reason to flag a post in violation of those guidelines, and testing new reporting forms that emphasize the suspicious act rather than the person.

PREDICTING CRIME

Beyond these crowdsourcing tools, some experts are concerned that bias in official crime reports will affect new predictive policing models, a data-driven approach that proactively dispatches officers to locations because an algorithm anticipates crime will happen there.

When relying on historical crime data, the concern is that an algorithm will reinforce overpolicing in areas already experiencing a high police presence. As Andrew Ferguson, a law professor at the University of the District of Columbia, points out, “If you’re using the data that is built within a system that has a racial imbalance, the resulting data will also be racially imbalanced.”

Eleazer Hunt, the manager of information services at the Greensboro, North Carolina, police department, oversaw a predictive policing pilot in 2015. The predictive policing missions were based on the department’s five-year crime reports. “I don’t know how it’s biased when we’ve been called by a citizen to take a report of a crime that occurred,” Hunt says. “It’s not like we’re going out and looking for crime in certain areas on our own and taking the reports we want to take.”

Jeremy Heffner, a senior data scientist at HunchLab, a product of Philadelphia-based Azavea and the predictive policing model used in Greensboro, says biased data can be overcome. “If there is bias in policing data, we can water it down a bit by providing other data sets that represent risk, but the output isn’t controlled by the police department itself.”

HunchLab uses not only historical crime data but also weather, location of bars and the day of the week to determine when and where certain crimes will likely occur. The model is tweaked as they receive more data sets and feedback. The company also opens up its algorithm for third parties to analyze.

Back in Oakland, Porter and Neighbors for Racial Justice are now working with Nextdoor to eliminate profiling on the platform. “Simply putting new guidelines in and new policies is not going to change people’s behavior,” Porter says. However, she thinks that with sustained effort, Nextdoor users can be trailblazers when it comes to eradicating racial profiling on social media.


This article originally appeared in the August 2016 issue of the ABA Journal with this headline: “Looking Suspicious: Websites and apps for sharing crime and safety data have become outlets for racial profiling.”

Give us feedback, share a story tip or update, or report an error.