Last year, San Francisco released BART Watch, an app for reporting crimes on the Bay Area's main transit system. After filing a California Public Records Act request to see what people were reporting, the East Bay Express found a disturbing pattern:
Of the 763 alerts sent to BART, 198 included a description of the race of the person who was the subject of the complaint. Out of these 198 alerts indicating a person's race, 134 of them, or 68 percent, described Black people as offenders and suspects warranting a police response.
What is important is that the vast majority of requests for police presence weren’t due to any actual crimes being committed.
Passengers who sent these alerts to the police often characterized playing music, singing, dancing, talking loud or yelling, and taking up more than one seat as "disruptive behavior" by a Black person that warranted a police response. […] “There were only seven "crime in progress" alerts sent through the app that identified Blacks as breaking a specific law, and among these, only one appeared to be for a potentially violent offense. The rest were apparently for drinking alcohol, smoking, and yelling on the train.”
Don’t get me wrong. Those activities on public transit are definitely annoying. I hate when people play music without headphones. I hate when people are so loud they drown out the music I'm playing in MY headphones. And “manspreading” needs to stop. But I wouldn't call the police for any of these things; these little daily nuisances are part of living in the world with other people.
But to the co-riders on BART, these minor inconveniences "warrant a police response." The article disturbed me on multiple levels. For one, it's a reminder that we're surrounded by snitches. Who needs a surveillance state when you have people requesting police backup for sleeping homeless people?
But it’s also disturbing given that black people represent such a small portion of the population in San Francisco proper—just 6%—but make up at least 20% of the offenders being reported on by BART riders.
Whenever I visit San Francisco, I feel hyper-visible - as if random people on the street are trying to figure out where I come from while assessing my threat level. Wearing a Stanford hoodie doesn’t help. Living in San Francisco is the epitome of the white gaze - I might as well have "OTHER" stamped on my forehead even though I'm standing in line at Sightglass like everyone else.
This is Silicon Valley, the beating heart of the technology industry, where so many world-changing applications are being designed. The people riding the BART, and assumedly making these reports, are the technorati. I fear that the racism evident in these reports reflects how the designers of the digital revolution view black people - we are not participants but rather problems to be solved.
But I'm also worried about the design of the app itself. "See a black person, call the police," is exactly the opposite of the kind of app we need. Police are an escalating force, not on-call babysitters for those moments when life gets a little too real. I keep thinking of the ACLU’s recent lawsuit against police in Barstow, California who slammed a black, pregnant woman to the ground. The woman had gotten into a parking lot dispute with a white woman during which heated words were exchanged. Last time I checked, yelling at people from your car wasn’t a crime, but police came to the scene despite the fact that there was no damage to report and no physical altercation. When the black woman accused of meanness refused to provide her ID to the officer, she was slammed to the ground and arrested.
"Notice that the officer does not ask the first witness, who is white, for her ID, but he does ask the second witness [04m31s], who is African-American, for hers," the ACLU explains. "What role does race play in this difference in treatment?"
This guilty-until-proven-innocent view of African-Americans is present in other technological settings.
In March, Pendarvis Harshaw wrote about the “meet your neighbors” app NextDoor and the racism that tends to flourish there. Even on the liberal side of the Left Coast, users of NextDoor felt totally free to racially profile, assuming nefarious intent from African-Americans who appeared lost or who committed the egregious offense of knocking on the wrong door while looking for a friend’s house. (With fearful fingers, I tapped my way to signing up while writing this article and was relieved to find that my Washington, D.C. neighborhood is still very diverse and mostly concerned with 100-year-old row house problems like mysterious leaks.)
NextDoor isn’t the only application where racism seems part of the source code. In 2012, Microsoft made headlines with their patent to avoid certain areas in their GPS app - critics of the proposal dubbed it “the avoid the ghetto app." The uproar was apparently missed by the creators of Avoid Sketch who launched a similar app in 2014 to avoid “sketchy” areas.
The commonality between these apps is how the technology is used to cloak racism. It is not politically correct to air those kinds of opinions in public, but thanks to the ease of the apps on a phone, people can play to their own racism-influenced reasoning with the click of a button. The user doesn't ever have to question if their assumptions are incorrect or even harmful - by design, the system affirms that the issue reported is valid, leaving the humans on the other end to sort through nuisance calls to figure out what actually needs police presence. And, unlike talking to a skeptical dispatcher, the app never provides a check to justify the report.
Considering how the Bay was thrust into the spotlight with the 2009 murder of Oscar Grant by a BART police officer, one would think that BART riders would be a little more judicious when requesting police back-up. Maybe the app should start playing the trailer for Fruitvale Station before allowing people to submit a request:
The problem, as always, isn’t the existence of the app itself. The app is a platform. It’s a great idea to be able to contact the BART police directly, and not have to make your way to an intercom which may or may not be working. It could really improve safety on the system.
But that safety will be compromised if there’s too much noise to signal. And filling the app with racist suspicions only serves to make everyone less safe. Perhaps a view toward reducing bias should be part of the user experience planning process - clarifying with an on-screen explanation what counts as suspicious activity, or disruptive behavior, or even just adding a "Does this incident need an immediate police response?" as a pop-up when reporting.
They could add a tag specifically for "quality of life" issues, that would allow people to alert the train operator; some of these problems could more easily and quickly be solved with a targeted intercom announcement.
Providing a better way to filter requests may help solve longer term problems. If numerous reports in a particular area concern people who are homeless on BART trains, perhaps the BART police can provide that data and tracking information to initiatives like Lava Mae (which converts old MUNI buses into mobile showers). And later, as more data becomes available, it may make sense to have users self-tag each of the incidents they report as violent or non-violent, which helps train people into more accurate reporting of each incident.
Unfortunately, there isn't a quick hack to engender empathy or compassion. But we could start turning the tide in the fight against racism if we could start shifting the way people think about human-centered design. Most tech workers are aware of human centered design as a practice: pioneered by the creative firm IDEO, the mandate is to understand why you are building a product in the first place. I am sure the BART Watch app isn't trying to solve for the needs of white and Asian users at the expense of black users, but that's what is happening.
In IDEO's own words, "human-centered design is all about building a deep empathy with the people you’re designing for." While human-centered design was created for the social sector, it is easy to apply the principles to dismantling biased systems. Since the way we incentivize/decentivize behavior could have major real world outcomes, the designers of the next version of the BART Watch app could put extra effort into designing solutions that better help users assess the threat level of situations, provide options for other types of resolutions that do not require police presence, and involve compassion in reporting the kinds of disturbances that come from people in need of help.
If we cared enough, we could engineer racism out of the equation - it just takes a better design.