Nextdoor's experiment to try to stop users from being racist

Latest

Almost everyone who uses the social network Nextdoor to connect and communicate with their neighbors knows the platform has a very serious problem: racial profiling. The network’s role in crime-spotting has been celebrated as its “killer feature,” but many users are too eager to associate “suspicious behavior” with the color of people’s skin. Nextdoor has been trying to figure out how to discourage racial bias from manifesting on the platform and is now rolling out changes it hopes will do that.

In October, Nextdoor CEO Nirav Tolia promised action in response to investigations of racial discrimination by its users. In May, The Verge reported that those changes were in a testing phase.

Nextdoor has become a lens for the racial bias of its users. As Pendarvis Harshaw reported for Fusion last year, black friends visiting a white friend in Oakland were reported by neighbors on Nextdoor as “suspicious” and “sketchy” because they were waiting around while trying to find the house. And when a community meeting was set up to deal with the issue of racial profiling on the site, the meeting was only open to white people.

Nextdoor is in the difficult position of trying to engineer its users to be less biased. Now, thanks to an NPR interview with Tolia, we have a little more information on how it’s doing that. In a “pilot project running in select neighborhoods across the U.S.,” if someone reports a suspicious person and includes their race, they have to fill out a couple of other descriptive fields, like what clothing they’re wearing or what shoes they have on.

Then, it goes to an algorithm:

An algorithm under development spot checks the summary of the suspicious activity for racially charged terms, as well as for length. If the description is too short, it is presumed to lack meaningful detail and is unacceptable.
If a draft post violates the algorithm’s rules or the form’s mandatory fields, the user has to revise. Otherwise, it’s not possible to post.

The use of the algorithm is interesting, since it’s presumably based on Nextdoor’s fairly broad definition of racial profiling. Tolia, for his part, seems to be cognizant that this will likely only be a small fix.

“This is a very, very, very difficult problem in society,” he told NPR. “Do I believe that a series of forms can stop people from being racist? Of course I don’t. That would be a ridiculous statement.”

We’ll just have to wait and see whether it’ll do something more broadly. The pilot phase ends soon; NPR reports that Nextdoor “plans to roll out changes to its entire U.S. network in the coming weeks.”

Ethan Chiel is a reporter for Fusion, writing mostly about the internet and technology. You can (and should) email him at [email protected]

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin