For Nextdoor, Eliminating Racism Is No Quick Fix

Social media has a racism problem because humans have a racism problem. Can one company solve it?
For Nextdoor Eliminating Racism Is No Quick Fix
Li-Anne Dias

Nirav Tolia was sitting down to lunch at a Los Angeles sushi bar when the article dropped. A trim Indian American with a photogenic face and perennially crisp shirts, Tolia, 45, is the cofounder and CEO of the social media site Nextdoor. He’d just finished taping an episode of Dr. Phil—classic Nextdoor press. The segment was about a family whose teenage son was a kleptomaniac. He’d been taking things from the neighbors. At the end of the show, Dr. Phil recommended Nextdoor as a tool that could help the neighbors communicate if, say, a garden hose went missing. These were the kinds of stories Tolia was used to doing: feel-good pieces about the feel-good things that happen on Nextdoor, a site that more than 10 million people use in three quarters of US cities and towns to connect with their neighborhood. A woman has too many tomatoes in her garden, so she’s left them on the porch for people to collect. A man is moving and he has a taupe couch to sell. The parents down the block have a great recommendation for a babysitter.

But the story that dropped that evening in March 2015, published by Fusion was not Nextdoor’s typical press. It was a feel-bad story.

Tolia was dining with his communications director, Kelsey Grady, when her iphoness alerted them to the story’s publication. They read the lengthy feature together. It alleged that white Oakland residents were using the “crime and safety” category of Nextdoor to report suspicious activity about their black neighbors. “Rather than bridging gaps between neighbors, Nextdoor can become a forum for paranoid racialism — the equivalent of the nosy Neighborhood Watch appointee in a gated community,” wrote Pendarvis Harshaw.

Caught off guard, Tolia asked his neighborhood operations team, which handles customer service, to review Nextdoor postings. They discovered several dozen messages posted over the course of the previous year that identified suspicious characters only by race. By some measures, it was a tiny issue. The site distributes four million messages daily. But there’s no such thing as a “small” racism problem when the promise of your service is that it allows people to trust their neighbors. “If there’s something that drives communities apart, it just overturns the whole premise,” Tolia tells me. “People will feel like, ‘Oh, [Nextdoor,] that’s that place where people are racist.’”

In the last couple years, as the largest social networking sites have come of age, many of them have also become platforms for racial bias. It’s not surprising. Most social web services — like Airbnb or Facebook or Twitter — were launched quickly. Their founding teams—consisting mostly of well-off men (and the occasional woman) from prestigious universities—were not diverse. Those teams hired designers and engineers who looked like them to launch and grow the sites. These companies weren’t thinking about the way bias would influence how people use their services; they were moving fast and breaking things, content to fill in the details later. What’s more, they mostly built advertising businesses that became more successful as people provided them more social data by posting more on the sites. There was little business incentive for them to slow their users down and ask them to think about why and how they were posting—and, in some cases, to post less.

"People will feel like, ‘Oh, Nextdoor, that’s that place where people are racist.’”

In many ways, that approach was successful beyond measure. In less than a decade these services have exploded into the mainstream, replacing brick and mortar stores with more efficient and delightful web versions. It was only after the fact that a major shortcoming began to reveal itself: With many of these services, minorities of all types were targeted or excluded. The problem with technology that allows us to target exactly who we want is that it allows us to target exactly who we want. In other words, the problem with technology is us.

For too long, these companies took little or no responsibility, claiming they were just platforms on which behavior — both bad and good — unfolded. Twitter failed to get ahead of its abuse problem. Airbnb has yet to figure out how to stop hosts definitively from refusing African-American guests. Last week, Facebook began to stop some advertisers from keeping certain races from seeing their ads — but outside ads for housing, employment, or credit, that practice is still fair game.

This is, point blank, a major failing of the Web 2.0 era. Nevertheless, here we are in 2017, confronted with a host of services that serve some people better than others. These companies are learning the hard way that there is no silver bullet for eliminating racial bias, and no quick web fix for calling out bad behavior.

Nextdoor has tried to eliminate racial bias on the site the same way it built its product: through smart product design. It is having some success — Tolia says the site has cut racial profiling in its crime and safety category by 75 percent. But this story is not yet finished.

Tolia and his cofounder, early Microsoft veteran Sarah Leary, never intended for Nextdoor to grow into the new neighborhood watch. Both serial entrepreneurs, they’d founded the company in 2010 after a previous startup failed. Facebook had just emerged into the mainstream, acquainting us with the idea of using our real identities to connect to people we knew. They figured Nextdoor could be a next-gen Craigslist, helping us to use those same identities to connect to our neighbors—often people we didn’t really know. The site took off from the start, and has grown into a $1.1 billion company connecting people in 130,000 neighborhoods around the world.

As with many social services, founders launch with the intention of doing one thing, and quickly discover that users want to do something else. From the start, people turned to Nextdoor to discuss crime and safety concerns in the neighborhood, and that part of the site took off. So, by the fall of 2015, roughly one in five Oakland households used Nextdoor, and three Oakland city agencies were using the site to send out public service announcements. About a fifth of the conversations that happened in Oakland’s Nextdoor networks involved crime.

To address racial profiling in these conversations, Nextdoor first needed to understand how and where it was happening. So Tolia and Leary assembled a small team of senior people, which included Grady as well as a product manager, a designer, a data scientist, and later an engineer. Its lead was Maryam Mohit, who is director of product at Nextdoor. At 49, with a mop of curly hair and a considered approach to her speech, Mohit was a varsity techie. She’d gotten her start at Amazon, where she was on the team of people who had created the one-click patent. Mohit believed the issue could be addressed through smart product design. Every day, she’d bring home reams of postings. She read them late at night. She sifted through batches of them on Saturday mornings in her bathrobe and slippers, before she heard the patter of her children’s feet in the hallway. “I must’ve read thousands of anonymized posts to see what was actually happening,” she says. “It felt urgent.”

By the late fall of 2015, Nextdoor had landed on its initial three-part fix. First off, it put the entire neighborhood operations team through a diversity training workshop so that its members could better recognize the problem. Second, it updated the company’s community guidelines, and reintroduced them to its users through a blog post. Both moves were well received by people in the Nextdoor community.

The third part of their solution flopped. Already, Nextdoor allowed people to flag inappropriate posts. Its members often used the system to flag commercial posts, for example, when they were posted in noncommercial areas of the site. They added a “racial profiling” category, inviting users to report the behavior if they saw it. The problem was that many people didn’t understand what it was, and Nextdoor members began reporting all kinds of unrelated slights as racial profiling. “Somebody reported her neighbor for writing mean things about pit bulls,” Mohit recalls.

The team realized it needed to help users understand when to use race when talking about suspicious or criminal activity. And to do that, they needed to define — very specifically — what constituted racial profiling in the first place. “We could not find a definition of racial profiling that everyone agreed on,” says Tolia. “If you go to the NAACP, if you go to the ACLU, if you go to the White House Task Force on Diversity, if you go to the Neighbors for Racial Justice, none of these people agree on what racial profiling is.”

Meanwhile, the press attention was escalating. Nextdoor members began reporting actual racial profiling more frequently, especially in several communities in California and Texas. Oakland city officials were concerned the fix wasn’t enough, and they threatened to remove the three Oakland agencies that used it to distribute information. “You rely quite heavily on your relationships with cities. It is a part of your marketing strategy to do this,” said Council Member Desley Brooks, speaking to Nextdoor’s team in front of a public audience at an Oakland city council meeting in early 2016. “If we come off, it is a bold statement that a city is not going to condone racial profiling at all. What is Nextdoor going to do to make sure that if the City of Oakland continues to use this as a tool, all of our residents are protected?”

Nextdoor counted on public agencies to use its platform; it was one of the things Nextdoor members had come to expect from the site. Now it was more than just a moral issue. The health of the business was at stake.

In late October, Tolia, Mohit, Grady, and the head of neighborhood operations, Gordon Strause, rode BART across the Bay to the 19th Street stop in downtown Oakland. Nearby, at an airy brick-walled coworking spot called Impact Hub Oakland, they met with five members of the activist group Neighbors for Racial Justice. “We were a bit nervous because we knew they were frustrated,” Grady remembers.

Nextdoor had learned about the group through an article in the local alternative weekly, the East Bay Express, and reached out to meet. The activists were wary. They were anxious to share the ideas they’d developed to help Nextdoor improve its product, but they believed Nextdoor was caving to pressure from the press and city officials, rather than authentically trying to tackle the issue. “They were completely placating us,” says Neighbors for Racial Justice leader Shikira Porter.

The conversations that happened over the weeks that followed were tough for everyone involved. Some of the activists felt that Nextdoor was paying them lip service. Some Oakland council members were concerned that their agencies should not be distributing public information on a platform that condoned any racial profiling at all. Everyone — even Tolia — agreed that Nextdoor wasn’t moving fast enough. “This is urgent,” Council Member Brooks told the Nextdoor team at one heated January meeting.

The Nextdoor group became very familiar with BART. They traveled to Oakland to meet with council members and law enforcement officials. They spoke with representatives of the Oakland chapter of One Hundred Black Men. They began holding regular working groups in which they included these people in the product development process.

Of course, this issue was not limited to Oakland, and the team also began seeking information from national organizations. They reached out to experts in all types of institutions that had figured out how to work with people reporting crimes or emergencies. They talked to 911 operators to learn the order in which operators asked questions of people so that they could get information that would be most helpful in emergencies. They spoke to police officers to learn their process for taking reports when they investigated crimes.

They also studied the work of Jennifer Eberhardt, a Stanford academic who won a MacArthur Genius Award for her research on unconscious racial bias. Eberhardt consults with police departments on bias and racial justice, helping them incorporate her theoretical work into their day-to-day interactions. “The basis of her research is around something she calls decision points,” says Tolia. “If you make people stop and think before they act, they probably won’t do the racist things that they do.”

In early 2016, the Nextdoor group set out to add decision points to steps of posting in the site’s crime and safety section. The point was to add friction to the process—to make users stop and think just enough to be purposeful in their actions, without making Nextdoor so laborious to use that it drove them away.

This is tricky territory for a social media company. Most of them, including Nextdoor, make their money off the data users input, either by advertising or selling that data to other companies. Therefore, most companies are incentivized to make it ever easier for people to post, and to encourage them to post more. Open Snapchat, for example, and you get a camera, ready to snap and post. Pull up Facebook, and the box at the top of the screen asks, “What’s on your mind?” If a company makes it harder for people to post, they’ll post less, and that will have a negative impact on the bottom line.

But there are times when a bit of friction can be useful. Take LinkedIn, which asks people at signup, “Where else did you work?” The company understood, starting from its earliest days, that nudging users to have a more complete profile at the start would make the service more valuable to them over time.

Mohit’s team gathered daily in whatever conference room they could find — space was tight — to riff around the whiteboard on ideas for how to take users through a process before they posted. As they worked, they checked in with community groups through in-person working sessions and on Google Hangouts.

They developed six different variants. For example, if you show people an interstitial defining racial bias before they post, does it change the way they post? If you force people to describe a person, how many attributes do you need for it to feel like a full and fair description—not one solely based on race? Then they A/B tested the variances for three months, assigning five people to read through thousands of resulting posts. By last August, they’d arrived at a new posting protocol for the crime and safety section. They rolled it out across the site.

Not long ago, Tolia invited me by to show me the result of Nextdoor’s work. It was a busy day, and there was a buzz in the air that felt like the tangible expression of the Silicon Valley term “blitzscale.” Having recently outgrown its offices, Nextdoor has just moved into the building behind Twitter’s headquarters on Market Street. Earlier in the week, it had acquired a United Kingdom competitor, and all employees—including Mohit—had dropped everything to help transition new members to the Nextdoor platform.

Tolia slid his laptop around the table and flipped it open to walk me through the new posting process. Today, when you post to the crime and safety section of the site, you must choose whether your post describes a crime, suspicious activity, or the rather ambiguous “other” category. If it’s a crime, you are prompted to describe the incident and to check a box indicating whether you’ve already reported it to the police. If you refer to race in your description of the incident, Nextdoor’s algorithms detect it and prompt you to save this part of the description for the next screen.

On the second screen, you’re encouraged to describe vehicles as well as people. “Cars aren’t stereotyped. You’re like, ‘Oh that’s a BMW. That’s a Mercedes,’” says Tolia. If you choose to describe a person, you’re given a series of fields to fill in. If you choose to fill in race, you are required to fill in two of four additional categories — hair, top clothing, bottom clothing, and shoes.

If you don’t fill in the requested fields, you can’t post.

If, instead of reporting a crime, you are reporting a suspicious activity, you are shown instructions for what constitutes suspicious activity before you are directed through a similar posting process.

These extra steps have caused people to post less to this section of the site. Tolia says there are about 25 percent fewer posts. But, he points out, many of those posts shouldn’t have been on the site in the first place. “This is where you make the long-term bet where you have to feel that the content you’re eliminating is not high value,” says Tolia. “The content you’re getting at the end is higher value.”

Last fall, the City of Oakland honored Nextdoor for the work that it did to address the racial profiling on the site. Chuck Baker, a board member with the Bay Area chapter of One Hundred Black Men, had participated in Nextdoor’s working groups, and he felt the award was well deserved. “We were pretty happy,” he says. “We were excited they were really talking about this.”

Not everyone feels that way. Neighbors for Racial Justice believes that Nextdoor isn’t doing enough. From a technical perspective, they’re right—both Mohit and Tolia tell me this. For one, although a significant number of members use the service on their phoness, Nextdoor hasn’t fully rolled out similar measures to prevent people from describing potential miscreants only by race in its ioses and androids apps yet. (The company says it has completed a design.) Also, much of the discussion that occurs in the crime and safety category doesn’t happen in the company’s carefully calibrated posts — it takes place in the comments that respond to those posts. Nextdoor hasn’t yet found a way to monitor those conversation threads for racial profiling. “We need to figure out a way to create friction in those modalities as well,” says Tolia.

But even if Nextdoor figures out how to carry out its current plan, activists Shikira Porter and Monica Bien will not be satisfied. They say the fix isn’t good enough, and that people should be required to list five attributes in addition to race instead of just two. And they question Nextdoor’s measure of improvement. “I don’t believe their data,” says Porter.

In addition to monitoring her Oakland neighborhood on the site, she keeps in touch with activists around the country doing the same thing. “What I see as a user is that folks are still profiling,” she says. In fact, she has noticed anecdotally that profiling has worsened in the new social climate condoned by the Trump administration.

The activists complain that now that Nextdoor has rolled out its initial design fix, it is less responsive to their concerns. “We had a two-hour phones call in October, and we got a response back three weeks ago,” says Porter.

I ask them if they believe the world would be better off without Nextdoor altogether. No, they say, not at all—Nextdoor is a valuable resource for all local communities. Porter sighs. She’s said what she’s about to tell me many times. The way our society figures out how to help people who are marginalized, she explains, is by deciding what’s best for them, rather than letting them decide for themselves. She is resigned to the fact that she will say this many more times. She is patient.

She is right. Though Nextdoor deserves the credit it has received for tackling bias at the highest level in the company and reducing racial profiling on its site by a considerable amount, it hasn’t eliminated it. And the work left to be done may come at the expense of its business interests. That raises questions every founder in Silicon Valley struggles to answer: What are the moral responsibilities of a small, fast-growing, private, venture-funded startup still eking out a business model? When doing what’s ethical clashes with the business imperatives of a company that has not yet succeeded, is there a middle ground?

“The sad reality is that unconscious bias is part of our society,” Mohit said at a heated Oakland City Council meeting last year, during the height of the controversy. “Platforms like Nextdoor sometimes act as a mirror that reflects the thoughts and fears that exist in our neighborhoods today.”

To find its way forward, the company has had to face off against one of the biggest challenges facing not only social platforms, but also all of society: how to nudge users to be their better selves.