Facebook's ad profile of you is both eerily accurate & laughably inaccurate. Here's why that's scary

Latest

You have two profiles on Facebook. And you’ve probably only seen the first one.

To find the shadow profile that Facebook has built to serve you ads, you go here. There, you’ll find dozens of interests that Facebook has assigned to you. Particularly compelling is the “Lifestyle and Culture” section where Facebook seems to have gone most out on a limb in profiling its users.

It got the social networking giant in trouble in late October when ProPublica published a report about Facebook assigning users an “ethnic affinity” in that section—”African American,” “Hispanic American,” or “Asian American”—and then letting advertisers use it in ad campaigns to target or exclude users based on their race. While Facebook has never asked users for their race, “ethnic affinity” uses metadata collection—information compiled from pages you like, posts you engage with, etc.—to sort users into discrete categories.

(Last year, Facebook also started profiling its users by tracking the webpages they visit, which it sees thanks to the ubiquitous Like buttons around the web. Whether that also goes into the ad profile is unclear.)

After Propublica’s report, Fusion asked readers to log into their ad preferences and report their assigned ethnic affinities.  Several issues were immediately apparent. First, many white users said their ad preferences didn’t assign them an ethnic affinity. The reason? Facebook doesn’t have a “white” affinity option, reinforcing existing concerns about an embedded prejudice within technology. On Facebook, whiteness is seen as the “default” despite an extremely diverse user base of over a billion people.

Facebook underscored that it doesn’t ask users for sensitive information like their race, religion, and sexual orientation. Yet when readers screencapped their ad preferences, they found that Facebook had used metadata to make inferences for these exact categories.

Technology companies don’t need their users to disclose private information. Just with our clicks, we’re already revealing everything there is to know.

The ethical issues surrounding Facebook’s metadata analysis run much deeper than the social network. When in 2013, Edward Snowden leaked classified documents to the press unveiling pervasive dragnet surveillance programs within the NSA, President Obama was careful to emphasize that the government doesn’t retain the content of communications, just the metadata. “The only thing taken, as has been correctly expressed, is not content of a conversation, but the information that is generally on your telephone bill,” Sen. Dianne Feinstein said.

But, the New York Times editorial board outlined how powerful metadata collection can be:

“Using such data, the government can discover intimate details about a person’s lifestyle and beliefs — political leanings and associations, medical issues, sexual orientation, habits of religious worship, and even marital infidelities.”

Speaking to The Atlantic about social media companies’ use of metadata, Gabe Rottman of the American Civil Liberties Union said, “What both types of information collection [government and social media] show is that metadata—data about data—can in many cases be more revelatory than content. You see that given the granularity with which private data collection can discern very intimate details about your life.”

But the problem with metadata is that it can also be misleading, resulting in incorrect assumptions about people. We saw that too in the responses for our call for people’s Facebook ad preferences profiles. In many cases, Facebook had misidentified users as African American:

@sidneyfussell pic.twitter.com/wVqRQhS2Yy
— Katlin Seagraves (@iamliterate) October 29, 2016

When contacted, several users who were misidentified said they’d liked or shared posts on subjects like Black Lives Matter, President Obama, or anti-police brutality. Crucially, Facebook’s machine learning doesn’t seem to make a distinction between interest in vs. belonging to a group. White users simply interested in something demarcated as “African American” can’t be reliably differentiated from an actual African American.

Once this information is compiled, whether accurate or inaccurate, there’s the question of who will get access to it. Facebook receives ever increasing requests from the government to give over user data—nearly 20,000 in the second half of 2015. Could a Facebook user have their ad preferences, which detail their sexual orientation, religion, etc., subpoenaed, then examined? What are the consequences of Facebook misinterpreting user behavior in that context?

Rafi Letzer, a journalist with Business Insider, found that Facebook made a number of startling inferences about his religion, including apparent “interests” in Hezbollah, a militant Islamic group designated a terrorist organization by the State Department, and The Jewish Home, a Zionist political party in Israel.

Erroneous as the miscategorization may be, imagine if Facebook gave this ad preference data to either the government or local authorities. Since 9/11, the government has majorly stepped up its reliance on consumer data in law enforcement, at times querying databases that people may not even realize they’re in.

If there’s one bright side to this, it’s that Facebook allows users to see this profile and remove assigned preferences simply by clicking on them. You may want to clean out yours.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin