Can Facebook prevent racial discrimination in targeted advertising?
Loading...
With the use of a category called “ethnic affinities,” Facebook allows its advertisers to exclude specific racial groups when placing housing ads on its portal, a ProPublica investigation revealed on Thursday.
The nonprofit investigative news organization illustrated how Facebook approved within 15 minutes an ad it submitted for a "renter event" that explicitly excluded African Americans. The Facebook ad was allowed to go through despite being illegal, according to a lawyer ProPublica contacted, under the Fair Housing Act of 1968 that prohibits any advertisement with “respect to the sale or rental of a dwelling” that indicates “preference, limitation, or discrimination” on race and other categories.
"There's a part of the Fair Housing Act that makes it illegal to have discriminatory advertising. That part applies to both the person taking out the ad and also the publisher of that ad," Rigel Oliveri, professor of law at the University of Missouri told USA Today. "It's not just hosting the ad. It's encouraging and providing the advertisers with the ability to exclude people based on their race and ethnicity,”
While critics call the targeting tool racist, the use of ethnicity in targeted advertising is typical for internet companies, particularly social media behemoths such as Facebook which have a huge vault of data about users’ demographics, location and preferences. It’s a fine line companies have to walk in connecting sellers to interested customers while at the same time, avoiding misuse of the tools to promote exclusion.
Facebook’s head of US multicultural sales, Christian Martinez, outlined the balance in response to ProPublica’s investigation on Friday. He explained why some advertisers may only want to reach to a certain group, citing the example of "a merchant selling hair care products that are designed for black women."
But not all examples are that innocuous, he acknowledged. "There’s also negative exclusion – for example, an apartment building that won’t rent to black people or an employer that only hires men," Mr. Martinez wrote. "Our ad policies strictly prohibit this kind of advertising, and it’s against the law. If we learn of advertising on our platform that involves this kind of discrimination, we will take aggressive enforcement action."
The Fair Housing Act was originally enacted because many African-American and Hispanic families couldn’t purchase or rent homes in areas because of their race.
The dangers of targeted advertising veering into potential racism showed up in the 1970s, for example, when many agencies made advertisements using racial stereotypes. More recently, studies have found more tobacco and junk food advertisements targeted at African-Americans and Latinos than white people.
With advertising generating nearly all of Facebook’s revenue, and targeted advertising being one of Facebook's main selling points, the company clearly struggles to vet all submissions and filter out those that violate its policies.
Mr. Martinez admitted that they "often aren’t in a position to know the details of an apartment rental or job application," but said Facebook will take down an ad if "the government agency responsible for enforcing discrimination laws tells us that the ad reflects illegal discrimination."
Doing the vetting by themselves, however, is not unfeasible. As ProPublica points out, The New York Times put a system in place to prevent discriminatory housing ads back in 1993, (after the newspaper was successfully sued under the Fair Housing Act) using an automated program that detects discriminatory code words such as "whites only," "no kids," or "near churches." Human reviewers also go through the ads before publication.
Another solution might just be to avoid exclusion. As Jerome Williams, professor and provost of Rutgers University-Newark tells The Christian Science Monitor in a phone interview, the original purpose of targeted advertising was not to "exclude other groups" but to "concentrate your message to a particular group."
"Because of technology, you can prohibit certain people from viewing it. That becomes risky," Prof. Williams says. "I think you want to avoid those type of strategies."