Facebook Takes Steps to Keep Targeted Advertising From Violating Its Users’ Civil Rights
Last spring, when we learned that Facebook was providing advertisers with the option to show different ads to different users based on their “ethnic affinity,” we got nervous.
Facebook places people into these clusters — African-American, Asian-American, and Latino — based on the fact that they’ve liked things on Facebook associated with membership in those groups, like an African-American chamber of commerce. (You can figure out how Facebook labels you here.) We were worried that advertisers who wanted to hide their ads from members of these groups could do that. Conversely, we were also worried that advertisers who wanted to target them in predatory ways could do that, too.
Discriminatory targeting is illegal when those ads are for housing, credit, and employment. Because those areas are so central to opportunity, civil rights law creates special protections to ensure fairness. Title VII makes it illegal to recruit employees in a way that excludes Black or Latino candidates (or women or Muslims) from the applicant pool. The Fair Housing Act makes it illegal to advertise an apartment for rent in a way that keeps members of these groups from knowing that it’s available. And it’s similarly illegal to keep people from applying for credit by denying them information under the Equal Credit Opportunity Act. (We’ve laid out more detail on the law in comment letters to the FTC and the EEOC).
Personalized advertising online can discriminate without users ever knowing it.
Until now, though, Facebook didn’t treat ads in these areas any differently from other ads. Advertisers could choose to target an ad to “house hunters” and to exclude “African Americans” from seeing that ad, as ProPublica recently pointed out. Today, though, Facebook announced that it will no longer allow discriminatory targeting (or exclusion) based on ethnic affinity for ads for housing, credit, and employment.
Perhaps equally important, part of that process will involve developing systems to identify ads in those areas. If successful, Facebook can work on figuring out how to keep advertisers from discriminating based on gender or sexual orientation or religion or any other protected status. Facebook also announced that it will update its advertising policies to make the prohibition on discriminatory targeting more explicit and will work to educate advertisers on their obligations when placing ads for housing, credit, or employment.
These are big steps in the right direction, and we’re glad to see that our conversations, along with lots of other advocacy, have helped Facebook recognize the important role it plays in protecting equal opportunity in the era of personalized online advertising. We hope that other advertising platforms will take notice and think about how to structure their own interfaces and policies so that targeting tools can’t become tools for discrimination in these key areas of economic opportunity. We look forward to continuing our dialogue with Facebook and working towards protecting users from discriminatory targeting based not only on race, but also on gender and sexual orientation and religion. We are also looking forward to working out how to allow advertisers to use targeted advertising to increase diversity in workplaces or communities, while still barring discrimination.
There is a long and sorry history in this country of housing ads that proclaimed they were for “whites only” and of classified sections in newspapers that separated “Jobs for Men” and “Jobs for Women” into different columns. Personalized advertising online can accomplish the same discrimination, without users ever knowing. How could I learn that I didn’t see a particular job ad because I’m a woman or that I saw different housing ads because I’m white?
This hiddenness gives advertising platforms great power. They must use it responsibly.