The US Department of Housing and Urban Development (HUD) recently announced charges against Facebook for serving ads that violate fair housing laws. The implications of the case could extend well beyond housing.
Facebook allows its advertisers to very specifically target the people who see – and don’t see – their ads. Until recently, Facebook advertisers could exclude certain races from seeing their ads. This allowed real estate advertisers to almost certainly violate the Fair Housing Act of 1968. Enacted in 1968, the Fair Housing Act made it illegal for property owners to refuse to sell or rent to someone based on their race or religion. HUD argues that by allowing its real estate advertisers to discriminate in their advertising, Facebook facilitated violations of the Act.
Technology has made discrimination in advertising both clearer and more confusing. A real estate company would never be accused of racism by choosing not to run ads on a hip-hop hop (71% black and Hispanic audience) radio station but making the same effective decision – but doing it explicitly – is much more obvious.
Advertisers have always sought ways to get their message in front of as many potential customers as possible, while reducing the amount they spend pitching their products to non-potential customers. Approximately 90% of Ebony’s 3.7 million readers are African-American, providing a very efficient way to reach that demographic. An advertiser can reach more than twice that many black readers, however, in Better Homes and Gardens. But someone who wants to sell Mizani hair products (a line targeted for African-American clientele) to 8 million African-American Better Homes and Gardens readers is also paying to show that ad to 30 million white people who probably have no idea what Mizani is. Is Mizani racist for not advertising in Better Homes?
Facebook not only knows your race, it knows your income, employer, hobbies, politics and vacation preferences – and those of your friends, as well.
But that is just the tip of the iceberg. By reviewing only 10 of your Facebook “Likes,” Facebook likely knows you about as well as does a work colleague. After 70 “Likes,” the company knows you like a close friend. And after 150 “Likes,” Facebook knows you as well as your spouse does.
Facebook “Likes”, and comments could theoretically be used to identify imminent criminal activity. It is already used to identify people at risk of self-harm; in 2018 Facebook contacted local emergency responders 3,500 times when its algorithms determined someone was in imminent risk of attempting suicide.
Housing and hair products are obviously not comparable. No one’s school district is determined by consumer purchases. But we are just beginning to see the ways in which the internet is upending targeted advertising.
David Moon is president of Moon Capital Management. A version of this piece originally appeared in the USA TODAY NETWORK.