Facebook axes 5,000 ad categories that excluded based on ethnicity, religion | Social

Facebook has removed over 5,000 ad targeting categories to limit advertisers’ ability to be discriminatory on its platform.
NurPhoto
Facebook is looking to make its advertising platform less discriminatory.
On Tuesday, the company said it had removed more than 5,000 ad-targeting categories in an effort to limit the ability of advertisers to exclude users based on ethnicity or religion.
In the coming weeks, advertisers will have to go through a process of certification that’ll educate them on the “difference between acceptable ad targeting and ad discrimination,” the company wrote in a blog post.
Some of the removed interest categories include “Buddhism” and “Islamic culture.” The company won’t release the full list of the deleted categories to prevent bad actors from adjusting their strategies accordingly, said a person familiar with Facebook’s thinking.
This comes as the social network’s ad targeting categories and methods have been under scrutiny. Last week, the US Department of Housing and Urban Development filed a complaint against the tech giant for letting advertisers pick and choose who could see their housing ads based on ethnicity.
Facebook said the removal of ad tags wasn’t a response to the HUD complaint. “We’ve been building these tools for a long time and collecting input from different outside groups,” a spokesperson said in an emailed statement.
Last month, Facebook removed the word “treason” from a list of user interest categories advertisers could target, due to concern that the Russian government could’ve identified its citizens marked with the word.
Cambridge Analytica: Everything you need to know about Facebook’s data mining scandal.
Taking It to Extremes: Mix insane situations — erupting volcanoes, nuclear meltdowns, 30-foot waves — with everyday tech. Here’s what happens.