A recent study by university researchers reveals that Facebook’s advertising algorithm targets Black users with ads for for-profit colleges more than white users.
Meta, the parent company of Facebook and Instagram, does not disclose how its algorithm determines which ads users see. To investigate potential racial discrimination in the delivery of education ads, researchers from Princeton and the University of Southern California conducted tests by purchasing ads for public and for-profit colleges.
The study focused on for-profit colleges due to their history of deceptive marketing practices aimed at students of color. By analyzing the performance of ads targeted at Black and white users, the researchers found a bias in Facebook’s algorithm. Black users were shown ads for for-profit colleges with questionable practices, while white users were directed towards public institutions.
Despite Facebook’s claims of fairness, the study suggests that the algorithm may unknowingly perpetuate racial biases in ad delivery. Previous research has also shown biases in Facebook’s ad delivery system along race and gender lines.
While Facebook states that its ads are tailored to users’ interests, researchers argue that the algorithm may be reinforcing historical biases rather than reflecting genuine preferences. The responsibility to address algorithmic biases lies with Facebook, according to the researchers.
Even if the ad targeting reflects enrollment trends, the ethical implications of disproportionately promoting for-profit colleges to Black users remain. The study raises concerns about potential legal liabilities for Facebook based on discriminatory ad delivery practices.
The paper warns that educational opportunities are protected by laws that prohibit racial discrimination and may also apply to advertising platforms.
According to Korolova, Meta has taken steps to reduce bias in their advertising systems, particularly in the areas of housing, employment, and credit. This includes a settlement with the Department of Justice in 2022 regarding housing discrimination. However, efforts to address bias in employment and credit advertising have been voluntary, possibly to prevent legal action based on research showing discrimination in employment advertising.
Despite ongoing research into algorithmic bias in Meta’s products, the company has not directly engaged with researchers and has not expanded efforts to address bias in advertising across other important areas related to societal opportunities.
Source link