(Bloomberg) -- Meta Platforms Inc. “routinely discriminates” in steering job ads to specific age and gender groups on its Facebook platform, a women’s truckers organization alleged in a civil rights complaint.

“Facebook’s algorithm regularly acts like recruiters in the 1960s (and even later), who identified jobs as ‘Male’ or ‘Female’ based on gender stereotypes or indicated their preferences to hire younger workers,” the nonprofit Real Women in Trucking said Thursday in a filing with the US Equal Employment Opportunity Commission.

Data culled from Facebook’s own online ad library shows that the algorithm selected to receive certain job listings were more than 99% male and 99% younger than age 55, even though the employers had asked that they be shown to people of all ages and genders, according to the complaint.

On the platform, “older job seekers are usually far less likely than younger job seekers to receive job ads, and men receive the lion’s share of ads for blue-collar jobs, especially jobs in industries that have historically excluded women,” Real Women in Trucking claims. “Meanwhile, women receive a disproportionate share of ads for lower-paid jobs in social services, food services, education, and health care, especially administrative positions that are historically considered women’s jobs.”

A Meta spokesperson said the company is reviewing the complaint and has been working to prevent discrimination and make its ads more transparent. “Addressing fairness in ads is an industrywide challenge and we’ve been collaborating with civil rights groups, academics and regulators to advance fairness in our ads system,” the company said in an emailed statement. “We’re actively building technology designed to make additional progress in this area.”

Decades-old US civil rights laws prohibit job or housing ads that indicate a preference based on sex or age, and federal agencies have determined that using those criteria to target online ads violates such prohibitions. 

“While Facebook has been warned for years by civil rights advocates and regulators that algorithmic bias is likely to be a serious problem on its platform and would violate a range of civil rights laws, Facebook has failed to stop the algorithmic discrimination that occurs in most cases when Facebook publishes job ads throughout the nation,” the truckers group said in its complaint.

In June, Meta agreed to settle a lawsuit brought by the US Department of Justice that alleged Facebook’s housing ad system discriminated based on characteristics such as race, disability and sex. In 2019, the company settled lawsuits and complaints brought by groups that included the American Civil Liberties Union and the Communications Workers of America. As part of that agreement, Facebook said it would no longer let advertisers target job or housing postings based on age or gender.

Thursday’s complaint credits the company with taking “meaningful steps” required by that settlement. “But Facebook’s own algorithm has replicated the same problem,” the truckers’ group claims.

“This is a problem not just on Facebook, but likely on a lot of other platforms,” Real Women in Trucking attorney Peter Romer-Friedman, who also represented plaintiffs in the 2019 settlement, said in an interview. “What these extreme disparities tell us is that algorithmic bias is going to be a problem unless you prevent it and eliminate it consciously.”

(Updates with Meta comment in fifth paragraph.)

©2022 Bloomberg L.P.