TikTok Algorithm Pushes Violent Videos to Minorities, Lawsuit Says

Jul 20, 2022

Share

(Bloomberg) -- TikTok faces a claim that its algorithm steers more violent videos to minority subscribers than to White users in a lawsuit blaming the platform for the death of a 14-year-old African-American girl. 

The complaint, which also names Meta Platforms Inc., Snap Inc., and TikTok parent company ByteDance Ltd. as defendants, is among a stream of lawsuits that attempt to hold social media companies accountable for teens getting addicted to their platforms. 

Parents of Englyn Roberts, who died in September 2020 about two weeks after she tried to take her own life, allege that TikTok is aware of biases in its algorithm relating to race and socio-economic status. Roberts wouldn’t have seen and been addicted to the harmful content that contributed to her death if not for TikTok’s programming, according to the complaint filed Wednesday in San Francisco federal court.

What Are Algorithms and Are They Biased?: QuickTake 

“TikTok’s social media product did direct and promote harmful and violent content in greater numbers to Englyn Roberts than what they promoted and amplified to other, Caucasian users of similar age, gender, and state of residence,” the parents alleged.

The complaint was filed by Social Media Victims Law Center, a Seattle-based advocacy group.

Read More: Meta Faces 8 Suits Claiming Algorithms Aim to Addict Youths

Representatives of TikTok, Meta and Snap didn’t immediately respond to requests for comment.

The case is Roberts v. Meta Platforms, Inc., 22-cv-04210, US District Court, Northern District of California.

©2022 Bloomberg L.P.