Frontier Technology09 Aug 2021

Algorithms promote eating disorders and racist content to teenagers

A new investigation has shown how Big Tech platforms are showing harmful and extreme content to Australian children while harvesting their data.

tiktok
Photo Credit: Franck (@franckinjapan) via Unsplash.

Australian children and teenagers are falling victim to Big Tech algorithms that promote eating disorders and racist content, a joint investigation by the ABC’s Four Corners and triple j Hack has shown.

The program, which focused on popular social media platform TikTok, also showed the way that Big Tech is harvesting children’s data for commercial gain – a practice the former Children’s Commissioner for England, Anne Longfield, said was “illegal”.

TikTok attracts a significantly younger audience than other social media networks, with research estimating a staggering 25% of Australians under 15 are using the platform. Minderoo Foundation’s Frontier Technology initiative said the program showed that the need for a children’s data code has never been clearer.

Four Corners and triple j Hack spoke to Australian users of TikTok who were actively pushed to content promoting eating disorders. One, a 19-year-old girl, was shown a popular fitness influencer after downloading TikTok. After the girl following this influencer, TikTok’s algorithm began promoting the viral trend of meticulously tracking how many calories you eat in a day, something researchers have warned promotes eating disorders. Four months after downloading TikTok, the 19-year-old girl was diagnosed with an eating disorder.

Another TikTok user, recovering from a history of eating disorders that had seen her in and out of hospital, said she began “actively relapsing” after using the app. When she tried to report videos that promoted eating disorders, she was told the videos didn’t breach TikTok’s guidelines – contrary to the company’s own policies.

But TikTok’s failure to police harmful content isn’t limited to videos promoting eating disorders. Reset Australia is a partner organisation of Frontier Technology and supported by the Minderoo Foundation. In a recent experiment, they found that it took four hours for TikTok’s algorithm to learn that a 13-year-old boy was interested in racism, and seven hours for sexist videos to dominate someone’s feed.

The program also exposed TikTok’s dangerous data collection practices. At the same time TikTok is showing harmful and extreme content, it is building sophisticated user profiles to enable advertisers to microtarget individuals based on location, gender and age. The company has also recently updated its privacy policy to enable it to capture users’ unique facial and voice data.

This has advocates concerned. Previous experiments on Big Tech have shown how advertisers are able to utilise user profiles to serve children targeted ads for alcohol, gambling and vaping products. And a recent report by Reset Australia found that TikTok’s terms and conditions, including its privacy policy, failed basic tests for meaningful consent. University-level reading skills were required to understand the terms and conditions, the terms and conditions took over an hour to read, and manipulative design techniques were used to nudge young people to hand over more data. Out of a possible score of five, TikTok ranked zero.

The Head of Tech Impact at Minderoo Foundation’s Frontier Technology initiative, Rachel Howard, said it was well past time to check Big Tech’s unfettered power over children’s data and wellbeing.

“This investigation confirms what we already knew: that Big Tech’s commercial interests are coming at the expense of the public interest,” she said. “Every parent would agree that children’s data should only ever be used in a child’s best interest, and every parent would agree that Big Tech is failing this test.”

“Without changes to the system, millions of children will grow up exposed to dangerous behavioural targeting and manipulative nudging. They will have hundreds of millions of data points collected about their preferences, interests and vulnerabilities which can be used in perpetuity. The only people who should know that much information about children are their parents.”

“That’s why Frontier Technology supports strong accountability for platforms and online service providers for the harms that come from inappropriate data collection and use.”

“Just like every parent who worries about what their kids are exposed to online and how their data is being used, I don’t want us to realise we were asleep at the wheel. We have the solutions to these challenges – solutions that mean children are never put at risk. One of them is a children’s data code governing the use of a child’s personal information, and making sure that their data is used for the only interest that matters – their own.”

Reset Australia and Minderoo Foundation are calling for a children’s data code. Join the campaign here: www.childrensdatacode.org.au.

If you or anyone you know needs help with an eating disorder:

Minderoo Foundation
by Minderoo Foundation

Established by Andrew and Nicola Forrest in 2001, we are a modern philanthropic organisation seeking to break down barriers, innovate and drive positive, lasting change. Minderoo Foundation is proudly Australian, with eight key initiatives spanning from ocean research and ending slavery, to collaboration in cancer and community projects.

3 minute read
Share this article
Other Stories