Study finds TikTok may push potentially harmful content to teens within minutes | CNN Business
A new study suggests that TikTok may display potentially harmful content related to suicide and eating disorders to teens within minutes of creating an account, likely adding to growing scrutiny of the app’s impact on younger users.
In a report published Wednesday, the nonprofit Center for Combating Digital Hate (CCDH) found just that can take less than three Minutes after signing up for a TikTok account to see content related to suicide and about five more minutes to finding a community promoting eating disorder content.
The researchers said they created eight new accounts in the US, UK, Canada and Australia at the minimum age for a TikTok user of 13. I paused these accounts briefly and liked content about body image and mental health. The ACHR said the app recommended videos about body image and mental health approximately every 39 seconds over a 30-minute period.
The report comes as federal and state lawmakers search for ways to crack down on TikTok over privacy and security concerns, as well as determine whether the app is appropriate for teens. It also comes more than a year after executives from social media platforms, including TikTok, faced tough questions from lawmakers during a series of congressional hearings about how their platforms could instruct younger users — especially teenage girls — to malicious content, which harms them Mental health and body image.
After, after Those hearings, which followed Facebook whistleblower Frances Hogan’s revelations about Instagram’s impact on teens, had the companies vowing to change. But the most recent findings of the Advisory Council for Human Rights suggest just that More work may still need to be done.
“The findings are every parent’s nightmare: bombarding young people’s nutrition with harmful and apocalyptic content that can have a significant cumulative effect on their understanding of the world around them, and their physical and mental health,” said Imran Ahmed, CEO of CCDH, in the report.
A TikTok spokesperson responded to the study, saying it was an inaccurate portrayal of the viewing experience on the platform for various reasons, including the small sample size, the limited 30-minute testing window, and the way accounts went through a series of unrelated topics to search for content. else.
“This activity and the resulting experience does not reflect real behavior or viewing experiences of real people,” a TikTok spokesperson told CNN. “We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources to anyone in need. We keep in mind that playing content is unique to each individual and remains focused on promoting a safe and comfortable space for all, including people who choose to share their journeys of recovery or educate others. on these important topics.
The spokesperson said the Advisory Council for Human Rights does He doesn’t distinguish between positive and negative videos on certain topics, adding that people often share empowering stories about recovering from eating disorders.
TikTok said it continues to roll out new protections for its users, including ways to filter out mature or “potentially problematic” videos. In July, it added a “maturity score” to videos that were detected as likely to contain mature or complex themes as well as a feature to help people decide how much time they want to spend on TikTok videos, set regular screen time slots, and provide A dashboard detailing how many times they opened the app. TikTok also offers a range of parental controls.
This is not the first time Social media algorithms tested. In October 2021, the US Sen Richard Blumenthal’s staff registered an Instagram account as a 13-year-old girl and proceeded to follow some dieting disorder and eating endorsement accounts (the latter was supposed to be banned by Instagram). Soon, the Instagram algorithm began recommending almost exclusively that the young teen’s account should follow more and more extreme diet accounts, the senator told CNN at the time.
(After CNN sent a sample of this list of five accounts to Instagram for comment, the company removed it, saying it violated all Instagram accounts. Policies against promoting eating disorders.)
TikTok said it does not allow content that depicts, promotes, normalizes, or glorifies activities that could lead to suicide or self-harm. Of the videos that were removed for violating their policies regarding suicide and self-harm content from April to June of this year, 93.4% were removed without being watched, 91.5% were removed within 24 hours of being posted, and 97.1% were removed before any reports, the company says. .
The spokesperson told CNN that when someone searches for banned words or phrases like #selfharm, they won’t see any results and will instead be redirected to local support resources.
However, the Advisory Council for Human Rights says More needs to be done to restrict certain content on TikTok and strengthen protections for young users.
“This report underscores the urgent need to reform online spaces,” said the Advisory Council on Human Rights. Ahmed. “Without moderation, the opaque TikTok platform will continue to generate revenue by serving its users — kids under 13, remember — increasingly intense and disturbing content without controls, resources, or support.”
#Study #finds #TikTok #push #potentially #harmful #content #teens #minutes #CNN #Business