Study Finds Potential Influence of TikTok Content on Teenagers' Mental Well-being
In the digital age, concerns about the safety of young users on social media platforms have come to the forefront. TikTok, a popular video-sharing app, has been under scrutiny for the accessibility of potentially harmful content related to suicide and eating disorders for teenagers.
According to recent reports, teens can access such content just minutes after creating a TikTok account. However, TikTok has introduced filters aimed at sorting out potentially inappropriate content for a more age-appropriate viewing experience.
The responsibility of monitoring social media platforms, including TikTok, falls heavily on parents and guardians. With the rapid pace at which content is generated and shared, it's crucial for adults to stay vigilant and proactive in safeguarding their children's online experiences.
TikTok claims to have a zero-tolerance policy towards pro-eating disorder content and has mechanisms in place to remove videos that violate their policies. According to their latest statistics, they remove 93.4% of videos that violate their policies on suicide and self-harm within 24 hours with no views. Moreover, they remove 97.1% of such videos before any reports are filed between April - June this year.
However, internal documents and lawsuits suggest that TikTok's algorithm may still be promoting harmful content such as “thinspiration,” extreme diet challenges, and self-harm videos. This is a concern that has led to a growing number of lawsuits alleging TikTok’s negligence in protecting young users.
Governments, especially in the UK, are enforcing strict age checks and monitoring compliance to prevent minors from accessing harmful content. TikTok itself offers tools allowing users to set time limits on videos and take regular screen breaks.
The Center for Countering Digital Hate (CCDH) believes further legislation is required to ensure children are safeguarded from inappropriate material on TikTok. They argue that without oversight, TikTok's platform will continue to profit by serving its users increasingly intense and distressing content without checks, resources, or support.
State and federal agencies have launched investigations to determine if TikTok is an appropriate platform for young people. Lawmakers are looking for solutions to protect teens from potential privacy and security risks on TikTok.
Recent research has found that TikTok was recommending videos about body image and mental health issues at an alarming rate of one video every 39 seconds. This, combined with the concerns about the promotion of harmful content, has led to calls for TikTok and other social media platforms to improve content moderation, restrict harmful content more effectively, introduce better transparency about risks, and regulate their recommendation algorithms.
In the face of criticism, TikTok has defended itself, criticizing a recent study for its limited sample size, short-term analysis window, and scrolling of topics unrelated to the research. They also provide a dashboard tracking app usage for insight into video watching habits.
Despite these efforts, the CCDH's recent findings suggest there is still more work to be done to make digital platforms safer for young users. Over the past year, executives of social media platforms, including TikTok, have faced questions about their impact on young users and mental health outcomes.
In an effort to address these concerns, TikTok has taken steps such as searching for harmful words or phrases, including #selfharm, on their platform will bring up no results and viewers will be directed towards local support resources.
The challenge lies in striking a balance between the freedom to express oneself and the need to protect young users from harmful content. As the debate continues, it's clear that more needs to be done to ensure the safety and well-being of young users on social media platforms like TikTok.
A concerned discussion about TikTok's handling of mental health content has emerged, as recent research indicates the app recommends such videos at an alarming rate of one every 39 seconds. This, coupled with allegations of promoting harmful content, has led to calls for TikTok to enhance content moderation and regulate their recommendation algorithms for the welfare of young users.
The Center for Countering Digital Hate (CCDH) has emphasized the need for further legislation and oversight to safeguard children from inappropriate material on TikTok, arguing that without regulation, the platform may continue to serve increasingly intense and distressing content without necessary checks, resources, or support for its users.