TikTok Conducts New Research into Harmful Trends and Challenges with a View to Improving Safety Measures

TikTok has conducted a new study into the proliferation of dangerous challenges and trends in the app, with a view to better understanding young people’s engagement with such, and establishing better policies to protect against related dangers.

Over time, this has become a key area of concern for the app. Last year, in Italy, a 10- year-old girl died after taking part in a ‘blackout challenge’ in the app, which lead to Italian authorities forcing TikTok to block the accounts of any users whose age it could not verify. The popular ‘Milk Crate Challenge’, which trended earlier this year, also saw many people suffer serious injury after trying to climb stacks of plastic crates, while other concerning trends include the ‘Benadryl challenge’, full face wax, the ‘back cracking challenge’ and more.

Clearly, there are some major concerns here – and when you also consider the younger skew of the app, and its ever-increasing reach, it’s important that TikTok looks to address these elements wherever it can, in order to limit potential harm.

To view more detailed tips about Mortal Kombat

TikTok’s harmful challenge study incorporated responses from more than 10,000 teens, parents, and teachers from several nations, giving the platform a wide breadth of insight into the various elements at play. Based on this, TikTok commissioned Praesidio Safeguarding, an independent safeguarding agency, to put together key findings and recommendations to help improve its processes.

As you can see in this overview, the vast majority of younger users don’t feel that they’re participating in unsafe challenges. But then again, this is self-reported, and as per the above-noted trends, there are clearly some significant safety concerns within these trends.

But enforcing such can be difficult, especially as these trends rise, and they’re not yet on TikTok’s radar as a key engagement focus. It can also be difficult to identify trends based on hashtag alone. ‘Benadryl challenge’ would likely get the TikTok moderation team’s attention, but ‘Milk Crate Challenge’ might not.

In examining the data, TikTok has honed in on self-harm hoaxes and related trends specifically as a key element of concern.

Self-harm hoaxes generally involve directing people to carry out a series of harmful activites, which escalate gradually, and can eventually lead to self-harm, and even suicide. Trends like the ‘Blue Whale Challenge’ and ‘Momo’ are among this more concerning element, where fictional characters play out a horror-like scenario that can drag users into dangerous behavioural pathways.

Such trends are so dangerous that even warnings about them can cause angst – as explained by TikTok:

“The research showed how warnings about self-harm hoaxes – even if shared with the best of intentions – can impact the well-being of teens. While we already remove and take action to limit the spread of hoaxes of this nature, to further protect our community we will start to remove alarmist warnings about them as they could cause harm by treating the self-harm hoax as real. We will continue to allow conversations to take place that seek to dispel panic and promote accurate information.”

In other words, TikTok will take stronger action on such hoaxes, beyond warnings, as a first step, as opposed to acknowledging them with any level of credibility by issuing alerts in app.

In addition to this, TikTok says that it’s also working with adolescent health experts to develop new resources for its Safety Center dedicated to challenges and hoaxes.

To view more detailed tips about Netflix

“This includes advice for caregivers that we hope can address the uncertainty they expressed about discussing this topic with their teens.”

TikTok’s also updating the language used in its warning labels that are shown to users who attempt to search for content related to harmful challenges or hoaxes.

As noted, this is a key area of concern, but it also poses a major challenge, as there’s no definitive way to detect potential harm in a rising trend, often till it’s too late, while restricting such can also have its own negative impacts, with respect to warnings and alerts, which may inadvertently help to promote the same.

Essentially, the approach that TikTok is taking is to focus on the most harmful types of challenges, which it’s working to better detect and action, while additional warnings and informational resources will help to provide more guidance, for parents and teens, on how to avoid risk.

But a level of danger will always be present, and policing such will be hard. As such, it is good to see TikTok undertaking more research to better understand the key elements at play, with a view to improved detection and enforcement in future.