Saturday, December 21, 2024

Top 5 This Week

Related Posts

Federal investigation into child sexual abuse targets TikTok

The U.S. Department of Homeland Security has reportedly launched an investigation into TikTok over how the platform handles content depicting child sexual abuse and the moderation controls put in place. The agency is looking into the alleged exploitation of a feature called “Only Me” on TikTok that was allegedly abused to share problematic content, something Financial Times claims to have verified in partnership with child safety groups and law enforcement officials.

The Only Me feature lets users save their TikTok videos without posting them online. Once a video’s status has been designated as Only Me, it can only be seen by the account’s owner. In TikTok’s case, credentials of accounts that shared content depicting Child Sexual Abuse Imagery (CSAM) were passed on among bad actors. In doing so, the abusive videos never made it to the public domain and avoided detection by TikTok’s moderation system.

TikTok is no stranger to the problem

This is not the first instance of such a serious probe into TikTok. The number of investigations by the Department of Homeland Security covering the spread of child exploitation content on TikTok has reportedly shot up by seven times between 2019 and 2021. And despite making bold promises regarding strict policy enforcement and punitive action against abusive content depicting children, it appears that bad actors are still thriving on the platform.

“TikTok talks constantly about the success of their artificial intelligence, but a clearly naked child is slipping through it,” child safety activist Seara Adair was quoted as saying. Interestingly, the federal agency banned TikTok on all systems, including phones and computers owned by the department’s information technology systems, in March this year over data security concerns.

This also isn’t the first instance of TikTok hogging attention for the wrong reasons. Last month, a couple of former TikTok content moderators filed a lawsuit against the company, accusing it of not providing adequate support while they handled extreme content depicting “child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

A BCC investigation from 2019 revealed predators targeting children as young as nine years of age with sleazy comments and proposals. Elizabeth Denham, the U.K.’s information commissioner, launched a probe into TikTok the same year over the platform’s handling of personal data belonging to underage users. And given its immense popularity among young users, the option of deleting it is not really as straightforward as Facebook’s.

The risks are increasingly high, with media regulator Ofcom claiming that 16% of toddlers in the age group of three to four years consume TikTok content. As per the U.K.’s National Society for the Prevention of Cruelty to Children (NSPCC), online grooming crimes reached a record high in 2021, with children being at particularly high risk. Even though Instagram and Snapchat are the preferred platforms for predators, reports of horrific child grooming on TikTok have surfaced online on multiple occasions in the past few years.

TikTok has lately enforced measures to keep its young user base safe. Last year, TikTok announced that strangers will no longer be able to contact TikTok accounts belonging to children below 16 years of age, and their accounts will default to private. The short video haring platform even tightened the restrictions around downloading videos posted by users under the age of 18. TikTok also added resources to its platform to help sexual assault survivors last year, bringing in experts from the Rape, Abuse & Incest National Network (RAINN) and providing quick access to the National Sexual Assault Hotline.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles