The UK communications watchdog, Ofcom, is investigating TikTok over allegations of providing inaccurate information. The investigation focuses on whether TikTok failed to comply with a legal information request from Ofcom. The request was part of a report published by Ofcom on Thursday, which examined the efforts of video-sharing platforms to prevent children from accessing harmful content. TikTok has attributed the issue to a technical problem and claims to have identified and reported it to Ofcom, prompting the investigation. The accuracy of the information provided by TikTok is under scrutiny, rather than its parental controls. Ofcom had asked TikTok, Snap, and Twitch to provide information on how they comply with legal requirements to protect children from harmful videos. While all three platforms have measures in place to prevent children from encountering harmful content, there are still instances where harm can occur. Ofcom specifically questioned TikTok about its parental control system, “Family Pairing,” and expressed doubts about the accuracy of the information provided by the platform. TikTok introduced Family Pairing in April 2020, allowing parents to link their account with their child’s and manage screen time, direct messages, content filtering, and privacy settings. Under-18 users can deactivate Family Pairing, but parents receive a notification if this happens. Ofcom may update its report if it receives more accurate information from TikTok. Ofcom’s research indicates that over a fifth of children aged between 8 and 17 have an adult online profile. The report also highlights that users can easily gain access to platforms like TikTok, Twitch, and Snap by entering a false age. Ofcom suggests that these platforms should improve their methods of identifying children and preventing them from encountering harm. In contrast, OnlyFans, a subscription-based platform known for explicit content, employs facial age estimation, ID verification, and other systems to ensure users are adults. The platforms mentioned in the report use various methods, including AI and human moderators, to identify underage users, but it is difficult to determine the exact number of underage users based on the available data. Ofcom also raised concerns about Twitch, noting that its content is open access, allowing anyone of any age to watch videos, even if they are rated mature. The report also mentioned that the platform’s content warnings can be easily dismissed. While TikTok and Snap have parental controls, Twitch requires parents to supervise their children in real time while using the service, according to the platform’s terms and conditions. Ofcom found that parents can request the removal of a child’s account on Twitch by providing specific details and a signed form verifying their relationship to the child. However, Twitch stated that no parent had contacted them to request the removal of a child’s account in the 12 months leading up to August 2022. Twitch recently relaxed its rules to allow art featuring nudity on its platform. The report examined the compliance of UK-based video-sharing platforms with existing regulations but highlighted that these firms will soon have to comply with the newly passed Online Safety Act, which aims to protect children from harmful social media content. Ofcom plans to consult on guidance for the act’s child safety measures in spring 2024.