4 views 3 mins 0 comments

Roblox restricts messaging for young users to enhance online safety.

In All, Finance, Technology
November 18, 2024
Roblox restricts messaging for young users to enhance online safety.

Roblox, the most popular gaming platform for eight to 12-year-olds in the UK, is implementing significant safety measures to protect young users. The company will block children under 13 from sending direct messages (DMs) within the platform, with changes to be fully implemented by the end of March 2025.

Under the new guidelines, child users will be unable to send private messages by default. Parents can grant permission for messaging through a verified account. The platform will still allow children to participate in public conversations visible to everyone in games, ensuring they can still interact with friends.

Matt Kaufman, Roblox’s chief safety officer, emphasized the platform’s commitment to safety, noting that over 10% of their employees work on safety features. The platform, which has 88 million daily users, recognizes the need to evolve its safety approach as it grows.

Parents will gain more control through enhanced parental management tools. They can:
– View and manage their child’s account
– See their child’s list of online friends
– Set daily play time limits

To access these permissions, parents must verify their identity using a government-issued ID or credit card. Kaufman acknowledged the challenges of identity verification and encouraged parents to ensure children use accurate ages when creating accounts.

Roblox is also introducing a new content labeling system to replace age recommendations. The labels will range from “minimal” to “restricted” and help parents make informed decisions based on their child’s maturity. By default:
– Users under nine can only access “minimal” or “mild” experiences
– Parents can grant consent for “moderate” games
– “Restricted” games are only accessible to users 17 and older who have verified their age

These changes follow previous safety initiatives, including barring under-13s from “social hangouts” and requiring game developers to specify age suitability.

The modifications come in response to the UK’s Online Safety Act, which mandates platforms to protect children from illegal and harmful content. Ofcom, the UK’s regulatory watchdog, has warned of potential punishments for platforms failing to ensure child safety.

Richard Collard from the NSPCC called the changes “a positive step” but stressed the importance of effective age verification to translate these measures into safer experiences for children.

Roblox’s comprehensive approach demonstrates a proactive stance in addressing online safety concerns, balancing user interaction with protective measures. By implementing these tools, the platform aims to create a safer environment for its young users while maintaining an engaging and interactive experience.