Discord, a popular chat app among gamers, has introduced new safety features to help parents monitor their teenagers’ activity. The company’s Family Center will now provide parents with information about their teen’s top contacts, including the five users they most frequently message and call, as well as the servers they interact with. Parents will also be able to view their teen’s total call minutes and purchases made within the app. However, this data is only available for the past week, and parents will need to check their email summaries to review past activity.
In addition to these features, Discord has introduced tools that allow teens to alert their parents when they report objectionable content, and parents can enable settings such as sensitive-content filters and direct message controls. Parents can also choose who can send direct messages to their teen, including friends or other server members. These features were developed in response to feedback from parents and organizations, including the National Parent Teacher Association and the Digital Wellness Lab at Boston Children’s Hospital.
Discord’s introduction of these safety features comes as the company faces increased scrutiny over its handling of user safety, particularly with regards to protecting young users from predators. The app has been criticized for allowing bad actors to target and communicate with minors, and has been named as a defendant in several lawsuits. Despite these efforts, some advocacy groups have argued that the company’s new safety features do not go far enough, and that the burden of ensuring user safety should not fall solely on parents.
The National Center on Sexual Exploitation has criticized Discord’s new safety features, stating that they do not address the company’s history of failing to protect children from exploitation. The organization has previously named Discord as one of the “Dirty Dozen” companies that contribute to sexual exploitation. A lawsuit filed against the company earlier this year alleged that Discord and another gaming platform, Roblox, created a “breeding ground for predators” that allowed a perpetrator to groom and exploit an 11-year-old girl.
While Discord’s new safety features are a step in the right direction, they may not be enough to address the concerns of parents and advocacy groups. The company has stated that it takes a “holistic view” of teen safety, and that it proactively identifies and flags content and accounts that could put users at risk. However, the company still faces significant challenges in balancing the need to protect user safety with the need to respect user privacy. As Discord continues to roll out new safety features, it will be important for the company to prioritize transparency and communication with parents and users.