A September 2023 warning from the FBI alerted Americans to the existence of violent online groups, which exist on messaging platforms and deliberately extort children into producing child sexual abuse material (CSAM).
The groups also share acts of self-harm online. According to the warning, issued by the FBI’s Internet Crime Complaint Center, the groups target minors between the ages of 8 and 17 years and focus on racial and ethnic minorities, LGBTQ+ youth and individuals who struggle with a variety of mental health issues.
U.S. Sen. Mark R. Warner of Virginia pressed Discord, an instant messaging social platform, in a letter Monday about the company’s failure to safeguard minors and stop the proliferation of violent predatory groups who target children with the goal of forcing them to end their own lives and livestream the act online.
“I am extremely concerned about this abuse, and I am profoundly saddened that it has affected Virginia families, including the daughter of a military family who was coerced into self-harm and to attempt suicide. I recognize that Discord’s Trust & Safety team is aware of this type of activity and has taken some actions to detect and remove some of these violent groups from their platforms. However, despite increased moderation, predators continue to target minors on your platform,” Warner said.
Abrielle is a teenager living in Virginia who was coerced by “King” into attempting suicide before first responders founder her and saved her life. She said she fell victim to the manipulation of violent predatory groups on Discord as a teenager.
“During a period in my life where I struggled with anxiety, depression and eating disorders, they took advantage of my feelings of isolation and encouraged me to self-harm and even end my life. While I’m deeply grateful to have escaped their abuse, I’m heartbroken to know that this violent, dangerous behavior persists on Discord. Enough is enough — tech companies need to do more to crack down on the predatory groups that nearly took my life. Discord owes it to a generation of kids and teens to eliminate the extremely harmful content that abounds on their platforms,” Abrielle said.
Warner encouraged Discord to devote more resources to the problem, including dedicating a greater number of content moderators, investigators, engineers and legal professionals to the situation.
“It is my understanding that Discord currently enforces its policies through actions like suspending policy-violating users’ accounts and servers, as well as banning their Internet Protocol (IP) addresses and email addresses. I also understand that there are far more sophisticated measures, such as device-based or cookie-based bans, that could be taken to prevent identified malign users from returning to your platform. Further, I am aware of measures that could be used to proactively detect harmful activity and initiate an early intervention to prevent harm and loss of life,” Warner said.
Warner demanded answers in his letter to a series of questions about the company’s efforts to address the predatory groups. He asked that Discord outline its policies and procedures around content that violates Discord’s Terms of Service, and that it share more information on its detection mechanisms, enforcement actions and measures to prevent the re-entry of malicious actors. He also requested answers on the number of accounts that have been removed in the last four years, and the quantity of suicide ideation or depiction content.
Monday’s letter also follows recommendations issued in July 2024 by the Biden-Harris Administration’s Kids Online Health and Safety Task Force to address the online health and safety for children and youth, with specific recommendations made to industry. Warner’s letter also comes on the heels of the Senate passage of the Kids Online Safety Act (“KOSA”) and the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”), which will require online platforms to take specific measures to protect the safety and privacy of children using their platforms.