Why the US needs child safety legislation now
30 January 2024
By: Ellen Jacobs
New research from ISD adds to the long list of documented harms that teens and kids face on online platforms. As the EU, UK and Australia begin to enforce their landmark regulatory schemes— the Digital Services Act, the UK Online Safety Act and the Australian Online Safety Act— to mitigate harms to minors, it is time the US passes legislation to protect teens and kids online.
Given the litany of harms— graphic and violent content, sexual harassment, content promoting eating disorders, cyberbullying, and more— faced by young users online, ISD is encouraged to see the issue of child safety elevated in digital policy discussions via proposed legislation and as the focus of various US Congressional hearings. In the Senate Judiciary’s upcoming hearing this Wednesday, some platform CEOs will face tough questions about their poor track records in protecting kids and teens on their social media sites or apps. We hope it will serve as a forum and add momentum to passing much-needed regulation around platform features that put child safety at risk.
Former Meta employee and concerned father Arturo Béjar’s recent testimony in front of Senate Judiciary’s Privacy, Technology, and the Law Subcommittee summarized the harms that children face on Meta’s platforms and the company’s failure to acknowledge or implement policies to address them. Béjar’s testimony is just one instance in a long line of revelations about the multitude of harms young users face on social media platforms, from Frances Haugen’s bombshell Facebook Papers to a lawsuit filed by the relatives of 65 victims who died from drugs purchased on Snapchat. Notably, these dangers are rarely highlighted by the platforms themselves, which often push back against allegations of hiding or misrepresenting harms. However, it shouldn’t take a brave whistleblower, a state lawsuit, or the death of a child, to bring to light unsafe experiences minors are having on social media platforms, let alone compel the platforms to address and fix them.
As an organization that conducts research on hate, extremism and disinformation online, we are acutely aware of how harmful content manages to remain online, and makes its way to minors feeds, despite being banned by platforms’ own terms and services. We have been documenting the types of harmful content being recommended or shown to minors’ accounts on various platforms to add to the public record as evidence of how unsafe it can be for kids online.
Recently we found content glorifying mass shooters readily available to minors on platforms including TikTok, Telegram, Discord and X. In October 2023, we found more than 300 posts or videos across Instagram, TikTok, and Snapchat portraying extremely graphic, distressing or violent imagery around the conflict between Hamas and Israel to accounts available to 13-year-olds using the platforms. In August 2021, our analysis found that extreme-right Discord servers can operate as safe spaces for young people curious about extremist ideologies and giving them access to explicit materials.
Time and time again, the platforms have responded to allegations of harm by downplaying the issues or responding with vague and uncomprehensive protective measures. Even when it is revealed that the platform knew of the issue, there is often accompanying evidence that instead of addressing the harms, the platform chose to ignore it. These companies have continually proven that they either are choosing to underinvest in identifying and mitigating risks to children and teens or they don’t care about the harm being inflicted in return for engagement and increased profits. Either way, the platforms have proven to be irresponsible stewards for creating safe and age-appropriate experiences for minors online.
This is why we need Congress to act now to protect kids and teens online. The multitude of harms facing young people on social media– graphic and violent content, sexual harassment, content promoting eating disorders, cyberbullying– will only continue without regulation that outlines the minimum standards platforms must enact and adhere to for the protection of young users. Other jurisdictions have recognized and acted to fill this urgent need: the European Union passed the Digital Services Act (DSA) in 2022, the United Kingdom passed the Online Safety Act (OSA) last year, and Australia passed its Online Safety Act in 2021. Both provide vital protections for young users online by requiring large digital platforms to take safeguarding actions, such as restricting ads targeted to kids, limiting the amount of personal information that can be collected from children, increasing transparency for researchers to identify harms and inform policy, publishing risk assessments, and making it easier for young users to report harmful or illegal content on the platform.
We need those protections for young users in the US. We are encouraged by Congress’ serious consideration of proposed child safety legislation, such as the Kids Online Safety Act and an amendment to the Children’s Online Privacy Protection Act, and ongoing engagement with civil society groups to ensure that any regulation passed protects all users online. The upcoming Senate Judiciary hearing on Wednesday also provides an important opportunity to create a public record in which platform CEOs are confronted with the harms facing young users on their platforms and an opportunity for senators to demand accountability from company leadership.