Senator Grassley Exposes Tech Giants’ Inadequate Reporting on Child Sexual Abuse Materials
Senate Judiciary Committee Chair Chuck Grassley (R-IA) has initiated a congressional inquiry into eight major technology companies, alleging their failure to provide sufficient information to a cyber tipline designed to detect the distribution of child sexual abuse materials (CSAM) on their platforms. This inquiry is a response to troubling reports from the National Center for Missing and Exploited Children (NCMEC), which claim that these tech giants have not adequately reported CSAM or related data concerning generative AI.
The Scope of the Inquiry
Grassley’s inquiry targets prominent companies including Meta, Amazon AI Services, TikTok, Snapchat, Discord, X.AI, Grindr, and Roblox. According to Grassley, these firms submitted over 17 million reports of suspected online child exploitation in 2025. However, they allegedly failed to provide NCMEC with critical location data and other essential information about users and suspects involved in these cases.
NCMEC has indicated that the tech companies also neglected to share CSAM data relevant for AI training and did not report instances of “sadistic online exploitation targeting children.” The organization emphasized that the role of these tech giants in reporting suspected CSAM is crucial for combating child exploitation. In fact, 81% of the reports received through NCMEC’s Cyber Tipline in 2025 originated from these eight companies.
“For almost thirty years, NCMEC has worked tirelessly to combat online child sexual exploitation by attempting to persuade platforms to detect, report, and remove child sexual exploitation on their platforms and improve the quality and substance of their CyberTipline reports,” NCMEC stated in a communication to Grassley.
Implications for Child Safety
The implications of the inquiry are significant. NCMEC pointed out that while many tech firms frequently highlight the volume of reports they submit to the Cyber Tipline, they often fail to disclose that millions of these reports lack fundamental information. This oversight leaves children vulnerable online, subjects survivors to revictimization, allows sexual offenders to operate freely, and squanders valuable law enforcement resources.
Grassley is demanding that the eight tech firms respond to NCMEC’s allegations and provide detailed plans for improving their handling of cyber tips in the current year. The senator expressed alarm at the information shared by NCMEC, underscoring the urgency of the situation.
Detailed Findings from the Inquiry
Grassley provided detailed statistics illustrating how each tech giant interacted with NCMEC in 2025. Key findings include:
-
Meta submitted nearly 11 million reports of suspected online child exploitation to NCMEC’s CyberTipline, but many of these reports allegedly contained “consistency and quality” issues that rendered them ineffective for law enforcement investigations.
-
Amazon AI Services filed over 1.1 million tips in 2025, yet none could be acted upon due to a lack of location or suspect information.
-
TikTok reported 3.6 million incidents, but many did not pertain to child exploitation. The platform informed NCMEC that it was prioritizing other high-priority items and could not commit to a timeline for rectifying these reporting issues.
Roblox’s Chief Safety Officer stated that the company is reviewing Grassley’s letter and is committed to a productive dialogue with both the senator’s office and NCMEC to ensure the safety of children online.
Responses from Tech Companies
In response to the inquiry, representatives from various companies have issued statements. A spokesperson for Meta emphasized that “child exploitation is a horrific crime,” asserting the company’s commitment to protecting children and assisting in bringing offenders to justice. They noted ongoing improvements based on feedback from NCMEC.
Discord highlighted its “longstanding, collaborative relationship” with NCMEC, emphasizing regular communication to fulfill reporting obligations and support the organization’s vital work.
Snap acknowledged Grassley’s concerns and stated that it has taken steps to enhance its reporting processes, improve data quality, and ensure law enforcement receives actionable information. The company reiterated its shared goal of protecting teens online and bringing perpetrators to justice.
Grindr expressed appreciation for Grassley’s concerns and welcomed the opportunity to detail its protections and policies for monitoring, identifying, and reporting CSAM to NCMEC. The company emphasized its commitment to preventing CSAM and maintaining a substantial moderation team to address accounts that may discuss topics related to minors.
Despite these responses, none of the other involved tech companies provided immediate comments upon request.
Conclusion
The inquiry led by Senator Grassley sheds light on the critical role that technology companies play in combating child sexual exploitation online. The findings raise serious questions about the adequacy of reporting mechanisms and the responsibilities of these platforms in safeguarding children. As the investigation unfolds, the implications for policy, technology, and child safety remain paramount.
For further information on this developing story, refer to the original reporting source: therecord.media.
Keep reading for the latest cybersecurity developments, threat intelligence and breaking updates from across the Middle East.


