How to Fix the Online Child Exploitation Reporting System

A new Stanford Internet Observatory report examines how to improve the CyberTipline pipeline from dozens of interviews with tech companies, law enforcement and the nonprofit that runs the U.S. online child abuse reporting system.
graphic represntations of computers and phones connected to tettering boxes of files and then connected to law enforcement in a complex web on a blue background.
  • The CyberTipline is enormously valuable, and leads to the rescue of children and prosecution of offenders.
  • Law enforcement officers are currently constrained in their ability to accurately prioritize CyberTipline reports for investigation. This is because:
    • Many online platforms submit low-quality reports. 
    • NCMEC has faced challenges in rapidly implementing technological improvements that would aid law enforcement in triage.
    • Legal constraints on NCMEC and U.S. law enforcement have implications for efficiency.
  • These issues would be best addressed by a concerted effort to massively uplift NCMEC's technical and analytical capabilities, which will require the cooperation of platforms, NCMEC, law enforcement and, importantly, the U.S. Congress.

The CyberTipline is the main line of defense for children who are exploited on the internet. It leads to the rescue of children and the arrest of abusers. Yet after 26 years many believe the entire system is not living up to its potential. A new Stanford Internet Observatory report examines issues in the reporting system and what the technology industry, the nonprofit that runs the tipline, and the U.S. Congress must do to fix it.

If online platforms in the U.S. become aware of child sexual abuse material (CSAM), federal law requires that they report it to the CyberTipline. This centralized system for reporting online child exploitation is operated by the National Center for Missing and Exploited Children (NCMEC), a nonprofit organization. NCMEC attempts to identify the location of the users who sent and received the abuse content, and may attempt to locate the victim. These reports are then sent to local or national law enforcement agencies in the U.S. and abroad.

An excerpt from the CyberTipline form for online platforms, An excerpt from the manual CyberTipline form for online platforms, as seen on February 12, 2024. This shows important file-level checkboxes, including "File Viewed by Company,'' "Potential Meme,'' and "Generative AI.''

The report is based on interviews with 66 respondents across industry, law enforcement, and civil society. Researchers also visited NCMEC’s headquarters for three days of extensive interviews.

While our research focuses on CyberTipline challenges, we want to note that many respondents highlighted that the entire CyberTipline process is enormously valuable and the fact that U.S. platforms are required to report CSAM is a strength of the system. “The system is worth nurturing, preserving, and securing,” one respondent said.

Findings

 

Law enforcement officers are overwhelmed by the high volume of CyberTipline reports they receive. However, we find that the core issue extends beyond volume: officers struggle to triage and prioritize these reports to identify offenders and reach children who are in harm. An officer might examine two CyberTipline reports – each documenting an individual uploading a single piece of CSAM – yet, upon investigation, one report might lead nowhere, while the other could uncover ongoing child abuse by the uploader. Nothing in the reports would have indicated which should be prioritized.

We identify three key challenges for law enforcement to prioritize reports for investigation.

First, while some tech companies are known for providing careful and detailed CyberTipline reports, many reports are low quality. Executives may be unwilling to dedicate engineering resources to ensure the accurate completion of fields within the reporting API. Trust and safety staff turnover and a lack of documentation on reporting best practices cause knowledge gaps in consistency and effective reporting. This is especially true for platforms that make fewer reports. That said, submitting a high volume of reports is not necessarily correlated with submitting high quality reports.

Second, NCMEC has faced challenges in rapidly implementing technological improvements that would aid law enforcement in triage. NCMEC faces resource constraints and lower salaries, leading to difficulties in retaining personnel who are often poached by industry trust and safety teams. While there has been progress in report deconfliction—identifying connections between reports, such as identical offenders—the pace of improvement has been considered slow. Additionally, varied case management interfaces used by law enforcement to process CyberTipline reports make it difficult to ensure linked reports are displayed. Integration difficulties with external data sources, which could enrich reports and facilitate triage, are partly attributed to the sensitive nature of CyberTipline data and potentially staffing constraints for technical infrastructure upgrades. Legal restrictions on NCMEC’s use of cloud services hampers their ability to leverage advanced machine learning tools, although opinions vary on the appropriateness of cloud storage for their sensitive data.

Third, there are legal constraints on NCMEC’s and law enforcement’s roles. A federal appeals court held in 2016 that NCMEC is a governmental entity or agent, meaning its actions are subject to Fourth Amendment rules. As a result, NCMEC may not tell platforms what to look for or report, as that risks turning them into government agents too, converting what once were voluntary private searches into warrantless government searches (which generally requires suppression of evidence in court). Consequently, NCMEC is hesitant to put best practices in writing. Instead, many trust and safety staff who are new to the CyberTipline process must learn from more established platforms or industry coalitions. 

Another federal appeals court held in 2021 that the government must get a warrant before opening a reported file unless the platform viewed that file before submitting the report.  Platforms often do not indicate whether content has been viewed; if they have not so indicated, then NCMEC, like law enforcement, cannot open those files. Platforms may automate reports to the CyberTipline on the basis of a hash match hit to known CSAM instead of having staff view each file, whether due to limited review capacity or not wanting to expose staff to harmful content. Where reported files weren’t viewed by the platform, law enforcement may need a warrant to investigate those reports, and NCMEC currently cannot help with an initial review. 

This review process makes it difficult to process the high volume of reported viral and meme content. Such content commonly gets shared widely, for example out of outrage or a misguided attempt at humor; nevertheless, if it meets the definition of CSAM, it is still illegal and must be reported. Platform staff don’t always review meme content (to avoid repeated unnecessary exposure to known material), but if these reports with unviewed files are submitted without checking the CyberTipline report form’s box for memes, it creates an enormous amount of work for law enforcement to close out these unactionable reports. Meanwhile, since platforms are required to preserve reported material for only 90 days, the time it takes to process a report means preserved content has often been deleted by the time law enforcement follows up with the platform in actionable cases.

 

Recommendations

 

  • Online platforms should prioritize child safety staffing with expertise for in-depth investigations that proactively identify and address child sexual abuse and exploitation to stay ahead of measures taken by bad actors to avoid detection.
  • Platforms should invest dedicated engineering resources in implementing the NCMEC reporting API. Ensure there is an accurate and (where possible) automated process for completing all relevant fields Our interviews suggest reports are more actionable when they provide offender information (including location information, particularly an upload IP address), victim information (including location information), the associated file (a hash alone is insufficient) or chat, and the time of the incident (including a description of how the platform defines the incident time). 
  • To avoid state actor concerns, an NGO that is not NCMEC should publish the key CyberTipline form fields that platforms should complete to increase the likelihood that law enforcement will be able to investigate their reports.
  • Congress should increase NCMEC’s budget to enable it to hire more competitively in the technical division, and to dedicate more resources to CyberTipline technical infrastructure development. This funding should not be taken out of the budget for Internet Crimes Against Children Task Forces.  
  • NCMEC should prioritize investment in technical staff and the technical infrastructure of the CyberTipline to speed up implementation of their technical roadmap.
  • NCMEC and Internet Crimes Against Children Task Forces should partner with researchers to bring insights into the CyberTipline reporting flow along with the relationship between CyberTipline reports, arrests, and victim identification.
  • Congress should pass legislation that extends the required preservation period to at least 180 days, but preferably one year.
  • The U.S. Supreme Court should resolve the split in authority over whether the private search doctrine requires human review by platform personnel in order for law enforcement to open a file without a warrant, or whether the doctrine is satisfied where a reported file is a hash match for a previously-viewed file. 

Read More

Ai collage
Blogs

Investigation Finds AI Image Generation Models Trained on Child Abuse

A new report identifies hundreds of instances of exploitative images of children in a public dataset used for AI text-to-image generation models.
cover link Investigation Finds AI Image Generation Models Trained on Child Abuse
watercolor style image showing a nexus of social media platform icons
Blogs

Addressing Child Exploitation on Federated Social Media

New report finds an increasingly decentralized social media landscape offers users more choice, but poses technical challenges for addressing child exploitation and other online abuse.
cover link Addressing Child Exploitation on Federated Social Media
watercolor style image showing a nexus of social media platform icons
Blogs

An update on the SG-CSAM ecosystem

cover link An update on the SG-CSAM ecosystem