Artificial intelligencechild sexual abuse materialCyberTiplineFeaturedTechnology

AI-generated child pornography threatens to overwhelm reporting system: Research

Child pornography generated by artificial intelligence (AI) could overwhelm an already inundated reporting system for online child sexual abuse material (CSAM), a new report from the Stanford Internet Observatory found. 

The CyberTipline, which is run by the National Center for Missing and Exploited Children (NCMEC), processes and shares reports of CSAM with relevant law enforcement for further investigation. 

Open-source generative AI models that can be retrained to produce CSAM “threaten to flood the CyberTipline and downstream law enforcement with millions of new images,” according to the report. 

“One million unique images reported due to the AI generation of CSAM would be unmanageable with NCMEC’s current technology and procedures,” the report said. 

“With the capability for individuals to use AI models to create CSAM, there is concern that reports of such content—potentially indistinguishable from real photos of children—may divert law enforcement’s attention away from actual children in need of rescue,” it added. 

Several constraints already exist on the reporting system. Only about 5 to 8 percent of reports to the CyberTipline result in arrests in the U.S., according to Monday’s report. 

Online platforms, which are required by law to report CSAM to the CyberTipline, often fail to complete key sections in their reports to the tipline. 

NCMEC also struggles to implement technological improvements and maintain staff, who are often poached by industry trust and safety teams. 

The nonprofit, which was established by Congress in the 1980s, has also run into legal constraints since it has been deemed a governmental entity by the courts in recent years, the report noted. 

Fourth Amendment restrictions on warrantless searches now limit NCMEC’s ability to view files that the platforms have not previously viewed, preventing it from vetting files and causing law enforcement to waste time investigating non-actionable reports. 

The report recommended that tech companies invest in child safety staffing and implementing the NCMEC reporting API to help ensure more effective tips. It also suggested that Congress increase NCMEC’s budget so it can offer competitive salaries and invest in technical infrastructure.

Source link

Related Posts

Load More Posts Loading...No More Posts.