Skip to main content

Helping Child Survivors: The Fight to Remove Sex Abuse Images

04-15-2024

Millions of images and videos of children being sexually abused, some as young as infants and toddlers, are circulating on the internet. More than 27,400 of these children have been identified and rescued by law enforcement, but many of their images continue to be traded for years by people who derive pleasure from viewing them.

So how does the National Center for Missing & Exploited Children (NCMEC) help stop re-victimization of these children, which occurs every time someone views their images online? The same way police solve crimes in the real world: fingerprints.

Images and videos downloaded on a computer have a unique digital fingerprint, known as a “hash.” When reports of suspected child sexual exploitation are sent to the CyberTipline here at NCMEC  –  a staggering 36 million last year alone, or nearly 100,000 a day – we use technology to compare the hashes of these images with known child sexual abuse material (CSAM). That helps us support survivors by working with tech companies to have their images removed and also identifying potential new victims suffering ongoing abuse.

When NCMEC confirms an image contains CSAM after a triple-review process, we add it to a “hash list” that we share with electronic service providers around the world. On a voluntary basis, tech companies can use our hash list to scan their systems for CSAM so the abusive content can be removed and reported to our CyberTipline. The reports are then made available to the appropriate law enforcement agencies in this country and around the world for potential prosecution.                                                   

To promote transparency and verify that hashes on our hash list were CSAM that met the federal legal definition of child pornography, NCMEC issued a request for proposal (RFP) to obtain an independent third-party review. Through the RFP, Concentrix, a customer experience solutions and technology company, was selected to conduct the independent audit, which they did for us at no cost. The goal for the audit, the first of its kind of any CSAM hash list, is that it will provide transparency as to the contents within NCMEC’s hash list and encourage more electronic service providers to use it to ensure CSAM is not hosted within their systems.

Concentrix, along with Webhelp, which has merged with the company, performed separate, double reviews of the images and videos within NCMEC’s hash list. During the three-month audit, they used NCMEC’s established review system, technology and protocols, said David Slavinsky, the Concentrix site director for the audit conducted at NCMEC’s regional office in Rochester, New York.

“The moderators’ review was based on their independent assessments for the images and videos with hashes included in the NCMEC CSAM hash list without intervention or direction by NCMEC,” Slavinsky said in his report.

The audit’s results were impressive: 99.99% of the 538,922 unique images and videos reviewed through this audit by Concentrix were assessed as containing verified CSAM. 

Some of the abuse in CSAM can be unfathomable. Repeatedly viewing these often-horrific images and videos of children being sexually abused, even raped, can take an emotional toll. When asked how she can do it over and over, one NCMEC analyst responded: “How can I not? I may be that child’s only chance to get help.”

Shelley Allwang, a director in our Exploited Children Division, said one of the major factors in selecting this company for the audit was its commitment to the wellness of its moderators.

Just as NCMEC does for its employees whose job is reviewing CSAM, the company took numerous precautions to protect the moderators’ emotional well-being, even flying in a clinical psychologist from India to sit with them each day, said Megan Dinan, vice president for service delivery. They were regularly monitored, offered counseling and required to take a break every two hours – either going outdoors or creating artwork – which helped keep intrusive thoughts at bay and enabled them to sleep at night, Dinan said. The team was proud to do the difficult but necessary job, “knowing it was for the greater good,” she said.

While federal law calls these images child pornography, NCMEC and a growing number of child-safety advocates do not. Child pornography implies that children have a choice. We call them what they really are: images of children being sexually abused. These aren’t innocent photos of children running naked through a sprinkler or babies in a bathtub. They’re crime scene photos, often showing children being sexually assaulted, forced into acts of bestiality, even raped.

The company wanted to do something to help sexually exploited children and was “honored” to partner with NCMEC. Concentrix performed the $300,000 audit as a donation to our non-profit organization, which enabled NCMEC to keep those funds in core programs that help serve victims and survivors of child sexual abuse.

"Our partnership with NCMEC aimed to prevent images and videos of child sexual abuse from ever being seen by the public,” said Chris Caldwell, president and CEO of Concentrix. “Our specialized team successfully conducted hundreds of thousands of audits and improved the data used to stop these images. While our moderators have been trained to face the difficulties of sensitive content, their dedication and resilience throughout this partnership serves as a testament to their passion and commitment to protecting children and supporting NCMEC's vital mission.”

Allwang is hopeful that the results of the audit will reinforce trust and lead to more electronic service providers using our hash list to proactively identify and remove CSAM from their platforms. It would be an enormous relief to survivors still suffering, often years after their abuse was captured in photographs or videotape, she said.

Even after the abuse has been stopped, survivors of CSAM are stuck in a unique cycle of trauma. In communities around the globe, survivors live with the debilitating fear that the images and videos memorializing their sexual abuse as a child and shared on the internet will forever remain online for anyone to see. 

Many of these survivors are re-victimized as their images are shared again and again, often well into adulthood, and they constantly worry someone who has seen their images will recognize them anywhere they go. Hear their terrified voices: 

“I try to live as invisibly as possible, try to impress upon myself that the chance of recognition is really very small since I’m much older now. But the feeling persists.” 

“I do not want to socialize; I’m scared to step out of the door.” 

“I worry about this every day. I’m afraid for my children’s safety, try to avoid going out…[I’m] really paranoid when I take my kids to places like the zoo.” 

“I try to cover my face with my hair.”

“Imagine the worst or most traumatic thing that ever happened to you in your life being videotaped, then having millions of people seeing it and taking pleasure watching that,” said Allwang, who has literally watched some child victims grow up in CSAM over her many years working at NCMEC. “That’s what many survivors are living with every day.”

For more information about our CyberTipline, go to www.ncmec.org