Hundreds of child sexual abuse pictures were found in an image dataset used to train AI models, an analysis by a Canadian children’s organization has found.
The Canadian Centre for Child Protection (C3P) analyzed the NudeNet dataset, which features tens of thousands of images used by researchers to create AI tools aimed at detecting sexually explicit content. These images are sourced from platforms such as social media and adult pornography websites, an Oct. 22 C3P press release noted.