News

More than a thousand images of child sexual abuse material were found in a massive public dataset that has been used to train popular AI image-generating models, Stanford Internet Observatory ...
More than 1,000 known child sexual abuse materials (CSAM) were found in a large open dataset—known as LAION-5B—that was used to train popular text-to-image generators such as Stable Diffusion ...