News

A massive open-source AI dataset, LAION-5B, which has been used to train popular AI text-to-image generators like Stable Diffusion 1.5 and Google’s Imagen, contains at least 1,008 instances of ...
More than a thousand images of child sexual abuse material were found in a massive public dataset that has been used to train popular AI image-generating models, Stanford Internet Observatory ...
More than 1,000 known child sexual abuse materials (CSAM) were found in a large open dataset—known as LAION-5B—that was used to train popular text-to-image generators such as Stable Diffusion ...
LAION-5B, a dataset used by Stable Diffusion, contained thousands of links to child sexual abuse material that could influence AI-image generation. LAION-5B had thousands of problematic images ...
A massive public dataset used to build popular artificial intelligence image generators contains at least 1,008 instances of child sexual abuse material, a new report from the Stanford Internet Obs… ...
Getty Images is going all in to establish itself as a trusted data partner. The creative company, known for enabling the sharing, discovery and purchase of visual content from global photographers ...
Photos of Brazilian kids—sometimes spanning their entire childhood—have been used without their consent to power AI tools, including popular image generators like Stable Diffusion, Human ...
More than a thousand images of child sexual abuse material were found in a massive public dataset that has been used to train popular AI image-generating models, Stanford Internet Observatory ...
More than a thousand images of child sexual abuse material were found in a massive public dataset used to train popular AI image-generating models, Stanford Internet Observatory researchers said ...