🚨Exclusive: OpenAI used outsourced Kenyan workers earning less than $2 per hour to make ChatGPT less toxic, my investigation found (Thread)https://t.co/302G0z7vy3— Billy Perrigo (@billyperrigo) January 18, 2023
In February, according to one billing document reviewed by TIME, Sama delivered OpenAI a sample batch of 1,400 images. Some of those images were categorized as “C4”—OpenAI’s internal label denoting child sexual abuse—according to the document. Also included in the batch were “C3” images (including bestiality, rape, and sexual slavery,) and “V3” images depicting graphic detail of death, violence or serious physical injury, according to the billing document. OpenAI paid Sama a total of $787.50 for collecting the images, the document shows
In a statement, OpenAI confirmed that it had received 1,400 images from Sama that “included, but were not limited to, C4, C3, C2, V3, V2, and V1 images.” In a followup statement, the company said: “We engaged Sama as part of our ongoing work to create safer AI systems and prevent harmful outputs. We never intended for any content in the C4 category to be collected. This content is not needed as an input to our pretraining filters and we instruct our employees to actively avoid it. As soon as Sama told us they had attempted to collect content in this category, we clarified that there had been a miscommunication and that we didn’t want that content. And after realizing that there had been a miscommunication, we did not open or view the content in question — so we cannot confirm if it contained images in the C4 category.”
:what-the-hell: i thought it was just text initially, but nope straight up images. this kind of job should include free therapy at the very very least.
cw:mention of sa, child abuse etc
:what-the-hell: i thought it was just text initially, but nope straight up images. this kind of job should include free therapy at the very very least.
deleted by creator
The extra horrifying thing, they do have good CP image recognition and hash values. Which means it’s new stuff.
It’s an endless stream of new horrors.