Fuck, this planet is doomed I stg

  • FuckyWucky [none/use name]
    ·
    edit-2
    10 months ago

    are they talking about AI generated CSAM? I've read about AI generated CSAM being indistinguishable from actual. I guess an argument could be made that its 'better' that actual children aren't being abused but the reality is these generators are trained on real images of CSAM and the pedos clearly aren't looking at it as 'art'. I do not think any normal person would make this shit.

    Good report on this.

    • BodyBySisyphus [he/him]
      ·
      10 months ago

      How tf are they hoovering up CSAM in the training data? It's not like that stuff is commonly search engine indexed and you would think they'd know to check the hashes. Lazy ass dystopic tech overlords.

      • zifnab25 [he/him, any]
        ·
        10 months ago

        CSAM exists on the internet and the web-crawlers that hoover up data aren't particular to their source of material. Its just one big vacuum absorbing every image or text file that isn't behind security.

        Consequently, there's a ton of training data on how to reproduce and replicate these images.

        • drhead [he/him]
          ·
          10 months ago

          That data goes through a lot of filtering and classification before it actually gets set up as a training dataset, though. LAION for instance has a NSFW classifier score for each item, and people training foundation models will usually drop items that score high on that. The makers of the dataset almost certainly ran it against the NCMEC database at least, and anything in it that was discovered and taken down since then won't be accessible to anyone pulling images since the dataset is just a set of links. People who make foundation models do NOT want to have it get out that they may have had that in their training sets and tend to aggressively scrub NSFW content of all kinds.

          It's a lot more likely that people are just relying on the AI to combine concepts to make that stuff (contrary to what everyone is saying you don't actually need something to be in the dataset if it exists as a combination of two concepts), or they are training it in themselves with a smaller dataset which sounds like what might have happened here. If there was anything like that in any of the common datasets used for training a large model from scratch, it would almost certainly have been found and reported by now; the only way that could conceivably not happen is if its captions have nothing to do with the content, which would not only also get the image dropped on its own, but would also make it entirely useless for training.

          • zifnab25 [he/him, any]
            ·
            10 months ago

            people training foundation models will usually drop items that score high on that

            Not if they're freaks. And it isn't as though there's a shortage of that in the tech world.

            People who make foundation models do NOT want to have it get out that they may have had that in their training sets and tend to aggressively scrub NSFW content of all kinds.

            I doubt the bulk of them give it serious thought. The marketing and analytics guys, sure. But when that department is overrun with 4chan-pilled Musklovites, even that's a gamble. Maybe someone in Congress will blow a gasket and drop the hammer on this sooner or later, like Mark Foley did back in the '00s under Bush. But... idk, it feels like all the tech giants are untouchable now anyway.

            they are training it in themselves with a smaller dataset which sounds like what might have happened here

            Also definitely possible, which would get us much closer to a "this is obviously criminal misconduct because of how you obtained the dataset" rather than "this is obviously morally repugnant while living in a legal gray area".

            • drhead [he/him]
              ·
              10 months ago

              I doubt the bulk of them give it serious thought.

              I mean, I've seen what OpenAI did for their most recent models, they spent about half of the development time cleaning the dataset, and yes, a large part of their reason for that is that they want to present themselves as "the safe AI company" so they can hopefully get their competitors regulated to death. Stable Diffusion openly discloses what they use for their dataset and what threshold they use for NSFW filtering, and for SD2.0 they ended up using an extremely aggressive filter (filtering everything with a score above 0.1 where most would use something like 0.98 -- for example, here is an example of an absolutely scandalous photo that would be filtered under that threshold, taken from a dataset I assembled) that it actually made the model significantly worse at rendering humans in general and they had to train it for longer to fix it. Like... all evidence available points to them being, if anything, excessively paranoid.

        • BodyBySisyphus [he/him]
          ·
          10 months ago

          Cool, you learn a new horrible truth about the corrupt underbelly of our civilization every day

      • SerLava [he/him]
        ·
        edit-2
        10 months ago

        They probably don't even need CSAM, the AI can probably just be running through cycles like "make the image look more like a porn photo" and "make the image look more like a child" simultaneously or something. I have heard they do use actual CSAM. But it would probably have to be filtered at the user input stage in the generator itself

    • LaughingLion [any, any]
      ·
      edit-2
      10 months ago

      I hate these fucks so goddamn much I can't even contain it. Look, I like to look at lewd cosplay and shit on DeviantArt. I know what I'm about. But if you do that now it's filled with tons of AI models in half-nude and full-nude poses. So fucking many of these are so goddamn young looking and it's sickening. Like pre-pubescent. Could I sift through them and report them as I see them in my recommends page? I could but I don't want to do that. That means I need to click them and SEE them. I don't want to see that shit. I don't want the psychic harm of wading through questionably legal and obsoletely unethical AI CSAM just to report it to the fucking DeviantArt automated system that may or may not even take any action.

      Fuck the people who defend this you are ruining good, consensual adult content you sick fucks.

      I'm okay with AI art in general. I can be swayed. Obviously every technology has a drawback. An evil use. You'd think we could all agree on this one. Like, hey don't make AI CP you creeps. But nope, the libertarian debate-lords can't let us have this easy W. FUCK YOU YOU'VE DESTROYED MY EROTIC ART SPACES YOU SICK FUCKS

  • aaaaaaadjsf [he/him, comrade/them]
    ·
    10 months ago

    libertarian-alert

    Seriously what is it with the internet that allows for people seriously defending pedo ideology? If you said this in the real world, there's a high chance of getting punched in the face.

  • Flyberius [comrade/them]
    ·
    10 months ago

    These fucking idiots man. They act as if their actions exist in a vacuum. I have seen the same argument used to justify to possession of CSAM, and they seem to believe that them seeking it out doesn't create a demand for more of it to be made.

    And yeah, as others have pointed out, AI images do not produce their images via magic. They are trained on CSAM, and this creates demand for larger and larger datasets of training material.

    • zifnab25 [he/him, any]
      ·
      10 months ago

      they seem to believe that them seeking it out doesn't create a demand for more of it to be made.

      Idk. At a certain point, absent explicit monetization, I think there's a disconnect between the folks who create these trophy cases for their crimes and people who get off on it voyeuristically. The original sickos aren't going away simply because the voyeurs are cut off.

      This online behavior persists for the same reason a host of other toxic and unwanted online behavior exists. Its far more time consuming and exhausting to hunt it down and stamp it out than it is to produce and circulate. And in a country that hates children and loves porn, we're simply not going to invest the kind of resources we use to harass poc or suppress unions or pad out the local blue-shirt gestapo with no-show jobs into curtailing this shit.

      What really ends up propagating this shit isn't the pervs gleefully browsing another layer of smut. Its the massive industrial media centers that blindly endorse participation to gin up their performance figures without wanting to know what's actually circulating under the hood.

    • Zuberi 👀@lemmy.dbzer0.com
      hexagon
      ·
      10 months ago

      The article in question, I believe the man used photos he took from inside of his psychological practice. Real-Children-Trained AI Deep Fakes without clothes on.

      For "art"

      • FuckyWucky [none/use name]
        ·
        10 months ago
        mention of child abuse

        pedophiles are known to use 'art' excuse with real CSAM too. Ukraine was infamous in the 2000s for being the producer of 'naturist' films depicting children naked without any explicit sexual acts. Because of it being in legal gray area, they were able to sell it as 'art' to westerners who made payments with credit cards (not cash/crypto).

        article

    • Saeculum [he/him, comrade/them]
      ·
      10 months ago

      AI image generators can combine concepts to create new images that are not similar to ones contained in the data-set.

  • BelieveRevolt [he/him]
    ·
    10 months ago

    Too bad we're not federated with that instance, so the red fascists can't go call them pedos sicko-wistful

    • BeamBrain [he/him]
      ·
      10 months ago

      I'd rather not be federated with these people, thanks

  • CrimsonSage [any]
    ·
    10 months ago

    Like this fucking ai humping pedo doesn't get that ai doesn't make anything. What does this moron think the scraped to generate his "art."

  • RyanGosling [none/use name]
    ·
    edit-2
    10 months ago

    If a computer or artist generated your gross pedo art completely independent of real life references or any kind of references, I would say whatever, just stay away from my kid.

    But when you’re kicking your lips over how AI has consumed so many photos of children without their consent that it can produce realistic results, then you’re quite literally exploiting children and deserve to get your ass kicked