Happy 20th birthday to this young child! Good luck blowing out all those candles.

  • Frank [he/him, he/him]
    ·
    edit-2
    2 months ago

    That's the trick. It isn't anything. It's a statistical representation of patterns of data that appear in image data with the appropriate tag words. None of the "things" in this picture are things, there are no representations of actual objects. It's all math, heat maps, patterns in data. That's why it gets fucked up at the edges and boundaries of things. There's no mind that can understand the existence of abstract objects like "cake" and "candle" and "mom". It's just maths putting colors where they're statistically most likely to go based on the input set, and those colors smear together at the edges.

    That's the hardest part; Up until literally four years ago the only time we'd ever encounter an image that strongly resembled a human is if it was the work of another human attempting to depict a human. There's no attempt at depicting anything here. Nothing is depicted. This isn't an image of people. It's just the averages of a bunch of numbers with colors assigned to them. It's nothing. It's so fucking weird and disturbing, like we barely, as a culture, have a conceptual framework for this nightmare.

    • UlyssesT
      ·
      edit-2
      21 days ago

      deleted by creator

      • Frank [he/him, he/him]
        ·
        edit-2
        2 months ago

        I'm going back in time to give the luddites a mecha.

        I hope I'm right about how these things work because I do not understand any of the math or programming behind them. All I know is that when a person draws a fucked up hand it's because hands are really hard to draw due to hands being in all kinds of different shapes based on what they're doing, and having many many different planes and angles and perspective doohickeys. But a person knows a hand is a hand. Show them a human hand with a five fingers, or six fingers, or a steel hook, or a grasping robot clow, or a lobster claw, or whatever, they can identify all of those things as a category of objects called "hands".

        And the machines don't do that. There's no indication that the machine has a concept of an abstract object called "hand" that is the same object regardless of it's actual appearance in a 2d image. There's no evidence of a concept of a squarish meaty bit with four thing meat sticks and a fat shorter meat stick. It's just blurs that resemble figures because the machine doesn't "think" or "know" or "represent" anything when it outputs these images.

        And if I'm wrong, and there is something thinking in there, and it just fucks these things up because it's entire perceptual world consists of datasets with no attached information or context? Fuck me, idk. The whole thing freaks me the fuck out because we really are getting in to the deep water of "If we did encounter a non-human intelligence how would we even know?" The people who think these things have human-like intelligence are massively ignorant dorks but I keep having spooky dreams about symbioses between trees and mushroom networks or the way ant hives engage in complex problem solving based on the simple, rules based action of individual ants.

        • UlyssesT
          ·
          edit-2
          21 days ago

          deleted by creator