Yeah...

  • Frank [he/him, he/him]
    hexagon
    ·
    1 year ago

    The human brain only generates the next set of neural activations given the previous set of neural activations

    :doubt:

    As near as anyone can tell humans are not deterministic, there's a lot more to cognition than neuronal activity, and these "humans are analogous to computers" arguments are enormously reductive at best, but usually just completely wrong.

    Building some novel plague is a possibility, and a good reason to burn all of these things and shoot the idiots who made them.

    • 0karin728 [any]
      ·
      1 year ago

      Yes, obviously it's an oversimplification, but fundamentally every computational system is either Turing complete or it isn't, that's the idea I was getting at. The human brain is not magic, and it's not doing anything that a sophisticated enough algorithm running on a computer couldn't do given sufficient memory and power.

      • UlyssesT
        ·
        edit-2
        21 days ago

        deleted by creator

        • Frank [he/him, he/him]
          hexagon
          ·
          1 year ago

          Word. I don't see any bits getting flipped in the brain, this "brain as a computer" thing seems pretty sketchy.

          • 0karin728 [any]
            ·
            1 year ago

            Computational universality has nothing to do with digital computer flipping bits. It just means that any system which manipulates information (performs computation), and can do so at a certain level of complexity (there's lots of equivalent ways of formulating it but the simplest is that it can do integer arithmetic) are exactly equivalent, in that they can all do the same set of computations.

            It's pretty obvious that the human brain is at least Turing complete, since we can do integer arithmetic. It's also impossible for any computational system to be "more" than Turing complete (whatever that would even mean) since every single algorithm that can be computed in finite time can be expressed in terms of integer arithmetic, which means that a Turing machine could perform it.

            Obviously the human brain is many, many, many layers of abstraction and us FAR more complicated than modern computers. Plus neurons aren't literally performing a bunch of addition and subtraction operations on data, the point is that whatever they are doing logically must be equivalent to some incomprehensibly vast set of simple arithmetic operations that could be performed by a Turing machine, because if the human brain can do a single thing that a general Turing machine can't, then it would either take infinite time or require infinite resources to do so.

          • UlyssesT
            ·
            edit-2
            21 days ago

            deleted by creator

            • 0karin728 [any]
              ·
              1 year ago

              This is why I fucking hate singularity cultist techbros. They convince the entire rest of society that AI is fake or that true AI is impossible or whatever by basically starting a religious cult around it.

              This is harmful because AI is Incredibly dangerous and we need people to acknowledge that to start taking action to ensure that it's developed safely and don't suddenly have capabilities spike by 300% one month and now suddenly we have 30% unemployment, or a super-plague gets released because chatGPT 5 in 2026 told some idiot how to make flu viruses 10x more transmissible and 10x as deadly or whatever.

              • UlyssesT
                ·
                edit-2
                21 days ago

                deleted by creator

                • 0karin728 [any]
                  ·
                  1 year ago

                  My worry isn't sapient AI, I genuinely do not care whether it's sapient, my worry is that in the short term it will enable people to commit bioterrorism and mass produce high quality propaganda, and in the longer term that it's capabilities might increase to the point of being difficult to control.

                  This is exactly the shit I'm talking about, you seem to dismiss the entire Idea that AI might outstrip human intelligence (and that this would likely be very bad) out of hand. I think this is a mistake born from not being familiar enough with the field

        • 0karin728 [any]
          ·
          1 year ago

          Do you even know what the Church-Turing thesis is?

    • UlyssesT
      ·
      edit-2
      21 days ago

      deleted by creator

      • Frank [he/him, he/him]
        hexagon
        ·
        1 year ago

        I want to be a hard determinist but those quantum mechanics assholes say there are truly random events at the quantum level so *shrug*

      • 0karin728 [any]
        ·
        1 year ago

        Ohhhhh, this is why your comment was so rude lmfao. Honestly fair. Sam Harris is a fucking idiot and I'm not a determinist, since quantum events are probably just random (though who the fuck knows tbh.)

        I am a strict materialist, but in more of a "everything can be explained by natural forces and interactions, by definition, because we are made of matter and something that wasn't composed of natural forces and interactions would be completely unobservable and therefore irrelevant" sort of way.