Forewarning: I don't really know shit about AI and what constitutes sentience, humanity, etc., so I probably won't be able to add much to the conversation, but I do want y'all's thoughts. Also sorry for the lengthy post. TL;DR at the bottom

For framing, this came from talking about how the Institute in Fallout 4 regards and treats synths.

So someone in a discord server I'm in is adamantly against giving rights to robots. No matter how sentient they are. This comes from the basis that they would have to have been programmed by humans (which have their own biases to input), they will not be able to have true empathy or emotions (saying AI is by default emotionless), and it is irresponsible to let AI get to the point of sentience. Part of their objections were also about imposing humanity on something that could mechanically fail because of how we designed them (their quote was "something that rain could short circuit”) would be cruel.

Now I do agree with them on these points if we are talking about AI as it is right now. I don't believe we currently have the technology to create AI that would be able to be considered sentient like a human. I do deliberately say like a human at this point, but I would feel the same way if an AI had animal-like sentience I guess. I did ask if they would give an ape rights if they were able to more adequately communicate with us and express a desire for those rights, and they said no. We weren't able to discuss that as they had to head off to sleep, so I can't fully comment on that, but I would like that hypothetical to be considered and discussed in regards to robot sentience and rights. We briefly talked about whether AI could consent, but not too much to really flesh out or give arguments for or against. My example was that if I told my AI toaster that I was going to shut it down for an update, and it asked me not to, I would probably have to step back and take a shot of vodka. If we had a situation like the Matrix or Fallout synths, I would not be able to deny them their humanity. If we had AI advanced enough that could become sentient and act and think, on their own, I would not be able to deny them rights if they asked for them. Now there are situations where it would be muddy for me, like if we knew how much their creators still had a hand in their programming and behaviors or such. But if their creators, or especially world governments, are officially stating that certain AIs are acting seemingly of their own volition and against the programming and wills of their creators, I am getting ready to arm the synths (this is also taking into account whether or not the officials might be lying to us about said insubordination, psyops, etc. etc.).

TL;DR, what are y’all’s thoughts on AI sentience and giving AI rights?

  • Speaker [e/em/eir]
    arrow-down
    1
    ·
    4 years ago

    If it's sentient, it's gotta be given the right to self-determination. Im vegan btw.

    • eduardog3000 [he/him]
      arrow-down
      1
      ·
      4 years ago

      Is it really sentient though? Or is it just really good at acting in a way that appears sentient?

        • eduardog3000 [he/him]
          ·
          4 years ago

          If it's possible for an AI to become sentient, then there would have to be some line to where that starts.

          Is something that passes the Turing Test sentient? Or will know how it creates enough of an illusion of sentience to pass the test but isn't actually? What's the difference between a CPU running an AI and the CPU in my computer? They are both just running a set of instructions.

          • Speaker [e/em/eir]
            ·
            4 years ago

            Your brain is just a machine, and every thought and action you take is the result of chemical and electrical instructions you do not control. What's the difference between the CPU running the AI and your brain?

            • eduardog3000 [he/him]
              ·
              4 years ago

              In that case what's the difference between a CPU not running an AI and my brain? Why is my computer's CPU not sentient?

              • Speaker [e/em/eir]
                ·
                4 years ago

                I'm asking the opposite question: Why are you sentient?

                • eduardog3000 [he/him]
                  ·
                  edit-2
                  4 years ago

                  I don't know that I really am. But if I'm a simulation I'm incapable of wanting to be anything more.

      • SeizeDameans [she/her,any]
        ·
        4 years ago

        Are you really sentient though? Or are you just really good at acting like it? How do we even know what sentient is? How do you know that you weren't "programmed" to believe that you are sentient?

        • eduardog3000 [he/him]
          ·
          edit-2
          4 years ago

          I could be a part of a simulation, sure. If so, I am literally incapable of knowing or caring if said simulation gets turned off or modified to remove whatever I perceive as free will. In the same way robots will be incapable of caring about being turned off or not having what we perceive as free will, let alone what they perceive it as.