Obviously the first. If I have a free choice, why would I subject a sentient creature to a life of being shit on and it hates it?
Like yeah my toilet going "gimme that brown log daddy" or whatever is gonna be fucking awful for me, but nowhere near as awful as being sentient and being shit on all day.
Come on man you gotta stick to the spirit of the game here. This is doing rhetoric fallacy gotchas for hypotheticals. You get two choices, your toilet either craves your shit or it hates being shit in. Which is it?
If a calculation machines capable of taking entire snapshots of their brain composition and comparing it to favorable conditions can't overcome robot ennui then fuck me.
Obviously the first. If I have a free choice, why would I subject a sentient creature to a life of being shit on and it hates it?
Like yeah my toilet going "gimme that brown log daddy" or whatever is gonna be fucking awful for me, but nowhere near as awful as being sentient and being shit on all day.
Just get a new toilet lol
What and kill the old one?
Does my sentient toilet need a consistent plumbing connection to survive? Or can I just leave it on the corner to heckle people passing by?
Come on man you gotta stick to the spirit of the game here. This is doing rhetoric fallacy gotchas for hypotheticals. You get two choices, your toilet either craves your shit or it hates being shit in. Which is it?
But I want to do genetic engineering on my toilet to breed a race of super soldiers who either love to or hate to eat shit
Toilet eugenics :walter-breakdown:
No, you get two choices. Which way, western man?
this is the best site on the internet. I'm dying here
Blow it up like in Lethal Weapon 2
have you considered that Tony is deeply selfish and also very emotionally repressed
Yeah but he's got a heart for non-human sapience, doesn't he? The ducks in the first episode and all
Slap a speaker with an MP3 of ducks quacking on that baby and we're in business.
he has a heart for animals but this toilet by its speaking would put itself to be a person in his eyes I reckon
https://youtu.be/dR1m29cNVsc
The only answer.
I have a soapbox about AI. Emotional regulation for them could just be if{unhappy}(beHappy())
They wouldn't even have to be treated fairly. They could just feel joy automatically.
What is happiness without sadness?
Plenty of you're a robot
If{hollowHappiness}(feelJoy())
They could have such :gigachad-hd: emotions
But isn't happiness a moving target based on your conditions? It's not just a switch you can flip
If a calculation machines capable of taking entire snapshots of their brain composition and comparing it to favorable conditions can't overcome robot ennui then fuck me.