From a... idk... personal dignity perspective? this is a violation of another person's dignity. You're turning them in to an object to be consumed without their knowledge or consent. It's dehumanizing. I'd argue it constitutes a form of social violence.
From a harm perspective we've been talking about the potential for deepfake videos to be used for nefarious purposes for years. Someone could produce one of these fake pictures are use it to blackmail a victim. They could release it in a public venue and use it to destroy someone's reputation. You might be able to call this making blackmail material and use blackmail laws.
Regardless of what we call it or how we do it, this is clearly a new category of criminal behavior that existing laws, and arguably morals, has never seriously accounted for. Producing not just images, but images indistinguishable from real photographs, has never been practical before. It required experts, specialist tools, and a great deal of time.
It is, and the hostility in every single one of your comments in this post confirms that it is. You may not like artists for whatever reason but they're workers just like we are and their labor deserves to be properly compensated.
the deserve fair comp but that doesn't mean you turn around and defend liberal ideas about intellectual property
it's not about the treats I don't care about the damn treats i'm too busy making my own stuff for free to care about entertainment media products i'm not gonna watch. if you're gonna accuse me of something at least make it credible.
And no one is "defending IP". Learn the difference between owning the concept of a character/story and owning the actual art you make. A painting is not IP, it is a material creation that should be owned by its creator. AI isnt "stealing" ideas from people, its taking the data from actual art itself.
Its like if I traced someone elses drawing and sold it as my own.
The way Ligma and Catgirl are maligning others as "not communist" or "not materialist" enough for just being wary of and critiquing this technology... it feels very personal lol
Even ignoring the personal attachment... its just a weirdly wrong analysis of the situation.
The whole calling artists petty bourgeois and equating protecting their livelihoods to worshiping "IP law", it really reminds me of the patsoc line about baristas not being real proles.
Bringing it into the real world is probably worse, the culture of doing that to begin with even just in people's imaginations probably wasn't great for society either.
this still sounds like the problem is some of the things people do with the doctored image but not all of the things they do with it.
like, we all agree writing fanfic about real celebrities is fucken weird and gross, but i don't think there's any harm in the adam/jamie weirdos or whatever... until you start sending it to the actual people or posting it in public somewhere they'd see it and people bother them about it... and the problem there is everything after the art, not the art itself.
If I draw a picture, or paint it, or do something where its kinda obvious that the picture wasn't "real." "Real" as in, a person who exists was doing the thing that I made a picture of.
Now, if we use AI to deepfake a nude, and it turns out well enough that its really, REALLY, hard to see that the nude is totally fabricated this can be used to get people fired, accused of pedophilia, adultery, etc. By the time that anybody with the know-how can verify that the image is a fabricated deepfake, that person's reputation/career/or even life could be ended. Teachers in grade school in the USA have been fired for soft core/hard core porn photos/movies they were in when well before they started a teaching career.
The way I see it, the problem is more with distributing fictional (or real, ofc) sexually-explicit media of a real person without their consent.
It's creepy as hell to make it for yourself - don't. But that's not exactly a moral problem so much as a deeply unhealthy approach to relationships. Meanwhile, I'd consider it wrong to, say, read your erotic friendfiction out loud in the school cafeteria.
Would that be altered if you only did this because Tammy threatened to do it and you got bad advice from your mom? Yes. Yes it would.
I disagree. It is a moral problem. It's immoral to literally objectify someone like this, turn them in to an object. It might not cause direct harm to the individual, but the widespread acceptance of this ties directly in to how normative sexual violence is. A society that normalizes this is a society that is making itself comfortable with completely, utterly dehumanizing someone and turning them in to a sex object with no concern at all for that person as a human being.
It might not rise to the point of being called a crime, but this is horrifyingly socially corrosive. This directly increases our alienation and isolation from each other. You can get sexual gratification from someone without ever learning their name, without them ever knowing, by putting them in a machine that strips their clothes off and exposes them to you.
To put it another way; Is being a peeping tom immoral? Is spying on someone while they're naked and have a reasonable expectation of privacy immoral? Is looking at someone's nudes when they did not share them with you immoral?
And if it is, then how does this meaningfully differ?
Is spying on a naked person only a crime if you are caught and it causes them distress? Or is it a crime even if they never know? Is it wrong to put a camera in someone's bathroom, even if they never suspect?
Well for the very last point, you're directly violating their privacy - observing them rather than creating a fictional depiction of them. It's worse, but that doesn't make it okay to create the fictional media. To repeat myself, that's not okay. Don't do it.
It's just not in any way an interaction with another person. The distinction between moral and practical rules is meaningless to me outside of interactions, so I'll leave it at "It's bad; don't do it."
that is weird and why should the AI nude feel more wrong than the actual human creep doing it?
I forgot who wrote this but they were writing about dehumanization in it's final form and how you won't even GET to be a victim. Like it's a minor privilege to have victimhood because a victim is a human. Which the "breakthrough" in AI deepfakes will be the ability to violate without creating victims. Probably why it feels so unseemly because we are sensing something dehumanizing.
deleted by creator
From a... idk... personal dignity perspective? this is a violation of another person's dignity. You're turning them in to an object to be consumed without their knowledge or consent. It's dehumanizing. I'd argue it constitutes a form of social violence.
From a harm perspective we've been talking about the potential for deepfake videos to be used for nefarious purposes for years. Someone could produce one of these fake pictures are use it to blackmail a victim. They could release it in a public venue and use it to destroy someone's reputation. You might be able to call this making blackmail material and use blackmail laws.
Regardless of what we call it or how we do it, this is clearly a new category of criminal behavior that existing laws, and arguably morals, has never seriously accounted for. Producing not just images, but images indistinguishable from real photographs, has never been practical before. It required experts, specialist tools, and a great deal of time.
Removed by mod
Removed by mod
deleted by creator
it's not about treats you ridiculous clown
It is, and the hostility in every single one of your comments in this post confirms that it is. You may not like artists for whatever reason but they're workers just like we are and their labor deserves to be properly compensated.
the deserve fair comp but that doesn't mean you turn around and defend liberal ideas about intellectual property
it's not about the treats I don't care about the damn treats i'm too busy making my own stuff for free to care about entertainment media products i'm not gonna watch. if you're gonna accuse me of something at least make it credible.
And no one is "defending IP". Learn the difference between owning the concept of a character/story and owning the actual art you make. A painting is not IP, it is a material creation that should be owned by its creator. AI isnt "stealing" ideas from people, its taking the data from actual art itself.
Its like if I traced someone elses drawing and sold it as my own.
Thats the treats part. The ai generated images are the cheap treats made at the expense of artists.
The way Ligma and Catgirl are maligning others as "not communist" or "not materialist" enough for just being wary of and critiquing this technology... it feels very personal lol
E: Add Redbolshevik in there too
deleted by creator
Even ignoring the personal attachment... its just a weirdly wrong analysis of the situation.
The whole calling artists petty bourgeois and equating protecting their livelihoods to worshiping "IP law", it really reminds me of the patsoc line about baristas not being real proles.
deleted by creator
Removed by mod
Bringing it into the real world is probably worse, the culture of doing that to begin with even just in people's imaginations probably wasn't great for society either.
this still sounds like the problem is some of the things people do with the doctored image but not all of the things they do with it.
like, we all agree writing fanfic about real celebrities is fucken weird and gross, but i don't think there's any harm in the adam/jamie weirdos or whatever... until you start sending it to the actual people or posting it in public somewhere they'd see it and people bother them about it... and the problem there is everything after the art, not the art itself.
If I draw a picture, or paint it, or do something where its kinda obvious that the picture wasn't "real." "Real" as in, a person who exists was doing the thing that I made a picture of.
Now, if we use AI to deepfake a nude, and it turns out well enough that its really, REALLY, hard to see that the nude is totally fabricated this can be used to get people fired, accused of pedophilia, adultery, etc. By the time that anybody with the know-how can verify that the image is a fabricated deepfake, that person's reputation/career/or even life could be ended. Teachers in grade school in the USA have been fired for soft core/hard core porn photos/movies they were in when well before they started a teaching career.
Because an AI thing could easily be mistaken for a real nude, and drawings cannot.
The way I see it, the problem is more with distributing fictional (or real, ofc) sexually-explicit media of a real person without their consent.
It's creepy as hell to make it for yourself - don't. But that's not exactly a moral problem so much as a deeply unhealthy approach to relationships. Meanwhile, I'd consider it wrong to, say, read your erotic friendfiction out loud in the school cafeteria.
Would that be altered if you only did this because Tammy threatened to do it and you got bad advice from your mom? Yes. Yes it would.
I disagree. It is a moral problem. It's immoral to literally objectify someone like this, turn them in to an object. It might not cause direct harm to the individual, but the widespread acceptance of this ties directly in to how normative sexual violence is. A society that normalizes this is a society that is making itself comfortable with completely, utterly dehumanizing someone and turning them in to a sex object with no concern at all for that person as a human being.
It might not rise to the point of being called a crime, but this is horrifyingly socially corrosive. This directly increases our alienation and isolation from each other. You can get sexual gratification from someone without ever learning their name, without them ever knowing, by putting them in a machine that strips their clothes off and exposes them to you.
To put it another way; Is being a peeping tom immoral? Is spying on someone while they're naked and have a reasonable expectation of privacy immoral? Is looking at someone's nudes when they did not share them with you immoral?
And if it is, then how does this meaningfully differ?
Is spying on a naked person only a crime if you are caught and it causes them distress? Or is it a crime even if they never know? Is it wrong to put a camera in someone's bathroom, even if they never suspect?
Well for the very last point, you're directly violating their privacy - observing them rather than creating a fictional depiction of them. It's worse, but that doesn't make it okay to create the fictional media. To repeat myself, that's not okay. Don't do it.
It's just not in any way an interaction with another person. The distinction between moral and practical rules is meaningless to me outside of interactions, so I'll leave it at "It's bad; don't do it."
deleted by creator
that is weird and why should the AI nude feel more wrong than the actual human creep doing it?
I forgot who wrote this but they were writing about dehumanization in it's final form and how you won't even GET to be a victim. Like it's a minor privilege to have victimhood because a victim is a human. Which the "breakthrough" in AI deepfakes will be the ability to violate without creating victims. Probably why it feels so unseemly because we are sensing something dehumanizing.