🚨Exclusive: OpenAI used outsourced Kenyan workers earning less than $2 per hour to make ChatGPT less toxic, my investigation found (Thread)https://t.co/302G0z7vy3— Billy Perrigo (@billyperrigo) January 18, 2023
Content moderators only seem to be necessary in spaces without real consequences. If you post CSAM in the friend group chat you are ostracized, in the work email list you are fired. You can't just make another account and keep doing it. Preventing repeat offenders means the volume of offensive material is so low that everybody can share the load and it's no big deal. Of course you don't need slaughterhouse workers if you don't eat flesh.
The internet will forever be a space without 'real consequences', that's how it work.
Much as I hate and long for the destruction of Facebook, Reddit, Twitter etc. and a return to community forum days, your approach only works if there are no large spaces of any kind in which to post, because anywhere with like, >1000 users is gonna need dedicated moderators of some kind.
You are correct that the internet will forever be an anon hellscape. A lot of people have tried to improve it, e.g. stack exchange's reputation system, with mixed results. I don't think it should be this way though.
anywhere with like, >1000 users is gonna need
dedicated moderators of some kind.
Yes, but not in the "removing bestiality posts" way. SRA national forums are this scale and mods just clean up flame wars. Probably because you need to pay $25 to get in.
A work slack where your posts can get you fired or arrested doesn't need this sort of dedicated abuse viewer position no matter how big it gets. HR may occasionally fire somebody who accidentally pastes a Pornhub link into #general but they're not getting PTSD from it.
But if it came down to it, I'd be comfortable giving up my Facebook car videos so that people didn't have to sit and watch CSAM all day. Not worth it.
Sure, so it's either very small or very tightly gatekept. Unless you clamp down on every single publically accessible website, which isn't doable, this will be a thing.
I do want to note that such a clampdown is doable. Just not in the US under capitalism, probably. Big websites only enforce CSAM stuff now to avoid advertiser flight, and possibly legal consequences if it gets really bad. Enforcing ID-above-X-users, etc. would be about the same amount of coercion.
Anyway we've had slaughterhouses for a hundred years so mod PTSD is probably gonna be around for a long time.
It really isn't. Not if the internet is to exist in any meaningful and public form; the obstacles are virtually, if not literally, insurmountable. So long as anyone can get an IP address and access the internet, they can serve content and protocols/software can be made to browse that content as easily as we can on the internet today.
Much like governments of today wanting to break encryption, the only way to make this doable is to effectively defunct the whole point of computers.
If "ID-above-X-users" were made law, the biggest social sites would immediately require IDs, like Pornhub did in Louisiana recently. They might lobby against the law, but they're going to stay above ground because they are running a profitable business and they have shareholders and stuff. I think the majority of social media use is through companies like Facebook, Twitter, TikTok.
It's currently technically possible for anyone to make a CP website. They're rare because most countries will raid you if you do, going to great lengths to get you even if you're serving it over TOR. Same goes for drug markets. Everyone knows that if you build something like that, it's just a matter of time before you make one tiny opsec slip-up and go to jail. That's the level of coercion that has to be applied to get 99% compliance. And it can be done; it's being done right now with drug and CP websites. I like drug markets but they're super super niche. Most people don't even know they're real, TOR and crypto are technically intimidating, and they're constantly being shut down as LE plays whack-a-mole and operators exit scam. You can see how that does not translate well to making underground social media large enough to give mods PTSD. Posts on Dread get like 20 upvotes max.
And of course underground anonymous social media offers a degraded experience. Lots of normal people will be fine with aboveground sites that simply take their ID at signup, like Gmail asks for a phone number or NextDoor asks for your address. Anonymous sites will have a higher proportion of sickos posting PTSD content because that's the only place they can go, which drives away normal people, which makes the proportion of sickos higher, which drives away normal people, etc. It's what happened to "free speech" Voat, and why 4chan has 27 million monthly users compared to Instagram's 2 billion.
Sort of, CP sites are just as commonly black-holed by ISPs rather than actually shut down, so it's not that simple. And this is a level of demand thing. Tor isn't overly used, but if it were the only way to get some forms of media, it would be made much easier and much more popular.
And if the only workable solution is raiding and jailing anybody who runs a website without state-sanctioned ID verification, that seems a very heavy-handed approach with a million downsides, all for the sake of avoiding excess amounts of have to moderate :freeze-peach:.
ISPs black-hole because they don't want to get in trouble. Same thing. There's a bunch of layers where actors decide to comply with the law because it's easiest. Like AWS will probably kick you off if they find out you're hosting CP on their servers. Underlying this is the actual threat of state force: besides making advertisers happy, AWS doesn't want to get raided.
It's a heavy-handed solution, but that's kind of the nature of any policy change beyond market rate adjustments. You can tweak bond rates or whatever, but when you want to outlaw something you gotta have state force backing it up. To have, e.g. OSHA standards, you need to be willing to fine and even shut down dangerous workplaces. Most businesses will mostly comply because being punished is unprofitable. This is about worker safety. Jobs where you have to look at cartel executions 8 hours a day shouldn't exist. Rather than e.g. legislate it to two hours but leave an obvious profit incentive for companies to skirt the law, it would be better to remove the profit incentive. Make that an economically unproductive activity, because advertisers don't want to advertise next to "cleaned" but still illegal anon social media.
I agree that the jobs shouldn't exist. But this proposed solution throws the baby, the mother and the whole dang household out with the bathwater. 'State ID the whole internet' is the kind of massively bureaucratic and overly authoritarian approach that I suspect would invalidate a government in any fair society.
I'm starting to realise that fair communistic work organisation solves the problem anyways - nobody will actually be coerced into that kind of shitty work 8 hours a day. If the work needs doing, it can be organised much better, and if its not worth doing, then workers won't do it and the workplaces will find suitable tailored solutions.
What if in order to sign up you had to use an ID, regardless of site or something. Obviously that would be hell to implement right now under capitalism. But could it work under socialism? Doesn't China have a similar thing?
anonymity provides more good than harm imo - even if there's a socialist government in charge you might be queer & not want people knowing about that, or have gotten your arse kicked by the cops for no reason since cops are still fuckin cops, or (since you specifically mentioned China) have issues with your capitalist boss that you wanna talk about with other workers without endangering your livelihood. imo this slaughterhouse model is really a product of how current social media is structured. it's funny that this is my example but furaffinity has ~40k concurrent users at any given time, is constantly besiged by fascist spam, and relies on volunteer mods who don't seem too traumatised by the work. when you're actually part of the community then the horrid part is a much more manageable % of your life than when you show up to do a 9-to-5 in the Omelas box
Good points. I would say, with the Furaffinity example, you are kind of just hoping the volunteer mods aren't also trash people. Because yeah, FA is fairly decent but the other furry websites I've never heard of and don't go to frequently have really dipshit mods that are too interested in maintaining freeze peach to do anything about the Nazis or dillweeds posting pictures of their sona having a transphobic screed. Which I wouldn't know anything about because I'm not on those sites, but still.
Jokes aside I do agree, the way a site is setup tends to bring in a certain type of user or incentivize different behaviors so that's probably the best place to start.
It is impossible to clamp down on all possible sites on the internet at this stage. No matter what you do, search engines will exist, and to be honest I think they serve a very good purpose separate from making money.
Do you propose linking real identity to everyone’s browsing and removing all ability to post anonymously? Seems like a major privacy and opsec concern and a major win for global capitalist surveillance
Yup. I think in a good society, you would go to jail (or be reeducated) upon posting CSAM and so there wouldn't be an endless torrent of it into moderated spaces.
Content moderation is a sick job. Like slaughterhouse worker. A good society shouldn't require anybody to do that.
Sure seems to suck. But what's the alternative?
Content moderators only seem to be necessary in spaces without real consequences. If you post CSAM in the friend group chat you are ostracized, in the work email list you are fired. You can't just make another account and keep doing it. Preventing repeat offenders means the volume of offensive material is so low that everybody can share the load and it's no big deal. Of course you don't need slaughterhouse workers if you don't eat flesh.
The internet will forever be a space without 'real consequences', that's how it work.
Much as I hate and long for the destruction of Facebook, Reddit, Twitter etc. and a return to community forum days, your approach only works if there are no large spaces of any kind in which to post, because anywhere with like, >1000 users is gonna need dedicated moderators of some kind.
You are correct that the internet will forever be an anon hellscape. A lot of people have tried to improve it, e.g. stack exchange's reputation system, with mixed results. I don't think it should be this way though.
Yes, but not in the "removing bestiality posts" way. SRA national forums are this scale and mods just clean up flame wars. Probably because you need to pay $25 to get in.
A work slack where your posts can get you fired or arrested doesn't need this sort of dedicated abuse viewer position no matter how big it gets. HR may occasionally fire somebody who accidentally pastes a Pornhub link into #general but they're not getting PTSD from it.
But if it came down to it, I'd be comfortable giving up my Facebook car videos so that people didn't have to sit and watch CSAM all day. Not worth it.
Sure, so it's either very small or very tightly gatekept. Unless you clamp down on every single publically accessible website, which isn't doable, this will be a thing.
I do want to note that such a clampdown is doable. Just not in the US under capitalism, probably. Big websites only enforce CSAM stuff now to avoid advertiser flight, and possibly legal consequences if it gets really bad. Enforcing ID-above-X-users, etc. would be about the same amount of coercion.
Anyway we've had slaughterhouses for a hundred years so mod PTSD is probably gonna be around for a long time.
It really isn't. Not if the internet is to exist in any meaningful and public form; the obstacles are virtually, if not literally, insurmountable. So long as anyone can get an IP address and access the internet, they can serve content and protocols/software can be made to browse that content as easily as we can on the internet today.
Much like governments of today wanting to break encryption, the only way to make this doable is to effectively defunct the whole point of computers.
If "ID-above-X-users" were made law, the biggest social sites would immediately require IDs, like Pornhub did in Louisiana recently. They might lobby against the law, but they're going to stay above ground because they are running a profitable business and they have shareholders and stuff. I think the majority of social media use is through companies like Facebook, Twitter, TikTok.
It's currently technically possible for anyone to make a CP website. They're rare because most countries will raid you if you do, going to great lengths to get you even if you're serving it over TOR. Same goes for drug markets. Everyone knows that if you build something like that, it's just a matter of time before you make one tiny opsec slip-up and go to jail. That's the level of coercion that has to be applied to get 99% compliance. And it can be done; it's being done right now with drug and CP websites. I like drug markets but they're super super niche. Most people don't even know they're real, TOR and crypto are technically intimidating, and they're constantly being shut down as LE plays whack-a-mole and operators exit scam. You can see how that does not translate well to making underground social media large enough to give mods PTSD. Posts on Dread get like 20 upvotes max.
And of course underground anonymous social media offers a degraded experience. Lots of normal people will be fine with aboveground sites that simply take their ID at signup, like Gmail asks for a phone number or NextDoor asks for your address. Anonymous sites will have a higher proportion of sickos posting PTSD content because that's the only place they can go, which drives away normal people, which makes the proportion of sickos higher, which drives away normal people, etc. It's what happened to "free speech" Voat, and why 4chan has 27 million monthly users compared to Instagram's 2 billion.
Sort of, CP sites are just as commonly black-holed by ISPs rather than actually shut down, so it's not that simple. And this is a level of demand thing. Tor isn't overly used, but if it were the only way to get some forms of media, it would be made much easier and much more popular.
And if the only workable solution is raiding and jailing anybody who runs a website without state-sanctioned ID verification, that seems a very heavy-handed approach with a million downsides, all for the sake of avoiding excess amounts of have to moderate :freeze-peach:.
ISPs black-hole because they don't want to get in trouble. Same thing. There's a bunch of layers where actors decide to comply with the law because it's easiest. Like AWS will probably kick you off if they find out you're hosting CP on their servers. Underlying this is the actual threat of state force: besides making advertisers happy, AWS doesn't want to get raided.
It's a heavy-handed solution, but that's kind of the nature of any policy change beyond market rate adjustments. You can tweak bond rates or whatever, but when you want to outlaw something you gotta have state force backing it up. To have, e.g. OSHA standards, you need to be willing to fine and even shut down dangerous workplaces. Most businesses will mostly comply because being punished is unprofitable. This is about worker safety. Jobs where you have to look at cartel executions 8 hours a day shouldn't exist. Rather than e.g. legislate it to two hours but leave an obvious profit incentive for companies to skirt the law, it would be better to remove the profit incentive. Make that an economically unproductive activity, because advertisers don't want to advertise next to "cleaned" but still illegal anon social media.
I agree that the jobs shouldn't exist. But this proposed solution throws the baby, the mother and the whole dang household out with the bathwater. 'State ID the whole internet' is the kind of massively bureaucratic and overly authoritarian approach that I suspect would invalidate a government in any fair society.
I'm starting to realise that fair communistic work organisation solves the problem anyways - nobody will actually be coerced into that kind of shitty work 8 hours a day. If the work needs doing, it can be organised much better, and if its not worth doing, then workers won't do it and the workplaces will find suitable tailored solutions.
What if in order to sign up you had to use an ID, regardless of site or something. Obviously that would be hell to implement right now under capitalism. But could it work under socialism? Doesn't China have a similar thing?
anonymity provides more good than harm imo - even if there's a socialist government in charge you might be queer & not want people knowing about that, or have gotten your arse kicked by the cops for no reason since cops are still fuckin cops, or (since you specifically mentioned China) have issues with your capitalist boss that you wanna talk about with other workers without endangering your livelihood. imo this slaughterhouse model is really a product of how current social media is structured. it's funny that this is my example but furaffinity has ~40k concurrent users at any given time, is constantly besiged by fascist spam, and relies on volunteer mods who don't seem too traumatised by the work. when you're actually part of the community then the horrid part is a much more manageable % of your life than when you show up to do a 9-to-5 in the Omelas box
Good points. I would say, with the Furaffinity example, you are kind of just hoping the volunteer mods aren't also trash people. Because yeah, FA is fairly decent but the other furry websites I've never heard of and don't go to frequently have really dipshit mods that are too interested in maintaining freeze peach to do anything about the Nazis or dillweeds posting pictures of their sona having a transphobic screed. Which I wouldn't know anything about because I'm not on those sites, but still.
Jokes aside I do agree, the way a site is setup tends to bring in a certain type of user or incentivize different behaviors so that's probably the best place to start.
It's technically impossible to enforce for all websites, this is just the reality of how innernet work.
deleted by creator
I mean ideally re-educated so they don't do it anymore, this is all fantasy anyway
deleted by creator
It is impossible to clamp down on all possible sites on the internet at this stage. No matter what you do, search engines will exist, and to be honest I think they serve a very good purpose separate from making money.
Do you propose linking real identity to everyone’s browsing and removing all ability to post anonymously? Seems like a major privacy and opsec concern and a major win for global capitalist surveillance
Reddit's system of deputizing moderators. Fuck, /r/anarchism's system of user written rules and elected moderators is probably the best model.
That's fine with me. But at some stage, you're gonna have a space big enough that this needs dedicated work, whatever you do.
I remember the stories about facebook moderators and their mental health struggles seeing such shit daily.
Yup. I think in a good society, you would go to jail (or be reeducated) upon posting CSAM and so there wouldn't be an endless torrent of it into moderated spaces.
I met a guy whose job was to review all suspicious content that was seized at pearson international. Dude seemed totally dead inside.