Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
I was under the impression that even clearly drawn it’s already illegal, though it’s a grey area since they can say “lol it’s a 1000 year old demon that just looks like a child.” Is that not the case?
Clearly drawn is hard to prosecute (and one might argue shouldn’t be prosecuted, since obscenity laws are just… weird). However, the stuff that is photorealistic can be treated, legally, like the real thing.
if you have a soup of all liquids and a sieve that only lets coffee and ice cream through it produces coffee ice cream (metaphor, don’t think too hard about it)
that’s how gen ai works. each step sieves out raw data to get closer to the prompt.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.
The biggest issue with this line of thinking is, how do you prove it’s CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn’t CP, but can’t find the link right now).
So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.
This sort of reminds myself on the discussion on “what is a women”. Is Siri a women? Many might say so, but t the same time Siri is not even human.
The question on how old the person on a specific generated image might be and if it even depicts a person at all, can only be answered through society. There is no scientific or any logical answer for this.
So this will always have grey areas and differing opinions and can be rulings in different cultures.
In the end it is about discussions about ethics not logic.
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
Fun fact it’s already illegal. If it’s indistinguishable from the real thing it’s a crime.
I was under the impression that even clearly drawn it’s already illegal, though it’s a grey area since they can say “lol it’s a 1000 year old demon that just looks like a child.” Is that not the case?
Clearly drawn is hard to prosecute (and one might argue shouldn’t be prosecuted, since obscenity laws are just… weird). However, the stuff that is photorealistic can be treated, legally, like the real thing.
What the fuck is AI being trained on to produce the stuff?
if you have a soup of all liquids and a sieve that only lets coffee and ice cream through it produces coffee ice cream (metaphor, don’t think too hard about it)
that’s how gen ai works. each step sieves out raw data to get closer to the prompt.
Pictures of clothed children and naked adults.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
Given the “we spared no expense” attitude to the rest of the data these things are trained on, I fear that may be wishful thinking…
Well, that’s somewhat reassuring.
Still reprehensible that it’s being used that way, of course.
AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.
I think it would boost the market for the real thing more.
It’s possible that there are people that would become into AI generated CP if it was just allowed to be advertised on nsfw website.
And that would lead some to seek out the real thing. I think it’s best to condemn it entirely
The biggest issue with this line of thinking is, how do you prove it’s CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn’t CP, but can’t find the link right now).
So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.
This sort of reminds myself on the discussion on “what is a women”. Is Siri a women? Many might say so, but t the same time Siri is not even human.
The question on how old the person on a specific generated image might be and if it even depicts a person at all, can only be answered through society. There is no scientific or any logical answer for this.
So this will always have grey areas and differing opinions and can be rulings in different cultures.
In the end it is about discussions about ethics not logic.
Definitely, and that’s why hard/strict laws or rules can be dangerous. Much like the famous “I know it when I see it” judgment on obscenity.