falcor84 12 days ago | next |

> "The creation of CSAM using AI is inherently harmful to children because the machine-learning models utilized by AI have been trained on datasets containing thousands of depictions of known CSAM victims," it says, "revictimizing these real children by using their likeness to generate AI CSAM images into perpetuity."

The word "inherently" there seems like a big stretch to me. I see how it could be harmful to them, but I also see an argument for how such AI generated material is a substitute for the actual CSAM. Has this actually been studied, or is it a taboo topic for policy research?

defrost 12 days ago | root | parent | next |

There's a legally challengable assertion there; "trained on CSAM images".

I imagine an AI image generation model could be readily trained on images of adult soldiers at war and images of children from instagram and then be used to generate imagery of children at war.

I have zero interest in defending exploitation of children, the assertion that children had to have been exploited in order to create images of children engaged in adult activities seems shaky. *

* FWiW I'm sure there are AI models out there that were trained on actual real world CSAM .. it's the implied neccessity that's being questioned here.

jsheard 12 days ago | root | parent | next |

It is known that the LAION dataset underpinning foundation models like Stable Diffusion contained at least a few thousand instances of real-life CSAM at one point. I think you would be hard-pressed to prove that any model trained on internet scrapes definitively wasn't trained on any CSAM whatsoever.

https://www.theverge.com/2023/12/20/24009418/generative-ai-i...

defrost 12 days ago | root | parent | next |

> I think you would be hard-pressed to prove that any model trained on internet scrapes definitively wasn't trained on any CSAM whatsoever.

I'd be hard pressed to prove that you definitely hadn't killed anybody ever.

Legally if it's asserted that these images are criminal because they are the result of being the product of an LLM trained on sources that contained CSAM then the requirement would be to prove that assertion.

With text and speech you could prompt the model to exactly reproduce a Sarah Silverman monologue and assert that proves her content was used in the training set, etc.

Here the defense would ask the prosecution to demonstrate how to extract a copy of original CSAM.

But your point is well taken, it's likely most image generation programs of this nature have been fed at least one image that was borderline jailbait and likely at least one that was well below the line.

9rx 12 days ago | root | parent | next |

> Legally if it's asserted that these images are criminal because they are the result of being the product of an LLM trained on sources that contained CSAM then the requirement would be to prove that assertion.

Legally, possession of CSAM is against the law because there is an assumption that possession proves contribution to market demand, with an understanding that demand incentives production of supply, meaning there that with demand children will be harmed again to produce more content to satisfy the demand. In other words, the intent is to stop future harm. This is why people have been prosecuted for things like suggestive cartoons that have no real-life events behind them. It is not illegal on the grounds of past events. The actual abuse is illegal on its own standing.

The provenance of the imagery is irrelevant. What you need to prove is that your desire to have such imagery won't stimulate yourself or others to create new content with real people. If you could somehow prove that LLM content will satisfy all future thirst, problem solved! That would be world changing.

harshreality 12 days ago | root | parent | next |

I'm somewhat sympathetic to that argument. However, it doesn't stop there.

Violent video games prove contribution to market demand for FPS-style videos of mass shootings or carjackings, so can/should we ban Call of Duty and Grand Theft Auto now?

(Note that the "market demand" argument is subtly different from the argument that the games directly cause people to become more violent, either in general or by encouraging specific copycat violence. Studies on [lack of] direct violence causation are weak and disputed.)

9rx 12 days ago | root | parent |

Tell us more about the market that is carrying out mass shootings/carjackings with intent to sell the imagery of it. This is the first I've heard of it. In fact, when mass shootings do occur they are usually explained away for some other reason (e.g. a troubled teen who went off the rails), not that it is the day-to-day operation of a videography business.

harshreality 11 days ago | root | parent |

They don't have to sell imagery, or intend for it to be publicized, for it to have the effect of driving demand for such imagery.

The link between consumption (without purchase) of CSAM and increased production of CSAM is assumed in the same way, isn't it?

The motives and situational dynamics of pedophiles filming child pornography, and gangs robbing or carjacking people on video in broad daylight, may be different in some ways, but in some ways they're not: both get direct benefit from the crime, regardless of the recordings; in both cases some (small) subset of people interested in watching the recordings may become more inclined to act out what they see, if it aligns with an already existing inclination toward pedophilia or violent crime. That's independent of whether they can make money or become famous for the crime, although money or infamy as additional motive is an additional problem.

9rx 11 days ago | root | parent |

> They don't have to sell imagery, or intend for it to be publicized, for it to have the effect of driving demand for such imagery.

As stated in the first comment, the law here is concerned about future production of imagery to satisfy demand, with its incentive to harm even more children. The business of creating CSAM material is well understood. It is, after all, an extension of the adult porn industry. The law didn't emerge out of someone sitting around dreaming up hypotheticals, it was in reaction to actual situations that the public didn't want to see continue. What is done is done, and there are other laws to deal with what is already done, but the law also seeks to prevent future harm. Outlawing the product so that there is no market for producers to sell into is the only way it knows how. It's overly blunt, is far from perfect, but it tries.

The business of committing mass shootings/carjackings to produce salable imagery is not well understood. Frankly, I've never heard of it before, and since you can't speak to it I take it that you haven't either. It is unclear why you decided to make it up. As before, I cannot think of any mass shooting that was considered "producing a product to sell" and not "deranged person acting out". Most importantly, even if it is a thing, since nobody is aware that it is happening there is no public will to see that line of business come to an end. Your attempt to find parallel is poorly considered.

The closest analogy, which still isn't great but is at least in the same universe, that might be found in your ramblings is a case of people wanting to own stolen car parts, incentivizing others to carry out carjackings to get ahold of stolen parts to sell. In reaction, you might make it illegal to possess stolen car parts with the intent to create a situation where someone trying to steal and sell stolen car parts ultimately doesn't have anyone to sell to, diminishing the incentive to carjack in the future. Of course, the law already does exactly that, so...

Ferret7446 7 days ago | root | parent | prev |

Such an assumption is wrong in a world with AI generated CSAM. Why would suppliers go through the risk/cost of producing "actual" CSAM if they could AI generate it? Especially if the demand is for AI generated CSAM (someone who has AI generated CSAM is stimulating demand for AI generated CSAM by definition).

Even for regular porn, which is far lower risk/cost, AI generation is becoming preferable (as with most technologies, the leading use case for AI is porn).

jsheard 12 days ago | root | parent | prev |

Framing it in that way is essentially a get out of jail free card - anyone caught with CSAM can claim it was AI generated by a "clean" model, and how would the prosecution ever be able to prove that it wasn't?

I get where you are coming from but it doesn't seem actionable in any way that doesn't effectively legalize CSAM possession, so I think courts will have no choice but to put the burden of proof on the accused. If you play with fire then you'd better have the receipts.

_aavaa_ 12 days ago | root | parent |

This seems like a long way of saying “guilty until proven innocent”.

jsheard 12 days ago | root | parent |

It is, but that's hardly unprecedented for this type of thing. In commercial porn the legal burden is on the producer to prove that their models were adults, not for prosecutors to prove that they were minors.

lazyasciiart 12 days ago | root | parent | prev | next |

Then all image generation models should be considered inherently harmful, no?

spwa4 11 days ago | root | parent |

But this is the dream for the supposed protectors of children. You see, just because child porn production stops, does not mean those children disappear. Usually, of course, they go into youth services (in practice most don't even make it to the front door and run away to resume the child abuse, but let's ignore that). That is how the situation of those children changes when CSAM is persecuted. From the situation they were in, to whatever situation exists in youth services. In other words, youth services is the upper limit to how much police and anyone CAN help those children.

So you'd think they would make youth services a good place to be for a child, right. After all, if that situation were to be only marginally better than child prostitution, there's no point to finding CSAM. Or at least, the point is not to protect children, since that is simply not what they're doing.

So how is youth services doing these days? Well ... NOT good. Regularly children run away from youth services to start doing child porn (ie. live off off an onlyfans account). There's a netflix series on the subject ("Young and locked up") which eventually, reluctantly shows the real problem, the outcome (ie. prison or street poverty).

In other words your argument doesn't really apply since the goal is not to improve children's well being. If that was the goal, these programs would do entirely different things.

Goals differ. There's people who go into government with the express purpose to "moralize" and arrest people for offenses. Obviously, to them it's the arresting part that's important, now how serious the offense was and CERTAINLY not if their actions actually help people. And then there's people who simply want a well-paying long-term job where they don't accomplish much. Ironically these are much less damaging, but they still seek to justify their own existence.

Both groups really, really, really want ALL image generation models to be considered inherently harmful, as you say.

lazyasciiart 11 days ago | root | parent |

Yea, the nonprofit for commercially sexually abused children that I volunteer at is a much better way to know about reality. But conspiracies are a comforting way to understand why things can't just be fixed, sure.

spwa4 10 days ago | root | parent |

I have experienced child services from the inside, including talking to quite a few kids caught, because caught is the correct word, in "that world". There was not a SINGLE example of one that appreciated the change, and one killed himself. All were essentially locked up. All were aggressive and for every last one the fault was the governments, in 2 cases teachers actively involved in sending students to "that world". Needless to say, these teachers, one who was caught red handed (by the press, no police involvement obviously), were entirely free, and the victims were locked up. ALL of these children had stories of police officers involved in "that world", none of whom had seen any punishment.

I also can absolutely guarantee you: child services CANNOT protect a child from drugs. Child services CANNOT protect a child from prostitution or sex. Child services CANNOT protect a child from violence, whether they are violent themselves or victim. For the same reason the sea cannot protect fish from water. Sex, drugs and violence are pervasive in even the youngest groups in child services, with people either pretending they're not seeing it or in bad cases participating.

There is not a doubt in my mind that had the police instead done nothing at all the outcome for all those children would have been better off, not worse. Every last one.

Oh and not because their situation was great and didn't damage them or any bullshit like that. They would have been better off because child services was orders of magnitude worse, didn't help, and DID fuck up any chance at a future they had (a few, including me, went normally to school. Went, past tense, as in child services made that utterly impossible. Often kid were sent to "idiot-schools" because that got the institution more money, almost always the school or someone at school was the one positive influence in their lives and child services ... just ... doesn't ... care, about such things, brutally and violently changing the school. The kid that later committed suicide beat up the guard of BOTH the idiot school AND hit the "guard" at his previous school and after that got the principal to accept him back, by essentially staying, and sleeping, in the waiting chairs at his office for 3 straight days until he was allowed to talk, then biking to school for 3 hours every day)

Hizonner 12 days ago | root | parent | prev |

I think you'd be hard-pressed to prove that a few thousand images (out of over 5 billion in the case of that particular data set) had any meaningful effect on the final model capabilities.

Hizonner 12 days ago | root | parent | prev | next |

> There's a legally challengable assertion there; "trained on CSAM images".

"Legally challengable" only in a pretty tenuous sense that's unlikely to ever haven any actual impact.

That'll be something that's recited as a legislative finding. It's not an element of the offense; nobody has to prove that "on this specific occasion the model was trained in this or that way".

It could theoretically have some impact on a challenge to the constitutionality of the law... but only under pretty unlikely circumstances. First you'd have to get past the presumption that the legislature can make any law it likes regardless of whether it's right about the facts (which, in the US, probably means you have to get courts to take the law under strict scrutiny, which they hate to do). Then you have to prove that that factual claim was actually a critical reason for passing the law, and not just a random aside. Then you have to prove that it's actually false, overcoming a presumption that the legislature properly studied the issue. Then maybe it matters.

I may have the exact structure of that a bit wrong, but that's the flavor of how these things play out.

defrost 12 days ago | root | parent |

My comment was in response to a portion of the comment above:

> because the machine-learning models utilized by AI have been trained on datasets containing thousands of depictions of known CSAM victims

I'd argue that CSAM imagery falls into two broad categories; actual real photographic image of actual real abuse and generated images (paintings, drawings, animations, etc) and all generated images are more or less equally bad.

There's a peer link in this larger thread ( https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn... ) that indicates at least two US citizen have been charged and sentenced for 20 and 40 years imprisonment each for the possession and distribution of "fictional" child abuse (animated and still japanese cartoon anime, etc).

So, in the wider world, it's a moot point whether these specific images came from training on actual abuse images or not, they depict abuse and that's legally sufficient in the US (apparently), further the same depictions could be generated with or without actual real abuse images and as equivilant images either way they'd be equally offensive.

yellowapple 12 days ago | root | parent | prev |

Exactly. The abundance of AI-generated renditions of Shrimp Jesus doesn't mean it was trained on actual photos of an actual Shrimp Jesus.

Hizonner 12 days ago | root | parent | prev | next |

The word "revictimizing" seems like an even bigger stretch. Assuming the output images don't actually look like them personally (and they won't), how exactly are they more victimized than anybody else in the training data? Those other people's likenesses are also "being used to generate AI CSAM images into perpetuity"... in a sort of attenuated way that's hard to even find if you're not desperately trying to come up with something.

The cold fact is that people want to outlaw this stuff because they find it icky. Since they know it's not socially acceptable (quite yet) to say that, they tend to cast about wildly until they find something to say that sort of sounds like somebody is actually harmed. They don't think critically about it once they land on a "justification". You're not supposed to think critically about it either.

metalcrow 12 days ago | root | parent | prev | next |

https://en.wikipedia.org/wiki/Relationship_between_child_por... is a good starting link on this. When i last checked, there were maybe 5 studies total (imagine how hard it is to get those approved by the ethics committees), all of which found different results, some totally the opposite of each other.

Then again, it already seems clear that violent video games do not cause violence, and access to pornography does not increase sexual violence, so this case being the opposite would be unusual.

harshreality 12 days ago | root | parent |

The few studies on "video games cause violence" I've seen have been extremely limited in scope. They're too concerned with short-term frequency of playing particular games, or desire to play them. They're not concerned enough with the influence of being quite familiar with such games, or how cultural prevalence of such games normalizes thoughts of certain behaviors and shifts the Overton window. There are also selection bias problems. I'd expect media and games to more greatly affect people already psychologically unstable or on the criminal fringe... not people likely to be study participants.

Studies on sexual violence and changes in that over time have even more problems, for example how difficult it is to get average people to report accurately about their private relationships. Those people likely to volunteer accurate information are not necessarily representative.

ashleyn 12 days ago | root | parent | prev | next |

I always found these arguments to be contrived especially when it's already well-known that in the tradition of every Western government, there is no actual imperative for every crime to be linked directly to a victim. It's a far better argument to me, to suggest that the societal utility in using the material to identify and remove paedophiles before they have an actual victim far exceeds the utility of any sort of "freedom" to such material.

yellowapple 12 days ago | root | parent | next |

> It's a far better argument to me, to suggest that the societal utility in using the material to identify and remove paedophiles before they have an actual victim far exceeds the utility of any sort of "freedom" to such material.

Or at the very least: flag pedophiles for further investigation, including one's social network. Even if a given pedophile hasn't actually harmed any children, one probably is acquainted with other pedophiles, possibly ones who have harmed children.

Softening the punishment for CSAM possession in and of itself could actually help with investigating the creation of (non-simulated) CSAM by this logic, since people tend to be more cooperative to investigators when they think they have done nothing wrong and have nothing to hide.

paulryanrogers 12 days ago | root | parent | prev | next |

Benefiting from illegal acts is also a crime, even if indirect. Like getting a cheap stereo that happens to have been stolen.

A case could also be made that the likenesses of the victims could retramatize them, especially if someone knew the connection and continued to produce similar output to taunt them.

willis936 12 days ago | root | parent | prev | next |

It sounds like we should be asking "why is it okay that the people training the models have CSAM?" It's not like it's legal to have, let alone distribute in your for-profit tool.

wongarsu 12 days ago | root | parent | next |

If you crawl any sufficiently large public collection of images you are bound to download some CSAM images by accident.

Filtering out any images of beaten up naked 7 year olds is certainly something you should do. But if you go by the US legal definition of "any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age)" you are going to have a really hard time filtering all of that automatically. People don't suddenly look differently when they turn 18, and "sexually explicit" is a wide net open to interpretation.

willis936 11 days ago | root | parent |

"It's hard" isn't a sufficient reason. Supply chain and infrastructure that makes safe food widely available is hard. We do it anyway because it's the right thing to do.

wbl 12 days ago | root | parent | prev |

Read the sentence again. It doesn't claim the data set has CSAM but that it depicts victims. It also assumes that you need AI to see an example to draw it on demand which isn't true.

ilaksh 12 days ago | root | parent | prev |

You probably have a point and I am not sure that these people know how image generation actually works.

But regardless of a likely erroneous legal definition, it seems obvious that there needs to be a law in order to protect children. Because you can't tell.

Just like there should be a law against abusing robots that look like extremely lifelike children in the future when that is possible. Or any kind of abuse of adult lifelike robots either.

Because the behavior is too similar and it's too hard to tell the difference between real and imagined. So allowing the imaginary will lead to more of the real, sometimes without the person even knowing.

danaris 12 days ago | prev | next |

If it's AI-generated, it is fundamentally not CSAM.

The reason we shifted to the terminology "CSAM", away from "child pornography", is specifically to indicate that it is Child Sexual Abuse Material: that is, an actual child was sexually abused to make it.

You can call it child porn if you really want, but do not call something that never involved the abuse of a real, living, flesh-and-blood child "CSAM". (Or "CSEM"—"Exploitation" rather than "Abuse"—which is used in some circles.) This includes drawings, CG animations, written descriptions, videos where such acts are simulated with a consenting (or, tbh, non consenting—it can be horrific, illegal, and unquestionably sexual assault without being CSAM) adult, as well as anything AI-generated.

These kinds of distinctions in terminology are important, and yes I will die on this hill.

skissane 11 days ago | root | parent | next |

This is a rather US-centric perspective. Under US law, there is a legal distinction between child pornography and child obscenity (see e.g. 18 U.S.C. § 1466A, “obscene visual representations of the sexual abuse of children”). The first is clearly CSAM; whether the second is, is open to dispute. But, in Canada, UK (and many other European countries), Australia, New Zealand, that legal distinction doesn’t exist, both categories are subsumed under child pornography (or equivalent terms-Australian law now prefers the phrase “child abuse material”), and the authorities in those jurisdictions aren’t going to say some child pornography is CSAM and the rest isn’t - they are going to say it is all CSAM

danaris 11 days ago | root | parent |

You are speaking of legality, not terminology.

The terminology is universal, as it is simply for talking about What People Are Doing, not What The Law Says.

Many people will—and do, and this is why I'm taking pains to point it out—confuse and conflate CSAM and child pornography, and also the terminology and the law. That doesn't change anything about what I've said.

Fundamentally, there are two basic reasons we outlaw or otherwise vilify these things:

1) Because the creation of CSAM involves the actual sexual abuse of actual children, which causes actual harm.

2) Because we think that child pornography is icky.

Only the former has a basis in fundamental and universal principles. The latter is, effectively, attempting to police a thoughtcrime. Lots of places do attempt to police thoughtcrime, in various different ways (though they rarely think of it as such); that does not change the fact that this is what they are doing.

skissane 11 days ago | root | parent |

> The terminology is universal

Is it? The US Department of Homeland Security defines "CSAM" as including generative AI images: https://www.dhs.gov/sites/default/files/2024-04/24_0408_k2p_...

So does the FBI: https://www.ic3.gov/PSA/2024/PSA240329

You want to define the "CSAM" more narrowly, so as to exclude those images.

I'm not aware of any "official" definition, but arguably something hosted on a US federal government website is "more official" than the opinion of a HN commenter

danaris 11 days ago | root | parent |

Sorry, I spoke imprecisely.

The terminology is used outside of legal contexts in ways that transcend borders. To the best of my knowledge, the terms "CSAM" and "CSEM" were coined outside of legal contexts, for the purposes I described above.

That they are used in legal contexts in particular jurisdictions with particular definitions that do not exactly match what I have described does not change what I have described for general usage.

By the very nature of the English language, which has no formal administering body, there is no such thing as an "official definition" of a word in common usage; even dictionaries are descriptive, not normative. It is possible that my experience is not universal; however, I have had enough exposure to the term in a variety of contexts that I am comfortable stating that, at least for a large number of Anglophone people, my description of the situation would read as accurate.

YMMV, void where prohibited, no warranty is expressed or implied, etc.

skissane 11 days ago | root | parent |

> The terminology is used outside of legal contexts in ways that transcend borders.

To clarify, the FBI and DHS publications I cited are not actually using the term in a "legal context", strictly speaking. Presently, US federal criminal law does not use the term CSAM; if the FBI or DHS arrest someone for "CSAM", they might use that term in a press release describing the arrest, but the formal criminal charges will be expressed without using it.

> To the best of my knowledge, the terms "CSAM" and "CSEM" were coined outside of legal contexts, for the purposes I described above.

This is where I doubt you – did the people who originally coined the term "CSAM" intend to exclude AI-generated images from the term's scope? You are assuming they did, but I'm not convinced you are right. I suspect you may be projecting your own views about how the term should be defined on to the people who originated it.

danaris 10 days ago | root | parent |

> This is where I doubt you – did the people who originally coined the term "CSAM" intend to exclude AI-generated images from the term's scope? You are assuming they did, but I'm not convinced you are right. I suspect you may be projecting your own views about how the term should be defined on to the people who originated it.

Knowing, as I do, the purpose (or at least the stated purpose) of coining the term—which, as I have very clearly stated, was to differentiate between situations where an actual real-life child is involved, vs those where none is: Yes, I am very confident that their intent would exclude AI-generated images. I can't see how any other conclusion could be drawn from what I've already said.

yellowapple 12 days ago | root | parent | prev | next |

I think the one case where I'd disagree is when it's a depiction of an actual person - say, someone creates pornography (be it AI-generated, drawn, CG-animated, etc.) depicting a person who actually exists in the real world, and not just some invented character. That's certainly a case where it'd cross into actual CSAM/CSEM, because despite the child not physically being abused/exploited in the way depicted in the work, such a defamatory use of the child's likeness would constitute psychological abuse/exploitation.

danaris 11 days ago | root | parent |

That would only apply if the child is exposed to it, either directly or indirectly—which, if it's distributed publicly, is a possibility, though far from a certainty.

I would also say that there's enough difference between being sexually abused, in person, and having someone make a fake image of that, that it's at least questionable to apply the term.

I would further note that part of the reason to use the term CSAM is to emphasize that there is an actual child in actual danger that may need help.

yellowapple 11 days ago | root | parent |

> That would only apply if the child is exposed to it

Not just the child, but anyone associated with the child. Classmates sharing it around school and gossiping about it, overbearing parents punishing the child for something the child didn't even do, predators identifying the child and seeking to turn the fictional images into reality... there are a lot of plausible angles for a fictional representation of a real person to produce tangible psychological or even physical harm, just by the mere existence of that representation.

It's in a similar vein to so-called "revenge porn". Nobody was harmed in the creation of it (assuming that the persons in it consented to being in it), and yet the dissemination of it has clear negative impacts on those who did not consent to said dissemination.

That all being to say:

> I would further note that part of the reason to use the term CSAM is to emphasize that there is an actual child in actual danger that may need help.

Creating pornographic works depicting a child who actually exists in the real world does indeed put that actual child in actual danger. That's why it'd be appropriate to call such works "CSAM".

ashleyn 12 days ago | root | parent | prev | next |

This is where my technical knowledge of genAI breaks down, but wouldn't an image generator be unable to produce such imagery unless honest-to-god CSAM were used in the training of it?

gs17 12 days ago | root | parent | next |

It's like the early demo for DALL-E where you could get "an armchair in the shape of an avocado", which presumably wasn't in the training set, but enough was in it to generalize the "armchair" and "avocado" concepts and combine them.

6ix8igth 12 days ago | root | parent | prev | next |

It's possible for the model to take disparate concepts and put them together. E.g. you can train a LORA to teach stable diffusion what a cowboy hat it is, then ask for Dracula in a cowboy hat.that probably doesn't exist in it's training data, but it will give it to you just fine. I'm not about to try, but I would assume the same would apply for child pornography.

danaris 11 days ago | root | parent | prev |

Not at all. If it's trained with images of children, and images of pornography, it should be pretty easy for it to combine the two.

darquomiahw 12 days ago | prev | next |

AI-generated "CSAM" is the perfect form of kompromat. Any computer with a GPU can generate images that an American judge can find unpalatable. Once tarred by the sex offender brush and cooled by 20+ years in prison for possession, any individual will be effectively destroyed. Of course, no real children are being abused, which makes it all the more ludicrous.

ALittleLight 12 days ago | root | parent |

You could say the same thing about real CSAM. Authorities claim you have it, present a hard drive they "recovered" from your property to court and you get convicted.

At some level we just have to hope, and be skeptical of etc, that the government is basically staffed by mostly trustworthy people.

WarOnPrivacy 12 days ago | root | parent |

>> Of course, no real children are being abused,

> You could say the same thing about real CSAM.

Maybe this is trying to make a technical point.

But abuse is a brutally heavy topic, mere technicalities aren't ever going to be strong enough to support the weight put upon them.

That said: I do believe there are counter arguments to be made. I question the ethics and actions of people who leverage the pain of injured children to expand gov reach and power.

Having been a child who can check all the abuse boxes, it gets old watching the endless parade of interests try to mine that pain for their own ends.

ALittleLight 12 days ago | root | parent |

In the scenario I laid out, no additional real children are being abused either - because the government is simply framing you and you never consumed or produced any CSAM. That's what I meant by it being similar - if your concern with punishing people for AI generated CSAM is that law enforcement could lie about it and frame innocent people, then what I'm trying to point out is, that same risk exists with the real thing, they could lie about that too - if you want the government to enforce any laws you need to trust they're mostly trustworthy.

grapesodaaaaa 12 days ago | prev | next |

Question for anyone that knows, but what happens if a 16 year old draws a naked picture of their 16 year old partner?

Hizonner 12 days ago | root | parent | prev | next |

Depends on where you are. In some places in the world, into the gulag with them.

In fact, they can get gulaged if they draw a picture of a totally imaginary 16 year old, let alone a real person. It's kind of in dispute in the US, but that'd be the way to bet. 'Course, in a lot of US states, they're also in trouble for having a "partner" at all.

In practice, they'd probably have to distribute the picture somehow to draw the attention to get them arrested, and anyway they'd be a low priority compared to people producing, distributing, or even consuming, you know, real child porn... but if the right official decided to make an example of them, they'd be hosed.

olliej 12 days ago | root | parent | prev |

I mean there have already been people reported to police or having all accounts cancelled for the kind of images most families have of their children growing up.

This entirely ignores what happens when you have states classifying telling a child “it’s ok to be gay/trans” as child abuse, or even just being gay/trans as a sex crime - which plenty of states in the US have said that they want to do.

Typically the same states and people that actively support child abuse by banning actual sex ed, and normalizing child molestation.

empressplay 12 days ago | prev | next |

In British Columbia both text-based accounts (real and fictional, such as stories) and drawings of underage sexual activity are illegal (basically any sort of depiction, even if it just comes out of your mouth.)

So California is just starting to catch up.

Hizonner 12 days ago | root | parent | next |

All of Canada. The Canadian Criminal Code is federal.

I think they did carve out a judicial exception in a case where some guy was writing pure text stories for his own personal use, and didn't intentionally show them to anybody else at all, but I also seem to recall it was a pretty grudging exception for only those very specific circumstances.

userbinator 12 days ago | root | parent | prev |

Wrong country (at least for now...!)

skissane 12 days ago | root | parent |

Not wrong country. In 2021, a Texas man was sentenced to 40 years in federal prison over CSAM text and drawings - https://www.justice.gov/opa/pr/texas-man-sentenced-40-years-...

metalcrow 12 days ago | root | parent |

Specifically, this ruling does not make that kind of content illegal. This person was convicted under federal obscenity statues, which are....fuzzy to say the least. As Supreme Court justice Potter Stewart said, it's a "i know it when i see it" thing.

Which in effect is basically a "you can go to jail if we think you're too gross" law.

skissane 11 days ago | root | parent | next |

> Specifically, this ruling does not make that kind of content illegal.

I don't know exactly what you mean by that, but if you can get a 40 year prison sentence for "that kind of content", then it is illegal.

> This person was convicted under federal obscenity statues, which are....fuzzy to say the least.

So are child pornography statutes. Canada's [0] has lots of subjective clauses open to judicial interpretation, such as whether the content "has a legitimate purpose related to.. science, medicine, education or art". How do you draw that line? Potter Stewart’s famous quip applies here too

[0] https://laws-lois.justice.gc.ca/eng/acts/c-46/section-163.1....

leshokunin 12 days ago | prev | next |

Between the 3D printed weapons and the AI CSAM, this year is already shaping up to be wild in terms of misuses of technology. I suppose that’s one downside of adoption.

adrr 12 days ago | prev | next |

Be interesting to see how this pans out in terms of the 1st amendment. Without a victim, it would be interesting to see how the courts rule. They could say its inherently unconstitutional but for sake of the general public, it is fine. This would be similar to the supreme court ruling on DUI checkpoints.

jrockway 12 days ago | root | parent | prev | next |

I think the victim is "the state". The law seems to say that by creating a model using CSAM, certain outputs are CSAM, and the law can say whatever it wants to some extent; you just have to convince the Supreme Court that you're right.

olliej 12 days ago | root | parent | prev |

The Supreme Court ruled already that there does not need to be a victim, or even any real people at all in its justification to permit refusal of service to lgbt people.

So I’m unsure why even if there were no victims that would be relevant.

To me the problem here is they might identify someone with a problem, but then send them to jail with essentially the same label as an actual pedofile or rapist, and prisons in the US exist as a source of slave labor and future criminals rather than any kind of rehab. So person goes in who is already clearly fucked up (not necessarily this case, assume some case where it’s clearly be demonstrated guilt or whatever) and then comes out with a pile of trauma and no employment prospects and it seems like a sure fire way to create an actual dangerous pedo/abuser.

I think a better equivalence would be the treatment of alcohol - alcoholics don’t go to jail immediately, not even when driving, it requires them failing to get the alcoholism treated, or actually harming people for them to end up in jail.

NotYourLawyer 12 days ago | prev |

Seems fine.

loeg 12 days ago | root | parent | next |

Right? Hard to be mad about the outcome.

6ix8igth 12 days ago | root | parent |

Laws like this impede everyone while providing little tangible benefit for actual children. It's silly to imply that generating an image with a model hurts anyone.

In general I don't like legislation that tries to criminalize abstract depictions of something that would otherwise be a crime. It's too intangible, and opens the opportunity for abusing the legal system to persecute undesirables. We have to be really careful about what spiders we swallow to catch small (or sometimes imaginary) flies.

loeg 11 days ago | root | parent | prev |

I don’t feel remotely impeded by this and neither do most other people. No one needs to generate CSAM.

6ix8igth 11 days ago | root | parent |

You don't feel impeded right now, but let enough stupid stuff like this through and you'll slowly drip feed your rights away. I explained in my post above why this is bad - if you make it a crime to do a simple thing, it becomes very easy to falsely accuse a person of a crime, and act in other authoritarian ways.

I already have personal experience facing strong arm measures under the pretext of "protecting the kids". Last time I returned to my home country of Australia, the policy forcibly did a digital search of my phone, on the pretext of looking for child porn. I have my banks details and pics of my girlfriend on that thing. In no first world country should I have to surrender it to authorities without a warrant. No one would tolerate this bullshit for any other reason then the moral panic around children.

I'm anti-pedophilia, but the legislation the article is talking about is incredibly stupid. We don't accept invasive police-state tactics to combat any other crime, we shouldn't accept them here.

loeg 11 days ago | root | parent |

Nah, there’s no slippery slope here. Don’t generate CSAM.

6ix8igth 11 days ago | root | parent |

You keep fixating on one half of my point (this law shouldn't exist) while completely ignoring and not addressing the rational (it does basically nothing to help children at great cost to your rights). In no way am I endorsing generating child porn, but this is a retarded way to try and prevent that.

This isn't a "slippery slope" argument, the specific law we are discussing is already an example of stepping too far.