Better to have neither, though. Even virtual CSAM can encourage behavior that eventually leads to real children being abused.
My brother worked in an in-patient place where a lot of people with pedophilia were being treated, and the providers would even instruct them not to watch certain movies and shows depicting children, much less something like virtual CSAM. It's a common internet myth that it reduces their chance of offending.
Do you have studies on this? I'm not convinced any given treatment facility is going to have good takes; conversion therapy is a thing they do at inpatient facilities. To be clear, I'm not saying your brother was doing conversion therapy, just that I prefer to base my opinions on research rather than what a treatment provider I have no idea who they are does.
Not necessarily? We have ai that can de-age people created without CSAM so it stands to reason that it's possible. I still don't think it should be accessible though. It still requires children and that's uncomfortable enough for me I think
Something like stable diffusion can absolutely make things it’s never been trained on. For example if puppies are in its training set, and poodles are in its training set, but not poodle puppies. It can generate a poodle puppy.
Yes but fron what i know every image AI is technically able to make "cp" you don't need cp to get a ai to make it. The training data is definitely actual children but not necessarily from CSAM