• Allero@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 days ago

    Why though? If it does reduce consumption of real CSAM and/or real life child abuse (which is an “if”, as the stigma around the topic greatly hinders research), it’s a net win.

    Or is it simply a matter of spite?

    Pedophiles don’t choose to be attracted to children, and many have trouble keeping everything at bay. Traditionally, those of them looking for the least harmful release went for real CSAM, but it’s obviously extremely harmful in its own right - just a bit less so than going out and raping someone. Now that AI materials appear, they may offer the safest of the highly graphical outlets we know, with least child harm done. Without them, many pedophiles will revert to traditional CSAM, increasing the amount of victims to cover for the demand.

    As with many other things, the best we can hope for here is harm reduction. Hardline policies do not seem to be efficient enough, as people continuously find ways to propagate the CSAM and pedophiles continuously find ways to access it and leave no trace. So, we need to think of ways to give them something which will make them choose AI over real materials. This means making AI better, more realistic, and at the same time more diverse. Not for their enjoyment, but to make them switch for something better and safer than what they currently use.

    I know it’s a very uncomfortable kind of discussion, but we don’t have the magic pill to eliminate it all, and so must act reasonably to prevent what we can prevent.

    • VoteNixon2016@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Because with harm reduction as the goal, the solution is never “give them more of the harmful thing.”

      I’ll compare it to the problems of drug abuse. You don’t help someone with an addiction by giving them more drugs, you don’t help them by throwing them in jail just for having an addiction, you help them by making it safe and easy to get treatment for the addiction.

      Look at what Portugal did in the early 2000s to help mitigate the problems associated with drug use, treating it as a health crisis rather than a criminal one.

      You don’t arrest someone for being addicted to meth, you arrest them for stabbing someone and stealing their wallet to buy more meth; you don’t arrest someone just for being a pedophile, you arrest them for abusing children.

      This means making AI better, more realistic, and at the same time more diverse.

      No, it most certainly does not. AI is already being used to generate explicit images of actual children. Making it better at that task is the opposite of harm reduction, it makes creating new victims easier than ever.

      Acting reasonably to prevent what we can prevent means shutting down the CSAM-generating bot, not optimizing and improving it.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        To me, it’s more like the Netherlands giving out free syringes and needles so that drug consumers at least wouldn’t contract something from the used ones.

        To be clear: granting any and all pedophiles access to therapy would be of tremendous help. I think it must be done. But there are two issues remaining:

        1. Barely any government will scrape enough money to fund such programs now that therapy is astronomically expensive
        2. Even then, plenty of pedophiles will keep consuming CSAM, legally or not. There must be some incentives for them to choose the AI-generated option that is at least less harmful than the alternative.