Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • JoBo@feddit.uk
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    9 months ago

    I was replying to someone who was claiming they aren’t harmful as long as everyone knows they’re fake. Maybe nitpick them, not me?

    Reak kids are harmed by AI CSAM normalising a problem they should be seeking help for, not getting off on.

      • JoBo@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Not getting beyond your first sentence here. I am not interested in what fucked up laws have been passed. Nor in engaging with someone who wants to argue that any form of child porn is somehow OK.

      • JoBo@feddit.uk
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        9 months ago

        No I didn’t. Go nitpick someone else.

        Or better still, explain why you think AI-generated CSAM isn’t harmful. FFS

        • SharkEatingBreakfast@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          9 months ago

          Let’s be real here:

          Sure, it’s not illegal. But if I find “those kinds” of AI-generated images on someone’s phone or computer, the fact that it’s AI-generated will not improve my view of that person in any possible way.

          Even if it’s technically “legal”.

          They tellin’ on themselves.

      • Ataraxia@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        People who consume any kind of cp are dangerous and encouraging thar behavior is just as criminal. I’m glad that shit is illegal in most civilized countries.