• mindbleach@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      There is no such thing as generated CSAM.

      That is the entire god-danged point of calling it “CSAM.”

      You can’t abuse children who do not exist.

      • explodicle@local106.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Do you suggest we keep calling the AI generated stuff “child porn” or something else? I just want to use whichever term results in clarity and fewer people correcting me.

        Edit: or is there an umbrella term that applies to both?