Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • duncesplayed@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    If I can try to summarize the main findings:

    1. Computer-generated (e.g…, Stable Diffusion) child porn is not criminalized in Japan, and so many Japanese Mastodon servers don’t remove it
    2. Porn involving real children is removed, but not immediately, as it depends on instance admins to catch it, and they have other things to do. Also, when an account is banned, the Mastodon server software is not sending out a “delete” for all of their posted material (which would signal other instances to delete it)

    Problem #2 can hopefully be improved with better tooling. I don’t know what you do about problem #1, though.