• Neshura@bookwormstory.social
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    6 months ago

    Also CSAM detection algorithms are known to misfire on occasion (it’s hard to impossible to tell apart a picture of a naked child sent for porn purposes and one not send for that) and people want to avoid any false allegations of that if at all possible.