@[email protected] to Lemmy [email protected] • 2 years agoAh, Yes! AI Will Surely Save Us All!lemmy.worldimagemessage-square85arrow-up1860
arrow-up1810imageAh, Yes! AI Will Surely Save Us All!lemmy.world@[email protected] to Lemmy [email protected] • 2 years agomessage-square85
minus-square@[email protected]linkfedilink2•2 years agoText even completely fictional can be CSAM based on jurisdiction.
minus-square@[email protected]linkfedilink1•2 years agoI’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
minus-square@[email protected]linkfedilink2•2 years agoInternationally? I know that in Germany there are cases.
Text even completely fictional can be CSAM based on jurisdiction.
I’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
Internationally? I know that in Germany there are cases.