Cross-Platform Dynamics of Self-Generated CSAM
Cross-Platform Dynamics of Self-Generated CSAM
A Stanford Internet Observatory investigation identified large networks of accounts, purportedly operated by minors, selling self-generated illicit sexual content. Platforms have updated safety measures based on the findings, but more work is needed.
A new Stanford Internet Observatory report investigates networks on Instagram and Twitter that are involved in advertising and trading self-generated child sexual abuse material (SG-CSAM). Our investigation finds that large networks of accounts, purportedly operated by minors, are openly advertising SG-CSAM for sale on social media. Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers.