Free Trust and Safety Teaching Resources

Free Trust and Safety Teaching Resources

New teaching materials from the Trust and Safety Teaching Consortium
trust and safety consortium logo

The Trust and Safety Teaching Consortium is a coalition of academic, industry and non-profit experts in online trust and safety problems. Our goal is to create content that can be used to teach a variety of audiences about trust and safety issues in a wide variety of formats.

We launched in May 2023 and have been consistently expanding our content. This post highlights our collection of free resources, all of which are available here.

While the Stanford Internet Observatory coordinates this consortium, virtually all of the contributions come from volunteers across industry, academia, civil society, and government. All contributors are recognized on the consortium website.

We encourage you to incorporate this content into your classes. Please credit the Trust and Safety Teaching Consortium, and email us at trustandsafetyjournal@stanford.edu to let us know you used it, so that we can track our impact.

Trust and Safety Teaching Consortium content:

  • NEW: Short videos and accompanying discussion questions from industry outlining real trust and safety cases.
    • Video and discussion questions from Jace Pomales (technology safety consultant, Ghxst Agency) on a child safety case he worked on at Flickr.
    • Video and discussion questions from Natalie Campbell (TikTok) on a case involving content moderator well-being.
    • Video and transcript from Sabrina Puls (TrustLab) on challenges in developing anti-discrimination policies for digital marketplaces that facilitate offline interactions.
  • Reading list for 13 core trust and safety modules.
  • Slide decks for the 13 core modules. Most decks are available in both Google Slides and LaTeX.
  • Exercises for the 13 core modules.
    • For example, a thoughtful exercise on human labeler agreement, precision, and recall by Hill Stark (ActiveFence), Matthew Soeth (Thriving in Games Group), and Justin Francese (University of Oregon School of Journalism and Communication).
    • Another example: a 20-page description of a quarter-length exercise used in Stanford’s interdisciplinary Trust and Safety course. The exercise description includes detailed instructions for students on creating a content moderation bot on Discord, along with milestones appropriate for non-technical students.
  • Recorded lectures for several of the core modules.
    • For example, a lecture from Katherine Keyes (Columbia University) on suicide and self-harm.
  • Slides and syllabi from real classes taught by Consortium members.
  • A list of trust and safety courses taught by Consortium members in the 2024-2025 academic year.
  • Example trust and safety course descriptions.
  • Links to external teaching resources, including the Trust and Safety Curriculum from the Trust and Safety Professional Association.

 

If you are interested in helping to create teaching content, email trustandsafetyjournal@stanford.edu to join the Consortium. Please include a sentence or two about your areas of expertise, along with your current professional affiliation, if applicable. We coordinate contributions through periodic Zoom meetings. You will be added to an email list and alerted to the next meeting. 

Read More

U.S. Marshals work with the NCMEC during Operation We Will Find You, in 2023 (U.S. Marshals Service photo by Bennie J. Davis III, https://www.flickr.com/photos/usmarshals/52917723748/; CC BY 2.0 DEED, https://creativecommons.org/licenses/by/2.0/)
News

Challenges in the Online Child Safety Ecosystem

How to improve the system for reporting child sex abuse material online. Originally published in Lawfare.
cover link Challenges in the Online Child Safety Ecosystem
social media icons on a phone
News

How Unmoderated Platforms Became the Frontline for Russian Propaganda

In an essay for Lawfare Blog, Samantha Bradshaw, Renee DiResta and Christopher Giles look at how state war propaganda in Russia is increasingly prevalent on platforms that offer minimal-moderation virality as their value proposition.
cover link How Unmoderated Platforms Became the Frontline for Russian Propaganda
graphic represntations of computers and phones connected to tettering boxes of files and then connected to law enforcement in a complex web on a blue background.
Blogs

How to Fix the Online Child Exploitation Reporting System

A new Stanford Internet Observatory report examines how to improve the CyberTipline pipeline from dozens of interviews with tech companies, law enforcement and the nonprofit that runs the U.S. online child abuse reporting system.
cover link How to Fix the Online Child Exploitation Reporting System