Discord is HIRING A

QA Specialist - Minor Safety and Exploitative Content

📍 San Francisco , California , United States 🌐 Fully RemoteFull Time
POSTED May 22, 2025

Please mention you found this job on TestDev Jobs. It helps us get more people to hire on our site. Thanks and good luck!


play video games.

We are looking for a detail-oriented professional with a strong passion for safeguarding vulnerable groups and combating exploitative content online. Your approach to quality assurance is rooted in empathy, precision, and a commitment to continuous improvement.

What You'll Be Doing

  • Review and audit moderation decisions related to minor safety and exploitative content to ensure adherence to Discord’s Trust & Safety policies.
  • Collaborate with moderators, analysts, and policy teams to identify trends, gaps, and inconsistencies in content review processes.
  • Provide constructive feedback and actionable insights to moderators to improve decision-making accuracy and maintain policy alignment.
  • Develop and lead calibration sessions for the moderation team based on audit findings and evolving content standards.
  • Partner with MSEC and other cross-functional teams to influence policy updates and improve internal tools and workflows for greater efficiency and scalability.
  • Regularly report on quality trends and metrics, highlighting risks, successes, and opportunities for process improvements.

What you should have

  • 2+ years of experience in quality assurance, trust & safety, or content moderation, preferably in a tech or online platform environment.
  • Deep understanding of issues related to minor safety, exploitative content, and global online safety trends.
  • Excellent analytical skills with the ability to synthesize large datasets and translate them into actionable insights.
  • Strong communication skills, both written and verbal, to effectively convey findings and train teams.
  • Familiarity with moderation tools, audit processes, and metrics-driven performance tracking.
  • A calm, resilient demeanor when handling sensitive or potentially distressing content.
  • Ability to flex your expertise to support other QA initiatives, including automation and machine learning, violent and hateful content, cybercrime, and other exploitative content.

Bonus Points

  • Experience working on global teams or in environments that require cultural sensitivity and awareness.
  • Experience with data analytics tools and languages like SQL.
  • Proficiency in multiple languages to support international moderation efforts.
  • Demonstrated success in driving cross-functional initiatives or policy changes in a Trust & Safety context.
  • Experience working with machine learning systems, automation tools, and LLM/AI technologies.

Requirements

  • This role requires regular interfacing with potentially traumatic material, including CSAM and other forms of exploitative, hateful, violent, or shocking content.
  • This role's hours are Monday-Friday, 9:00 AM to 5:00 PM Pacific Standard Time, with occasional flexibility required to accommodate our global partners.

Why Discord?

Join us in our mission! Your future is just a click away!

Please mention you found this job on TestDev Jobs. It helps us get more people to hire on our site. Thanks and good luck!