Software Engineer, Youth Well-Being

New Yesterday

About the Team The Youth Well-Being product team is part of the Integrity pillar at OpenAI, responsible for ensuring that our state-of-the-art AI technologies are deployed in safe, age-appropriate, and beneficial ways—especially for youth and families. We work across OpenAI’s entire product surface area, from ChatGPT to future-facing tools, to architect safety and trust into the foundation of our systems. Our mission: empower families while meeting the highest standards of regulatory compliance and ethical responsibility. This work is core to OpenAI’s mission to ensure AGI benefits all of humanity. Safety, especially for the most vulnerable users, is more important to us than unfettered growth.
About the Role We’re looking for a Senior Software Engineer to help architect and build the foundational systems that power family- and youth-facing experiences at scale. You’ll help define how teens and guardians engage with OpenAI products—ensuring those experiences are safe, compliant, and empowering. You’ll operate across a broad technical surface: building identity primitives, age assurance pipelines, and guardian tools that support both proactive and reactive interventions. You’ll collaborate closely with a cross-functional team of engineers, data scientists, designers, user researchers, and policy experts. This is a high-impact, 0→1 opportunity to set the standard for how families interact with generative AI.
In this role, you will: Architect and implement teen and guardian experiences across OpenAI products, including ChatGPT.
Build global age assurance systems that are privacy-preserving and tailored to regional compliance needs.
Design and evolve our identity infrastructure to support scalable, secure, and resilient user journeys at consumer internet scale.
Help define safety and well-being metrics, and continuously improve user trust through technical interventions.
You might thrive in this role if you: Have built products or infrastructure for users under 18, or have a strong interest in digital safety, youth well-being, or educational technology.
Have hands-on experience with identity platforms, auth/authz systems, or user verification at scale.
Have worked on systems with regulatory complexity (e.g., COPPA, GDPR-K, UK OSA, KOSA) or experience navigating legal constraints in product development.
Are comfortable owning complex, ambiguous problems end-to-end, learning quickly, and adapting to fast-changing priorities. Can thrive in a cross-functional team, with strong collaboration across policy, research, legal, and product functions.
Care deeply about building systems that are not just performant, but also inclusive, thoughtful, and ethical.
Location:
San Francisco

We found some similar jobs based on your search