87 remote roles added today376 active tech employers🇺🇸 🇨🇦 🇲🇽 Tri-border network749 metros covered12 database updates this hourTN visa filter live87 remote roles added today376 active tech employers🇺🇸 🇨🇦 🇲🇽 Tri-border network749 metros covered12 database updates this hourTN visa filter live
Jobs/San Francisco/Protection Scientist Engineer, Intelligence And Investigations
San Francisco, CA

Protection Scientist Engineer, Intelligence And Investigations

About the Team OpenAI’s mission is to ensure that general-purpose artificial intelligence benefits all of humanity. We believe that achieving our goal requires real world deployment and iteratively updating based on what we learn.

Company
OpenAI
Compensation
$198K - $425K
Schedule
Full-Time
Role overview

What this role actually needs.

About the Team OpenAI’s mission is to ensure that general-purpose artificial intelligence benefits all of humanity. We believe that achieving our goal requires real world deployment and iteratively updating based on what we learn. Responsibilities: - Scope and implement abuse monitoring requirements for new product launches. - Improve processes to sustain monitoring operations for existing products, including developing approaches to automate monitoring subtasks. - Prototype and mature into production systems of detection, review, and enforcement of abuse for major harms. - Work with Product, Policy, Ops, and Investigative teams to understand key risks and how to address them, and with Engineering teams to ensure we have sufficient data and scaled tooling. - Have at least 4 years of experience doing technical analysis and detection, especially using SQL and Python. - Have experience in trust and safety and/or have worked closely with policy, enforcement, and engineering teams. An investigative mindset is key. Requirements: - Improve processes to sustain monitoring operations for existing products, including developing approaches to automate monitoring subtasks. - Prototype and mature into production systems of detection, review, and enforcement of abuse for major harms. - Work with Product, Policy, Ops, and Investigative teams to understand key risks and how to address them, and with Engineering teams to ensure we have sufficient data and scaled tooling. - Have at least 4 years of experience doing technical analysis and detection, especially using SQL and Python. - Have experience in trust and safety and/or have worked closely with policy, enforcement, and engineering teams. An investigative mindset is key. - Have experience with basic data engineering, such as building core tables or writing data pipelines in production, and with machine learning principles and execution. Basic software development skills are a plus as this role writes productionised code. Benefits: - Scope and implement abuse monitoring requirements for new product launches. - Improve processes to sustain monitoring operations for existing products, including developing approaches to automate monitoring subtasks. - Prototype and mature into production systems of detection, review, and enforcement of abuse for major harms. - Work with Product, Policy, Ops, and Investigative teams to understand key risks and how to address them, and with Engineering teams to ensure we have sufficient data and scaled tooling. - Have at least 4 years of experience doing technical analysis and detection, especially using SQL and Python. - Have experience in trust and safety and/or have worked closely with policy, enforcement, and engineering teams. An investigative mindset is key. Company context: OpenAI builds frontier AI systems, research infrastructure, and applied products for developers, enterprises, and global users.

Responsibilities

Day-to-day expectations

OpenAI lists these responsibilities for the Protection Scientist Engineer, Intelligence And Investigations role.

  • Scope and implement abuse monitoring requirements for new product launches.
  • Improve processes to sustain monitoring operations for existing products, including developing approaches to automate monitoring subtasks.
  • Prototype and mature into production systems of detection, review, and enforcement of abuse for major harms.
  • Work with Product, Policy, Ops, and Investigative teams to understand key risks and how to address them, and with Engineering teams to ensure we have sufficient data and scaled tooling.
  • Have at least 4 years of experience doing technical analysis and detection, especially using SQL and Python.
  • Have experience in trust and safety and/or have worked closely with policy, enforcement, and engineering teams. An investigative mindset is key.
Requirements

What a strong candidate brings

These requirements are extracted from the source listing and normalized for UpJobz readers.

  • Improve processes to sustain monitoring operations for existing products, including developing approaches to automate monitoring subtasks.
  • Prototype and mature into production systems of detection, review, and enforcement of abuse for major harms.
  • Work with Product, Policy, Ops, and Investigative teams to understand key risks and how to address them, and with Engineering teams to ensure we have sufficient data and scaled tooling.
  • Have at least 4 years of experience doing technical analysis and detection, especially using SQL and Python.
  • Have experience in trust and safety and/or have worked closely with policy, enforcement, and engineering teams. An investigative mindset is key.
  • Have experience with basic data engineering, such as building core tables or writing data pipelines in production, and with machine learning principles and execution. Basic software development skills are a plus as this role writes productionised code.
Benefits

Why people would want this job

OpenAI published these compensation, benefits, or working-context details with the role.

  • Scope and implement abuse monitoring requirements for new product launches.
  • Improve processes to sustain monitoring operations for existing products, including developing approaches to automate monitoring subtasks.
  • Prototype and mature into production systems of detection, review, and enforcement of abuse for major harms.
  • Work with Product, Policy, Ops, and Investigative teams to understand key risks and how to address them, and with Engineering teams to ensure we have sufficient data and scaled tooling.
  • Have at least 4 years of experience doing technical analysis and detection, especially using SQL and Python.
  • Have experience in trust and safety and/or have worked closely with policy, enforcement, and engineering teams. An investigative mindset is key.
UpJobz market context

Why this listing is more than a copied job post.

Protection Scientist Engineer, Intelligence And Investigations is framed against UpJobz source checks, country scope, compensation visibility, and work-authorization signals so candidates can make a faster go/no-go decision.

United States tech market

United States roles on UpJobz are filtered for high-tech relevance, source freshness, and actionable employer detail before they are allowed into SEO surfaces.

Compensation read

$198K - $425K is visible before the click, so candidates can compare the role against local market expectations before applying.

Work authorization read

Current extracted signal: United States residents. UpJobz treats this as a search signal, not legal advice, and links visa-sensitive roles back to the relevant visa hub where possible.

Location read

On-site roles in San Francisco should be compared against commute, local salary bands, and nearby employer demand.

Browse similar jobs

Subscriber playbook

Turn this listing into an application plan.

This is the first pass at the premium UpJobz layer: a fast brief that helps serious applicants move with more clarity.

Next moves

  • Tailor your resume around ai and llm instead of sending a generic application.
  • Use the first two bullets of your application to connect your background directly to protection scientist engineer, intelligence and investigations is a high-signal on-site role in san francisco, and it is most realistic for united states residents.
  • Open the role quickly if it fits and bookmark three similar jobs before you leave the page.

Interview themes

Artificial IntelligenceOn-siteaillmmachine-learningresearch

Watchouts

  • $198K - $425K is visible, so calibrate your application around the posted range.
  • Use united states residents as part of your positioning so the recruiter does not have to infer it.
  • Show concrete examples of succeeding in on-site environments.
Role signals

Keywords to match against your background

Use these terms to decide whether your resume, portfolio, and recent projects line up with the role.

aillmmachine-learningresearchpythonawssecuritydataapiinfrastructure
Next step

Apply through the employer source

Open the source listing from jobs.ashbyhq.com, confirm the role is still active, then apply on the employer or ATS page.

Open employer application

Source: jobs.ashbyhq.com · Source ID: 529feb11-4b9b-4baf-9b3f-f2b100156c5a · Confidence: 97/100 · Last checked: May 7, 2026

How UpJobz verifies job sourcesContinue browsing tech jobs