87 remote roles added today376 active tech employersπŸ‡ΊπŸ‡Έ πŸ‡¨πŸ‡¦ πŸ‡²πŸ‡½ Tri-border network749 metros covered12 database updates this hourTN visa filter live87 remote roles added today376 active tech employersπŸ‡ΊπŸ‡Έ πŸ‡¨πŸ‡¦ πŸ‡²πŸ‡½ Tri-border network749 metros covered12 database updates this hourTN visa filter live
Jobs/San Francisco/Data Scientist, Integrity Measurement
San Francisco, CA

Data Scientist, Integrity Measurement

About the Team The Applied Foundations team at OpenAI is dedicated to ensuring that our cutting-edge technology is not only revolutionary, but also secure from a myriad of adversarial threats. We strive to maintain the integrity of our platforms as they scale.

Company
OpenAI
Compensation
$293K - $385K
Schedule
Full-Time
Role overview

What this role actually needs.

About the Team The Applied Foundations team at OpenAI is dedicated to ensuring that our cutting-edge technology is not only revolutionary, but also secure from a myriad of adversarial threats. We strive to maintain the integrity of our platforms as they scale. Responsibilities: - Own measurement and quantitative analysis for a group of severe, actor- and network-based usage harm verticals. - Develop and implement AI-first methods for prevalence measurement and other productionised safety metrics, which may necessarily include off-platform indicators or other non-standard datasets. - Build metrics that can be used for goaling or A/B tests when prevalence or other top line metrics are not suitable. - Own dashboards and metrics reporting for harm verticals. - Conduct analyses and generate insights that inform improvements to review, detection, or enforcement, and that influence roadmaps. - Optimise LLM prompts for the purpose of measurement. Company context: OpenAI builds frontier AI systems, research infrastructure, and applied products for developers, enterprises, and global users.

Responsibilities

Day-to-day expectations

OpenAI lists these responsibilities for the Data Scientist, Integrity Measurement role.

  • Own measurement and quantitative analysis for a group of severe, actor- and network-based usage harm verticals.
  • Develop and implement AI-first methods for prevalence measurement and other productionised safety metrics, which may necessarily include off-platform indicators or other non-standard datasets.
  • Build metrics that can be used for goaling or A/B tests when prevalence or other top line metrics are not suitable.
  • Own dashboards and metrics reporting for harm verticals.
  • Conduct analyses and generate insights that inform improvements to review, detection, or enforcement, and that influence roadmaps.
  • Optimise LLM prompts for the purpose of measurement.
UpJobz market context

Why this listing is more than a copied job post.

Data Scientist, Integrity Measurement is framed against UpJobz source checks, country scope, compensation visibility, and work-authorization signals so candidates can make a faster go/no-go decision.

United States tech market

United States roles on UpJobz are filtered for high-tech relevance, source freshness, and actionable employer detail before they are allowed into SEO surfaces.

Compensation read

$293K - $385K is visible before the click, so candidates can compare the role against local market expectations before applying.

Work authorization read

Current extracted signal: United States residents. UpJobz treats this as a search signal, not legal advice, and links visa-sensitive roles back to the relevant visa hub where possible.

Location read

Hybrid roles in San Francisco should be compared against commute, local salary bands, and nearby employer demand.

Browse similar jobs

Subscriber playbook

Turn this listing into an application plan.

This is the first pass at the premium UpJobz layer: a fast brief that helps serious applicants move with more clarity.

Next moves

  • Tailor your resume around ai and llm instead of sending a generic application.
  • Use the first two bullets of your application to connect your background directly to data scientist, integrity measurement is a high-signal hybrid role in san francisco, and it is most realistic for united states residents.
  • Open the role quickly if it fits and bookmark three similar jobs before you leave the page.

Interview themes

Data and AnalyticsHybridaillmresearchpython

Watchouts

  • $293K - $385K is visible, so calibrate your application around the posted range.
  • Use united states residents as part of your positioning so the recruiter does not have to infer it.
  • Show concrete examples of succeeding in hybrid environments.
Role signals

Keywords to match against your background

Use these terms to decide whether your resume, portfolio, and recent projects line up with the role.

aillmresearchpythonawssecuritydatauxplatformapiinfrastructure
Next step

Apply through the employer source

Open the source listing from jobs.ashbyhq.com, confirm the role is still active, then apply on the employer or ATS page.

Open employer application

Source: jobs.ashbyhq.com Β· Source ID: be4e1098-f7ac-46f4-babe-44ef08f47fcb Β· Confidence: 97/100 Β· Last checked: May 7, 2026

How UpJobz verifies job sourcesContinue browsing tech jobs