Data methodology

How UpJobz turns raw job feeds into candidate decisions.

Our raw inventory is larger than the pages we expose to search engines. That is intentional. The public index should show durable, useful pages, not every row that a harvest process can collect.

Step 1

Source collection

UpJobz harvests employer and public job sources such as ATS boards, curated feeds, and approved public-data sources.

Step 2

Normalization

We normalize title, company, city, country, salary, work style, technical keywords, and work-authorization language into a consistent candidate-facing format.

Step 3

Quality filters

Jobs must pass freshness, high-tech intent, country, trust-score, detail, source URL, and public-slug checks before they can be eligible for search traffic.

Step 4

Human review

When a source creates abnormal slugs, missing descriptions, or unclear employer context, we quarantine the job from SEO until the source logic is fixed.

2026 SEO quarantine rule

Jobs with public provider fingerprints such as Ashby, Greenhouse, Lever, Job Bank, or AI-generation tokens are removed from sitemap eligibility until they are re-imported with clean public slugs. Existing old URLs redirect where possible; new sitemap exposure stays conservative.