Skip to main content
All posts
Report5 March 2026·14 min read

UK AI Employer Visibility Report 2026: 500+ Companies Audited

UK AI Employer Visibility Report 2026: 500+ Companies Audited

68% of UK employers have incomplete or inaccurate data in AI search results. That's the headline finding from OpenRole's first comprehensive audit of AI employer visibility across the United Kingdom.

Between January and March 2026, we audited 517 UK-based employers across six major AI platforms to measure how accurately artificial intelligence represents them to job candidates. The results reveal a significant gap between how companies see themselves and how AI presents them to the talent market.

This report presents the full findings, industry breakdowns, and actionable recommendations for UK employers who want to control their AI narrative.

Source: OpenRole audit data, March 2026


Table of Contents

  1. Methodology
  2. Overall Findings
  3. Industry Breakdown
  4. Company Size Analysis
  5. The Salary Data Crisis
  6. What Do Top Scorers Do Differently?
  7. Recommendations for UK Employers

How Did We Conduct This Audit?

We queried six major AI platforms — ChatGPT (GPT-4o), Claude (3.5 Sonnet), Perplexity, Google AI Overviews, Microsoft Copilot, and Meta AI — with standardised questions about each of the 517 companies in our sample.

What We Measured

Each employer was scored across eight categories, each weighted by importance to candidate decision-making:

CategoryWeightWhat We Checked
Company Overview15%Accuracy of description, industry, founding date, size
Salary Data20%Presence and accuracy of compensation information
Benefits & Perks12%Completeness of benefits description
Culture & Values10%Accuracy of culture representation
Interview Process8%Availability of interview process information
Remote/Hybrid Policy12%Accuracy of working arrangements
Career Growth8%Information about progression and development
Technical Accuracy15%Factual correctness of all claims (locations, products, leadership)

Scoring Method

Each response was evaluated on a 0–100 scale:

  • 0–20: AI has no meaningful data or actively hallucinated information
  • 21–40: AI provides partial data with significant inaccuracies
  • 41–60: AI provides mostly correct data with some gaps
  • 61–80: AI provides accurate, comprehensive data with minor gaps
  • 81–100: AI provides highly accurate, detailed, and current information

Final scores were averaged across all six AI platforms to control for model-specific biases.

Sample Composition

Our 517-company sample was drawn from:

  • UK Companies House registrations with 10+ employees
  • Companies actively hiring (at least 1 open role on LinkedIn in Q1 2026)
  • Stratified by industry, company size, and geography to ensure representativeness

Source: OpenRole audit data, March 2026


What Did We Find Overall?

The average AI Visibility Score across all 517 UK employers was 34 out of 100 — firmly in the "partial data with significant inaccuracies" band.

Score Distribution

  • 0–20 (Invisible): 28% of companies — AI essentially knows nothing useful about them
  • 21–40 (Poor): 37% of companies — AI provides fragmented, often inaccurate information
  • 41–60 (Moderate): 22% of companies — AI gets the basics right but misses critical details
  • 61–80 (Good): 10% of companies — AI provides a reasonably accurate picture
  • 81–100 (Excellent): 3% of companies — AI accurately represents the employer brand

Only 13% of UK employers scored above 60 — meaning for the vast majority, AI is telling candidates an incomplete or incorrect story.

Worst-Performing Categories

The categories where employers scored lowest on average:

  1. Salary Data: Average score 18/100 — the single biggest gap
  2. Interview Process: Average score 24/100
  3. Career Growth: Average score 27/100
  4. Benefits & Perks: Average score 31/100

Best-Performing Categories

  1. Company Overview: Average score 52/100 — AI generally knows what companies do
  2. Technical Accuracy: Average score 44/100 — basic facts are mostly correct
  3. Culture & Values: Average score 38/100 — often vague but directionally right

Source: OpenRole audit data, March 2026


How Does Each Industry Perform?

AI visibility varies dramatically by industry. Here's how each sector performed:

Industry Average Scores

IndustryAverage ScoreScore RangeCompanies Audited
Cybersecurity5231–7838
Fintech4722–8164
SaaS4118–7372
Consulting3815–6956
Energy3512–6541
Media3314–6135
Professional Services3110–5848
Healthcare288–5267
AI/ML2711–6429
Retail225–4767

Why Does Cybersecurity Lead?

Cybersecurity companies score highest for two reasons: they tend to have technically sophisticated web teams who implement structured data, and the industry has high media visibility, which feeds AI training data. Notably, 74% of cybersecurity companies in our sample had at least partial schema.org markup on their careers pages.

Why Does Retail Score So Poorly?

Retail employers scored just 22 on average, dragged down by near-zero salary transparency (only 8% published any salary data) and heavy reliance on job boards rather than owned careers content. AI models had the least accurate data about retail employers, with hallucination rates exceeding 40% for salary information.

Why Do AI/ML Companies Score Below Average?

Counterintuitively, AI/ML companies scored just 27 on average — below the overall mean. Many are early-stage startups with minimal web presence. Despite building AI themselves, 72% had no structured data on their careers pages.

Source: OpenRole audit data, March 2026


How Does Company Size Affect AI Visibility?

Larger companies have better AI visibility — but the relationship isn't linear.

Average Scores by Company Size

Company SizeEmployee RangeAverage ScoreScore Range
Startup10–492215–29
SMB50–1493225–38
Mid-Market150–4993932–45
Enterprise500–5,0005140–62
Large Enterprise5,000+5845–74

The Startup Visibility Gap

Startups face the steepest challenge. With an average score of just 22, most startups are effectively invisible to AI. The primary drivers:

  • 89% have no schema.org markup on any page
  • 94% publish no salary data in any structured format
  • 67% block at least one AI crawler in their robots.txt
  • Limited media coverage means less training data for AI models

The consequence is stark: when candidates ask AI about a startup, the response is typically "I don't have specific information about [Company]" or, worse, a hallucinated response based on similar-sounding companies.

The Enterprise Advantage — and Its Limits

Enterprises benefit from greater media coverage, Wikipedia articles, and more comprehensive Glassdoor profiles. However, even large enterprises averaged just 51/100, suggesting that scale alone doesn't guarantee AI visibility. The enterprises scoring 60+ had actively invested in structured data and AI-accessible content.

Source: OpenRole audit data, March 2026


The Salary Data Crisis: What's Really Happening?

Salary information is where AI employer data fails most dramatically.

The Numbers

  • 61% of UK employers have no machine-readable salary data whatsoever
  • 78% don't include salary ranges in their job postings' structured data
  • Only 14% use baseSalary in JobPosting schema markup
  • When salary data is absent, AI guesses — and is wrong by ±£15,000 on average

How Far Off Is AI?

We compared AI-stated salaries against verified compensation data (from companies that do publish transparently) to measure the error rate:

Role LevelAverage AI ErrorDirection of Error
Junior (0–2 years)±£8,200Underestimates 62% of the time
Mid-Level (3–5 years)±£12,800Underestimates 58% of the time
Senior (5–8 years)±£17,400Underestimates 71% of the time
Lead/Principal (8+ years)±£23,100Underestimates 76% of the time

AI systematically underestimates senior salaries because its training data skews towards older Glassdoor entries and public sector pay scales. For a Senior Software Engineer in London, AI might say £65,000–£80,000 when the actual market rate is £85,000–£105,000.

The Candidate Impact

This underestimation directly harms hiring. When candidates believe a company pays below market (because AI told them so), they self-select out before ever applying. For companies competing for senior talent, this invisible leak in the talent pipeline could be costing them their best candidates.

You can check what AI says about your salaries with a free OpenRole audit.

Source: OpenRole audit data, March 2026


What Do Top-Scoring Employers Do Differently?

The 67 companies scoring above 60/100 shared five consistent practices:

1. Structured Data on Every Careers Page

92% of top scorers had Organisation schema on their homepage and JobPosting schema on individual role pages, compared to just 18% of companies scoring below 30.

The structured data gave AI models authoritative, machine-readable facts to cite — rather than forcing them to guess from unstructured prose.

2. Published Salary Ranges

84% of top scorers published salary bands either on their job postings or their careers page, compared to 11% of companies scoring below 30.

Transparency correlated strongly with accuracy: when companies published salaries, AI cited them within £2,000 of the published figure 91% of the time.

3. FAQ-Formatted Careers Content

76% of top scorers used question-and-answer formatting on their careers pages (e.g., "What benefits does [Company] offer?"), compared to 23% of low scorers.

AI models extract information far more reliably from Q&A formats than from prose paragraphs. A question-answer pair maps directly to how candidates query AI.

4. AI Crawler Access

97% of top scorers allowed GPTBot, ClaudeBot, and Google-Extended in their robots.txt. They actively wanted AI to crawl their content.

By contrast, 34% of companies scoring below 30 blocked at least one major AI crawler — inadvertently making themselves invisible.

5. Multi-Platform Consistency

Top scorers maintained consistent, up-to-date profiles on LinkedIn, Glassdoor, Indeed, and their own website. AI models cross-reference sources; when multiple platforms agree, the AI response is more confident and accurate.

Use OpenRole's free schema generator to implement structured data on your careers pages in under an hour.

Source: OpenRole audit data, March 2026


What Should UK Employers Do Now?

Based on our audit of 517 companies, here are five actionable steps every UK employer should take:

Step 1: Run an AI Visibility Audit

Before fixing anything, understand your baseline. Run a free audit on OpenRole to see exactly what ChatGPT, Claude, and Perplexity say about your company. You can't fix what you haven't measured.

Step 2: Implement Employer Schema Markup

Add Organisation schema to your homepage and JobPosting schema (with baseSalary) to every job listing. This single action had the highest correlation with improved AI accuracy in our dataset — a 32% average improvement in score within 90 days of implementation.

Read our complete guide to employer schema markup for step-by-step implementation instructions.

Step 3: Publish Salary Ranges

The salary data crisis is the single biggest issue in AI employer visibility. Publishing salary bands — even broad ones — gives AI factual data instead of forcing it to guess. Companies that published salary ranges saw an average 41% improvement in salary data accuracy across AI platforms.

Step 4: Restructure Your Careers Content

Rewrite your careers page using question-and-answer formatting. Address the questions candidates actually ask AI:

  • What does [Company] pay for [Role]?
  • What benefits does [Company] offer?
  • What's the interview process at [Company]?
  • Does [Company] allow remote work?
  • What's the culture like at [Company]?

Step 5: Allow AI Crawlers

Check your robots.txt file. If you're blocking GPTBot, ClaudeBot, or Google-Extended, you're blocking the primary channels through which candidates now discover employers. Unless you have a regulatory reason to block AI crawlers, open access today.


How Can You Benchmark Against Your Industry?

We publish industry-specific benchmarks updated monthly. Browse the UK AI Employer Visibility Index to see how your industry and competitors score, or view a sample audit report to understand what a full assessment includes.

The employers who act now — while 68% of the market remains invisible — will own the AI narrative before their competitors catch up. The window is closing, but it's still wide open.


Methodology Notes

This report is based on audits conducted between 4 January and 28 February 2026 across 517 UK-registered employers. All scores represent averages across six AI platforms queried on the same dates. Individual company scores are available through OpenRole's audit tool. Industry and size categories follow ONS Standard Industrial Classification codes and Companies House employee band definitions respectively.

For questions about methodology or to request the raw dataset, contact research@openrole.co.uk.

Full source: OpenRole audit data, March 2026. Updated annually.