humans.txt Not Found

humans.txt is an optional file that lists who built the site, what technologies were used, and contact information. It's not critical for SEO or functionality, but it's a small transparency signal that helps security researchers reach you, gives credit to your team, and shows you care about web standards.

Start here

Before You Fix It: What This Check Means

humans.txt is optional, but can document ownership/contact context for operations and trust. In plain terms, this checks whether an optional humans.txt file is present if you want to publish one. This check is intentionally simple.

Why this matters in practice: operational drift here often causes hard-to-debug regressions across environments.

How to use this result: treat this as directional evidence, not final truth. This result reflects what was observable at scan time and should be verified in your own production context. First, confirm the issue in live output: verify directly in live production output with browser/network tools Then ship one controlled change: Serve a plain-text `humans.txt` from origin or edge. Finally, re-scan the same URL to confirm the result improves.

TL;DR: Your site doesn't have a humans.txt file — a minor transparency signal that credits your team and lists your technology stack.

humans.txt is an optional file that lists who built the site, what technologies were used, and contact information. It's not critical for SEO or functionality, but it's a small transparency signal that helps security researchers reach you, gives credit to your team, and shows you care about web standards.

What Scavo checks (plain English)

This check is intentionally simple:

  • Scavo resolves the scanned host and probes https://host/humans.txt with a HEAD request.
  • Pass: endpoint returns HTTP 200.
  • Info: endpoint is missing/unreachable or returns a non-200 status.

Scavo does not fail this check because humans.txt is optional.

How Scavo scores this check

Scavo assigns one result state for this check on the tested page:

  • Pass: baseline signals for this check were found.
  • Warning: partial coverage or risk signals were found and should be reviewed.
  • Fail: required signals were missing or risky behavior was confirmed.
  • Info: Scavo could not gather enough reliable evidence on this run to score pass/fail confidently.

In your scan report, this appears under What failed / What needs attention / What is working for humans_txt, followed by Recommended next steps and Technical evidence (for developers) when needed.

  • Scan key: humans_txt
  • Category: TECHNICAL

Why fixing this matters

For many teams this is a small trust signal. It documents who built the site, which stack is used, and where to contact maintainers. It can also support onboarding and operational memory when teams change.

If you do not care about this signal, leaving it as info is acceptable. The value here is clarity, not compliance pressure.

Common reasons it shows as info

  • File was never created.
  • File exists but wrong path (/human.txt, /about/humans.txt, etc.).
  • Edge rules/auth/caching block HEAD responses.
  • Redirect chain exceeds probe limit.

If you are not technical

  1. Decide if your brand/team wants this public credit file.
  2. If yes, ask dev to publish it at exactly /humans.txt.
  3. Keep content simple: team, stack, contact/maintainer reference.
  4. Re-run Scavo and confirm it switches from info to pass.

Technical handoff message

Copy and share this with your developer.

Scavo returned Humans.txt info (humans_txt). If we want this signal, publish /humans.txt as a public text file (HTTP 200) and ensure edge rules allow probe access.

If you are technical

  1. Serve a plain-text humans.txt from origin or edge.
  2. Keep response publicly readable and cacheable.
  3. Include lightweight sections: TEAM, THANKS, TECHNOLOGY, contact reference.
  4. Ensure HEAD and GET both succeed (some platforms mishandle HEAD).
  5. Avoid exposing sensitive emails or internal-only details.

How to verify

  • curl -I https://your-domain/humans.txt returns 200.
  • curl https://your-domain/humans.txt returns readable text.
  • Re-run Scavo and confirm humans_txt reports pass.

What this scan cannot confirm

  • It does not evaluate content quality of the file.
  • It does not measure SEO impact from this endpoint.
  • It does not validate legal/privacy appropriateness of listed names/contacts.

Owner checklist

  • [ ] Decide if humans.txt is part of your public brand policy.
  • [ ] If enabled, assign owner for updates when team/stack changes.
  • [ ] Ensure CDN/auth changes do not block the endpoint.
  • [ ] Review file quarterly for outdated or sensitive information.

FAQ

Will missing humans.txt hurt rankings?

This check treats it as informational. It is not a core crawl/index control.

Can we include only company-level credits instead of individual names?

Yes. Keep it aligned with your internal privacy and disclosure policy.

Should the file be long?

No. Keep it concise and maintainable.

Why does Scavo use HEAD for this check?

It is a fast availability check. If HEAD is blocked on your platform, allow it or ensure equivalent accessibility.

Sources


Want a minimal humans.txt starter format that matches your brand style? Send support your preferred team/contact wording.

More checks in this area

redirect_chain_hygiene

Redirect Chain Too Long — Multiple Hops Before the Real Page Loads

Learn how Scavo measures redirect hops, why chains slow users and crawlers down, and how to flatten protocol, host, and legacy URL redirects into cleaner routes.

Open guide
not_found_status

404 Page Returns Wrong HTTP Status Code

When a deleted or broken URL returns HTTP 200, search engines index it as a real page — polluting your index with dead content and wasting crawl budget. This is called a "soft 404" and Google specifically warns against it. Your 404 page should return a proper 404 status code while still showing a helpful message to users.

Open guide
analytics_instrumentation

Analytics Not Installed or Not Firing

Without analytics, every business decision about your website becomes a guess. You can't see which pages convert, where users drop off, which channels drive traffic, or whether changes improve performance. This is the foundation of data-driven optimization — if it's missing, you're flying blind.

Open guide