Start here
Before You Fix It: What This Check Means
AI Visibility Signals Conflict With Each Other shows whether this part of your site is behaving the way users and search systems expect. In plain terms, this tells you whether AI crawlers and answer systems can understand and reuse your content correctly. Scavo reads the page-level robots tokens from both HTML meta tags and `X-Robots-Tag` headers, checks for `noai` and `noimageai`, probes whether `/llms.txt` exists, and inspects wildcard root policy in `robots.txt`.
Why this matters in practice: unclear machine-facing signals can reduce retrieval quality and citation consistency.
How to use this result: treat this as directional evidence, not final truth. Answer-engine retrieval behavior can shift over time even when your technical setup is stable. First, confirm the issue in live output: verify bot-facing output and policy files on the final URL Then ship one controlled change: Inspect the final HTML and response headers for the exact route Scavo tested, not just the template source in code. Finally, re-scan the same URL to confirm the result improves.
TL;DR: You do not need to publish every AI-facing standard. You do need the signals you do publish to tell a consistent story.
A failing "AI Visibility Signals Conflict With Each Other" check is usually solved with a small, targeted change. It improves how answer engines understand and attribute your content. Keep the scope tight, verify in production, and lock in the regression guard.
This is a conflict check, not a hype check. Scavo is not warning just because /llms.txt is missing or because a page has an intentional opt-out. It warns when the visible signals disagree with each other in a way that can confuse compliant crawlers and your own team.
A common example is publishing a public /llms.txt reading list while also sending page-level noai or noimageai directives, or blocking the whole site via wildcard robots.txt while still exposing machine-readable discovery files. Those setups can be intentional, but they are often accidents.
What Scavo checks (plain English)
Scavo reads the page-level robots tokens from both HTML meta tags and X-Robots-Tag headers, checks for noai and noimageai, probes whether /llms.txt exists, and inspects wildcard root policy in robots.txt.
A pass means the signals Scavo found look aligned. A warning means the signals are mixed. An info result usually means there was no explicit AI-facing signal to score, or the page declares an intentional opt-out without any contradiction.
- Scan key:
ai_crawler_visibility_signals - Category:
AI_VISIBILITY
How Scavo scores this check
- Warning:
/llms.txtis public while page-level opt-out directives are present, or wildcardrobots.txtblocks root crawl access while/llms.txtis public. - Pass: the AI-facing signals Scavo found look internally consistent.
- Info: the page uses an intentional opt-out without conflict, or no explicit AI visibility signal was found.
Why fixing this matters
Consistency matters because discovery files, robots directives, and page-level metadata often end up owned by different people. When those layers drift apart, you get a misleading technical posture even though every individual change looked reasonable on its own.
This is also an operational trust problem. If your scan says one thing, your file says another, and your policy page says a third, nobody can reliably answer whether the site is meant to be visible to AI systems.
Common reasons this check flags
- A CMS plugin injects
noaiornoimageaion templates while content or SEO teams publish/llms.txt. - Wildcard
robots.txtwas tightened during a migration, but/llms.txtstayed public and unchanged. - Edge headers and template meta tags disagree about effective robots directives.
- Teams assume
/llms.txtis only informational, while others assume it is a live allow-list signal.
If you are not technical
- Pick one clear stance for the affected route or site section: visible, restricted, or mixed by design.
- Ask for a plain-English summary of every AI-facing signal published on the site:
robots.txt, page-level robots tags, and/llms.txt. - If the mixed setup is intentional, document why. If nobody can explain it, treat it as drift and simplify it.
- Re-scan the exact URL after the change so the new state is visible in one report.
Technical handoff message
Copy and share this with your developer.
Scavo flagged AI Visibility Signals (ai_crawler_visibility_signals). Please review page-level robots directives, wildcard robots.txt behavior, and /llms.txt, remove any unintentional contradictions, and confirm the final production state matches our intended AI visibility policy.If you are technical
- Inspect the final HTML and response headers for the exact route Scavo tested, not just the template source in code.
- Fetch
/robots.txtand/llms.txtfrom production and confirm they reflect the same policy stance as the page-level directives. - If the page should be visible, remove accidental
noai/noimageaitokens and ensure wildcard root crawl access is not blocked. - If the page or site should be restricted, keep the restrictive signals and consider whether
/llms.txtshould be removed, narrowed, or left intentionally informational. - Avoid mixing “helpful reading list” behavior with broad crawl blocks unless you have a specific documented reason.
How to verify
- Check the final response headers and page source for
noai,noimageai, or other restrictive robots tokens. - Fetch the live
/robots.txtand/llms.txtand compare them against the intended policy. - Re-run Scavo and confirm the warning is gone or that the result is now a deliberate info state.
What this scan cannot confirm
- The directives
noaiandnoimageaiare not universally standardized or honored by every vendor. - Scavo cannot infer your legal intent. It only highlights technical contradiction or alignment.
- This check does not prove future citation or crawl frequency by any specific answer engine.
Owner checklist
- [ ] Name one owner for this check and note where it is controlled (app, CDN, server, or CMS).
- [ ] Add a release gate for this signal so regressions are caught before production.
- [ ] After deploys that touch this area, run a follow-up scan and confirm the result is still healthy.
- [ ] Re-check AI crawler and citation signals after robots, schema, or author metadata changes.
FAQ
What does Scavo actually validate for AI Visibility Signals Conflict With Each Other?
Scavo checks live production responses using the same logic shown in your dashboard and weekly report.
Will AI visibility changes show immediately after we ship fixes?
Usually not instantly. Crawlers and answer engines refresh on different schedules, so confirm technical signals first, then monitor citations and mentions over time.
What is the fastest way to confirm the fix worked?
Run one on-demand scan after deployment, open this check in the report, and confirm it moved to pass or expected info. Then verify at source (headers, HTML, or network traces) so the fix is reproducible.
How do we keep this from regressing?
Keep one owner, keep config in version control, and watch at least one weekly report cycle. If this regresses, compare the release diff and edge configuration first.
Sources
- Google Search Central: robots meta tag and X-Robots-Tag
- llms.txt project
- IPTC Generative AI Opt-Out Best Practices
Need stack-specific help? Send support your stack + check key and we will map the fix.