Ask ChatGPT to recommend a tree service in Cleveland.
If you run one of the 40-something tree companies in Cleveland, there's a good chance your name doesn't come up — even if you have a hundred five-star reviews and a website that ranks fine on Google. The AI gives three confident recommendations and none of them are you.
That's not a freak accident. It's a signal architecture problem. And most local business owners have no idea it exists.
The way people search is changing
Google is still the dominant channel for local traffic. That's not going away tomorrow. But something is shifting at the edges, and the businesses that catch it early are going to hold those positions for years.
More and more, the "who do I call for X in my city" query starts in an AI assistant. The person doesn't open a browser. They open ChatGPT or ask their phone assistant or prompt Claude. They describe what they need. The AI gives them three names. They pick one and call.
If you're not in those three names, you don't get a shot. The person never makes it to a search results page.
Why AI can't find most local businesses
AI systems don't rank businesses the way Google's local pack does. Google weights reviews, proximity, and citation consistency. AI systems weight something different: structured, machine-readable signals on your website.
Specifically:
Schema markup. This is JSON-LD code in your site's <head> that tells machines exactly what your business does, where it operates, what services you offer, and how to describe you. Most small business sites have zero schema. The AI's crawler visits, reads generic paragraph text, and can't confidently categorize you. It moves on to a site that gave it cleaner data.
FAQ content. AI search systems love FAQ sections because they're already in question-and-answer format — exactly how someone would phrase a query. A contractor with a well-structured FAQ that includes "Do you serve [city]?" and "What's included in an emergency call?" is giving the AI exactly the kind of content it can pull from.
An llms.txt file. This is a simple text file at the root of your site — like robots.txt, but for AI crawlers. It signals which AI systems are allowed to index your content and gives a concise description of who you are. Most businesses don't have one. Adding it is a ten-minute task that flags your site as AI-crawler-aware.
Explicit service area coverage. Your site might say "serving Northeast Ohio" somewhere in the footer. That's not enough. AI systems want to see a list: Cleveland, Lakewood, Strongsville, Westlake, Bay Village. The more explicit and structured the city list, the more confident the AI is that you're a relevant result for queries from those places.
Your business showing up in the AI's training data. This one's harder to manufacture directly, but it's influenced by citations — press mentions, directory listings, local news, industry publications that reference you. The AI's knowledge of your business goes back to when it was last trained. If you've never been cited anywhere external to your own website, you're invisible.
What the gap looks like in practice
We run a five-part AI visibility check on every business that goes through our audit. Here's what we see consistently for local service businesses:
- llms.txt: absent 90% of the time
- Schema markup: absent 80% of the time; present but minimal another 15%
- FAQ content: absent 70% of the time
- Service area structured data: absent 75% of the time
- AI crawler access in robots.txt: blocked accidentally 40% of the time — businesses that copied a robots.txt template years ago and didn't realize it was blocking GPTBot and ClaudeBot
That last one is a common trap. You didn't intend to block AI crawlers. You had someone set up your robots.txt in 2019 and it had a broad Disallow: / rule that nobody looked at again. Your site is correctly indexed on Google because Googlebot is explicitly allowed. But every AI crawler is blocked. ChatGPT literally cannot index your site.
What closing the gap looks like
None of this is technically complicated. It's just work that most site owners don't know they need to do.
For a typical local service business, getting AI-visible involves:
Add a LocalBusiness schema block with your business type, address or service area, services list, and phone. This takes under an hour for a developer and is a one-time add.
Audit your robots.txt to confirm GPTBot, ClaudeBot, and PerplexityBot are allowed. If you're not sure, look for lines that say
User-agent: *followed by anyDisallowrules — those may be blocking everything.Create an llms.txt file at your site root with a two-paragraph plain-English description of who you are, what you do, and what cities you serve.
Build a real FAQ section on your homepage and service pages. Focus on the questions people actually ask before they call: service area, what's included, response time, what to do in an emergency.
Add service area city pages. One page per city, with real content about what you do in that location. Not the same paragraph copied twelve times — actual local context. This is the highest-effort item but it's also the most durable: these pages rank on Google AND get picked up by AI crawlers.
This is exactly what our audit catches and what the content engine builds.
If you run the free brand audit at vibetokens.io/start, the AI Visibility module runs live against your site and tells you exactly which signals are missing. Takes two minutes to start. The report comes back within a few minutes and tells you what's broken.
The timing window
The businesses that rank well in AI search two years from now are the ones that set up the right signals today — not because they were technically sophisticated, but because they did it while most of their competitors were still asleep on it.
Google took years to go from "tech people use this" to "everyone uses this." AI search is moving faster. The floor is rising.
If you're a contractor or local service business and you're reading this: the gap between AI-visible and AI-invisible is not a knowledge problem or a budget problem. It's a "nobody told me this existed" problem.
Now you know.
Run a free AI visibility check on your business at vibetokens.io/start. The report is free, automated, and delivered to your inbox.
The five modules we check — GBP listing, site performance, keyword gaps, missing service pages, and AI visibility — are the same ones we fix as part of the $199/mo content engine. If your audit comes back with problems, we can start fixing them the same day.
