Google processes about 8.5 billion searches per day. That number has been pretty stable for years. Meanwhile, ChatGPT alone crossed 1 billion queries per day in early 2025 — and it's still climbing. Perplexity, Claude, Gemini, and a dozen other AI tools are growing right behind it.
Here's the thing most local business owners haven't caught yet: when someone asks ChatGPT "best roofer in Austin" or tells Perplexity "find me a reliable HVAC company near me," those AI models are pulling information from somewhere. And if your business hasn't told them what you do, they're guessing. Or worse — recommending your competitor instead.
There's a simple fix. One file. It's called llms.txt.
What Is llms.txt?
You've probably heard of robots.txt — the file that tells Google's crawlers what they can and can't access on your website. It's been around since the mid-90s and every serious website has one.
llms.txt is the AI equivalent. It's a plain text file that sits at the root of your website (like yourbusiness.com/llms.txt) and tells large language models exactly what your business does. Not in HTML. Not buried in page content that might get misinterpreted. Direct, structured, plain English that AI models can read and understand immediately.
Think of it this way: robots.txt is a set of rules for crawlers. llms.txt is a business brief for AI. It answers the questions that matter when someone is searching through an AI assistant instead of a traditional search engine.
Why This Matters Right Now
The shift from "search engine optimization" to "AI search optimization" is happening faster than most businesses realize. A few numbers worth sitting with:
- ChatGPT hit 1 billion daily queries — that's roughly 12% of Google's daily volume, achieved in about two years
- Perplexity processes over 100 million queries per day and is growing month-over-month
- 40% of Gen Z prefers AI assistants over Google for local recommendations (and they're the ones moving into homes and hiring contractors)
- Zero-click answers are the default — AI doesn't send users to 10 blue links, it gives one answer
When a homeowner asks an AI assistant to find a plumber, that AI is synthesizing everything it knows about plumbers in that area. If your business has an llms.txt file giving it clear, structured information — your name, services, service areas, what makes you different — you have a massive advantage over competitors who don't.
Most local businesses don't have one. That gap won't last forever.
What Goes in Your llms.txt File
Keep it simple. AI models don't need fancy formatting — they need clear facts. Here's what to include:
Business identity. Your business name, what you do in one sentence, and your primary website URL. No ambiguity.
Services. List every service you offer. Be specific. Don't just say "plumbing" — say "emergency pipe repair, water heater installation, drain cleaning, sewer line replacement, bathroom remodeling." AI models match queries to services, so the more specific you are, the more queries you'll show up for.
Service areas. Every city, town, neighborhood, and zip code you serve. AI assistants are increasingly location-aware, and they're looking for this exact data when someone asks for a local recommendation.
Differentiators. What makes you different from the other five businesses that do the same thing in your area? Licensed and insured? 24/7 emergency service? Family-owned for 30 years? Same-day appointments? This is what tips the AI's recommendation in your favor.
FAQs. The five to ten questions your customers ask most often, with clear answers. AI models love structured Q&A — it maps directly to how people search. "How much does a roof replacement cost in Phoenix?" "Do you offer financing?" "How fast can you get here for an emergency?"
Contact information. Phone number, email, hours, physical address if you have one. Make it effortless for the AI to connect the searcher to you.
How to Create One
This is the part that trips people up because it sounds too simple. Here's the actual process:
- Open a plain text editor (Notepad, TextEdit, VS Code — doesn't matter)
- Write out the sections above in a clean, readable format
- Save the file as
llms.txt - Upload it to the root directory of your website — the same place your
robots.txtlives - Verify it's accessible by going to
yourbusiness.com/llms.txtin your browser
That's it. No plugins. No monthly subscription. No developer needed. Five minutes and you're ahead of 95% of local businesses in your market.
If you want to go further, add structured schema markup to your website pages (FAQ schema, LocalBusiness schema, Service schema) and make sure your robots.txt isn't blocking AI crawlers like ChatGPT-User or ClaudeBot. The llms.txt file is the foundation — schema and crawler access are the next layers.
Check Your AI Visibility Right Now
Not sure where you stand? We built a free tool that checks how visible your business is to AI search engines — including whether you have an llms.txt file, whether AI crawlers can access your site, and whether your content is structured in a way AI models can parse.
Run your free AI visibility check here →
It takes 30 seconds and gives you a clear picture of what's working and what's missing.
The Window Is Open
AI search optimization is where SEO was in 2005. The businesses that figure it out early — while their competitors are still focused exclusively on Google — will own their local market in the AI era. An llms.txt file is the simplest first step you can take.
One file. Five minutes. A direct line to the AI models your future customers are already using.
