Generative Engine Optimization (GEO) files are structured assets that tell AI crawlers, agents, and language models how to read and represent your brand. Without them, AI platforms have to guess your brand’s context from unstructured web content — and they often get it wrong or miss you entirely. CogNerd’s GEO pipeline crawls your site and generates seven ready-to-deploy files in one automated job.Documentation Index
Fetch the complete documentation index at: https://docs.cognerd.in/llms.txt
Use this file to discover all available pages before exploring further.
Enter your website URL
Type in your primary domain (e.g.,
https://yourdomain.com) and click Generate. Make sure this is the publicly accessible root of your site — CogNerd needs to crawl it.Wait for the pipeline to complete
CogNerd runs a four-stage pipeline in the background:
- Page Discovery — maps all crawlable pages on your site
- Content Crawl — reads and extracts content from each discovered page
- Enhanced Asset Generation — builds structured representations of your brand, entities, and content
- Final Report — compiles and packages all seven output files
You receive a job ID when the run starts. If you navigate away and come back, the dashboard will show the current status of your job automatically.
Download your GEO files
When the pipeline finishes, click Download ZIP to get all seven files in a single archive:
| File | Purpose |
|---|---|
llms.txt | Machine-readable brand summary for AI agents and language models |
robots.txt | AI-aware crawl rules that guide what AI bots can and can’t access |
sitemap.xml | Page index for efficient crawling and indexing |
site.jsonld | Site-level entity schema (schema.org) |
organization.jsonld | Brand identity schema including name, logo, and social profiles |
website.jsonld | Website and search action schema |
entities.json | Entity relationship graph describing your brand’s key concepts |
Deploy the files to your site
Each file has a specific deployment target:
- llms.txt
- robots.txt
- sitemap.xml
- JSON-LD schemas
- entities.json
Place
llms.txt at the root of your domain so it’s accessible at yourdomain.com/llms.txt. This is the most important file — AI agents check for it the same way browsers check for robots.txt.GEO files reflect your site at the time of the crawl. If you make significant changes to your site, products, or brand, re-run the pipeline to keep your files current. You can regenerate as often as needed.