LLMs.txt and AI crawler controls
View Dashboard > Settings > SEO
Superblog includes two newer AI-facing SEO controls:
Generate LLMs.txtDo not allow ChatGPT (OpenAI) to crawl
These settings help you decide how AI systems discover and consume your content.
Generate LLMs.txt
When enabled, Superblog generates an llms.txt-style file for your site.
The goal is to provide content in a format that is easier for AI tools such as ChatGPT, Claude, and Gemini to read and understand.
What to expect
the file is generated as part of deployment
deploys can take a bit longer when this is enabled
it is meant to improve AI readability, not replace your existing SEO setup
When to enable it
Enable this if you want your content to be easier for AI tools to consume and reference.
Do not allow ChatGPT (OpenAI) to crawl
This setting adds a robots.txt directive intended to block OpenAI's crawler from indexing your content.
When to use it
Use this if you do not want ChatGPT or OpenAI-controlled crawling systems to access your site for indexing purposes.
How these settings fit together
These options are independent.
That means you can choose to:
make your content more AI-readable
block a specific crawler
do both or neither depending on your publishing policy
Important note
These controls are about AI discovery and crawl preferences. They do not replace your normal SEO work like:
strong titles and descriptions
clean URLs
internal linking
sitemaps
structured data
Suggested approach
Choose a policy based on how you want your content to be used:
If you want maximum AI discoverability, consider enabling
Generate LLMs.txt.If you want to limit OpenAI crawling, enable the ChatGPT crawler block.
Revisit the settings as your content distribution strategy changes.
Related settings
You may also want to review:
custom
robots.txtnoindex settings for archives
canonical URLs
schema and FAQs on important posts
Together, these give you more control over how both search engines and AI systems interact with your content.