Crawl Postponed Because Robots.txt was Inaccessible
Table of Contents
Introduction
When you see the error message "Crawl postponed because robots.txt was inaccessible" on your webmaster page, it is telling you that Google can't crawl your site because something is wrong with your robots.txt. That means your site's SEO is dead — unless somebody knows your site and goes to it directly, Google will not show your site in search results.
The Problem
Sometimes the problem might be that your site's service provider is blocking Google's crawler from accessing your site. I encountered a different issue.
I had a website which had around 1,000 visitors per day. It was a startup website. One day I changed something and the site dropped to zero visits. When I looked at the webmaster's robots.txt section, it was showing the warning "Crawl postponed because robots.txt was inaccessible." But I noticed my robots.txt was there and I could access it by typing the URL like /robots.txt with a 200 response.
Attempted Fixes
I thought I might need to change the text like the following, to allow every URL on my site to be crawled. But I kept seeing the same warning message. So I removed it and the warning still persisted.
User-agent: *
Allow: /
Root Cause and Solution
Then I noticed that Google will crawl everything when accessing the URL /robots.txt returns a 200 status or gets a 400 error. In my case, the application was returning 200 even though robots.txt was not actually there.
After I changed the logic, I needed to wait for the next time Google crawled. If you want to change how often Google crawls your site, you can adjust the crawl rate in your Google Webmaster settings. However, based on the following article, Google will first visit your pages within 2–4 days.
googles-faster-crawl-rate
This is a blog that tells you how to change the crawl rate:
change crawl rate
Conclusion
If you encounter the "Crawl postponed because robots.txt was inaccessible" error, make sure your /robots.txt endpoint is returning the correct HTTP status code. Ensure it returns a proper 200 response with the actual robots.txt content, or a 404 if the file doesn't exist. Also consider adjusting your Google Webmaster crawl rate settings to speed up re-indexing.