Crawl Postponed Because Robots.txt was Inaccessible

2014/7/52 min read
bookmark this
Responsive image

When you see the error message "Crawl postponed because robots.txt was inaccessible" at your web master page. It is telling you that google can't crawl your site, something wrong with your robots.txt. That means, your site's SEO is dead, unless somebody know your site go to your site directly, otherwise google will not showing your site to search.

Sometime the problem might be your site service provider block google crawl to access your site. I was encounter different issue. 

So, I had a website which had around 1000 user visit per day. It's start up website, one day I changed something the site become zero visit. When I look at the webmaster's robots.txt section, it's warning as "Crawl postponed because robots.txt was inaccessible". But I noticed my robots.txt is there and I can hit it by type the url like /robots.txt with 200 response.

I thought it might be I need to change the text like following, allow every url at my site to crawl. But still I'm keep seen this warning message. So I removed it and still same warning.

User-agent: *
Allow: /

Then I noticed that, google will crawl everything when access to url /robots.txt return 200 or get 400 error. My case, the application return 200 eventhough robots.txt is not there.

After I changed logic, need wait next time when google crawled. If you want to change how often google will crawled your site, you could change the crawl rate at your google web master. However base on following article, 2-4 days google will first visit your pages.

googles-faster-crawl-rate

This is a blog tell you how to change crawl rate.

change crawl rate