Stop getting traffic from Googlebot on high server load

by Islam Wazery   Last Updated August 17, 2018 15:04 PM

In my website, we usually get a spike in the number of requests because of search engine crawlers (like Googlebot) harvesting the website for indexing.

So what I thought of; is depending on the server health (CPU Utilization), by which I can respond with 503 to the bots if the server is on heavy load, so they stop crawling and come back again after some time which I will specify in the "Retry-After" header.

My questions are:

  1. Will responding with 503 for any Robots.txt request stop the Googlebot (to be specific) from accessing all of my website resources?
  2. Do I need to respond with 503 for all of my website resources, and if I did that, will it stop Googlebot from retrying, and will it obey the "Retry-After" header


Related Questions



How to optimize robots.txt file?

Updated June 26, 2018 03:04 AM