Sometims website owners want to take there website offline for the server maintainence,but how google take see this websites,which will hurt so how to guide this?
Blacout:503 HTTP status code.
Webmasters return a 503 http header for the URL's which are participating in blackout.By which crawling rate will drop and the content wont be indexed.But blackout does not cause long term problem it can be recovered fairly.
For example if you want to block only some pages on the website or portion of the site then dont forget the robot.txt file status code is not changed to 503.And some think that if we are blocking then why cant we change the robot.txt file to Dissallow status,it is a wrong thinking dont do like this this wil cause the long term problem and crawl rate will be drooped.
Thus, keep simple and dont change too many things,especially dont change DNS settings,robot.txt file contents.keep settings as constant.
No comments:
Post a Comment