Can I use robots.txt to optimize Googlebot’s crawl?



Can I use robots.txt to optimize Googlebot’s crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a ‘normal’ robots.txt?

Blind Five Year Old, SF, CA

source

17 thoughts on “Can I use robots.txt to optimize Googlebot’s crawl?”

  1. after update my robots.txt how can i check my webpages will index …….
    can i know immidiately or it will take minimum 7 days .. please help me how to know my blocked resources are release or not

  2. can someone tell me how long it takes for google to refresh the robots.txt? my robots.txt file changed. right now its using the old version and it says 2,613 URLs have been Blocked because for some reason the line that shows my Sitemap URL has: "HTTPS" in front of it but i dont have an SSL certificate! its suppose to read "HTTP". need it refreshed asap please!

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top