Can I use robots.txt to optimize Googlebot's crawl?



Can I use robots.txt to optimize Googlebot’s crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a ‘normal’ robots.txt?

Blind Five Year Old, SF, CA

source

Leave a Reply

Your email address will not be published. Required fields are marked *