Can I use robots.txt to optimize Googlebot’s crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a ‘normal’ robots.txt?
Blind Five Year Old, SF, CA
source
Can I use robots.txt to optimize Googlebot’s crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a ‘normal’ robots.txt?
Blind Five Year Old, SF, CA
source
is this guy Matt Cuts ?
This is an old video but this answer still holds. Thank you!
Nice explanation. Thanks for it.
nope
after update my robots.txt how can i check my webpages will index …….
can i know immidiately or it will take minimum 7 days .. please help me how to know my blocked resources are release or not
1980
What is the root page is he taking about?
can someone tell me how long it takes for google to refresh the robots.txt? my robots.txt file changed. right now its using the old version and it says 2,613 URLs have been Blocked because for some reason the line that shows my Sitemap URL has: "HTTPS" in front of it but i dont have an SSL certificate! its suppose to read "HTTP". need it refreshed asap please!
What is the best way to convert from ASP to ASPX without losing all of my Google links?
@iLovePalestineDotCom You can always remove the already indexed pages too.
Why you dont stop answering question from that 5 year old ?, It's your cousin or something?
KINDA NOOOOOOOOOO VIDEOS fan club!!!
There were NO ROBOTS txting, as suggested by the title.
You're a nice guy Matt, but I'm done with you professionally. 🙂
Google need the brand new Googlebot technology. Just like the new search algorithm.
@Sander33333 Go to the webmaster forum. If you're lucky, Matt will answer your question in a video.
Don't scare googlebot, come on man 🙁
Awesome Matt you Rockkk!