I have the following robots.txt
to disallow all Bots except Google's. I made this change last week and when I search for my domain name in Google I still get
A description for this result is not available because of this site's robots.txt. Am I doing something wrong? How often does Google respider a domain
Best How To :
Your robots.txt is not doing what you want (but that’s not related to the problem you mention).
If you want to disallow crawling for every bot except "googlebot", you want to use this robots.txt:
Disallow: / means: disallow every URL
Disallow: means: disallow nothing (i.e., allow everything)
To your problem: There is no definite answer to how often Google crawls. It depends on different, for us non-calculable, factors. Having to wait a week or two is not unusual.