Recently i have been having some issues with the robots.txt file on my webserver according to google webmasters tools. More precisely i get a "Crawl postponed because robots.txt was inaccessible." message. This is weird, because if you try to access it: http://www.newsflow24.com/robots.txt it looks just fine, even the google crawl tester shows that there's no problem, but the real google bot seems to be having some issues.
So to find out what's happening, i would like to know, how can i see a log file or something where i can find out what happened exactly when the google bot tried to access the robots.txt file at x time.
The first paragraph is just an intro to my problem, my real question is the second paragraph.
Thanks in advance.