Is it possible to tell Google not to crawl these pages
These are basically Ajax calls that bring blog posts data.
I created this in robots.txt:
But now I have to another page that I want allow which is
Is there a way that I tell robots that only pages that end with a number e.g
I also got an error bellow when I tried to validate the robots.txt file:
Best How To :
Following the original robots.txt specification, this would work (for all conforming bots, including Google’s):
This blocks all URLs whose path begins with
/blog/pages/ followed by any number (
So you should not append the
* character (it’s not a wildcard in the original robots.txt specification, and not even needed in your case for bots that interpret it as wildcard).
Google supports some features for robots.txt which are not part of the original robots.txt specification, and therefore are not supported by (all) other bots, e.g., the
Allow field. But as the above robots.txt would work, there is no need for using it.