I want my site to be indexed in search engines except few sub-directories. Following are my
robots.txt in the root directory
robots.txt in the sub-directory (to be excluded)
Is it the correct way or the root directory rule will override the sub-directory rule?
Best How To :
No, this is wrong.
You can’t have a robots.txt in a sub-directory. Your robots.txt must be placed in the document root of your host.
If you want to disallow crawling of URLs whose paths begin with
/foo, use this record in your robots.txt (
This allows crawling everything (so there is no need for
Allow) except URLs like