We have a global client who has five country sites-
domain.com/us/ domain.com/uk/ domain.com/de/
We have setup sitemap.xml files inside each folder.
domain.com/us/sitemap-us.xml domain.com/uk/sitemap-uk.xml domain.com/de/sitemap-de.xml
Now, will Google pick up these automatically inside each folder, or do we have to specify each sitemap inside the parent
robots.txt? I ask because apparently, based on Google help, robots.txt can only be in the parent domain, once. So even if we put a robots.txt file inside each country's folder, it's useless -- the respective sitemap won't be picked up.
How should we specify Google to pick it up?