I've a problem with the Fetch as Google tool for my ajax website. My site is a little old, ajax website written using jquery. The developers who have made it haven't used the Hash Fragments. But they've defined static routes and the ajax calls are used only within the views (to load the page content). Now I wanted to make this specific page Google friendly, and I've already implemented what Google asks here.
Since my site is not a full Single page app, I've selected the third step directly. In my route file, what I did is, if i see a ?_escaped_fragment_= parameter, I return a custom template file which will have server generated content. (So it should be crawl-able, right?)
Here is an example: http://example.com/topic/Health/Conditions_and_Diseases
this page uses an ajax call to get details from the server and update the view. (I included the meta name="fragment" content="!" meta tag in this page) so the Google crawler should go to:
This page now generates the content in server side, no ajax calls.
Is this the correct setup? But when I try to Fetch this page in the Webmaster tool, it doesn't load anything. The fetching keeps saying pending and ends with an error (it takes a long time to show as Error, but nothing mentioned about the error.) I've confirmed that both these versions are working by manually visiting each url. and before I implement this, the Fetch tool actually showed the image of the page without content. So now I was expecting to see it with content. But no idea why it's taking a long time + it gives the error.
Can somebody please where I've done wrong? Is it correct what I'm thinking about the ?_escaped_fragment_= ???
Thank you in advance, Looking forward for tips from you all.