I'm using fullpage.js jQuery plugin for a Single page application.
I'm using mostly default settings and the plugin works like a charm. When I got to the SEO though I couldn't properly make Google crawl my website on a "per slide" basis.
All my slides are loaded at the page load so ajax is not even an issue, but GoogleBot doesn't seem to be able to visualize each slide as a page (I presume because all the html content is always loaded and can't tell apart the various slides)
Is there a way to make GoogleBot understand that each "/#Section/Slide" is a different page and should have a different result in the search engine? Maybe with microdata or any other semantic workaround?
Best How To :
You probably won't be able to force Google to index your anchor links as different pages. You will be able to index them as a single page. Google will read your page as what it really is. A single page.
There are some recommendations which suggest to use the
id tag to separate content. But the problem with this is that it won't work in fullPage.js as fullPage.js doesn't allow to use the same
id value as for your
This way, if you use a different value, google might index the
id element which will cause fullPage.js to break once you access to the link.
I believe that as much as you can do is using
section elements like so:
And initialize fullpage.js using the sectionSelector` option like so:
And as suggested in the previous link, use
h1 elements inside each section.
You can also take a look at this video from Google regarding single page websites. But I would suggest just to try and test it yourself with some single pages you might now, for example the main fullPage.js website. Look for portions of text displayed in different sections or slides and you'll find out how Google indexes them without any problem although the indexed link won't contain the anchor link.