How To get a Single Page Application Indexed
“Single-Page Applications (SPAs) are Web apps that load a single HTML page and dynamically update that page as the user interacts with the app.” I kept in mind that SPA’s are dynamically generated. The first thing I had to do was to let Google know about each URL. This is where I started a very in-depth XML Sitemap Strategy, which we dubbed Ungoliant After 3 months of intense XML Optimization, Log File Analysis, and monitoring Google Search Console, we created our Title & Description Tag and Alt Text script called Shelob.
The XML Sitemaps are all .xml.gz compressed and kept at either 50mb or 50 000 URL’s. A basic XML Sitemap structure for client:
<–Sitemap Location on Robots.txt–>
<–Multiple Sitemap Management (compressed)–>
<?xml version=”1.0″ encoding=”UTF-8″?> <sitemapindex xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9″>
<–Sitemap with Single URL & Image Information–>
<?xml version=”1.0″ encoding=”UTF-8″?>
<urlset xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9″ xmlns:image=”http://www.google.com/schemas/sitemap-image/1.1″ xmlns:xhtml=”http://www.w3.org/1999/xhtml”>
We do not use any Priority in our XML Sitemaps.
We rely on lastmod.
Having to select what is more important than others just didn’t made sense, so we left it at lastmod.
Our next challenge was deciding how often we would generate the XML Sitemaps. Due to the large scale, we didn’t want to generate on a daily basis. If the platform expanded 10x, we might start running into problems.
After each XML Sitemap update, we submitted the XML Sitemap to Search Console and monitored our Log Files.
For the first month, Google crawled regularly with each update. After 2 months, this was less. (I guess they could start to form a
pattern and had enough information to know when to recrawl certain URL’s).
We went from updating twice a week, to updating once a week.
Every time we updated the XML Sitemaps we saw the Google Crawler requesting our INDEX File, but not requesting all of the XML Sitemaps.
By then we knew once a week is where we would keep it at, and we stopped by submitting manually instead to Search Console.
Fixing some SPEED errors improved our Crawl Budget. Another test I did was testing 4 new URL’s, two were left untouched, and for the other two, I created backlinks from Pinterest to them. The two that received backlinks from Google were recrawled and indexed before the other two.
Increasing Total Clicks & Total Impressions
Increasing Total Clicks & Total Impressions & Avg Position.
Structured Data | Currently in Test Mode (BETA)
Indexed Pages Increased with
Avg Keyword Clicks Increased with