My Experience Using a Prerender Service

JavaScript Client-side Rendering Drawbacks

I explain the differences between server-side render and client-side render in another programming blog post. Search engine bots have difficulty processing client-side rendered pages since bots can’t execute scripts. This is especially true with JavaScript. Although Google is better than other search engines for processing JavaScript pages, it’s still not perfect.

Why Use Prerender Services?

Prerender services crawl a website as a normal visitor. Using their own user-agent. They download the DOM (document object model) and then cache the HTML on their servers. When a search engine bot requests the page from the web host, the request is intercepted by the prerender service and a server-side cached version is served instead of the client-side rendered version.

All of this sounds simple and, like all technologies, seems ideal. However, like all technologies, quirks exist in this supposed simple system. Read on to discover some of the problems encountered while relying on a prerender solution for a JavaScript website.

Prerender Encounters

I won’t expose the website where I encountered a prerender service for the first time. It consisted of an “index.html” shell page which would load React and client-side render the page according to the URL and backend API calls. By default, Google wasn’t able to crawl the DOM HTML; prerendering was required to index the pages in the search engine.

Based on my experiences, my conclusion is that websites should strive to serve the pages themselves to search engines or use a CDN (which caches page source HTML code and CSS/JS files). You could reserve select static pages for indexing, and then use JavaScript for dynamic pages that don’t need to be indexed by search engines. This can be accomplished by creating a separate version of the site (usually represented as a sub-domain) that’s server-side render and designed to be SEO-friendly.

Here are all the headaches I encountered when I used a prerender service.

Prerender Failure

There was a routing failure which caused search engine requests to not redirect to the prerender service. As a result, our client-side rendered pages were indexed improperly. Google was only able to crawl the page source of the “index.html” shell page instead of the DOM. Showing a generic title and a weird description snippet. Once the routing was restored, Google was able to crawl URLs correctly and traffic returned to normal levels.

CSS Parsing Problems

When I logged into the prerender service interface and previewed a number of our pages, I discovered that their appearance varied very differently than what the user was viewing. The layout looked like a giant jigsaw puzzle, and this was being served to Google. The problem was that the prerender service wasn’t able to parse certain CSS selectors on the page. They solved the problem by implementing a custom solution for our account.

Desktop vs. Mobile CSS Intricacies

Some React websites are programmed to dynamically render for mobile users instead of relying on CSS breakpoints. This could cause desktop pages to be served to Google smartphone bots, causing the search engine to conclude that pages aren’t mobile-friendly. I can’t say that I encountered this issue, but it’s possible for this problem to occur if a React site doesn’t depend on CSS @media breakpoints.

Modals Are Cached Instead of Page Content and Features

If a pop-up modal becomes the main focus of the window, then it’s possible for the prerender user-agent to only crawl the HTML of the pop-up. Preventing Google from being able to crawl content and links in the DOM, which could cause pages from ranking for search terms.

A/B Testing Experiences Get Indexed

Most A/B testing platforms exclude search engine bots from getting experiment versions. However, prerender user-agents aren’t excluded by default. Therefore, it’s possible for Google to detect experiments and treat them as part of the page’s HTML. If the variations aren’t SEO-friendly, then its rank can decrease.

Are There Prerender Alternatives?

There are several prerendering services which transform JavaScript into HTML for search engines to crawl:

  • CoreRender
  • SEO4Ajax
  • SnapSearch

These services were discovered by visiting Stackshare. A CDN could be a viable alternative if the server’s able to intercept the search bot request and serve an HTML server-side version for crawling.

Do I Recommend Prerendering Services?


Leave a Reply

Your email address will not be published. Required fields are marked *