Det här innehållet är inte tillgängligt på ditt språk. Här är den engelska versionen.

Main SPA SEO challenges and ways to make your web app discoverable in search

Dec 24, 2020 · 10 min lästid

If you are going to build a single-page application (SPA), you should think about its visibility on the web beforehand. SPA SEO is different from that of a static website because search engine and social media agents have to deal with JavaScript instead of HTML.

Since 2015 Googlebot has been able to render JavaScript and index dynamic content on SPAs, however, the process still has some peculiarities and drawbacks. As for the other crawlers, neither Bing, DuckDuckGo, and Baidu nor Facebook and Twitter bots can cope with JavaScript so far.

Does that mean SPAs are bad for SEO? No, they aren’t if done right. A lot of SPA pages are ranking high on Google and look good when referred to in socials. Yours can too. Our article and single-page app SEO tips from Pavel, Frontend Developer at Proxify, will help you avoid common missteps and get your project on top of search results.

Let’s start with cold hard facts about SPA SEO

Classic SPAs have their content rendered client-side (in the browser). When your app loads, your server sends the browser an empty HTML shell and some JavaScript code to execute. The browser runs the code and fills the page with meaningful content. With new pieces of code executed, the content inside the shell dynamically changes letting users navigate through different views of your app without loading new HTML pages.

The way SPAs operate enhances user experience however results in several bottlenecks that are critical for your project's SEO performance:

  • When Googlebot renders and indexes dynamic pages the success depends a lot on the app’s JavaScript rendering speed and stability.

  • To rank on search, each view of your app should have an individual clean URL despite being rendered on a single HTML page.

  • Search engine and social media bots should get metadata in the static HTML to index your pages or show previews when the pages get linked to on socials.

Now let’s zoom in on each of those facts to see why they are an issue and how you can make your web app perform fine in terms of SEO.

1. JavaScript code should run fast and flawless

If you choose client-side rendering for your SPA, you’ll have to keep an eye on how Googlebot processes it and try to simplify the task. As described in the official documentation, Googlebot has a three-stage workflow for processing JavaScript pages, which includes crawling, rendering, and indexing.

How Googlebot crawls, renders, and indexes SPAs

How Googlebot indexes SPAs

Source: JavaScript SEO on Google Search Central

SEO pitfalls of client-side rendering

Because of the extra stage needed for rendering, it takes longer for SPA content to appear or update in search compared to static HTML pages. The risk of indexing errors also goes up. You may face the following challenges:

Your SPA pages may have to spend some time in the render queue, which can be long considering the number of pages created and updated daily on the web. Your JavaScript code should execute fast enough to fit in the Googlebot’s render budget, otherwise, some parts of your pages may not render properly and get indexed with incomplete content. In case of rendering errors, the HTML generated by Googlebot may be broken or empty, which may block your entire pages from indexation.

The above concerns have been proved by independent single-page application SEO experiments. The results showed that Googlebot always indexes the content generated immediately after a SPA page loads. They also proved the content that is supposed to render on-demand (as users scroll or take actions on the page) or after a long delay (content coming from external resources or animated text) doesn’t always get indexed because of the limited render budget.

“For SEO it's important to have small JS code for fast initial page load. It concerns both SPAs with client-side rendering and with server-side rendering. The implementation of lazy loading solves the problem of slow web apps very well.” – Pavel, Frontend Developer

There’s one more important thing Google bot does after rendering your JavaScript code. It parses the generated HTML for links and passes all the URLs it finds back to the crawling queue. Because links become available for the crawler only after rendering, Googlebot may miss some of them in case of faults in the code execution. Even being discovered, there are cases when your app’s URLs may appear unfriendly for the crawler.

2. URLs and internal linking should be SEO-friendly

You might want different views of your app to be indexed as standalone pages and rank in search. For this, you should help Googlebot find all of your URLs and navigate through your app architecture. Beware that crawlers are sensitive to the way you implement routing between different views of your web app.

SPAs rely on routers that keep their UI and content in sync with the changing URLs without refreshing the page. When it comes to SPA routing modes, you have two options: hash-based routing and History API routing. Only the latter is SEO-friendly.

SEO pitfalls of SPA URLs

If your router works in the hash mode, it will add #hash fragments added to your home page URL to generate unique URLs for different pieces of your SPA content. Crawlers always ignore hashed URLs as they indicate parts of the same page. They won’t index different app views as separate pages if you use hash-based routing.

If your router works in history mode (uses History API), crawlers will get a unique normal URL for every separate piece of content. However, all these URLs won’t exist on your app’s server, which may cause 404 errors when a user tries to directly access a non-existent URL.

To use clean and SEO-friendly URLs while avoiding the 404 problems, you’ll have to add a fallback route for your server to redirect requests to your index.html page where your app lives. Although this requires additional effort, all popular JavaScript frameworks and libraries provide such an option.

Having unique URLs is enough for indexing but not for ranking high on search. To make your URLs competitive, you’ll have to provide them with unique titles, descriptions, and preview images. Moreover, if you want them to be picked up not only by Googlebot but also other agents, you’ll need them in static HTML.

“Make sure you create an XML sitemap for your project. It is submitted to Google, Bing and other search engines to help them crawl your website better.” – Pavel, Frontend Developer

3. Meta tags should be unique for each URL

The header of your SPA belongs to the static HTML part of your app. It contains by-default meta tags that don't change when different page content is rendered. However, you want titles and descriptions that clarify to both users and bots what information they’ll find on each of your URLs.

You can address this issue using special utilities available in frontend frameworks and libraries. They’ll help you generate dynamic metadata and push it to your page header. However, there still will be some JavaScript code to execute before the metadata can be crawled and indexed. Which makes it unavailable to search engines and social media agents that don’t process JavaScript.

To make your metadata accessible for all bots, you should send it with the initial HTML when your page loads. The way around it is in switching from client-side rendering for your SPA to its alternatives.

Can all the single-page app SEO issues be solved at once?

Surelly. Popular JavaScript frameworks, like React, Angular, or Vue, offer pre-rendering and server-side rendering (SSR) as a way to cope with single-page application SEO challenges. Those solutions are comprehensive and proven but have some drawbacks though.

Both pre-rendering and SSR add complexity to your SPA and require hiring more experienced developers. Those methods also require extending your app tech stack and raising requirements to your servers. Therefore, before you opt for either option, you should decide if the expected benefits are worth the extra development and maintenance cost.

In terms of SEO, pre-rendering and SSR allow you to offload rendering your JavaScript from the crawlers and streamline your app content indexation. Let's look closer at how those methods work and learn when to choose either of them.

How pre-rendering improves SEO for single-page apps

Pre-rendering is a simpler and lighter solution. It will work well if you have just a handful of marketing pages to optimize. If that's your case, all you need is to pre-render static HTML files for specific routes of your app at build time. This way you'll keep your marketing assets in the form of a static website that’s easy for bots to consume.

To pre-render your SPA you’ll need to use a special plugin or third-party service, like prerender.io, that is compatible with your JavaScript framework or library. Those tools will fetch desktop and mobile HTML versions of your pages and save them in the cache. Cashing will ensure fast response time to users’ and crawlers’ requests.

How dynamic rendering makes single-page apps SEO friendly

Dynamic rendering builds upon the pre-rendering method and is recommended by Google. This approach allows serving the same content to users and bots differently. Your users will continue to enjoy client-side rendering, while bots will get redirected to a dynamic renderer that serves them pre-rendered HTML.

To set it up, you’ll need your webserver to distinguish between crawlers and users by checking the user agent. You’ll have a list of agents that should get your static HTML, such as Bingbot, Googlebot, Linkedinbot, etc. Your server will redirect requests from the listed crawlers to your dynamic renderer that will send them cached HTML pages.

Switching between client-side rendered and pre-rendered content will allow your app to take the best of both worlds. You’ll deliver as much content to bots as possible without any harm to the user experience.

The limitation of pre-rendering is that it doesn’t work well for large apps with frequent content updates. It may take just too much resource to pre-render and cache each route in your SPA. For such cases, you might want to use server-side rendering instead.

How SSR boosts SEO for single-page applications

Server-side rendering is more difficult to implement. However, if you plan to have an online store or app with countless dynamic pages to be indexed by search engines, choosing SSR will be more than reasonable.

SSR will require you to set up a Node.js server. This technology allows creating universal (isomorphic) web apps that run JavaScript code both client-side and server-side. Universal app architecture makes your app scalable and ensures fast response times even under high loads.

With SSR, all requests from users and crawlers will be sent to the server. The server will execute your JavaScript code in the Node.js runtime environment, render HTML pages in real-time and send them back in response. This way you remove the need for browsers and bots to process JavaScript code. Your SPA becomes equally SEO-friendly as static HTML websites.

To make the SSR implementation easier you can use out-of-the-box solutions for building isomorphic apps – Nuxt.js for Vue.js framework or Next.js for React.js library. Read more in our posts about how to make Vue and React SEO-friendly.

“SSR brings in problems with performance. Because the server will render a particular route of your SPA instead of just sending an HTML page to a user. Your app will require a more powerful infrastructure to handle loads.” – Pavel, Frontend Developer

Need a JavaScript ninja to build an SEO-friendly SPA?

You’ll find a perfect software developer for your web app on Proxify.io. Having senior-level talent on our network, we can match you with the right specialist within two weeks. Don’t miss out on the opportunity to hire JavaScript developers at rates starting from 29€ / h. Send us a talent request and get the work on your project started!

Innehåll