How to improve SEO performance in a Create-React-App

I’m currently developing an app where I’ve had to delve into the world of SEO (Search Engine Optimization) performance. React apps are at an inherent disadvantage when it comes to SEO because the content is rendered by Javascript and not always present. This can be circumvented with SSG/SSR with frameworks such as Next.js.

In my particular case, I built this app with Create-React-App so I ended up going a different route to improving SEO performance.

To give a brief overview of SEO, there exists something called crawlers. They scour the internet and check out web pages and analyze them to understand how they should be positioned or indexed at all for various search engines such as Google, Bing, etc.

Several aspects have to be taken into consideration such as, but not limited to: the actual page content, what other pages (and their credibility) refer to yours, page performance, crawler configurations via robot.txt, metadata, image optimization, and more!

Disclaimer : This is not an end-all-be-all but just a list of tips that have helped me.

Hosting Configuration

One of the first important steps is prerendering the content for the crawlers to correctly analyze the site.

Netlify, as one example, offers a suite of plugins and settings to streamline and personalize the deployment and web hosting process. For the purposes of the guide, I’ll focus on one particular setting that can be found in Site Settings > Build & Deploy.

Netlify

The prerendering option is precisely what helped me! I would recommend looking into similar options with other web hosting solutions that can prerender the site.

Robots.txt

Web crawlers typically review a file called robots.txt. This is to suggest to crawlers what should or should be crawled, what specific crawler bots are allowed. This file is automatically generated with create-react-app in the public folder.

Example:

# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:

More info on robots.txt.

Integration with Search Engines

At first, I felt very overwhelmed when attempting to understand how my page should be positioned in Google or Bing. but these very same search engines offer ways to monitor your progress on the search engines!

Google offers the Google Search Console

Microsoft offers the Bing Webmaster Tools for Bing

Yandex offers Yandex Webmaster

I strongly suggest submitting the webpage to every single tool. Every single one offers a variety of tools and tips to see how the performance is and if there are any SEO problems on the page itself.

Keep in mind that verification of page ownership is necessary.

Sitemap.xml

In every single one of the tools offered, one of the primary forms to improve SEO and page navigation is to submit a sitemap. Typically for SPAs, this isn’t necessary since of the low amount of routes but it doesn’t hurt. Google has a short article on how to generate one.

Example:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
  <loc>https://www.MyApp.com/</loc>
  <lastmod>2021-08-04</lastmod>
</url>
</urlset>

Page Performance

The whole world of page performance would go out of the scope of this article but a tip to quickly improve is to go ahead and open up the developer tools and select Lighthouse. This is a tool offered by Google to analyze various page properties to determine: Performance, Progressive Web App practices, Best practices, Accessibility, and SEO performance on mobile and desktop.

This is completely free and gives great insight into what could be quickly improved!

SEO

Title and Metadata

There are various amount of HTML tags that can help crawlers to understand the site and site content to improve SEO performance and indexing.

  • HTML Language: Indicates the primary language of the webpage
  • Meta Viewport: Gives the browser instructions on how to control the page’s dimensions and scaling
  • Meta Description: Provides a brief summary of a web page. Typically should be around 160 words. Include important keywords for crawlers to understand the site!
  • Link Canonical: if the site is a single page that can be navigated to by multiple sources this tag can be set to consolidate them.
  • Title — it is shown in the browser’s tab and is a primary way to for search engines to find the page, should typically be within 50–60 words
  • Optional: Open Graph — Various social media sites such as Twitter and Facebook use the open graph protocol to understand your site and display it in different ways when linked to it.

Example:

<head>
  <meta charset="utf-8" />
  <link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
  <meta name="viewport" content="width=device-width, initial-scale=1" />
  <meta name="theme-color" content="#000000" />
  <meta name="description"
    content="Description" />
  <meta property="og:title" content="" />
  <meta property="og:type" content="" />
  <meta property="og:url" content="" />
  <meta property="og:image" content="" />
  <meta property="og:locale" content="en_US" />
  <meta property="og:description"
    content="" />
  <link rel="canonical" href="" />
  <link rel="apple-touch-icon" href="%PUBLIC_URL%/logo192.png" />
  <link rel="manifest" href="%PUBLIC_URL%/manifest.json" />
  <title>My Site Title - What it does</title>
</head>

I’ve included the auto-generated create-react-app tags for the favicon and various images.

Content

I’ve saved the best for last. The single most important thing for SEO performance is the actual page content! Great page content that is relevant and provides value is essential. If the page provides value it is incredibly more likely for it to be shared. This sharing creates some things called backlinks which crawlers take as basically an upvote for your webpage and greatly boosts its SEO performance.

If you have any more tips or details on what I covered share them in the comments below.

Originally published at relatablecode.com on August 13, 2021.