What is an SEO Tech Audit, and Why Do I Need One?

Patrick Coyne
Patrick Coyne Director of Organic & Local Search

seo audit toolsWith content and links hogging all the attention, technical work is sort of the red-headed stepchild of the SEO world. Sure, creating great content and earning powerful, contextual backlinks are essential to the Search Engine Optimization of your website, but a technically deficient website could render all other tactics ineffective.

It also doesn’t help that technical SEO is the least understood aspect of SEO. So, let’s start with the basics:

What is Technical SEO?

Technical SEO focuses on meeting Search Engine guidelines and requirements for crawling, indexation, and user experience. This is the SEO work that falls outside of on-page content and links. A technically-sound website enables Search Engines to crawl your content and contextual backlinks efficiently, and rank your webpages accordingly, thus improving your visibility with the SERPs.

So okay, technical SEO is clearly important. But you might still be scratching your head as to how to get started. This is where a technical audit comes into play.

What is an SEO Technical Audit?

An SEO technical audit is an in-depth assessment of every aspect of the website that may be affecting Search Engine indexation and visibility. The ultimate goal of the audit is ensuring that the site meets the requirements for indexation.

Why Do I Need an SEO Tech Audit?

Technical SEO is the foundation on which all of your on-site and off-site SEO work is built. Because what is the use of having terrific content if you’ve accidentally blocked Search Engines from crawling and indexing it?

Performing an SEO technical audit allows you to do a checkup of the website and explore the different ways in which your website may be underperforming due to inefficient technical setup.

SEO Audit Checklist

Below is a simplified version of an SEO audit checklist. While every website is different, and there are certainly more that can be included, this blueprint provides a good understanding of where to begin.

Infrastructure

  • Perform a Site Scrape: A site scrape tool, like the Screaming Frog SEO Spider Tool, will crawl the entire website and then generate a list of all site URLs, images, CSS templates, etc. Understanding the full depth of the website is the first step in identifying and fixing any technical SEO issues. Many of these tools can also generate a sitemap for you if need be.
  • Check for Errors and Redirect Chains: The site scrape will show you any 4XX/5XX errors and redirect chains happening on the site. A 4XX error is a response code that occurs when a server cannot find the requested webpage. A 5XX is a similar response code that indicates something has done wrong at the server level. A redirect chain is a series of redirects that go from one page to another to another. If a page needs to be redirected, it should ideally only be one leap. Too many 404s and Redirect Chains can drain your Crawl Budget. Crawl Budget is defined as the number of URLs Google can and wants to crawl. Because your Crawl Budget is limited, you’ll want to make sure Google crawls your most important pages first.
  • Check URL Structure: A properly structured website makes it easy for Search Engines to crawl the site and helps them understand a page’s context within the larger website, making it easier to identify the most important content.

Sitemap

  • Ensure That Your Sitemap is Up to Date: An XML sitemap is a feed of the website content that you would like to see indexed by the Search Engines. If you don’t have a dynamically generated sitemap, it may be missing the URLs of newer pages. Make sure that the sitemap matches the current state of the site. Also make sure that your sitemap is free of errors and old URLs.
  • Submit your Sitemap to Google Search Console and Bing Webmaster Tools: Submitting your sitemap to the major Search Engines makes it easier for them to do their job, and easier for you to get indexed quickly.
  • Examine (or Create) a Robots.txt File: The Robots.txt file lets Search Engines know which website sections should not be crawled and indexed. Websites frequently use this file to hide things like Admin pages or Conversion Goal Success pages from Search Engines. If a Robots.txt file exists, make sure it is not accidentally blocking something that should be visible, and vice versa.
  • Check Indexation: Google Search Console or Bing Webmaster Tools will be able to tell you how many of your pages are indexed. There are also third-party tools that can provide you with this information. You want to make sure that the number of indexed pages is close to the total number of webpages, not including intentionally blocked pages. If some pages aren’t showing up in Google or Bing’s index, figure out why.

 On-Site Links

  • Check Internal Links: You never want to send users or Search Engine spiders to a broken webpage. Make sure that if you’re linking to another page on your site, the URL is correct. The last thing you want is for a user to not get the information he or she is looking for and then exit your site.
  • Check External Links: Same deal, but with links pointing to other websites.

 Content

  • Check for Orphan Pages: Orphan pages are webpages on your site that are not linked from anywhere else on your website. Because of this, Search Engines are unable to crawl them. Additionally, the site scrape tool won’t be able to find these pages either, since they work the same way as a Search Engine spider – by bouncing from page to page via internal links. To identify orphan pages, try generating a list via your CMS and match that up to the list of indexed pages found in Google Search Console or Bing Webmaster Tools.
  • Add Canonical Links: The rel=canonical tag is an HTML element that identifies the preferred version of a webpage. This is useful if you have multiple versions of a webpage and you want Search Engines to know which one should be indexed (potentially avoiding duplicate content issues).
  • Manage Duplicate Content: Duplicate content is potentially hurting your Crawl Budget. This includes both on-page content and meta content such as Title Tags and Meta Descriptions. If the pages have SEO value, make sure each one has unique content. If not, then you should….
  • Restrict Access to Non-SEO Value Pages: Restricting access to pages that don’t need to be indexed is best. This can be done a few different ways, including via Google Search Console.
  • Page Speed: Page Speed is an SEO Ranking factor but is also extremely important from a usability standpoint. If it takes too long for your page to load, a user will quickly move on to a different website. Get started by using Google’s PageSpeed Insights tool to get an idea of how fast your site is: https://developers.google.com/speed/pagespeed/insights/

 

Got questions about conducting an SEO technical audit? Contact us now!