Mini Technical SEO Audit Checklist | GlowMetrics

Jun 25th 2019

Digital Marketing Insights

8 min read

Posted by Nicola Russell

Mini Technical SEO Audit Checklist

All SEO starts with tech SEO. This Mini Technical SEO Audit is designed to help you kick-start your Tech SEO efforts. For many business websites, organic traffic is the bread...

GlowMetrics Social Link GlowMetrics Social Link GlowMetrics Social Link GlowMetrics Social Link GlowMetrics Social Link

All SEO starts with tech SEO. This Mini Technical SEO Audit is designed to help you kick-start your Tech SEO efforts. For many business websites, organic traffic is the bread and butter. If your customers can’t find you on Google (or other search engines), it’s unlikely that they will become your customers. This is what makes SEO enormously important. However, there are so many different aspects to SEO it can be challenging to know where to start.  

What Exactly is Technical SEO

As the name suggests, Technical SEO deals with only the technical aspects of SEO. It’s all about optimising your website so that the search engines can crawl, assess and index the content easily. However, to fully understand the importance of Technical SEO, you need to have some understanding of how search engines work.   Crawling & Indexing: How Search Engines Work Crawling is where it all begins.

First, automated bots called “crawlers” view pages on your site. These are also sometimes referred to as “spiders,” and their purpose is to find information online. These spiders do exactly what you’re probably picturing they do; they traverse the web moving from page to page taking in the information they come across. The next step in the process is indexing. When a user performs a search, Google doesn’t instantly crawl the web looking for the most relevant results. Instead, they search their existing database of online content — known as their index. 

When a search query is entered, Google will find what they deem to be the most relevant page, and this will be fetched and displayed to the user in the search engines results pages (SERPs). The job of technical SEO is to make sure that these spiders find the information you want them to in the easiest way possible, so the search engine that’s employing the spider can successfully index your site.

It is then that your web page will rank well in that search engine and be returned as high as possible within the search results.  

How to Help the Google Spiders Out with a Technical SEO Audit Throughout this blog post, we’ll take a look at some simple things that you can do to ensure that the spiders see the content you want them to:

  1. Check for Crawl Errors
  2. Submit an XML Sitemap
  3. Check What Sites are Linking to your Site
  4. Make Sure Your Site is Secure with an SSL Certificate
  5. Check your Robots.txt file
  6. Check Your Page Speed

  Make Sure Google can Crawl your Site One of the easiest ways to find out if your site can be crawled is with the Google Search Console. This is an essential tool for any Technical SEO audit. You’ll need a Google account to use Google Search Console, and if you’re already using Google Analytics, you can also link the two accounts to get access to Search Console data in your Google Analytics account.  

How to Check for Crawl Errors:

Google Search Console Coverage Report

When you’re logged in, go to the Coverage Report area of Search Console from here, you can glean a range of insights. Google gives an excellent insight into what every one of these statuses means, and how you should go about fixing them. We don’t have room to get into all of them here, but generally speaking, you should focus on:

  • Errors – This displays all potential site errors, including server errors, redirect errors, robot.txt errors and 404s, to name a view. These should be checked at least once a month.
  • Warnings – Here, you can see pages that have been indexed but have been blocked by robots.txt. If you want to block a page from the index, Google prefers the “no index” over the robots.txt. A page blocked via robots.txt can still show up in the index if another page links to it. These warnings allow you to go through and correctly de-index those pages
  • Valid Pages – All of these pages are in the index. If you see the “Indexed, not submitted in sitemap status” you should make sure you add those URLs to your sitemap. “Indexed; consider marking as canonical” means that page has duplicate URLs, and you should make it as canonical.
  • Excluded Pages – These are pages that have been blocked from the index by a ‘no index’ directive, a page removal tool, robots.txt, a crawl anomaly, by virtue of being duplicate content, etc.

 Create & Submit an XML Sitemap

First, make sure that you have an XML sitemap for your website and that is has been submitted to Google Search Console. An XML sitemap is a clearly-defined structure of how your website content is arranged. This will tell Google where all your webpages are so that the spiders can crawl them and index. Here’s what an XML sitemap tells search engines:

  • URLs in use
  • Categories (if applicable)
  • Date of publication
  • Date of modification

By having this information about every URL on your website, Google knows exactly where to send their spiders. This ease of being found reduces the amount of data that crawlers have to search and index, helping improve the SERPs ranks of your pages.  

Here’s How to Check your XML sitemap:

Check your XML Sitemap

If you don’t have a sitemap, you can create one easily in WordPress by using Yoast SEO. You can also use an online sitemap creation tool like XML Sitemaps. Then use the Submit button to add the sitemap URL to Google Search Console. Once you submit the sitemap, Google should crawl it automatically, but if you ever make significant changes to your site, it’s worth resubmitting.  

Check Who’s Linking to Your Site Backlinks are critical for SEO success – and therefore they are central to any Technical SEO Audit. Links have played a large part in Google’s ranking algorithm for a long time. It’s reported that Google has over 200 ranking factors! However, it is believed that inbound backlinks are still considered one of the most, important factors. However, we find many business owners do not know how to check the links to their website and didn’t even know that you could.

When your site has a high number of useful quality links, from good quality websites, search engines like Google and more likely to deem your site content relevant to users – and therefore display your site high in the SERPs. Remember, links from low-quality sites can equally hurt your SEO efforts. That’s why it’s essential to find those links, as part of your off-page SEO efforts. This is easy to do with Moz.  

How to Check Your Inbound Links:

Check Inbound Links with Moz

 

When you’re in the project dashboard for your site, click on the Links > Inbound Links menu from the drop-down menu. This report shows you your links, anchor text, and which of your inbound links are fine, and which are classed as spam.   Check the SSL Certificate Having an SSL enabled website is no longer a luxury.

Thanks to the search algorithm updates introduced by Google in 2014, a secure site now always outranks a comparable non-secure website. While you are at it, also make sure that the all non-https versions redirect to a single https version of the website.  

Here’s How to Check your SSL Certificate:

SSL Certificate Check

You can quickly check the status of your domain’s SSL certificate in the address bar, by clicking the HTTPS protocol or the secure padlock (depending on the browser you’re using). Or use a free tool like SSL-Checker.   Check Your Robots.txt File Robots.txt is a simple text file that tells search engine bots and spiders which pages to crawl and which to leave out.

It gives webmasters more control over the appearance of their website in search results by disallowing the crawling and indexing of non-presentable pages like archives, categories, users etc. A standard robots.txt file can disallow up to 50,000 URLs. This, however, shouldn’t mean that only large websites need it.

As far as the ease of being indexed goes, we recommend every website – small and large – to have a well-maintained robots.txt file.  

Here’s How To Check Your Robots.txt:

Test robots.txt with Search Console

You can check the robots.txt file in your Search Console Dashboard – remember as of June 2019 this is only available in the old Search Console dashboard. Skim through the disallow tags to see if any essential pages are accidentally blocked (trust me, this can happen!). Standard Robots.txt files are located directly in the root folder (https://your-domain.com/robots.txt). This means you can quickly check how your competitors are using their robots.txt file. Just type in www.EXAMPLE.com/robots.txt to see.  

Check your Page Speed People are impatient. Google knows this. This is why Google loves fast websites. The load times for web pages are becoming increasingly important to maintain a certain level of user experience. Your customers don’t want to wait around.    Here’s How to Check if your Site is Quick Enough: For this, the best and easiest tool you could use is Google’s PageSpeed Insights.

The best place to start is your home page of the website. Just enter the home page URL in the PageSpeed Insights tool (don’t forget to type in the correct HTTP/HTTPS version of the URL), and Google will tell you if your site is up to the mark or not.

Google PageSpeed Insights  

But what do you do with all this new information? Well, if your site speed is lacking, don’t panic! Google provides optimisation recommendations for you to implement and improve your speed. For most businesses, it is helpful to sit down with your web developer to discuss any possible changes that Google recommend.

Google PageSpeed Insights Results

Check If your Website Is Mobile-Optimised SEO is an ever-evolving process, and things are always changing. Each search engine has its own algorithm, and these algorithms get improved, updated or altered frequently (sometimes without you even knowing.) As rules change, you need to be able to adapt to new SEO tactics. The mobile friendliness factor is an excellent example of this. Just a few years ago nearly all websites were designed to give the best user experience on desktops – any traffic from mobiles we’re just a bonus. However, things have changed A LOT.

Everyone owns a smartphone, and they all expect a seamless mobile user experience.   Here’s How to Check if Google Thinks Your Site is Mobile Friendly: Google, once again, is your best friend here. Use the Google Mobile-Friendly Test to get an instant rendering of your website.

Google Mobile Friendly Test

 

Never Ignore Any Technical SEO Red Flags Hopefully, this should have given you a ‘brief’ indication of what Technical SEO entails, but it’s far from an exhaustive list of everything you can and should be considering for your site. It’s probably fair to say that it barely scratches the surface… However, if this mini technical SEO audit reveals any warning signs or red flags, it may be time to let an expert take over.

Click here to drop us a line! Have you found any technical SEO errors on your website? Let us know about them by commenting below.


GlowMetrics Social Link GlowMetrics Social Link GlowMetrics Social Link GlowMetrics Social Link GlowMetrics Social Link

Leave a Reply

Your email address will not be published. Required fields are marked *

GlowMetrics

Posted by
Nicola Russell

Nicola is the Digital Team Lead at GlowMetrics, preparing and executing PPC & SEO strategies for new and existing clients. Nicola specialises in integrated marketing campaigns, combining the best of paid social and paid search to get the finest of results for her clients
Read more from Nicola Russell

Browse by Category



GlowMetrics
Optimise your website and marketing campaign performance with Ireland’s leading digital analytics agency
Sign up to our newsletter