If you own a small or even medium-size business, you will know that getting traffic from Google is a great source.
Most people realise that having great content and high-quality backlinks will boost their organic rankings, but what most people don’t realise or focus on is the technology the site is built on and the importance this plays.
A great quote is “You wouldn’t build a million-dollar house on quicksand, so don’t build a million-dollar website on poor foundations”.
Fixing the foundations is a great way of maximising your other efforts.
In this article, I will show you some of the basic errors you should be looking for when auditing and, more importantly, how to fix them.
The first stage of any audit is to use a crawling tool to crawl your website. This can be done manually but can take up a lot of valuable time, so use a third-party tool to do most of the work for you.
A crawling tool will save you both time and money but will only find about 80% of the issues when compared to hiring a specialist technical audit company—offering a great middle-ground option.
A good choice would be something like Screaming Frog which is around US$179 for the year.
With this tool and many others, you simply input your domain name into the field and off it goes to crawl your site.
The downside to this is that there is no visual reporting if you need to check the data, although you can find free dashboarding audits for Screaming Frog if too much raw data overwhelms you.
So, now you have crawled the site, let’s take a look at some of the issues you need to be aware of.
For the error codes, I have grouped them into categories, rather than looking at them individually. So 5xx means any response code which starts with a 5.
There really shouldn’t be any 5xx issues. If crawling your website causes 5xx issues, it means your server is struggling to cope and Google has confirmed in the past that they will crawl less frequently if they keep receiving 5xx responses.
How to fix:
There could be several reasons for 5xx issues but the main one is usually a poor server set up. Either look to improve your server or change hosting.
4xx errors are basically landing on a dead page, which isn’t good for either Google or any potential users, who should be landing on live pages. Landing on dead pages massively increases your bounce rate.
How to fix:
When you have found the broken links, you have two choices: simply remove the link or, if there is a resource on-site which is suitable, change the link to that.
These are slightly better for the user but still not great for Google. Instead of landing on a dead page, the user lands on the correct page but has gone through a redirect chain. This slows down the page load but more importantly wastes Google crawl budget.
How to fix:
This one is super simple! You already know where they need to link to as a redirect is in place. Simply edit the link to the final destination.
Google confirmed that page speed is the third biggest ranking factor for mobile and an important factor for desktop.
Most people check the home page and assume the rest of the site is the same. However, you should check each page individually. This used to be quite a manual process but now you can connect Google Page Speed API to Screaming Frog and, when crawling your site, it analyses every page.
How to fix:
This could be fairly difficult depending on your knowledge and the platform you use—but for slow pages, you’ll want to make the recommended fix Google is suggesting.
Large images on a website are usually one of the factors that slow down a page. These are easy to spot using tools like Screaming Frog.
How to fix:
This fix requires a bit of manual legwork but will be worth it in the long run. Simply compress the images and resize them to the correct dimensions. The good news is there are great tools like compressjpeg.com which will do all the compressing work for you for free.
Low word count
In 2011, Google released an update to its algorithm which targets low quality and spun pages. The update was called the Panda Update.
Whilst this isn’t always a bad issue, too many pages containing a low word count can be a bad sign. Again, Screaming Frog calculates this information for you.
How to fix:
Unfortunately, the fix is relatively laborious as you will need to manually review every page. If you are happy with the content and don’t think anything else needs doing, leave it (also don’t add words for the sake of adding words—if what you need to say can be written in 90 words, write 90 words).
If, however, after reviewing your content, you realise that it could do with an update, then go ahead and re-write the post. It may be the case that changes have occurred which means you have more to talk about; maybe your expertise has increased allowing you to add more content. Don’t add words to simply bulk out your copy, but if you can expand on a topic, that’s great.
With other pages, you may realise they are no longer needed, serving no real purpose. Simply delete such pages and redirect the URLs to a more suitable place. That might mean another crawl of your site to check that you haven’t added more redirects. For a time, Google will continue to crawl these pages, so rather than landing them on a dead page, it’s much better to redirect them somewhere. The key is removing the internal links so that Google doesn’t keep adding them to its “to crawl” list.
Getting More Advanced
If you really want to get past the basics and take your auditing to the next level, it’s time to start analysing your server logs and examining what Google is doing on your website.
Whether you manage a small website or a huge ecommerce website like Amazon, Victorian Plum or any other large site, carrying out regular updates is important. Once you have identified issues, fixing them is key.