Some SEO health checks can be very detailed and can require or promote paid for tools.
Not this one.
This SEO guide is designed for you to be able to quickly check 10 different aspects of your website that may be affecting your web presence. All the tools that I use in this guide are free, easy to use, easy to understand and, most importantly, actionable.
Caveat: SEO is a vast topic! I wont cover everything but I will cover the areas that would either have a big impact on site performance or would introduce you to tools that are incredibly useful.
The areas this SEO health check covers are:
1. Checking your title tags and meta descriptions
Title tags are important because they act as a summary for your content in the search results, help Google decide the page relevancy to a users search and act as a ranking factor. Meta descriptions on the other hand act as a brief summary of what that page contains and are not a ranking factor but can reinforce a decision to click on your URL.
Title tags: What should you be aiming for?
- Avoid duplication – each page should be unique therefore each title tag and description should be too
- Include your keyword in your title tag and your description
- Aim for your title and description to be less 55 and 155 characters respectively
How to check your title and meta descriptions
To do an SEO health check on your title tags and descriptions, a page at a time, download the mozbar and click on the page analysis button:
This will show you the information on title tags and meta descriptions for that page:
If you have a website with a fair amount of pages you can use Google Search Console (previously called Web Master Tools) to find a summary of your sites content. Doing this will allow you to identify any elements that are duplicated/too long/too short:
2. Checking your on page copy
Headings (H1, H2’s etc), alt tags and keyword density should all be considered but please note this shouldn’t be at the expense of how engaging and useful your content is. There’s no point getting content found if it sounds spammy and repetitive.
On page copy: What should you be aiming for?
- Word count – As a guide aim for 300+ words as a minimum
- Keyword density – avoid sounding spammy but use your keyword and don’t be afraid of using synonyms (Googles use of Latent Symantic Indexing means that it knows the general context of a page without you having to repeat your keywords)
- Consider the use of keywords in your headings
- Use pictures – pictures area great at explaining things quickly and alt text helps describe the pictures to visually impaired customers
What to do: Back to the MozBar and pick the analyze page option:
Going to the MozBar icon where you looked at your title tags and meta descriptions will let you check all of the on page factors that go into making a web page relevant for a particular query:
3. A quick check to see if Google thinks you have duplicate content on your site
Google doesn’t want multiple copies of the same information as they serve little purpose for a users experience. If Google thinks you have duplicate content on your site you can do a simple check using a command in the google search box:
Type “info:www.your-domain-name.com” and click on ‘pages from the site‘:
You are going to then have to go the last page of the results that Google has from your site and see what message you get. If you see the following message you may have a duplicate content issue that you need to look at:
Having some pages omitted is to be expected especially on a large site however, if you are experiencing issues getting pages indexed that you think should be indexed this could point to the issue.
4. Checking the page loading times of your site
Google wants to provide users with content that provides relevant answers quickly. That means speed. We’ve all been on those websites that take an age to load and the likely response is to leave the page and go elsewhere. An increase in mobile usage means that page load speed has become a ranking factor and should not be avoided.
There are loads of free tools for checking the speed of your site but you might as well go to the horse’s mouth. Use the Google Page Speed Tool and you will get a list of suggestions for your developer.
5. Checking if your site is mobile friendly
If you compared how many times you used your phone in a day to how many times you used your desktop you wouldn’t be surprised to learn that having a mobile friendly website is a ranking factor. Google wants to promote those sites that cater for users on mobile devices. As above there is a very quick and simple test you can run through Googles own mobile friendly test.
This is a quick way to identify what page resources Google can load and what it cannot ie what Google can “see”. This can highlight issues that can be addressed with your web developer.
6. Checking the size of your images
There’s no excuse to have huge image files on your site because there are so many free tools that will allow you to reduce the size and compress files without affecting image quality too much. It is a fine balance though. You need a good enough quality file to be useful to the user without them having to use all their data just to see a banner image.
Firstly you need to crawl your site and look for files that are too big. You can do this with one of my favourite tools called Screaming Frog. (This is free up to 500 URLs).
Download the free version of Screaming Frog and make sure you select “check images” and the mode is on “spider”
Once that is done simply enter your website URL and click start. Once the crawl is finished you can will be able to see all the images that Google can crawl on your site. Scroll along to image size and order by biggest size. If you have some large image files that you think you can reduce in size and compress get it sorted!
There are loads of free tools you can use to edit images and compress image files. The two I use are:
7. Checking broken pages
It’s important to keep a check on pages that were once live and now are not. Pages that break (or 404) can occur for many reasons like accidental deletion or URL change. Although having 404 pages aren’t in themselves a ranking factor the pages could still be indexed, generating traffic and may have inbound links. If you lose these pages you lose the traffic and any value the link was passing to your website. Therefore it’s always best practice to review your broken pages and redirect to a relevant page.
To identify broken pages you will need to log on to your Google Search Console (mentioned above) and navigate to Crawl > Crawl Errors > “Not Found” tab. Here you will see a list of all the pages that Google has crawled which are providing a server response of 404.
You can then highlight all of these pages and download them to a file. There are tools you can use to further qualify all these URLs to see if they are worth redirecting but at this stage we just want to perform a SEO health check and not a full in depth audit. Therefore do a visual check of all the URL’s and check if you see any URL’s that really should be working. Make a list and speak to your web developer or if you have access update your redirect list.
8. Checking IP location
Although many websites use servers in foreign locations there is still many that believe the location of your server can have an effect on your local search visibility. So if your website is hosted in Germany, Google is more likely to think that your website is more useful to German users then York in England for example. It’s not going to be a game changer if your site is hosted elsewhere as Google takes into account many other variables like your TLD (.co.uk) or which territory you may have specified as being most relevant in Search Console.
The main thing is that your website is hosted in a way that gives your users fast access to it. A way to do that is to make sure it’s hosted locally (Google FAQ).
You can check the location of your IP by going to a website that does this automatically for example: IP Location
9. HTTP Vs HTTPS
If you see a HTTPS before your domain name that means your website uses SSL to move data. In other words the data is encrypted and more secure than HTTP. As this provides more security for the user Google is more likely to promote a site that use HTTPS above one that use HTTP. If your website doesn’t use HTTPS dont panic. It may be that there’s no business case for your website to change over to HTTPS. The acid test is to ask yourself “if I was a customer coming to my website would I want any data I entered encrypted and secure?”.
Google certainly thinks so and because of this HTTPS is fast becoming an expectation of many websites and some browsers even warn users if the site they are entering are not on HTTPS.
Robots.txt is a file that acts as a map for Google and other bots. It tell Google how to crawl your website and what pages to crawl and what pages not to crawl. Google ideally wants to access everything so it can choose what is and isn’t relevant but you may choose that certain areas of your site should be off limits to Google e.g. client login pages. By adding a snippet of code to this file you can tell Google not to crawl this area.
In the below example Hallam Internet is requesting any URLs starting /events-calendar/ to not be crawled.
This file is important! Essentially, what you are doing in this file is allowing or restricting access to the pages on your website. It is easy to accidentally write a snippet of code which tells Google not to crawl the entire site (we have seen this before).
If you have a robots.txt file it should be located on the root of your domain for example “www.example.com/robots.txt“. You can easily check this file by going to Search Console > Crawl > “robots.txt Tester”.
This will show you if there is a robots.txt file and if there are any suggestions for improvements. If you are in doubt as to whether a page is accessible you will also be able to type in the URL to see if it blocked or not.
So there you go, some quick and free ways that you can check some of the important SEO factors for your website – a cheap and cheerful website SEO health check!
Powered by WPeMatico