There are plenty of companies offering SEO audit services, particularly since so many websites have been hit by various Google algorithm updates.
Aims
The key objective of any SEO audit should be to identify any existing problems with the search engine optimisation of a site. If a site has been hit by a penalty, then a complete online marketing audit should also enable the causes of such a penalty to be identified.
Key Elements
The main elements of the SEO process can be broken down, with each being reviewed in turn. The most important aspects are as follows:
- Keyword identification
- On-site optimisation elements
- The link profile
- The way the site is viewed by the main search engines
- Social media interaction
Each of these is considered in turn in the remainder of this post.
Keyword identification
Any SEO project must start with the correct identification of target keywords. As a result, ot makes sense to start an SEO audit by considering the target keywords that are being used.
This may be something that’s easy to do, if such targets are listed within the meta keywords entry. Indeed, it may be that all of the individual elements of the site reflect the list of target keywords.
It’s not always as simple as might be imagined. It may require a quick conversation with a client, to confirm the intentions.
Having discussed the target keywords, it makes sense to look at:
- The number of searches being conducted using those phrases
- The level of competition being faced.
Within the opening paragraphs of the SEO audit document, there should be a suitable discussion of the keyword targeting. Recommendations could then be made, with an explanation of why particular keywords have been selected.
On-site optimisation elements
Having considered the target keywords, it’s time to start looking at the content, tags and stucture of the site.
Google Webmaster Tools
Is there a Google Webmaster Tools account in place? If not, there probably should be. Registering for an account and getting the authorisation in place won’t take long and it’s clearly time that’s well spent.
With the Webmaster Tools login, pay particular attention to crawl stats and 404 errors. This may well point to problems with the site structure that can be corrected.
Google Analytics
A quick check of the Google Analytics installation is also worthwhile. If it’s appropriate, check that conversion tracking is in place too.
Robots.txt
The robots.txt file is used to restrict access to the site by search engine crawlers. You need to make sure that you are restricting access in the right manner. Are you, for instance, inadvertently stopping the search engines from seeing pages that you would actually like to see included within the index?
Sitemaps
If navigation is in place correctly on a site, then a sitemap should be seen as little more than a backup. You may choose to place HTML and XML sitemaps on the site.
As far as an XML sitemap is concerned, you obviously need to check that it has been constructed correctly. Has it also been submitted via the Google Webmaster Tools account?
You also need to be sure that the sitemap is comprehensive. Check that every single page is included. By the same token, every page that appears within the sitemap file should also be accessible via the standard site navigation. Check that this is actually the case.
Navigation checks
How easy is it to navigate around the site? If you’re using some sort of JavaScript navigation, for instance, then have you considered that some website visitors will undoubtedly have JavaScript disabled within their browsers.
Make sure that you have allowed for such issues by ensuring that alternative navigation is provided for those who may not be able to access your appealing solution.
Indexing check
You can use Google’s site: command to check that the website is being indexed correctly. When doing so, you should largely be checking for two main elements:
- Is every page being included within the index? If not, why not? This may mean making further changes to the robots.txt file, site navigation and the sitemap.
- Is duplicate content being indexed? If so, this is something that could cause problems and that needs to be corrected.
Keyword-rich URLs
If you’re looking to get improved search engine positioning, then consideration needs to be given to the URLs. As an example:
http://www.example.com/seo-audit/
looks rather better than:
http://www.example.com/articles/39/aug2013/44562.html
The first example is also more likely to produce better search engine positioning. You can use rewrites to correct this problem and will also find that many pieces of software (including WordPress) have specific plugins that can assist.
Page Titles
Do your page titles contain your target keywords. Together with the URL, the title provides a critical element that can be used to send signals to the search engines. It’s often suggested that the title should be less than 70 characters, although there has been some evidence to suggest that longer page titles are also helping to produce some impressive results.
You do need to be wary of over-optimising too and it’s a good idea to avoid repetition.
Meta Keywords
Although the main search engines ignore the meta keywords entries, it’s worth including these for completeness. They also offer a reference, when writing titles and descriptions.
Meta Description
Although the meta description entry isn’t currently used to improve search engine positioning, the fact that this entry does appear within the search engine results pages.
This means that it can be used to encourage search engine users to click on the listing. That’s an SEO advantage that is so often overlooked.
Content
There are numerous articles online suggesting that you should concentrate on writing content that contains the write keyword density. That’s probably less of a priority than seeking to write informative, exclusive and well-researched pieces.
Examine the use of grammar and spelling too. Websites that contain numerous spelling errors are unlikely to be associated with quality. They may not appeal to visitors, which can lead to some poor metrics and a lack of positioning as a result.
Image ALT tags
The main search engines can’t read images, so it’s important that care is taken when associating text with images. This can be achieved with IMG ALT tags, although over-stuffing them with keywords will do you few favours.
The link profile
The existing link profile of a site will go a long way to explaining whether that site is currently under penalty. With that in mind, you should look out for potential warning signals, including:
- Any warnings within Google Webmaster Tools
- A high proportion of links from poor quality websites
- A lack of anchor text variation
Search engine visibility
How is the site currently being ranked? Does it seem to be doing worse than would be expected?
You may have enough information to confirm that there’s been a loss of positioning over time. Look at whether particular pages have been hit, or whether it looks like there is a site-wide problem.
Social media interaction
There are undoubted advantages to using social media as part of a wider SEO campaign. What you need to consider is whether accounts are in place, whether they are linked to the site and whether they are regularly updated.