Keep Your Site in Good Shape with Webmaster Tools

Avatar

Rob Marsden

Head of Digital - Earned and Owned


Technical SEO

A healthy website is always a desirable starting point for search engine optimisation. If you are a site owner you can use Google Webmaster Tools which is a free tool by Google that can help you to configure and spot issues on your site. Its through this tool that Google will contact you about any issues relating to the site. These messages include:

  • Increases in not found errors
  • Googlebot cannot access your site
  • Big traffic changes for top URL
  • Possible outages
  • Notice of suspected hacking
  • Unnatural link warnings

You can see there are a number of very important messages there, so keeping an eye on these is crucial.

To start using Webmaster Tools there are only a few simple steps.

1. Setup a Google account using the email address you wish to use to sign into WMT

2. Sign into WMT

3. Click add a new site

4. Enter your home page URL then press continue

5. You will now be presented with options to verify ownership of the site.

a. If you choose the HTML verification file method you simply upload this to the root directory, click the link to confirm it’s been uploaded correctly then click verify – very simple

b. If you click on the alternate methods tab there are 3 further options

i. HTML tag – you simply place a snippet of code into the HTML of your home page

ii. Google Analytics – if you already use GA then you can verify your site in WMT. For this to work though you need to have administrator privileges in GA and also be using the asynchronous tracking code.

iii. Domain name provider – if you have login details for your domain name provider then as long as the feature is available you can add a new TXT or CNAME record. Some companies will have a domain registrar verification tool which can be used instead of adding one of the new records mentioned above.

Once you have your site verified its time to start configuring your site and looking for errors.

There are a number of pages to look at here. The image below shows each one:

 

 

 

 

 

 

The table below will cover each page and some brief notes on what each feature does and how it can be used to aid the health of your site:

 

Category Page Description
Configuration Settings Within this page there are 3 settings you can change – geographic target, preferred domain and crawl rate. Geographic target allows you to specify which country your target audience resides in. Changing this to UK can have a positive ranking affect in Google.co.uk but may have a detrimental one in Google.com, Google.de, Google.fr etc…

If you want to target UK users only then its wise to select UK, whereas if you want to attract worldwide users its best not to specify a geographic target. If you had a .com domain with multiple sub-folder, you could add all these as new sites and geotarget each one individually.

The preferred domain option allows us to display URLs with or without the www. I tend to select www, just because I like it! In reality it doesn’t matter as long as both versions are handled correctly with 301 redirects or canonical tags to avoid duplicate content.

The crawl rate feature lets you limit Googles crawling of your site – in the main this is left alone and set to its default Let Google optimize for my site.

Sitelinks If you search for your brand name you will generally see sitelinks underneath the root – these are links to deeper pages which Google see as important enough to pull into a SERP. Sometimes Google displays links you may not want to show so its here where we can demote sitelinks.

 

URL Parameters URL parameters are parts of a URL that is shown after a question mark. They can be used for filtering and tracking but can also cause duplicate content issues. If Google has discovered any parameters whilst crawling your site, they will be displayed here. If there are any displayed you can specify to Google how they should be handled by clicking edit then selecting the appropriate options. Changing settings here can have a massive effect on the number of URLs being indexed by Google so we recommend changing anything here only if you know how parameters work and the effect your changes will have.
Change of Address This feature is used as part of the domain migration process. When the site is live on the new domain and all the 301 redirects are working we can tell Google that the site has moved. To do this though both the old and the new sites need verifying in WMT.

 

Users Here you can grant other users access to your data. If you give them restricted access they can only see the data whereas giving them full access would allow them to make changes.

 

Associates An associate is a trusted user who can perform tasks relating to other Google products but can’t see any data or making any changes in WMT.

 

Health Crawl Errors Here we can see any crawl errors Google have discovered. These could be server errors, 404 pages, soft 404 pages, not followed pages etc… Have a look through each one and if necessary download the data to mine in Excel. Normally there is a pattern and simple fixes will make a big difference to these numbers. In an ideal world there would be no crawl errors at all but it is completely formal to have some 404 pages etc… if you can clean them up though, do.

 

Crawl Stats An overview of when and how Googlebot has been crawling your site over the last 90 days.

 

Blocked URLs This is where we can specify the location of the robots.txt file and also text new lines we want to add to it against actual URLs to make sure its working properly. Check here that no important pages have been blocked.

 

Fetch as Google Here you can fetch as Google to make sure it can read and index your page properly. Once the success status is shown you can then submit the URL to the index – you only have 10 submissions but 500 fetches. The submit to index feature is normally used when migrating domains, in this case the 10 URLs should be chosen wisely. I would use one of the 10 to crawl the HTML sitemap so that once it does it should follow all the links from that page and index the other pages quicker.

 

Index Status The basic view of the index status shows you how many URLs have been indexed and by looking at the graph you can quickly see any trends, spikes or drop-offs. The advanced tab shows the number of URLs blocked by robots and not selected, again use the graphs to look for trends. If the number of not selected URLs is going up then it should be investigated.

 

Malware Self-explanatory – if Google detects malware on your site it will display here.

 

Traffic Search Queries Here we can see the keywords driving traffic to your site, the impressions they received, the click through rate and the average position. Using the advanced tab we can also see the change over time. At the top, a date range can be selected and there is also an option to see the pages users landed on from the SERPs as opposed to the keywords typed.

 

Links to Your Site We can see here which sites are linking to yours. We are now seeing a lot more data than we used to do and the data can be downloaded for analysis.

 

Internal Links Here we can see all the pages on the site which have internal links pointing at them. Look at the ones at the bottom of the list – why do they have no/hardly any internal links pointing at them? Would they benefit from linking more to them? URLs with no links to them may have been accidentally orphaned off from the main site – this should be looked into if this is the case.

 

Optimization Sitemaps Here we can specify the sitemap file and see how many of the URLs contained in the sitemap are in the index. If you dont have a sitemap you should create one and specify it here.

 

Remove URLs Here you can submit a removal request which means asking Google to remove a URL from their index. There is criteria which needs to be met before doing this though so please read the documentation.

 

HTML Improvements This page shows things like duplicate meta titles and descriptions, long and/or short descriptions and non-indexable content. The duplicate titles are the most important thing here, look for patterns and try to find out why they are duplicated. It may be that the page content is exactly the same – this is a big issue and you should look at employing canonical tags to fix this. If the page content is different but the titles are the same then the titles simply need changing. This can happen when there are paginated sets of pages which just use the same title.

 

Content Keywords Simply tells you which words it sees as the most relevant to your site.

 

Structured Data As structured data is becoming more common (and important), it makes sense to keep an eye on this to see how Google is detecting your mark-up. You can click into each data type to see the pages which contain the code.

 

Data Highlighter This is a new and very exciting feature. Previously implementing structured data meant lots of code and developer time. Now, with the highlighter you can specify this data in WMT. It currently only supports events but it should be rolled out to support other rich data types.

 

Other Resources This page just contains 3 links to other Google sites which may help when looking into search verticals.

 

Labs Author stats This shows impressions and clicks on content which contained your authorship mark-up. This is a good place to test whether changing your profile picture had a positive or negative effect on click through rates.

 

Custom Search Allows your users to perform searches on your site which are powered by Google.

 

Instant Previews Instant previews appear in SERPs to give the user a glimpse of what the page looks like before clicking through to the site. With this tool we can see what Google is showing users for any page on the site you specify.

 

Site performance You will still see this in WMT but the functionality has been removed and replaced with links to other resources. I recommend you follow these links as site speed is a ranking factor so its important for SEO (and usability) to ensure you dont have a sluggish website.

 

Once you go through each page, correct any errors and amend settings so they are correct for your site then all of a sudden your websites health will be looking pretty good. Don’t just leave it there though, perform regular checks and keep up to date with any errors. Your settings are likely to stay the same but may need to change if any on-site work has been done (maybe new parameters will be reported which need dealing with)

.


"