On-page SEO glossary

Avatar

Daniel Frank


Technical SEO

A while back Rhiannon did an introductory glossary to Off-Page SEO. I moved over to On-Page recently and thought that it might be useful to have a similar glossary for the more technical aspects of On-Page SEO. Aside from gaining a number of tools that sound like early 90s video games, (Xenu and Screaming Frog anyone?) there’s a lot to learn and a handy guide to tell your XML Sitemap from your Html Sitemap could be useful.

Spider/Web Crawler:

These bots are the key to understanding which ever website you are working on. They are the same kind of programmes as googlebots, but the ones we work with are on a much smaller scale. We run these in the background to get a list of all the pages on a website so we can look for problems and areas of improvement.

Keyword Density:

This is what it sounds like, the relative frequency of a keyword or phrase on a webpage. Back in the prehistory of the web (1997) Keyword density was the best indicator search engines had of a pages relevance. But due to spammers abusing this technique its now fairly irrelevant. Its worth keeping an eye on it however to ensure you aren’t guilty of keyword stuffing. A good rule of thumb is between 2 and 4% but its far more important to have good content that reads naturally than obsessing over whether you have too many or too few mentions of a keyword.

Title Tag:

If you are going to worry about keywords, this is where to do it. The title tag can be found in thesection of the pages source code. This is the title that search engines show in their results and browsers show on the top of the page.

Optimised title tag

The Title Tag tells the search engines what your page is about so make sure you keyword is mentioned as early as possible. But be careful, you only have 70 characters, sometimes less to use.

Meta Description:

If the title tells the search engine what a page is about the meta description tells the searchers. This is a short description of around 160 characters in thesection. Search Engines will usually display this below the title in the search results.

Meta Description

Keyword Tag:

You can ignore this. Early spammers killed the keyword tag as a useful tool for search engines.

Duplicate Content:

Possibly the second most talked about subject in SEO (after links) duplicate content is less worrying for the on-page team than the off page team. The basic issue is that if you have two pages with the same or very similar content on them it makes it harder for search engines to choose between them and if people link to different versions the link juice can be split between them, making them less authoritative. Its important to remember that to search engines every separate URL is a different page, but to developers different URLs point to the same page. So changing parameters or navigation paths can create duplicate content. For example

https://www.searchlaboratory.com/animals/cutekittens and https://www.searchlaboratory.com/animals/cutekittens.html&sort=price%20

would count as separate pages.

Canonical Tag:

The canonical tag was created to solve the duplicate content problem. Inserting this tag into thesection tells the search engine which page is the original. If we take the duplicate example from above putting the following tag in all versions of the page would solve the duplicate content issue:

<rel=canonical “href=https://www.searchlaboratory.com/cutekittens”/>

301/302 redirect:

301 and 302 redirects tell search engines and browsers who land on one page to go to a different page. So a redirect could be placed on

https://www.searchlaboratory.com/animals/cutekittens/ to send people to https://www.searchlaboratory.com/animals/cutekittens.

The difference between 301 and 302 is that a 301 redirect tells the search engines that the change is permanent. This means that most of the link juice and authority of the old page is sent to the new page. Whereas a 302 is temporary, meaning that the old page retains these attributes. You should nearly always use a 301.

Sitemap.xml and Robots.txt:

The xml sitemap and the robots file respectively are a list of pages on your website and instructions for any web crawlers that use it. You can also create a sitemap of the videos on your page. These are separate from the normal sitemaps designed for humans and can be uploaded via webmaster tools.
So thats our overly long rundown of some of the things we look at in the on-page department. If you’d like to find out more you could always commission an SEO Report.