Diagnosing Your Website Using Google’s Webmaster Tools

diagnostics - webmaster tools

diagnostics - webmaster tools

Google’s Webmaster Tools provides a set of diagnostics that enable you to make necessary repairs to your website.   The information covers issues like the presence of Malware, duplicate pages, missing links (404 errors) and suggestions re improvement in HTML.   So it seems the Google Bots are not so lazy after all!

Some of the information provided by Webmaster Tools gets very technical, so you may want to get your Webmaster to have a look at the diagnosis (or get a knowledgeable friend or family member to do so).   However,  you should be able to act on some of the information yourself and I will confine myself to a couple of these aspects.

Identifying and removing Malware

Webmaster Tools, as indicated in the above image, provides information about Malware that is affecting your website.  According to Malwarebytes, Malware represents 97% of all online threats.  Malware enables the perpetrator to steal your personal information (passwords, logins, etc) and disrupt your computer.  Malware includes computer invasions such as ‘viruses, worms, trojans, rootkits, dialers and spyware’.  Your Webmaster Tools will identify the threats and give you links for advice on how to remove these threats.  A better way to go is to be proactive and use a program that will detect, prevent and destroy malware.  The Anti-Malware program that I use and that is recommended by many technical experts (100 million downloads worldwide) is Malwarebytes.

If you take the opportunity to protect your website with an anti-malware program, then you will see the following message from Google’s Webmaster Tools (reported about my Small Business Odyssey site):

Google has not detected any malware on this site.

Clearing out duplicate webpages

The Webmaster Tools also advise other problems with your website that can be readily rectified.  One of these is where you have duplicate pages on your website.  Sometimes, you may have created a new version of an old page and accidentally left the old version on the website, resulting in duplication of the page.  This happened recently with the website for my own human resources business.  Google’s Webmaster Tools will advise you of this duplication under the ‘HTML suggestions’ menu item (see image above).  The duplicate pages I identified were listed under the sub-menu item, ‘Pages with duplicate meta descriptions’ (in other words, with substantially the same title).

Dealing with 404 errors – page not found

These ‘page not found’ errors (404) are reported under ‘Crawl errors’ (see image above – it means you’ve upset the Google Bots because they can’t find the webpage that was supposed to be present).  404 errors can occur because you have removed a page which is still showing in Google’s version of your sitemap or because the page you linked to no longer exists for some reason.  The Google Bots try to follow the links on your site and when they see a deadend, they report it as a 404 error. 

I recently found a series of 404 errors on the website for my human resource consultancy business as reported in Webmaster Tools.  These errors resulted from the fact that we had changed the default ‘permalink’ for the titles of our blog posts (changed from an auto-generated number to the title of our blog posts).  The result was the following type of errors reported in Webmaster Tools under ‘Crawl errors”:

Webmaster Tools

 

Webmaster Tools - 404

 

 

Where possible, you should clean up broken links and, where appropriate, resubmit your sitemap so that Google has a more up-to-date version to follow.

Through its Webmaster Tools, Google gives you a lot of information to help you to keep your website functioning  properly and to make it easy for the Google Bots to index your website effectively.

Tell the Google Bots Where To Go!

Lazy Bot

site map with links

 

When I discussed Webmaster Tools earlier, I mentioned the need to create a sitemap for your website and submit it to Google.  I will discuss how to do the creation and submission of a site map in this post because it is critical to the indexing of your website by Google and determines how your website will be found through search queries on Google (and other search engines).

A sitemap is basically, as the name suggests, a map or directory of your website, so that the structure and priority of the files on your website can be displayed for easy access by the search engines.   You can see from the sitemap extract above that the sitemap for Small Business Odyssey has a hyperlinked list of files, a priority rating (percentage) and a frequency rating (to tell the search engines how frequently to index that part of the website).  

Why create a sitemap and why submit it to Webmaster Tools?

Lazy BotWell, in non-technical language, it seems that the Google Bots (robots that crawl your website) are lazy ‘creatures” and do not go out of their way to properly index your site for the Google search engine.  They take the easy way out – they only go where the path is clearly laid out for them.  They don’t like deadends (broken links) or confused pathways (disconnected files randomly located).   When I look at how Google is currently indexing my Small Business Odyssey blog, I am even more convinced of how lazy the Google Bots are – it seems that they need to be spoon-fed the information, otherwise they do a poor job of indexing your website.

So the primary reason for creating a sitemap for Google is to enable the Google Bots to comprehensively index your website.   Otherwise, a lot of your website may not appear in Google’s index and will not be found by Internet searchers.  Google admits as much by this comment on Webmaster Tools:

Submit a Sitemap to tell Google about pages on your site we might not otherwise discover.

Creating an XML Sitemap

This brings us to the creation of a sitemap.  I am suggesting that you create this sitemap initially as a .XML file because it is easy for the lazy Bots to read completely.  This sitemap format basically lets the Google Bots into the back engine room of your site and shows them around – where files are located and how they are linked by type (home page, static pages, dynamic pages, categories, tags).

If this post appears too technical for you, just make sure that your Webmaster has created an XML sitemap for your website and submitted it to Google.  

Here are the steps for creation of your XML sitemap:

  1. Download the free WordPress Plugin for the Google XML Sitemap Generator.
  2. Upload the XML Sitemap Generator to your website (via your WordPress Admin panel)
  3. Make adjustments to the default settings (if you wish)
  4. Click ‘create sitemap’ and you will very quickly have a site map and a stated location (URL) for your sitemap.

The beauty of this WordPress Plugin for creating Google XML Sitemaps is that it offers multiple options in terms of settings, automatically submits the sitemap to Google, Ask.com and Bing search engines and updates automatically when you change a file on your website.  So it is comprehensive and dynamic.

In terms of adjustments to default settings, most commentators suggest that you leave the defaults as they are – it certainly makes life simpler.  However, I would suggest that you may want to change the default for ‘priority’ – the default setting tells the Google Bots to give priority to the posts that have the most comments.  This may not be meaningful if you have a really new site.  I have set up my priorities in the following order –  home page, recent posts, static pages, older posts, categories and tags.  I will change this as the Small Business Odyssey site becomes more established and generates more traffic and comments. 

The other default setting you may want to change before you click the “create sitemap’ button, is ‘Change Frequency’. For example, the default setting tells the Google Bots to index your posts weekly.   However, if you are creating blog posts on a daily basis, you should change the ‘frequency’ to daily.  The Google Bots may ignore this suggestion (remember they are basically lazy), but it is better to at least express your wishes.  Google’s own experts, such as Matt Cutts, tell us that the more frequently you update your site with relevant information, the more often the Google Bots will crawl your site and the deeper (more thoroughly) they will index your website.

I’ve made a few adjustments to the priority and frequency default settings for my XML sitemap and you can see the result here:

http://smallbusinessodyssey.com/sitemap.xml

Here’s a YouTube video that simplifies the whole process and shows you exactly what to do (there are no adjustments to defaults and the WordPress Plugin is downloaded directly to the Admin panel via the built-in Plugin search facility): 

 

How to submit your XML sitemap to Google’s Webmaster Tools 

You might wonder why we need to do this extra step as the WordPress XML Sitemap Generator automatically submits your sitemap to Google (and to Bing and Ask.com).  Well, I think it comes back to our lazy Google Bots again – they don’t go out of their way to find the sitemap, so you have to put it in front of them!   If you check out the screenshots below, you will also see how Google takes up the information from the sitemap on Webmaster Tools and begins to integrate it into its index.  So submitting the sitemap to Google’s Webmaster Tools is a way to get direct access to Google’s index (although it may take some time for all of the information to be indexed).

The process of submission of your sitemap to Google’s Webmaster Tools is very simple:

  1. log in to your Webmaster Tools site
  2. click on the web address (URL) for your verified website 
  3. click the ‘site configuration’ menu item
  4. click the ‘sitemaps’ menu item
  5. enter your sitemap address where indicated (see image below).

  sitemap submission to webmaster tools

When you first submit your sitemap, the above image will appear with the messages ‘submitted URLs – O’ and “index count pending’ (and status shown as ‘in progress’).   Take heart, this is Google trying to identify all your files from the sitemap and integrating them into their index.   After some processing time, you will see the following image that indicates successful submission:

Google indexing sitemap on Webmaster Tools

So this indicates that Google has taken on board your website pages (URLs) and has loaded them into their index.  The actual indexing in terms of search terms (keywords) will occur over an unspecified period (you can’t rush the Google Bots).

Creating and submitting an XML sitemap to Google’s Webmaster Tools is critical for small business marketing because it ensures effective indexing of your website so that Internet searchers can find your website through your targeted search terms (keywords).

What If You Don’t Like the Search Queries Results You Are Getting?

search queries webmaster tools

search queries webmaster tools

In a previous post, I discussed how to register with Google’s Webmaster Tools and how to identify the results for search queries that bring people to your website.   The search queries results may not be what you were expecting and may, in fact, be quite disappointing.  So why would this be, given all the hard work you have put into writing content for your site?

The search queries results provide invaluable data about your site and how it is viewed by Internet searchers.   Before you do anything else, just check the filters (e.g. geography) that you have used to report the data.  It may be that you have excluded a search query from your results because of the filters you have used in reporting the search queries results.

One of the core reasons that your search queries results will differ from your expectations is the relevance of both your content and your site description.

How to improve your search queries results in Webmaster Tools

Let’s focus on relevance because that is the key issue determining whether your site gets included in Google’s search queries results for a particular query.

It may be that Google has not been able to find enough content on your site that is both useful and relevant to Internet searchers who have used a particular search query.   If you are unfocused on your site, trying to cover multiple topics or unrelated topics, both Internet searchers and Google will be confused.  As I mentioned in an earlier post, focus is so critical to your online results and your small business marketing success.

So here are some hints to improve the relevance of your content:

  • Focus on your reader – what is their level of understanding of what you are discussing on your site?
  • What language do your readers use to describe their problems/issues? – you can outpace your reader with your own understanding of a particular issue
  • Find out where your customers are conversing – join in the conversation
  • Be conscious of your marketing style, particularly if you are in the coaching or consulting business
  • Check out Google AdWords Keyword Tool to see what terms people are using to search for information in your niche area – you will often be surprised! (make sure you sign in with your Google account details to get more complete results)
  • Visit blogs that are related to your niche to see what topics people are discussing, the language they are using and the problems/issues they are experiencing (take particular note of the blogs that have lots of comments)
  • Put yourself in your reader’s shoes – what would you be experiencing?, what kind of help would you need?, what would you be talking about?
  • Check out your site’s description as it appears on the Google search queries results – is your site description relevant to what you write about? (if not, change your site’s ‘description meta tag’ or get someone to change it for you).

Just focusing in on the relevance of what you write, and how you write, can go a long way to improving your search queries results.