Analyzing Google Webmaster Statistics with Spreadsheets

The Google Tools

There are two Google tools that you can use to analyze the performance of your web site.  One is Google Analytics, the other is Google Webmasters.  There two systems give different views and statistics on your web site.

When you set up Google Webmasters you have to make sure that the URLs are set up properly and you need to set up conical reference.

Once Google is providing statistics into the webmaster tool, you can start analyzing the data.  However, you may have to wait for several months before you do a proper analyses and see trends.

Google Webmasters provide statistics on how your web site is performing with Google Search.  In particular, it includes information on the queries that the user entered into Google to arrive at a given page, click through rates and information on the search position.  All of these give you an understanding of search engine performance for you web site.

Downloading Your Data

The following assumes that you have already set up your web site and that data is available.

1) Log into Google Webmasters using your Google account.

The Search Console will be displayed.

2 ) Click on the Website that you want to analyze.

In the left hand navigation column, click on “Performance” and in the right hand side, scroll down to where you will see a list of “Queries” with the number of clicks and impressions.

3) Go to the “Download” icon and click on it.

4) Save the download as either a csv file or as a Google Sheets file.  I normally download as Google Sheets format and then open it up in an old version of Microsoft Excel to do the calculations.  However, you can just work with Google Sheets as it does the job just as well.

5) Sort the complete spreadsheet in order of “Impressions”, highest to lowest.

6) Add in a new column called %Impressions.

We want to calculate each value as a percentage of the total impressions.  So for each entry:
( Impression Cell Value )  / SUM( All Impression Values ) * 100

7) Add in a new column called %Cumulative

We want to add up all the cumulative values using the formulae =
%Impression + %Previous Cumulative Value

8) From the cumulative percentage we can look at the top 50% of queries which are the most important and do out analysis on that.

Doing The Analyses

We can now look at our listing and see which of the keywords in the top 50% have low CTR (Click Through Rate).  We can see that these keywords are giving high number of impressions on our site, but some are not returning good CTR.

We can look at these queries, and then try to work out how we can improve the click through rate.

The Other Downloads

From the same location we can download “Queries”, “Countries”, “Pages” and “Devices”.  From each one we can investigate different issues. 

The first thing to notice is that if you add up the impressions in each one, you will notice that they are different.  So what is going on here?

Well, Google has a page at:

https://support.google.com/webmasters/answer/7576553

There is a section called “Data Discrepancies” which gives some explanations for the differences.  Essentially, the issue with the queries download is that it is limited to 1000 rows, so there will be some loss of data although these will be from queries that are rarely made.  There is also loss of data for other reasons, such as excluding queries with “private” data, time lag differences in data processing, different processing techniques for different data sources and so on.

Comparing Across Dates

Luckily for us, Google has provided us with features to compare one set of data with another over time.

If you click on the Date Compare button, you can select Filter or Compare. In the Compare list you can then choose the date range for comparison.

The resulting csv file provides with a wealth of information of changing data over time.

Example Spreadsheet File

If you click on the following link you can download and example xls spreadsheet which shows the spreadsheet formula.

Click here for example spreadsheet.

 

Posted in SOE

Google introduces Adwords Express

Google AdWords Express is a cut down version of Google AdWords system.  The express version has fewer options which can be an advantage for those who have small business and don’t want to get into the complexities of setting individual parameters for keywords or get involved in the many other options that are available.

The user has to have a Google+ local page and then they can create their campaign.  There is no concept of bidding for keywords, you just set up a budget.  The introduction of Google AdWords Express is to improve the local listing experiences.

Posted in SOE

Links to your site using Google Webmaster

If you want to find which web site links to your site, one way is to your Google Webmaster. (http://wwwgoogle.com/webmaster)

The site has to be verified in the webmaster site and then :

On the Webmaster Tools Home page, click the site you want.
On the left-hand menu, click Traffic, and then click Links to Your Site.

Posted in SOE

The robots.txt file

When it comes to SEO, most people understand that a Web site must have content, “search engine friendly” site architecture/HTML, and meta data such as title tags, graphic alt tag tags and so on.

However, some web sites totally disregarded the robots.txt file. When optimizing a Web site: don’t disregard the power of this little text file.

What is a Robots.txt File?

Simply put, if you go to www.domain.com/robots.txt, you should see a list of directories of the Web site that the site owner is asking the search engines to “skip” (or “disallow”). However, if you’re not careful when editing a robots.txt file, you could be putting information in your robots.txt file that could really hurt your business.

There’s tons of information about the robots.txt file available at the Web Robots Pages, including the proper usage of the disallow feature, and blocking “bad bots” from indexing your Web site.

The general rule of thumb is to make sure a robots.txt file exists at the root of your domain (e.g., www.domain.com/robots.txt). To exclude all robots from indexing part of your Web site, your robots.txt file would look something like this:

User-agent:
* Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

The above syntax would tell all robots not to index the /cgi-bin/, the /tmp/, and the /junk/ directories on your Web site.

There are situations where you might use the Robots.txt file to cause issues with your site optimisation.  For instance if you include a * Disallow: “/” in your Robots.txt file it will be telling the search engines not to crawl any part of the web site giving you no web presence – not what you want.

Another point to watch out for is if you modify your Robots.txt file to dissallow old legacy pages and directories – you should really do a 301 permanent redirect to pass the value from the old Web pages to the new web pages.

Robots.txt Dos and Don’ts

There are many good reasons to stop the search engines from indexing certain directories on a Web site and allowing others for SEO purposes.

Here’s what you should do with robots.txt:

* Take a look at all of the directories in your Web site. Most likely, there are directories that you’d want to disallow the search engines from indexing, including directories like /cgi-bin/,  /wp-amin/,  /cart/,  /scripts/,  and others that might include sensitive data.
* Stop the search engines from indexing certain directories of your site that might include duplicate content. For example, some Web sites have “print versions” of Web pages and articles that allow visitors to print them easily. You should only allow the search engines to index one version of your content.
* Make sure that nothing stops the search engines from indexing the main content of your Web site.
* Look for certain files on your site that you might want to disallow the search engines from indexing, such as certain scripts, or files that might contain e-mail addresses, phone numbers, or other sensitive data.

Here’s what you should not do with robots.txt:

* Don’t use comments in your robots.txt file.
* Don’t list all your files in the robots.txt file. Listing the files allows people to find files that you don’t want them to find.
* There’s no “/allow” command in the robots.txt file, so there’s no need to add it to the robots.txt file.

By taking a good look at your Web site’s robots.txt file and making sure that the syntax is set up correctly, you’ll avoid search engine ranking problems.  By disallowing the search engines to index duplicate content on your Web site, you can potentially overcome duplicate content issues that might hurt your search engine rankings.

Test a robots.txt file

Google provides a facility as part of there Webmaster Tools system to enable you to test a robots.txt file.

Test a site’s robots.txt file:

On the Webmaster Tools Home page, click the site you want.
Under Health, click Blocked URLs.
If it’s not already selected, click the Test robots.txt tab.
Copy the content of your robots.txt file, and paste it into the first box.
In the URLs box, list the site to test against.
In the User-agents list, select the user-agents you want.

Any changes you make in this tool will not be saved. To save any changes, you’ll need to copy the contents and paste them into your robots.txt file.

Posted in SOE

How to deal with duplicate content on your web site

Duplicate content within one website

This is often unintentional and can be the result of sites having pages for similar products where the content has been only slightly changed, or because landing pages have been created for PPC campaigns.

In this case, Google recommends that webmasters include the preferred version of the URL on their sitemap file, which will help the search engine’s crawlers find the best version.

Duplicate content across domains

This refers to content identical to that on your website appearing on third party domains, often when sites use scrapers to copy your text and use it to push themselves up the rankings.

Google claims that it manages to determine the original source of the content “in most cases”, and that having your content copied shouldn’t impact on your search rankings.

Google offers the following tips if sites with scraped content are ranking higher than the original website:

•Make sure your site’s content is being crawled by Google.

•Check the Sitemap file to see if you made changes for the particular content which has been scraped.

•Make sure your site is in line with Google’s webmaster guidelines.

Posted in SOE

Getting more site hits

Here are a few basic points on getting more hits for your site.

* Make sure body text is optimised for keywords.

* Make sure title tags are well written and different for each page and reflect the page content.

* Submit to search engines, although once they have been submitted, the search engines are very good at re-crawling at regular intervals.  A way of submitting to search engines to to create an xml file of you site and submit that to the search engines.

* Look for other sites where you can advertise your products which may mean paying.

* Look at google adwords or other similar forms of advertising.

* Create a Google Analytic account to monitor traffic and keyword success.

* Provide content that will encorage people to return to your site e.g. free stuff, regular updates of information.

* Provide a newsletter or other means of keeping in contacts with people.

* Provide some kind of two way interaction, e.g. a forum discussion system

Posted in SOE

Google Webmaster Tools

You can learn so much from Google Webmaster Tools, such as:

* Understanding how often GoogleBot visits your Web site.
* Finding errors on your site (404s, etc.).
* Content analysis (which will show you if you have duplicate title tags or meta description tags or content that isn’t indexable).
* Statistics, including top search queries, crawl stats, subscriber stats (if you publish a RSS feed of your content and people subscribe to these feeds using iGoogle, Google Reader, or Orkut).
* Viewing your site from the search engine’s perspective (What Googlebot Sees).
* Learning which pages of your site are indexed in Google and which other Web sites link to your site (Index Stats).

If you look at the “What GoogleBot Sees” report you can see the top 200 phrases that other Web sites have used when linking to your site, and a comparison of how the content on your Web site may or may not “jive” with those links. Obviously, you would prefer that the links pointing to your Web site are consistent with the content on your Web site.

For those who don’t already know, doing a “link:www.sitename.com” check on Google isn’t reflective of the entirety of links that Google actually knows about. Going into Google Webmaster Tools will give you much more accurate data.

Two important concepts results from Google Webmaster Tools:

1. You can find 404 errors on your Web site and you may have had other Web sites linking to these 404 pages. If you can 301 redirect those pages to the new location of the content that used to exist on this page (or perhaps to something closely related to the content that was on this page), you’re recovering a lost link.
2. You can view pages with external links by looking at the Links section. You may find some Web sites that you already have a personal relationship with that are linking to you. Perhaps they would be open to you suggesting that they change the anchor text of their link from “Company Name” to “Keyword Here.”

Posted in SOE

Using different currencies in PayPal

What currency should you sell your products in ? 

PayPal is able to handle quite a wide range of currencies and you have to descide which currency you should sell your products in. 

I guess the question really is “are you selling to the world or are you expecting payments from customers who are based in your locality ?”  This will depend on the type of products you are selling.  Digital products have the advantage that you have no shipping to worry about so you can choose whatever currency you want. 

The only issue that I have discovered with selling in different currencies is to do with the set up of your associated PayPal account. If your ‘native’ PayPal currency is GPB and you want to sell in Euros, you have to tell PayPal to automatically accept ‘foreign’ currency transactions (the default is to ask).  If you do not do this and you receive a purchase through your shopping cart you will receive an email saying something like “PayPal purchase verified and order is waiting to be processed” with body text of :  “Unknown pending reason was received.”
 
Telling PayPal to accept all currencies and convert them to GPB resolves the issue.