Google introduces Adwords Express

Google AdWords Express is a cut down version of Google AdWords system.  The express version has fewer options which can be an advantage for those who have small business and don’t want to get into the complexities of setting individual parameters for keywords or get involved in the many other options that are available.

The user has to have a Google+ local page and then they can create their campaign.  There is no concept of bidding for keywords, you just set up a budget.  The introduction of Google AdWords Express is to improve the local listing experiences.

Posted in SOE

Links to your site using Google Webmaster

If you want to find which web site links to your site, one way is to your Google Webmaster. (http://wwwgoogle.com/webmaster)

The site has to be verified in the webmaster site and then :

On the Webmaster Tools Home page, click the site you want.
On the left-hand menu, click Traffic, and then click Links to Your Site.

Posted in SOE

The robots.txt file

When it comes to SEO, most people understand that a Web site must have content, “search engine friendly” site architecture/HTML, and meta data such as title tags, graphic alt tag tags and so on.

However, some web sites totally disregarded the robots.txt file. When optimizing a Web site: don’t disregard the power of this little text file.

What is a Robots.txt File?

Simply put, if you go to www.domain.com/robots.txt, you should see a list of directories of the Web site that the site owner is asking the search engines to “skip” (or “disallow”). However, if you’re not careful when editing a robots.txt file, you could be putting information in your robots.txt file that could really hurt your business.

There’s tons of information about the robots.txt file available at the Web Robots Pages, including the proper usage of the disallow feature, and blocking “bad bots” from indexing your Web site.

The general rule of thumb is to make sure a robots.txt file exists at the root of your domain (e.g., www.domain.com/robots.txt). To exclude all robots from indexing part of your Web site, your robots.txt file would look something like this:

User-agent:
* Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

The above syntax would tell all robots not to index the /cgi-bin/, the /tmp/, and the /junk/ directories on your Web site.

There are situations where you might use the Robots.txt file to cause issues with your site optimisation.  For instance if you include a * Disallow: “/” in your Robots.txt file it will be telling the search engines not to crawl any part of the web site giving you no web presence – not what you want.

Another point to watch out for is if you modify your Robots.txt file to dissallow old legacy pages and directories – you should really do a 301 permanent redirect to pass the value from the old Web pages to the new web pages.

Robots.txt Dos and Don’ts

There are many good reasons to stop the search engines from indexing certain directories on a Web site and allowing others for SEO purposes.

Here’s what you should do with robots.txt:

* Take a look at all of the directories in your Web site. Most likely, there are directories that you’d want to disallow the search engines from indexing, including directories like /cgi-bin/,  /wp-amin/,  /cart/,  /scripts/,  and others that might include sensitive data.
* Stop the search engines from indexing certain directories of your site that might include duplicate content. For example, some Web sites have “print versions” of Web pages and articles that allow visitors to print them easily. You should only allow the search engines to index one version of your content.
* Make sure that nothing stops the search engines from indexing the main content of your Web site.
* Look for certain files on your site that you might want to disallow the search engines from indexing, such as certain scripts, or files that might contain e-mail addresses, phone numbers, or other sensitive data.

Here’s what you should not do with robots.txt:

* Don’t use comments in your robots.txt file.
* Don’t list all your files in the robots.txt file. Listing the files allows people to find files that you don’t want them to find.
* There’s no “/allow” command in the robots.txt file, so there’s no need to add it to the robots.txt file.

By taking a good look at your Web site’s robots.txt file and making sure that the syntax is set up correctly, you’ll avoid search engine ranking problems.  By disallowing the search engines to index duplicate content on your Web site, you can potentially overcome duplicate content issues that might hurt your search engine rankings.

Test a robots.txt file

Google provides a facility as part of there Webmaster Tools system to enable you to test a robots.txt file.

Test a site’s robots.txt file:

On the Webmaster Tools Home page, click the site you want.
Under Health, click Blocked URLs.
If it’s not already selected, click the Test robots.txt tab.
Copy the content of your robots.txt file, and paste it into the first box.
In the URLs box, list the site to test against.
In the User-agents list, select the user-agents you want.

Any changes you make in this tool will not be saved. To save any changes, you’ll need to copy the contents and paste them into your robots.txt file.

Posted in SOE

How to deal with duplicate content on your web site

Duplicate content within one website

This is often unintentional and can be the result of sites having pages for similar products where the content has been only slightly changed, or because landing pages have been created for PPC campaigns.

In this case, Google recommends that webmasters include the preferred version of the URL on their sitemap file, which will help the search engine’s crawlers find the best version.

Duplicate content across domains

This refers to content identical to that on your website appearing on third party domains, often when sites use scrapers to copy your text and use it to push themselves up the rankings.

Google claims that it manages to determine the original source of the content “in most cases”, and that having your content copied shouldn’t impact on your search rankings.

Google offers the following tips if sites with scraped content are ranking higher than the original website:

•Make sure your site’s content is being crawled by Google.

•Check the Sitemap file to see if you made changes for the particular content which has been scraped.

•Make sure your site is in line with Google’s webmaster guidelines.

Posted in SOE

Getting more site hits

Here are a few basic points on getting more hits for your site.

* Make sure body text is optimised for keywords.

* Make sure title tags are well written and different for each page and reflect the page content.

* Submit to search engines, although once they have been submitted, the search engines are very good at re-crawling at regular intervals.  A way of submitting to search engines to to create an xml file of you site and submit that to the search engines.

* Look for other sites where you can advertise your products which may mean paying.

* Look at google adwords or other similar forms of advertising.

* Create a Google Analytic account to monitor traffic and keyword success.

* Provide content that will encorage people to return to your site e.g. free stuff, regular updates of information.

* Provide a newsletter or other means of keeping in contacts with people.

* Provide some kind of two way interaction, e.g. a forum discussion system

Posted in SOE

Google Webmaster Tools

You can learn so much from Google Webmaster Tools, such as:

* Understanding how often GoogleBot visits your Web site.
* Finding errors on your site (404s, etc.).
* Content analysis (which will show you if you have duplicate title tags or meta description tags or content that isn’t indexable).
* Statistics, including top search queries, crawl stats, subscriber stats (if you publish a RSS feed of your content and people subscribe to these feeds using iGoogle, Google Reader, or Orkut).
* Viewing your site from the search engine’s perspective (What Googlebot Sees).
* Learning which pages of your site are indexed in Google and which other Web sites link to your site (Index Stats).

If you look at the “What GoogleBot Sees” report you can see the top 200 phrases that other Web sites have used when linking to your site, and a comparison of how the content on your Web site may or may not “jive” with those links. Obviously, you would prefer that the links pointing to your Web site are consistent with the content on your Web site.

For those who don’t already know, doing a “link:www.sitename.com” check on Google isn’t reflective of the entirety of links that Google actually knows about. Going into Google Webmaster Tools will give you much more accurate data.

Two important concepts results from Google Webmaster Tools:

1. You can find 404 errors on your Web site and you may have had other Web sites linking to these 404 pages. If you can 301 redirect those pages to the new location of the content that used to exist on this page (or perhaps to something closely related to the content that was on this page), you’re recovering a lost link.
2. You can view pages with external links by looking at the Links section. You may find some Web sites that you already have a personal relationship with that are linking to you. Perhaps they would be open to you suggesting that they change the anchor text of their link from “Company Name” to “Keyword Here.”

Posted in SOE

Using different currencies in PayPal

What currency should you sell your products in ? 

PayPal is able to handle quite a wide range of currencies and you have to descide which currency you should sell your products in. 

I guess the question really is “are you selling to the world or are you expecting payments from customers who are based in your locality ?”  This will depend on the type of products you are selling.  Digital products have the advantage that you have no shipping to worry about so you can choose whatever currency you want. 

The only issue that I have discovered with selling in different currencies is to do with the set up of your associated PayPal account. If your ‘native’ PayPal currency is GPB and you want to sell in Euros, you have to tell PayPal to automatically accept ‘foreign’ currency transactions (the default is to ask).  If you do not do this and you receive a purchase through your shopping cart you will receive an email saying something like “PayPal purchase verified and order is waiting to be processed” with body text of :  “Unknown pending reason was received.”
 
Telling PayPal to accept all currencies and convert them to GPB resolves the issue.

WordPress plugins

The following is a list of some of the more interesting WordPrss plugins.

* http://akismet.com/ Akismet – Kills spam comments, dead.

* http://www.1pixelout.net/code/audio-player-wordpress-plugin/ Audio player – An easy to embed audio player (possibly for use with podcasts)

* http://semperfiwebdesign.com/portfolio/wordpress/wordpress-plugins/all-in-one-seo-pack/ All in One SEO Pack – All your search engine optimization needs in one package

* http://jamesmckay.net/code/comment-timeout/ Comment Timeout – Stop getting comments on posts older than X amount of days.

* http://www.arnebrachhold.de/projects/wordpress-plugins/google-xml-sitemaps-generator/ Google XML Sitemaps – Easily create a Google-ready sitemap for your blog.

* http://www.smackfoo.com/plugins/sig2feed/ RSS Feed Signature – Add an extra line to your RSS feed (great for ads or plugs)

* http://txfx.net/code/wordpress/subscribe-to-comments/ Subscribe To Comments – Allow your users to get updates on when more comments are left on a particular story.

Posted in SOE