Sunday, December 12, 2010

You have heard of a '404-Error' before?

If you visit a page of a website that has been moved, deleted or renamed, or if you typed in part of the address wrong yourself or even clicked on a broken link, you’ll see an error telling you that there’s no page at that address. The technical name for this is a ‘404’ error - Page Not Found, and they’re very important to handle correctly.

When your website doesn’t have its own custom 404 error page, website users will see the hosting company's default error screen which isn't all that useful, and starts to create a sense of frustration for website visitors, and in many cases becomes the point of abandonment for the user leaving your website. Once we've driven traffic to a website we certainly don't want to see the user leave without spending time on your site!

It's actually possible to customise your 404 error pages that website visitors will see on your website. This allows you to make the page look exactly like any other page of your website maintaining your branding and professional appearance along with access to your site's navigation menu, but at the same time displaying a clear message to your visitors telling them that the page they were looking for has not been found. A simple example is: “Sorry, it looks like we couldn’t find the page you were looking for”. I also recommend the inclusion of a simple on-page site map or search box, so that the user can quickly locate the area or page of the website that they want to go to and you direct them along their way. Talk to your website designer about any options that they may be able to provide for you in this regards.

Remember that this is the first page a user may see of your website and it’s an error page. They might assume the entire site is down, where actually the only problem is that a page has been renamed.

Some websites even use the 404 page as another place to push their products onto visitors. “We couldn’t find the page you were looking for… so why not have a look at these other items instead…”

I've even seen a few funny customised 404 error messages from companies used to generate 'viral' marketing, and encourage discussion about their company online such as the following:

Every website hosting operating system, and development application has a different manner in which they can implement customised 404 Error Pages, and some are definitely more complex then others. If you think having one sounds like a good idea than I encourage you to shoot through an email to your website designer to see if they can offer you this feature.

If you want any further details on this process and how to implement, or suggestions please let me know.

Website Validation Check... is there a noise coming from your engine?

If you are a business website owner and would be interested in receiving a detailed website validation check of the front page of your website drop me an email to  

Bit like a car that may gradually, or even suddenly, start to make an unusual noise from underneath the hood you have choices...
- keep driving and hope for the best
- pop the hood and take a look around yourself
- or head over to your trusted mechanic and get it sorted

At we regularly provide ongoing reports, and expand upon our initial website SEO audit where we outline any concerns that may exist under the hood of your website, particularly when it comes to the performance of your website both for the user experience and the performance and interaction of Search Engines across your website.

I continue to noticed in my role of performing reviews across websites that many clients have invalid websites that do not conform to web standards and meet the necessary quality checks applied.

The following short and simple reference is provided to explain in depth why it is beneficial to validate your website, and answer some of the common misconceptions around this:

Having a website that has errors or even warnings can be hazardous to its performance.

Just like when you drive behind another car that has a broken brake light, you the web site owner are not always aware of the problem because it seems to just work for you... but this may not be the same experience for everyone else on the internet. Which is why website validation checks exist to help find problems and provide an opportunity to fix them once and for all.

I typically find if a problem exists on the front page of your website then it carries through the rest of the website. This is NOT the time for you to learn html so don't panic! The validation reports that we provide include a summary of concerns. This will provide an outline to what is not 100% correct for your web site. The top section actually shows the code of your website along with comments amongst this code where problems exist.

We can then work with your in-house design and development team, or external web agency on resolving these problems. Our aim is for a perfect health score on the test, but even a reduction in the number of errors and warnings I will consider a win! And you will have a win too. By providing a website without errors will make for a better user experience and a much happier search bot with more opportunity of actually crawling through your entire website to find all your great content and getting this included within the Search Engine Results.

If you don't currently have a website designer available to assist you, or need a helping hand please let us know as our in-house design and development team Star3Media can provide support.

It really is the little things that can make your website stand out...

Whilst performing a lot of website reviews recently, I have noticed that many website owners are not taking advantage of a simple and important web design feature known as the favicon. The favicon (abbreviation for favourite icon), is a visual representation of your business, and is normally the logo of your business.

So where is the favicon seen?

When users bookmark or save your website to their favourites apart from the title of the website being stored as a link, the favicon image appears next to your link in the users browser. If they also put your website in to their links bar within the browser or drag the favourite to their desktop for easy access, this favicon is used as the icon representing your business website. As well, quite simply, every time a user goes to your website they will see your logo in the url bar of the browser where you type the web address in – AND when they use tabbed browsing, the favicon is next to the website name on each tab…

So as you can see this simple, effective, visual reminder, really helps your website stand out of the crowd, and should encourage users to come back and revisit your site.

Some search engines actually check to see if your website has a favicon, and may use this as part of their ranking algorithm as they determine how professional your website is, which could help to determine whether your website should appear above another. So when they check, and they don’t find one in the default location it may also trigger unnecessary 404 Errors (Page not Found) which we would like to avoid.

Now that you can see how important it is to have this little thing in place, how do you get one?

If you already have a website designer please contact them directly and request them to get this implemented for your website, creating the icon logo, locating this on the server, and ensuring it is in the default location and each page of your site has the necessary meta tag pointing to it indicating that it now exists.

If you need assistance getting this done for you let me know as own internal design/development team Star3Media can assist. This will provide your own customised favicon created and implemented properly across your site.

Saturday, December 11, 2010

Competitive Intelligence For Brand Protection

Protecting your brand online is an ongoing activity. It requires hard work and due diligence. One area of concern is when others bid on your brand, or use your brand name in their ads. The current understanding of Google's policy is that it's OK to bid on a competitive brand name, but it is not OK to use in your ad. Yahoo's policy appears to be a bit stricter, and does allow for the possibility of prohibiting the bidding on a trademarked term.
Finding out who is bidding on your brand may not be hard. You can go to the search engine, type your brand name in, and then see who shows up. This, of course, assumes that they are not geo-targeting their bid to only show up in certain areas (and perhaps not the area you are checking from). This is one of the areas where tools like Hitwise and comScore are useful, as they avoid these types of issues.
It's also interesting to see just how important brand name bidding is to a competitor's business:
You can see from this table that Smart Fares receives 4.78% of it's traffic from the Orbitz brand name. A deeper look shows that brand name bidding is a fundamental part of their business model:
Acting on Infringement
Direct bidding on your brand is not really something you can prevent. However, if your competitor takes the next step and starts using your brand in their ad text, that's another matter. The search engines do provide a way to address that. You can use the following links to find the Google Trademark Complaint Policy and the Yahoo Trademark Complaint Policy. Google also provides an Trademark Complaint Form.
Most of the information you need to provide in the Google complaint form is pretty basic. However, if your trademark is registered, you will need to have the application or registration number which they can use for verification. You can also apply even if all you are doing is claiming use rights. However, this is a bit more difficult to prove, so be prepared for a longer wait on resolution.
Trademarks can also be on a word, design, or both. These are equally protectable. The Google form accommodates up to 10 trademarks, but Google also provides a way to submit more than 10.
Yahoo does not provide a form, but does provide an email address or a mailing address for sending in your complaint. They request information on the search term for which the complaint is related, the trademark info (much the same as Google), registration information if your trademark is registered, any evidence of consumer confusion (hmm - that Hitwise data looks pretty good for providing that), and any information on communications you have had with the advertiser.
Perhaps this goes without saying, but don't go down this path unless you really have a legitimate trademark to protect. Also, this is not an overnight process. It can take several months for you to get resolution. Just be patient. It is well worth it.

Wednesday, November 24, 2010

Introducing you and your website to RSS

RSS which stands for Really Simple Syndication, is a great way for your online business to keep in touch with your website visitors once they have been to your website and keep your website 'sticky'... meaning that once you have had the visitor to your site, you can encourage them to receive ongoing information about your company, news, information, new deals or services and keep your clients engaged.

It is actually very easy to implement RSS into your site! By including the following piece of one line HTML code in the head section of your webpages you'll be all set:
<link rel="alternate" type="application/rss+xml" title="RSS" href="ENTER_RSS_URL">

By doing this visitors to your website will then see within their web browser the RSS icon glow orange and become active - meaning that they can then subscribe directly to your feed, which is like book marking your specials or info page so that they will continue to receive this information from you each time you post out an RSS update.

So RSS is a bit like having an email newsletter subscription service for your website, but only better.

Because RSS was created a few years back as an alternative method to email delivery to get around some of the problems related to email marketing. RSS is automatically opt-in, as the user is choosing to subscribe. There is no spam filters that block your message getting through, nor delays in delivery that can happen with email - RSS is immediate, so a great way for speed to market of your business information or new deals.

A lot of people will actually have RSS available in their email clients like Outlook 2007 and above. So they can subscribe and follow your business information from the comfort of their email application. Once you have RSS set up, you can also provide your clients with a long list of other marketing tools that take advantage of RSS, and which can receive and display RSS. So you could actually create and provide downloadable Google Gadgets, Yahoo! Widgets, iPhone & iPad applications, even Screen Savers that receive and show your company information via RSS. These are some great brand building and marketing tools to help disseminate a brand awareness campaign and attract more eyeballs to your products and services. If you would like some further information about any of these just drop me an email.

So why am I harping on about RSS for your website?

Quite simple really as I'm all about driving more traffic to your online business, and retaining client interaction as well as pointing out ways to create better leverage with Search Engines. We're here to work for you and improve your online business.

RSS feeds can be crawled by Search Engines, they can even be submitted to Search Engines, along with a long list of RSS aggregator websites or online directories. This generates more exposure, and a better interaction with Search Engines encouraging them to find all your content, and particularly any new content you create and get this added into their Search Directories so your site is found more.

And here is a simple tip to actually manage and update your website's RSS feed...

Use Twitter! Yup, signup for your free Twitter account for your business at and subsequently Twitter provides a RSS feed of your tweets. This RSS address can be put into that one line of code I provided you with at the top of this email and you are away. This means that you are using Twitter and getting your business information and website links out into Twitter and being seen there as well, and having it generated at the same time into your webpage. Bit like two for the price of... nothing!

Did you know that Twitter has a content relationship deal with Google - meaning, that Google trawls through all the data generated on Twitter. So when you tweet a new deal, a special, a new page added to your site or information about your business it will be found by Google and added very quickly into Google's own search index. This is one of the key reasons I recommend the use of Twitter for your business!

Okay, so a bit of information for you to digest this morning. I hope it makes sense. Of course if you have any further questions shoot me through an email.

Saturday, November 20, 2010

Careful with your choice of Domain Name

Such a simple thing, and yet one of the most important and fundamental decisions a business will make initially in their foray into the online space will be the selection of an appropriate domain name.

The Search Engines will look upon this as one of the most important factors in determining what your website is going to be about and hence presenting your site in the Search Engine Result Page relevant to audience queries.

Unfortunately since the internet has now been around for so long, there is relatively limited choice in some aspects for good domain names that still exist. Be careful the choice of Top Level Domain structure and ensure it is appropriate to your location (localised search factor) eg. for Australia, and not just going for a domain extension because it is available... as this may lead to consumer confusion as they expect the 'norm'.

Keyword rich domain names are the ideal choice that match your business segment. Hyphen separated domains are okay, but do and can add to confusion for typing in the name by the consumer so aren't an ideal first choice, but a good second when nothing else exists.

But beware the acronym, the abbreviated version of your business name purely based on its initials. Particularly if this is not a brand naming convention that users may be aware of. Firstly it is most likely, and certainly not keyword rich, so a back step from the very beginning. However I can understand the use if you have a very long business name. But my concern comes from what sometimes is over looked, in that what does that acronym or abbreviation perhaps mean in another context or even another country? Do your research, search particularly across the Social Media sphere just in case your particular acronym has sinister connections and associations that may already be established and which then may make you reconsider your choice... Research really is an important element of your initial steps into the online space, so step carefully and wisely.

Sunday, October 31, 2010

Foursquare as a geo-marketing tool for local businesses


Foursquare on your phone gives you & your friends new ways of exploring your city. Earn points & unlock badges for discovering new things. Users check into local places like cafes, restaurants, hotels on FourSquare via their smart phones and provide tips and share their location with friends across FourSquare as well as Twitter and Facebook. Users with the most checkins to a venue become the “Mayor”, and can receive loyalty rewards from the venue.

Foursquare co-founders Dennis Crowley and Naveen Selvadurai met in 2007 while working in the same office space (at different companies) in New York City. FourSquare was launched in Austin, Texas in March 2009. As of August 2010, foursquare had close to 3 million users worldwide.

For local businesses it an opportunity to communicate to your customers, provide coupons as incentives, and promote your business in the social media space.

Saturday, October 23, 2010 – Online Notability Guidelines remains one of the most reputed sources of editable content in the world.
Within hours of posting content on Wikipedia, it can be indexed in Google and Yahoo!
However, the editorial guidelines are very strict and it is critical to understand that editors police postings. It is vital to have “reliable secondary sources” as references in any posting to ensure it survives the human editorial test.
The Notability Guidelines are detailed below.

Sunday, October 17, 2010

List of Web Server Error Messages

Errors on the Internet, and those annoying error messages, occur quite frequently, and can be quite frustrating, especially if you do not know the difference between a 404 error and a 502 error. Many times they have more to do with the Web servers you're trying to access rather than something being wrong with your computer. Here is a list of error messages you might encounter while surfing the Web and their respective meanings to help you figure out just what the problem is:
200 If a page is missing, it's replaced with the custom error page.
302 If the page is missing, it's replaced with a temporary redirect to a custom error page.
301 Redirects errors to either a custom error page, or some other page in the site (i.e. sitemap, homepage or best guess).
400 Bad File Request, usually means the syntax used in the URL is incorrect (e.g., uppercase letter should be lowercase letter; wrong punctuation marks).
 401 Unauthorized  Server, is looking for some encryption key from the client and is not getting it. Also, wrong password may have been entered. Try it again, paying close attention to case sensitivity. 
403 Forbidden/Access Denied,  Similar to 401; special permission needed to access the site -- a password and/or username if it is a registration issue. Other times you may not have the proper permissions set up on the server or the site's administrator just doesn't want you to be able to access the site. 
404 File Not Found, server cannot find the file you requested. File has either been moved or deleted, or you entered the wrong URL or document name. Look at the URL. If a word looks misspelled, then correct it and try it again. If that doesn't work backtrack by deleting information between each backslash, until you come to a page on that site that isn't a 404. From there you may be able to find the page you're looking for. 
408 Request Timeout, client stopped the request before the server finished retrieving it. A user will either hit the stop button, close the browser, or click on a link before the page loads. Usually occurs when servers are slow or file sizes are large. 
500 Internal Error, couldn't retrieve the HTML document because of server-configuration problems. Contact site administrator. 
501 Not Implemented, web server doesn't support a requested feature. 
502 Service Temporarily Overloaded, server congestion; too many connections; high traffic. Keep trying until the page loads.
503 Service Unavailable, server busy, site may have moved, or you lost your Internet connection. 
Connection Refused by Host, either you do not have permission to access the site or your password is incorrect. 
File Contains No Data, page is there but is not showing anything. Error occurs in the document. Attributed to bad table formatting, or stripped header information. 
Bad File Request, browser may not support the form or other coding you're trying to access. 
Failed DNS Lookup, the Domain Name Server can't translate your domain request into a valid Internet address. Server may be busy or down, or incorrect URL was entered. 
Host Unavailable, host server down. Hit reload or go to the site later. 
Unable to Locate Host, host server is down, Internet connection is lost, or URL typed incorrectly. 
Network Connection Refused by the Server, the Web server is busy. 

Saturday, October 16, 2010

Sorting out Canonical Issues for SEO

Information on how to use Canonical link reference, and 301 redirects (permanent redirect) to fix duplicate issues in SEO

I am still surprised at the number of websites I come across daily with this problem. The following information is supplied to look at resolving this issue.


Set the Canonical source reference, indicating the primary page where ranking is to be delivered if you have duplicate content, by including the following metatag. The following tag tells the SEs to treat the page that they are on as if it is a copy of the listed domain/page, and that all of the link and content metrics that the engines apply should technically flow back to that URL.

<link rel='canonical' href='' />

Note: 301s carry cross-domain functionality, meaning you can redirect a page at to and carry over those search engine metrics. This is NOT THE CASE with the Canonical URL tag, which operates exclusively on a single root domain (it will carry over across subfolders and subdomains).

The “301 Permanent Redirect” is the most efficient and search engine friendly method for redirecting websites. You can use it in several situations, including:

    * to redirect an old website to a new address
    * to setup several domains pointing to one website
    * to enforce only one version of your website (www. or no-www)
    * to harmonize a URL structure change


ASP Sites – 301 Redirect

<%@ Language=VBScript %>
Response.Status=”301 Moved Permanently”
Response.AddHeader “Location”, “”


ASP Single Page 301 Redirect (a variation on the above)

This redirect method is used with the Active Server Pages platform.

Response.Status="301 Moved Permanently"


ASP Canonical Redirect

The Canonical Redirect with ASP must be located in a script that is executed in every page on the server before the page content starts.

If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then
Response.Status="301 Moved Permanently"
Response.AddHeader "Location","http://www."
& Request.ServerVariables("HTTP_HOST")
& Request.ServerVariables("SCRIPT_NAME")
End if


PHP Single Page 301 Redirect

In order to redirect a static page to a new address simply enter the code below inside the index.php file.

Header(“HTTP/1.1 301 Moved Permanently”);


PHP Canonical Redirect

The Canonical 301 Redirect will add (or remove) the www. prefixes to all the pages inside your domain. The code below redirects the visitors of the version to

if (substr($_SERVER['HTTP_HOST'],0,3) != 'www') {
header('HTTP/1.1 301 Moved Permanently');
header('Location: http://www.'.$_SERVER['HTTP_HOST']


PHP Sites – 301 Redirect: Add to .htaccess file

Options +FollowSymlinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^
RewriteRule ^(.*)$$1 [L,R=301]
RewriteRule ^index.html$ / [R=301,L]


Apache .htaccess Singe Page Redirect for Linux Servers

In order to use this method you will need to create a file named .htaccess (not supported by Windows-based hosting) and place it on the root directory of your website, then just add the code below to the file.

Redirect 301 /old/oldpage.htm /new/


Apache .htaccess Canonical Redirect for Linux Servers

Follow the same steps as before but insert the code below instead (it will redirect all the visitors accessing to

Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^ [nc]
rewriterule ^(.*)$$1 [r=301,nc]


Wednesday, September 29, 2010

Search Group - Perth SEO company -

Hi all,

Just an update to let you all know that I am now working as a Search Engine Optimization specialist with Perth SEO company Search Group.

Search Group is a leading Perth SEO Company specialising in website search engine optimisation (SEO), Search Engine Marketing (SEM), and online marketing activities with a large and growing portfolio of Australian clients.

Address: Suite 90, City West Business Centre, 102 Railway Street, West Perth, WA 6005

T: 08 9278 8899 
F: 08 9278 8890

So if you would like a free introductory SEO training session, website appraisal and review or just simply to discuss your search optimisation requirements to improve the visibility of your website on Google, Yahoo! and Bing then please drop me an email.

Monday, September 27, 2010

404 Error Pages... have you customised yours?

I generally recommend for client websites to include a customided 404 Error Page (page not found error alert) for users who mistype or arrive at a broken link. Best practice indicates that a redirect should be provided back to the website home page or site map. I believe that the custom error page has become quite a fashion statement and feature of discussion. Though you can set a meta refresh to redirect the user after a set period of time (20 seconds) to the home page, I prefer to include the site map or site search feature on this page for usability improvement to the users experience, that clearly indicates that they are presented with a page that indicates that they have come to this point as the page that they were trying doesn't exist, and here are some options to assist in getting to where you were going.

It is important to note the most common reasons that visitors to the website land on a 404 page:

• A mistyped website address or URL (or an out of date bookmark or favourite)
• A search engine link that is out of date
• An internal broken link that the webmaster or web developer is not aware of.

Saturday, September 25, 2010

Factors to consider for local search SEO - using GEO tags

This post combines information on what GEO tags are, how they can be implemented, and factors that may contribute in my opinion to improvements in local search for Search Engine Optimisation.

Firstly the Geo Tag Elements

<META NAME="geo.position" CONTENT="latitude; longitude">
<META NAME="geo.placename" CONTENT="Place Name">
<META NAME="geo.region" CONTENT="Country Subdivision Code">

<META NAME="geo.position" CONTENT="-31.9667;115.8167">
<META NAME="geo.position" CONTENT="-31.9667;115.8167">
<META NAME="geo.region" CONTENT="AU-WA">
<META NAME="geo.placename" CONTENT="Perth">

Element Description

Longitude is conventionally given in degrees of arc relative to the Greenwich Meridian, a great circle passing through the poles and Greenwich in London, England. Longitude is either qualified as East or West, or given as a signed numeric value of degrees East. The geo.position tag uses a signed numeric value.

On a Mercator map, Longitude is plotted left to right as the X coordinate, while Latitude is plotted bottom to top as the Y coordinate, with (0,0) in the middle. Thus, positive values for Longitude correspond to locations East of Greenwich, e.g. in Asia. Locations West of Greenwich, e.g. in the United States, correspond to negative values of Longitude.
(Latitude;Longitude) in quadrants

Latitude is conventionally given in degrees of arc relative to the Equator. Latitude is either qualified as North or South, or given as a signed numeric value of degrees North. The geo.position tag uses a signed numeric value.

The geo.region tag is taken from a controlled list, and may be used for resource discovery. It may also be used as a bounding check to validate the geo.position tag.

For the geo.region tag, the ISO 3166-2 country subdivision codes are used. These codes are formed from the "domain-name" 2-character codes, for example iso3166-countrycodes.txt, together with a regional code. The codes may be found here, while the GeoTag Generator includes a 2-stage script to pick appropriate codes.

For the USA and Canada, the subdivision code is the familiar 2-character state/province abbreviation. See for instance United States Postal Service, Official Abbreviations - States and Possessions and Canada Postal Guide (Province and Territory Symbols)

If the regional code is not known, the 2-character country code may be used instead.
ISO 3166 codes are reproduced with permission from ISO

Place Name:
The geo.placename tag is provided primarily for resource recognition; it is anticipated that this field be harvested by automated agents and presented to the user in search engine results in a similar manner to the description META tag. This field is free-text, and typically would be used for city, county and state names. It could, however, be used for resource discovery, particularly if names from some controlled vocabulary such as the Getty Thesaurus of Geographic Names is used.

Required Accuracy of Position
The accuracy with which positions need to be determined is largely determined by the character of the resource being described.

While a position given in a gazetteer or thesarus such as the Getty Thesaurus of Geographic Names is convenient, and in many cases adequate, in other cases it is clearly not. For instance, it is probably sufficient to place a branch office of a multinational corporation on a map of the world, or a map of a country. It is, however, clearly insufficient to place a public telephone, filling station or fast-food restaurant. In these cases, position must be measured or researched to greater accuracy.

Accuracy of Elements
Properly expressing the accuracy of geographic positions is too complex for the simple geo.position META tag, yet some clue to the accuracy may be given by the number of digits in each element. One Minute of Arc of Latitude corresponds to one Nautical Mile (1852 metres, or 1.8km). Latitude Accuracy Mile Km

49 1° 60 111 
49.1 0° 6' 6 11 
49.30 36" 0.6 1 
49.320 3" 0.06 0.1 

In a commercial GPS set, the standard accuracy is less than 15 metres, now that SA has been turned off. (May 2000). DGPS sets may allow accuracy better than this.

As currently defined, the geo.position tag describes a point, not a region. It is thus unsuitable for describing an extended area. What constitutes a region may depend on the scale of map or geographic search used; for instance, the country of Andorra (geo.position 42.5;1.5) may reasonably be represented by a point on a map of the world but not on a map of Andorra. It is the responsibility of the user to determine if a point representation is meaningful given the intended audience of the tagged document.

Future versions of geotags may incorporate a region element, as do other metadata standards.

Position Datum
In cases where the resource position is less accurate than a few kilometres, datum issues may be ignored. For accurate positions, however, the WGS-84 datum should be used. In North America, this corresponds to the NAD-83 datum. Many maps and charts still use the older NAD-27 datum, and there is a difference in coordinates of as much as a couple of hundred metres, depending on the exact position. Conversion software is available, and most recently issued topographic maps and charts will give the offsets.

Tag Placement
In accordance with the HTML 4.0 specification, META tags (including geo tags) should be placed in the HTML document header, between the <HEAD> and </HEAD> elements, for instance:
<html><head><title>My Document</title>
<meta name="geo.region" content="US-WA">

TGN etc. Cut/Paste
The Geo Tag Generator allows geographic positions from certain popular sites to be pasted in using the browser clipsheet. Bold text is required. Examples:

TGN: Lat: 49 21 N Long: 123 05 W
Tiger: Scale: 1:218074 (Centered at Lat: 38.89000 Lon: -77.02000)
CPCGN:  Latitude - Longitude : 49? 16' 00" N - 122? 57' 00" W

Check Map
The Geo Tag Generator incorporates an optional check map which may be used for a quick sanity check on geographic co-ordinates. Co-ordinates may also be checked against country and region codes, if given.

Pages which have included geotags may include a geotag icon to indicate that geographic search capability may be available. To use the icon, copy the image to your system (e.g. right-click "Save Image As") and include the following HTML code:

<a href="" target="_top">
<img src="geo2t.png" alt="Geo Tagged for Geographic Discovery"></a>

Referenced from:

The tags describe the position of the resource described on the page, for instance a beach or restaurant, not the company hosting the page, the company managing the resource, or the server hosting the page (cf. RFC 1876)

The tags are described in terms of current HTML practice, which does not preclude them being represented in another manner such as RDF or XML.

The tags are intended for use by a wide base of authors who are probably not well versed in GIS or formal metadata techniques. They are not intended to supplant, and do not address as many issues, as such formal standards as FGDC, GILS, TC211, Dublin Core etc which should be used where applicable.

Hierarchical Position:
     World (facet) 
  ....  Oceania (continent) (P) 
  ........  Australia (nation) (P) 
  ............  Western Australia (state) (P) 
  ................  Perth (inhabited place) (P) 

Place Types:
 inhabited place (preferred, C)  ............  settled 1829
city (C)  ............  established 1856
regional capital (C) 
commercial center (C) 
financial center (C) 
transportation center (C) 
university center (C)

Unfortunately search engines determine a websites location by the location of its webhost (server).

So, if your focus is a specific locality, then it’s vital that your site is recognised by search engines as being from that territory. Local websites are featured more prominently in local versions of the search engines, there web surfers are given the option to see only pages from their location excluding foreign based sites.

Hosting up until recently tended to be fairly expensive in comparison to overseas in the US. For this reason many Australian based sites found themselves running into problems when they tried to save costs by hosting their sites in the USA.

Also, don’t take it for granted that by hosting with a local Perth based company that their servers will also be based here in Perth! Many Perth hosting companies locate their servers either over East or overseas. When signing up for hosting contracts, if location is an issue for you, always check that the servers are located geographically where you would expect them to be.

There are several factors that are I believe effect the location of a site in terms of search engines:
•             The top level domain extension (, .fr, .de)
•             The location (IP address) of the website host
•             The geographic location of the domain registrar
•             The language that the site is written in
•             The location of incoming links
•             On page factors (addresses, telephone numbers)
•             Registering with Google Local

Obviously some of these factors hold more importance than others, some I believe are used only marginally, others perhaps not at all, or their effect is too minimal to test.

Using The Correct TLD (Top Level Domain)

This is your best case scenario. You have a site that you’re targeting to local Australian consumers, registering a site will pretty much guarantee that you’ll be found in the Australian search results, even if you choose to host out of the country. If however you were unable to obtain your business name in the domain space and went with a .com (maybe because it was cheaper!) or other variations, this may not be the case!
Website Host Location

There are of course many instances of websites that are targeted to a specific country but are using a generic top level domain such as a .com or .net. In these cases simply ensuring that your hosts servers are geographically located in your marketplace should ensure that the site is recognised as being local.

Geographic Location Of The Domain Registrar

I feel this is a factor that is sometimes overlooked by many webmasters, but as Google has access to the geographic location such as the location of the domain registrar it would make sense for them to make use of this as well. This along with other registrar information such as Whois data could well be used as a ‘tie-breaker’ when country-specific TLDs are hosted elsewhere. For example many country specific TLDs such as .fm, .cc, and .tv are now being used because of the brandability of the domain extension. In cases such as these where the TLD extension is indicating one location and the hosting location indicating another then it would be a logical step to make use of the information available from the domain registrar. Tie into this a further step, by ensuring as much of your actual business details (name) and address is accurately reflected in the domain name registration process and thereby reflected in the whois details for your site. This I believe is a factor now being used by Google Local in their lookup for local search.

Site Language

Obviously as well as it making sense to make sure your site is written in the language of the search visitors that you’re looking for, it may well be one of the factors that a search engine may use to help determine the location of a site. It certainly isn’t a defining factor though as it’s relatively easy to find foreign language sites within the AU only search results.

Location Of Incoming Links

As above, the idea is that a search engine can use the location of incoming links to determine the site location. Again, I’m sceptical if this is any more than of marginal importance. I have seen lots of sites with low quality link profiles that consist of largely overseas located links and of course news sites with very few links seem to have little problem get geographically placed before backlinks have had a chance to develop.

Site Addresses/Telephone Numbers

It is simply good practice to have local contact details for local markets. There is also speculation that this may be used to place a sites location. Again this is difficult to test but I’m doubtful if this is would be anything more than of marginal importance. (Consider including local and the international prefix in your phone contacts for example).

Registering With Google Local

Again, hard to test, but it would make sense for Google to make as much use of all the information that was made available to them. Yahoo and MSN also have similar local services.

Monday, September 6, 2010

Robots.txt - informing search engines

The robots.txt file is stored in the root level directory of the website to inform search engines how to interact with the web page, what to, and what not to go into and list in their directory. The following string is the format that is used and provides an example layout.

The file must reside in the root directory of your web. The URL path (web address) of your robots.txt file should look like this: www.yoursite/robots.txt

User-agent: *
Sitemap: http://www.yoursite/sitemap.xml.gz
Disallow: /secure/

To exclude ALL robots from the server:

User-agent: *
Disallow: /

To exclude a single robot from parts of a server:

User-agent: Named Bot
Disallow: /private/
Disallow: /images-saved/

The Robot Tag in Source Code

Where a robots.txt file can’t be uploaded onto a website server, the following robot tags can be included on the individual html pages:
<META NAME="robots" CONTENT="index,follow">

Robots.txt resources:

Robots.txt File Generator

Analyze robots.txt

Improving on Robots Exclusion Protocol

About /robots.txt

Tip: the robots.txt file is not the place where you should include comments in the code, as these can sometime be incorrectly misinterpreted and cause problems with the search spider.

Thursday, September 2, 2010

Comparing 301 redirects and the canonical url tag

The Canonical URL tag attribute
<link rel="canonical" href=>
is very similar to the use of a 301 redirect from an SEO perspective.

Effectively you are telling search engines that several pages should be considered as one ,which the 301 does, without actually redirecting visitors to the new URL.

Note however; a 301 redirect redirects all traffic (search bots and human visitors), the Canonical URL tag is just for the search engines to see, meaning you can still separately track visitors to the unique URL versions. As well, a 301 is a better indicator that multiple pages have a single, canonical source. The power of a 301 redirect is in its ability to carry cross-domain functionality, meaning you can redirect a page from one domain to another and carry over those search engine metrics. This is not something that you can do with the Canonical URL tag, which operates exclusively on a single root domain (it will carry over across subfolders and subdomains).

Monday, August 30, 2010

The Canonical URL tag

Yahoo!, Bing & Google all support the "canonical url tag" with the aim to assist webmasters and site owners eliminate self-created duplicate content in the index.
The tag is part of the HTML header on the web page:
<link rel="canonical" href=>

This would tell Yahoo!, Bing & Google that the page in question should be treated as though it were a copy of the URL and that all of the link & content metrics the engines apply should technically flow back to that URL.

Sunday, August 29, 2010

Open Directory Project Meta Tags

Sometimes the search engine will display information about your site taken from the Open Directory Project (ODP) instead of your description meta tag. You can force the search engine to ignore the ODP information by including a robots meta tag like this: <meta name="robots" content="noodp">.

The "noodp" robots meta tag is fully supported by Google, Yahoo!, and MSN.

If your site is listed in the Yahoo! Directory, Yahoo! search results will display information about your site taken from their directory instead of the contents of your description meta tag. As above, you can force Yahoo! to ignore the directory information by including this robots meta tag: <meta name="robots" content="noydir">.

The "noydir" robots meta tag is only supported by Yahoo!

Saturday, August 28, 2010

Robots metatag

Robots: <meta name="robots" content="index,follow">

Many web pages have this tag wrong. An example of the wrong usage is content="index, follow, all" - wrong because some spiders can't handle spaces between the words in the tag or the word "all". Most engines by default assume that you want a web page to be indexed and links followed, so using the wrong syntax can actually result in the spider coming to the wrong conclusion and penalizing, or worse, ignoring the page outright.

If by chance you do not want your links followed, or the page not indexed, then you would substitute "noindex" and or "nofollow" into the tag. You can also make an inline link no-follow with the following code:
<a href="" rel="nofollow">Visit My Page</a>

<meta name="revisit-after" content="7 days">

rating content
<meta name="rating" content="General">

<meta name="distribution" content="Global">

<meta name="classification" content="PRIMARY KEYWORD PHRASE">

<meta name="author" content="Company Name">

<meta name="copyright" content="Company Name © 2010">

<meta name="cache-control" content="Public">

Friday, August 27, 2010

SEO with different languages - a cultural fit

Who is your audience? If the answer to that is an international audience beyond the borders of the local area of delivery, then don’t forget it, as you need to consider the impact of reaching the masses!

Just by looking at your website tracking stats it should be clear whether you have an international flavour to your site visitors, so by going to ‘their level’ consider localisation factors in the way that international visitors may consider searching on our site.

They don’t watch our TV, they don’t listen to our radio, they don’t read the front page of our newspapers and they say potato when we say potatoe. Use the available language converters to check your main 3-5 keyword/phrases and consider using the results as part of your keyword group.

Language Resources:

Does your domain name (Top Level Domain) say international (.com) or local (

Thursday, August 26, 2010

Typographical Errors - Capturing misspelt words in your SEO program

Fremantle or Freemantle? Typographic errors due to rushing or fat fingers on a keyboard, difficult or hard to spell words, or foreigners of different nationalities attempting words that they may have only heard spoken provides an avenue for picking up stray search attempts.

When preparing your list of keywords consider the variations and attempts that people may make. Particularly review your search stats for the website/page after a few months to determine the attempts that have been made and cater for these if you sense a fair proportion of searchers may continue to err.

This is particularly useful in a tightly competed SEO field – where millions of dollars may be spent on obtaining top position for a particular phrase or keyword, you can skirt the outside of the pack and pick up many stragglers attempting to access that phrase but who just didn’t quite get it right.

Tuesday, August 24, 2010

Keyword Competition Analysis

Analysing your competition is an important step in the search engine optimisation process and it takes into consideration: what keywords your competition has chosen and how they use them on their website.

If you want to see who is your competition for the targeted keywords you've chosen then these tools will assist:

Competition Tool - SEO Digger

Competition Analysis Tool - Seoscorecard

Top Competitor Tool - Webuildpages

Monday, August 23, 2010

Keyword Suggestion Resources

Keyword Suggestion Resources:


Wordtracker Free Keywords:

Google AdWords Keyword Suggestion Tool:

Google Suggest Tool - shows frequently search for phrases starting with the words and letters in your query:

Keyword Discovery - free keyword tool from Trellian:

Microsoft Keyword Forecast - shows predicted Microsoft search impression count and historical trends:

Keyword Mutation Detection - Detect frequent misspellings or alternative spellings of the same keyword in search query logs:

Wordze - nice for generating decently large keyword lists quickly and inserting them into paid search campaigns:

Metaspy - see what others are searching for currently keyword suggestion tool

Use KeywordPad to Clean, Modify & Multiply (Long Tail) Your Keyword Lists

Keyword Research Tool - Webmaster toolkit

Keyword Suggestions Overture - SEO Chat

Website Keyword Suggestions - Webconfs

Keyword Suggestion Tool - Self SEO

Keyword Valuation Tools

Google Traffic Estimator - shows the estimated bids prices and ad clicks for the top ranked AdWords ad. Allows you to check for [exact match], "phrase match", or broad match.

SEO & Internet Marketing Google Gadget Tools

Search engine keyword position tools

When done correctly, search engine positioning can increase web traffic by a tremendous amount. Use these tools to check your current position of your keywords in major search engines like: Google, Yahoo, MSN:

Search Engine Keyword Position - SEO Chat  

Keyword Analysis Tool - Mcdar

SERPS Position Checker - LinkVendor

Website Position Tool - Rnk1

Dictionary Resources:

Using Dictionaries & Thesaurus: Here’s a great starting point for you to consider variations on a theme – use the online Thesaurus to generate associated and similar words to your key focus words.  

Sunday, August 22, 2010

Keywords - plurals, singular, misspelt...

Consider the benefit of adding plurals and/or slight variations e.g. both Australia Australian. The vast majority of search results seem to focus upon returning results for the plural version of the word, instead of the singular version. Providing results that are singular or plural will provide more relevant results for a searcher than if the search engine had just returned results for the version that a searcher entered into a search box.

Also worth noting is where Google’s Algorithm highlights synonyms. Also the degree to which we understand synonyms changes and improves, and this can affect the impact on highlighting the terms. The main point is that the algorithm does change. Google mostly highlights stems, not synonyms, but the distinction is lost on most people. Stemming on Google isn't new. The company has been doing it since back in 2003. It was also highlighting stemmed words back then, too. If you searched for "running," and it found a page with the word "run," the word "run" would be bolded in the search listing description.

Consider using keywords that are misspelled or mistyped. With a business name or product name, consider how people may say it, or sound it out, and thus interpret the spelling of that word.

Do use a combination of very unique and more general words to describe the contents of your page.

Consider using keywords related to something else your prospect would buy or search for.(related phrases)

Once you have completed your brainstorming and have compiled your list of 5-10 core keywords, it’s time to move on and expand that list. A list of 5-10 search phrases will generally not bring the amount of search engine traffic needed to make your website successful. However, that list will be a vital tool when determining which phrases to add to the mix. At this point, you need to turn to the search engines themselves and research which search phrases are actually being typed into Google, Yahoo, Bing et al. While few search engines will openly tell you which search phrases are the most often searched, there are several very useful tools you can use to expand your list of keyword popularity.