Monday, August 30, 2010

The Canonical URL tag

Yahoo!, Bing & Google all support the "canonical url tag" with the aim to assist webmasters and site owners eliminate self-created duplicate content in the index.
The tag is part of the HTML header on the web page:
<link rel="canonical" href=http://www.yoursite.com/page>

This would tell Yahoo!, Bing & Google that the page in question should be treated as though it were a copy of the URL www.yoursite.com/page and that all of the link & content metrics the engines apply should technically flow back to that URL.

Sunday, August 29, 2010

Open Directory Project Meta Tags

Sometimes the search engine will display information about your site taken from the Open Directory Project (ODP) instead of your description meta tag. You can force the search engine to ignore the ODP information by including a robots meta tag like this: <meta name="robots" content="noodp">.

The "noodp" robots meta tag is fully supported by Google, Yahoo!, and MSN.

If your site is listed in the Yahoo! Directory, Yahoo! search results will display information about your site taken from their directory instead of the contents of your description meta tag. As above, you can force Yahoo! to ignore the directory information by including this robots meta tag: <meta name="robots" content="noydir">.

The "noydir" robots meta tag is only supported by Yahoo!

Saturday, August 28, 2010

Robots metatag

Robots: <meta name="robots" content="index,follow">

Many web pages have this tag wrong. An example of the wrong usage is content="index, follow, all" - wrong because some spiders can't handle spaces between the words in the tag or the word "all". Most engines by default assume that you want a web page to be indexed and links followed, so using the wrong syntax can actually result in the spider coming to the wrong conclusion and penalizing, or worse, ignoring the page outright.

If by chance you do not want your links followed, or the page not indexed, then you would substitute "noindex" and or "nofollow" into the tag. You can also make an inline link no-follow with the following code:
<a href="http://www.site.com/page.html" rel="nofollow">Visit My Page</a>

revisit-after
<meta name="revisit-after" content="7 days">

rating content
<meta name="rating" content="General">

distribution
<meta name="distribution" content="Global">

classification
<meta name="classification" content="PRIMARY KEYWORD PHRASE">

author
<meta name="author" content="Company Name">

copyright
<meta name="copyright" content="Company Name © 2010">

cache-control
<meta name="cache-control" content="Public">

Friday, August 27, 2010

SEO with different languages - a cultural fit

Who is your audience? If the answer to that is an international audience beyond the borders of the local area of delivery, then don’t forget it, as you need to consider the impact of reaching the masses!

Just by looking at your website tracking stats it should be clear whether you have an international flavour to your site visitors, so by going to ‘their level’ consider localisation factors in the way that international visitors may consider searching on our site.

They don’t watch our TV, they don’t listen to our radio, they don’t read the front page of our newspapers and they say potato when we say potatoe. Use the available language converters to check your main 3-5 keyword/phrases and consider using the results as part of your keyword group.

Language Resources:
http://www.freetranslation.com/
http://www.google.com.au/language_tools?hl=en

Does your domain name (Top Level Domain) say international (.com) or local (.com.au)?

Thursday, August 26, 2010

Typographical Errors - Capturing misspelt words in your SEO program

Fremantle or Freemantle? Typographic errors due to rushing or fat fingers on a keyboard, difficult or hard to spell words, or foreigners of different nationalities attempting words that they may have only heard spoken provides an avenue for picking up stray search attempts.

When preparing your list of keywords consider the variations and attempts that people may make. Particularly review your search stats for the website/page after a few months to determine the attempts that have been made and cater for these if you sense a fair proportion of searchers may continue to err.

This is particularly useful in a tightly competed SEO field – where millions of dollars may be spent on obtaining top position for a particular phrase or keyword, you can skirt the outside of the pack and pick up many stragglers attempting to access that phrase but who just didn’t quite get it right.

Tuesday, August 24, 2010

Keyword Competition Analysis

Analysing your competition is an important step in the search engine optimisation process and it takes into consideration: what keywords your competition has chosen and how they use them on their website.

If you want to see who is your competition for the targeted keywords you've chosen then these tools will assist:

Competition Tool - SEO Digger
http://seodigger.com/

Competition Analysis Tool - Seoscorecard
http://www.seoscorecard.com/

Top Competitor Tool - Webuildpages
http://www.webuildpages.com/seo-tools/top-competitor-tool.php

Monday, August 23, 2010

Keyword Suggestion Resources

Keyword Suggestion Resources:

SEObook:
http://tools.seobook.com/keyword-tools/seobook/

Wordtracker Free Keywords:
http://www.wordtracker.com/

Google AdWords Keyword Suggestion Tool:
https://adwords.google.com/select/KeywordToolExternal

Google Suggest Tool - shows frequently search for phrases starting with the words and letters in your query:
http://tools.seobook.com/general/keyword-information/

Keyword Discovery - free keyword tool from Trellian:
http://www.keyworddiscovery.com/search.html

Microsoft Keyword Forecast - shows predicted Microsoft search impression count and historical trends:
http://adlab.msn.com/Keyword-Forecast/default.aspx

Keyword Mutation Detection - Detect frequent misspellings or alternative spellings of the same keyword in search query logs:
http://adlab.msn.com/Keyword-Mutation-Detection/

Wordze - nice for generating decently large keyword lists quickly and inserting them into paid search campaigns:
http://www.wordze.com/

Metaspy - see what others are searching for currently
http://www.metaspy.com/info.metac.spy/metaspy/

7search.com keyword suggestion tool
http://conversion.7search.com/scripts/advertisertools/keywordsuggestion.aspx

Use KeywordPad to Clean, Modify & Multiply (Long Tail) Your Keyword Lists
http://www.goodkeywords.com/

Keyword Research Tool - Webmaster toolkit
http://www.webmaster-toolkit.com/keyword-research-tool.shtml

Keyword Suggestions Overture - SEO Chat
http://www.seochat.com/seo-tools/keyword-suggestions-overture/

Website Keyword Suggestions - Webconfs
http://www.webconfs.com/website-keyword-suggestions.php

Keyword Suggestion Tool - Self SEO
http://www.selfseo.com/keyword_suggestion_tool.php

Keyword Valuation Tools

Google Traffic Estimator - shows the estimated bids prices and ad clicks for the top ranked AdWords ad. Allows you to check for [exact match], "phrase match", or broad match.

https://adwords.google.com/select/TrafficEstimatorSandbox

SEO & Internet Marketing Google Gadget Tools

http://tools.seobook.com/google-gadgets/

http://inventory.overture.com/d/searchinventory/suggestion/

Search engine keyword position tools

When done correctly, search engine positioning can increase web traffic by a tremendous amount. Use these tools to check your current position of your keywords in major search engines like: Google, Yahoo, MSN:

Search Engine Keyword Position - SEO Chat
http://www.seochat.com/seo-tools/search-engine-keyword-position/  

Keyword Analysis Tool - Mcdar
http://www.mcdar.net/KeywordTool/keywordtool.asp

SERPS Position Checker - LinkVendor
http://www.linkvendor.com/seo-tools/serps-position.html

Website Position Tool - Rnk1
http://www.rnk1.com/

Dictionary Resources:

Using Dictionaries & Thesaurus: Here’s a great starting point for you to consider variations on a theme – use the online Thesaurus to generate associated and similar words to your key focus words.

http://www.lexfn.com/

http://dictionary.reference.com/  

Sunday, August 22, 2010

Keywords - plurals, singular, misspelt...

Consider the benefit of adding plurals and/or slight variations e.g. both Australia Australian. The vast majority of search results seem to focus upon returning results for the plural version of the word, instead of the singular version. Providing results that are singular or plural will provide more relevant results for a searcher than if the search engine had just returned results for the version that a searcher entered into a search box.

Also worth noting is where Google’s Algorithm highlights synonyms. Also the degree to which we understand synonyms changes and improves, and this can affect the impact on highlighting the terms. The main point is that the algorithm does change. Google mostly highlights stems, not synonyms, but the distinction is lost on most people. Stemming on Google isn't new. The company has been doing it since back in 2003. It was also highlighting stemmed words back then, too. If you searched for "running," and it found a page with the word "run," the word "run" would be bolded in the search listing description.

Consider using keywords that are misspelled or mistyped. With a business name or product name, consider how people may say it, or sound it out, and thus interpret the spelling of that word.

Do use a combination of very unique and more general words to describe the contents of your page.

Consider using keywords related to something else your prospect would buy or search for.(related phrases)

Once you have completed your brainstorming and have compiled your list of 5-10 core keywords, it’s time to move on and expand that list. A list of 5-10 search phrases will generally not bring the amount of search engine traffic needed to make your website successful. However, that list will be a vital tool when determining which phrases to add to the mix. At this point, you need to turn to the search engines themselves and research which search phrases are actually being typed into Google, Yahoo, Bing et al. While few search engines will openly tell you which search phrases are the most often searched, there are several very useful tools you can use to expand your list of keyword popularity.

Saturday, August 21, 2010

Gone but not forgotten - Benefits of the Keyword Metatag

<keywords> Most search engines compare your meta keywords with what is actually on your page, and if it doesn't match, your web site can get penalized, and suffer in search results. Metatag keywords should have no more than 16 keywords. The limit is usually around 20-30 words. Maximum character length is actually 1000, but aim for less than 300, but really, given the negatives surrounding keywords, less is usually more.

Use plural variations (as the singular is included), comma space separate keywords, focus the design of the keywords around your primary 5 keywords and their variations... but again put the MOST important at the beginning of the list. Keyword PHRASES are important, and more realistic, as most people will search using 2 or 3 word combinations, rather than a single word!. Also determine keywords that searchers may use when looking for products or services that relate to your site. Use keywords that match the content within the page.

The Meta keywords were once the be all and end all of SEO. These days, the importance of the keyword tag has fallen drastically, as the ease with which it can be spammed has made it a less than perfect means for categorising pages. That said, Yahoo still uses the keywords tag, and as such it is still worth having, just not taking as much time over as was previously the case. In most cases, one keyword tag per site, or one per section, is usually more than adequate. The keywords tag should have terms separated by commas.

Keywords represent the key terms that someone might enter into a search engine. Choose only relevant keywords. If the terms are going to appear in your keywords tag, they must appear in the content of your site, or be a synonym to a term on your site. Most search engines compare your meta content with what is actually on your page, and if it doesn't match, your web site can get penalised, and suffer in search results.

The Keyword tag should include the keywords/phrases that you are targeting in order of importance. Start with the most important and then proceed to less important but still relevant keywords. It is extremely important that you do not repeat any word more than 3 times within the Keyword tag. This includes words within target sentences. Also do not place repeated keywords close together.

So even with the relative demise of this tag due to black hat spammy use, the benefits in a structural sense for organising your page and determining your focus should be enough to consider its use.

Friday, August 20, 2010

Describing the Description Meta Tag

The <description> Description Meta Tag directly represents what will appear as the "summary" of your web page. This will be displayed after the title of your web page in any search result SERP or "listing". The content of the description should clearly communicate what a visitor to your web site can expect to find when linking to your site, and you should include keywords in the description to help boost the ranking of the page even further.

Avoid simply pasting your keywords here as most search engine users will not respond well to a search result that does not contain a short explanation of your page. You should try to 'hide' the keywords within the description so that you achieve more favourable results within an engine listing. This makes the Description Meta Tag the perfect opportunity to repeat keywords and add more key phrases for your web page!

The description tag should be about 250 characters maximum, include the primary five keywords repeated 2x/3x each within this description (but NOT more than three times). Description Tags should contain between 12 and 24 words and written as a call to action to encourage users to click on it. Many search engines will display this summary along with the title of your page in their search results. Keep this reasonably short, concise and to the point, but make sure that it's an appropriate reflection of your site content.

Here's a HINT; It is beneficial to reflect the content very similarly in the first section of text that appears on your actual page. Try to make your description content very user friendly for humans to read. This tells them what the page is all about. It should reflect the title, but NOT repeat it word for word. Google does not use the description verbatim, preferring instead to return what they consider a relevant snippet from the page’s content. In any case, the description is still a valuable on-page factor, as Yahoo does use the description as intended, and Google will sometimes return relevant parts of the description in snippets.

Every page should, if possible, have a unique description. Descriptions should always reflect what a user can expect from a page, be proper sentences and read well. It should not be a repeat of the title, but instead expand upon the title.

Thursday, August 19, 2010

Behind the Scenes - a look at the Title Meta Tag

The <title> tag in the metatags should aim to include ALL 5 of the page's primary focus keywords in the title text... the title should be about 14-16 words maximum in length, the first two-three words in the title will be the PRIMARY focus keyword for the page. You can used punctuation, but aim to remove wasted information and unnecessary items, such as the following characters: -, :, !!!, all of which waste space in your valuable SEO realestate. Make it readable for humans as well as optimised for Search Engines! Market the title, make it compelling to click on... information rich... relevant to that page. Keep the page title brief. Including too many keywords (keyword stuffing) may dilute your overall relevancy rating. Aim to use a grammatically correct sentence as the title, rather than just a list of keywords. Most SE crop titles at around 70-90 characters, Google shows no more than 65 characters, so the more specific information that can be placed in a title in the first 8-10 words, the better. It is important to remember that titles should provide a ‘call to action’ the more compelling the call to action, the more likely the user will be to click on the SERPs result compared to competitors.

Think of the title as a mini advertisement for a page, try to entice the user to click through. Titles should be constructed from the content on the page to ensure maximum benefit and correlation for relevancy rankings in result pages. All of the keywords that appear in the title should appear within the content of the page. Make Titles as specific as possible, and promote differentiating content, e.g. if a site has several pages about South Africa, including separate pages for “South Africa Tours” and “Adventure Travel South Africa”, differentiate those two pages by using appropriate phrases.

Wednesday, August 18, 2010

Increasing Keyword Density through bottom/footer links

Increase the navigational link opportunity throughout all pages across the website by creating a series of bottom footer tags.

Creating these visible hyperlinked text links on each webpage increases the user navigation whilst supplementing the website with high audience keywords that are being used in the pay per click advertising. For sites that may have been designed without an initial SEO focus, and have navigational elements embedded in Flash or scripts that leave a visiting search bot scratching its head as to how to follow and explore deeper into your site, this is an effective solution that provides usability and SEO rich features.

By developing bottom Footer Tags which are standardised across every webpage throughout the website. These hyper-text links are keywords that provide tunnels for search engine spiders, crawlers and robots to access every page on the website whilst adding to the keyword density of each page – without compromising the simple sign up functionality of the website that users are looking for. They can also be hidden behind a link and not exposed on the main pages – to avoid competitors detecting a search engine linking strategy.

With the constant modification of Search Engine algorithms, future algorithm updates may do away with all repetitive and similar looking links to websites. Therefore, it is important to create a range of links and not one standard link text. By doing this, the search engines are not likely to detect a pattern and this may assist in preserving your PageRank if the algorithms change.


Partly off-topic tip: Change the anchor text of a “more” link to the keyword the “more” page deals with!

Tuesday, August 17, 2010

Some considerations in linking strategy

The following are some thoughts and cautions around a linking strategy...


Sub Domains

Sub domains are subject to a lot of industry debate as to their importance in the process of natural search engine optimisation. Multiple sub-domains were once used as a way to increase the number of URLs without providing additional content. This practice is largely ignored by search engines. Beware duplicate content issues that may arise through this practice.

Re-Directed Links

A link that is first re-directed to another page within your partner site before pointing to the client website is referred to as a re-directed link. It is important that clients monitor these links, as search engines do not give weight to re-directed links. It is very unlikely that the client would draw any benefit from a re-directed link.

Dynamic Link Pages

The client should also be aware of any link pages that are generated dynamically. These pages are not frequently indexed meaning that a link from such a page would not benefit the client site.

Flash Link Pages

It is also important to identify pages that are generated through Macromedia Flash, as many search engines cannot read and index flash pages or the links embedded within flash.

Directory Depth

It is also important to evaluate the depth of the directory of the linking page. Avoid getting links from pages that are embedded in a very deep directory or pages that are more than two directories deep.

For example, www.yoursite.com/leve1/level2/level3/linkpage.htm is not a good link page. It is important to remember that deep directories seldom earn high PageRank as they are also slow in getting indexed, if at all!

Warning: Avoid Frame Sites

It is important to avoid receiving links from framed sites. A link placed on a website with frames would not provide client website any benefit, as search engines would not be able to recognise such a link.

Monday, August 16, 2010

Google PageRank formula

What is all this fuss about PageRank?

Google's founders, Sergey Brin and Larry Page created their search engine algorithm that shifted the ranking weight to off-page factors. The formula called PageRank (named after its founder Larry Page) evolved where the algorithm would score out of ten the "value" of each website in comparison to other websites on the Internet. It is based on two key factors; the number of links you have pointing to your website and the value of the links pointing to your website. The value is calculated based on the PageRank of the page linking to your website and the relevancy of the page linking to your site.

In order to do this, you should begin by understanding Google’s PageRank. Although Google PageRank is the heart of the Google algorithm, it is not the absolute consideration in determining the most relevant website and hence the highest listing in the search engine results pages. A definition of PageRank as provided by Google is included below:

Defining Google PageRank

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyses the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important." Importantly, high quality sites receive a higher PageRank, which Google remembers each time it conducts a search. Of course, important pages mean nothing to you if they don't match your query.

So, Google combines PageRank with sophisticated text matching algorithms to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all aspects of the page's content (and the content of the pages linking to it) to determine if it's a good match for your query.

PageRank of the Linking Page

Firstly, PageRank of the linking page is one of the most important factors as it determines how much value of importance is passed on to your web page. The higher the PageRank of the linking page, the higher the value your website will receive.

Sunday, August 15, 2010

Outward Link Benefits

Search engines not only count the number of external websites that backward link to the client website but their algorithm considers the number of Outward Links from that client website to external websites.

Search Engines consider a website desirable for higher search engine indexing where it contains easy to navigate pathways from each page with a website.

Higher value outward hyper text links should be spread throughout each webpage to the following types of organizations:
- Government departments (.gov)
- Associations and Business networks (.net)
- Related Trade Organizations (.org)
- Information websites and directories (.info)
- Independent Commercial organizations such as news, media and magazines websites (.com)

Include search engine friendly info: Although search engines do not specifically state that there is specific content or file extensions that they rank higher in their search engine results pages, experience in many industry sectors suggest that they have a strong preference for particular formats and content such as PDFs, FAQs, case studies and glossaries.

Saturday, August 14, 2010

Internal Links - confirming your intentions

Google and many other search engines algorithms assume that the page with the most internal links is the most important one in your site. So if you link from all your pages to the contact page it will be the most important page for Google. So you really have to tell Google: Do not mistake this page for the most important one by using the nofollow attribute (not tag!) on the links leading to the contact page, or balance the power and stack the odds, and actually determine your internal link strategy in favour of your intentions.


Friday, August 13, 2010

Design with Accessibility in mind

I would encourage webmasters to design pages for users, not just search engines. When you're designing your AJAX site, think about the needs of your users, including those who may not be using a JavaScript-capable browser (for example, people who use screen readers or mobile devices). One of the easiest ways to test your site's accessibility is to preview it in your browser with JavaScript turned off, or to view it in a text-only browser such as the Lynx browser. Viewing a site as text-only can also help you identify other content which may be hard for Googlebot to see, such as text embedded in images or Flash.

Whilst in the mode of testing, consider reviewing the browser screen resolution. I feel that it is important to test a web site in as many ways as possible. View it in different screen resolutions, various computer platforms and different browsers. Find out how your web page looks like with the viewer's monitor set to different screen resolutions as you want to be sure your visitor can see everything or at least the important navigation links:

Browser Shots - My Favourite
http://browsershots.org/

Browser Screen Resolution Checker - Markhorrell
http://www.markhorrell.com/tools/browser.html

Screen Size Tester - AnyBrowser
http://www.anybrowser.com/ScreenSizeTest.html

Screen Resolution Checker - AndyLangTon
http://andylangton.co.uk/stuff/screen-resolution-checker

Thursday, August 12, 2010

AJAX in your site, should this be a concern for your SEO efforts?

We can see in many current website architecture design that webmasters have discovered the advantages of using AJAX to improve the user experience on their sites, creating dynamic pages that act as powerful web applications. However, just like Flash was a stumbling block for SEO opportunities, AJAX can make a site difficult for search engines to index if the technology is not implemented carefully. There are two main search engine issues around AJAX: Making sure that search engine spiders can see your content, and making sure they can see and follow your navigation.

While Googlebot is great at understanding the structure of HTML links, it can have difficulty finding its way around websites which use JavaScript for navigation. Google continues to work on doing a better job of understanding JavaScript, but your best bet for creating a site that's crawlable by Google and other search engines is to provide HTML links to your content.

If using AJAX to create your links, format them so they'll offer a static link as well as calling a JavaScript function. That way you'll have the AJAX functionality for JavaScript users, while non-JavaScript users can ignore the script and follow the link. For example:

<a href="ajax.htm?foobar=32" onClick="navigate('ajax.html#foobar=32');
return false">foobar 32</a>

Note that the static link's URL has a parameter (?foobar=32) instead of a fragment (#foobar=32), which is used by the AJAX code. This is important, as search engines understand URL parameters but often ignore fragments. Since you now offer static links, users and search engines can link to the exact content they want to share or reference.

While Google is constantly improving their crawling capability, using HTML links remains a strong way to help (as well as other search engines, mobile devices and users) better understand your site's structure. If you need to balance out the benefits of dynamic delivery using AJAX, what does your static sitemap page provide as an alternative to ensure a visibility to all pages in your site?

Use your Rights to better effect - The Right Hand Column SEO technique

Whether your business is Amazon, eBay or a software development company, it is important to utilise all the available screen real estate of your website for SEO gain, in particular the Right Hand Side column of each webpage to create navigational quick links for customers to find high trafficked sections of the website.

The ‘Right Hand Side Column’ is an often under-utilised search engine optimisation real estate of a website. Consider a ‘RHS Column’ to increase keyword density across your web page layout. The right hand side is a perfect place to add high audience keywords that search engines index whilst providing improved usability for clients.

HTML and CSS Validation

Although not proved without doubt that W3C validation gives you better search engine rankings, it's sure that errors in your code can cause you big problems. Converting your website pages to XHTML may help you reach more customers, as your site will work better in more browsers and non-traditional devices.

In order to see if you're CSS files or HTML code turns out to be valid use these tools:

HTML

W3C Validator
http://validator.w3.org/

WDG HTML Validator - Web Design Group
http://www.htmlhelp.com/tools/validator/

CSE HTML Validator Lite
http://onlinewebcheck.com/

Validation Services for your HTML / XHTML / WML - Validome
http://www.validome.org/

CSS

CSS Validator - Jigsaw
http://jigsaw.w3.org/css-validator/


On a side note, the going experiment with the use of the Blogger specific platform continues as I test uptake, design/coding issues using default settings, spamming issues and management of posts as well as organic growth and seeding of content via this medium. Extending past the bare bones approach and adding modules to the template layout, as well as experimenting with advert positioning has been a useful exercise. I have appreciated users feedback along the way as the experiment continues.

The Absolute Positioning Technique - The loading sequence

Continuing on from my posts on the benefit of absolute positioning getting the content and all those beneficial keywords and phrases in front of the search engines spiders instead of being lost in heavy code and giving up. Today's post will have you consider the ‘order’ in which the certain ‘content blocks’ appear in the source code. It is well understood that cross browser programming remains a challenge to this day which can also restrict the extreme use of positioning.

Our goal is simple - move the main content up to the very top of the source code.

This search engine optimisation technique called ‘absolute positioning’ is to avoid search engine crawlers visiting the website but not indexing the low positioned content.

Absolute positioning will allow clients to write pages in the order required to maximise the likelihood of search engines spiders crawling the source code and to provide control over the order of content elements appearing in the visitor’s browser when accessing the various sections of the client website. The recommended order of delivery:

Loading 1st
- Main Body Content

Loading 2nd
- Keyword Footer

Loading 3rd
- Navigation Menu

Loading 4th
- Image Files

Absolute positioning is done through the use of <DIV> tags and classes inside the Cascading Style Sheets (CSS) files. The blocks are given sizes (through width and height) and are placed at either a set distance from the top left hand corner of the page or a parent <DIV> tag which contains all the visual elements of the page. Absolute positioning can still be done in pages that have their content in the middle of the page rather than resting on the left hand side of the browser.

Wednesday, August 11, 2010

Load Images Last... a SEO technique using css

Search engine optimisation often requires techniques to ensure the search engine crawlers can see content higher in the source code, instead of it being hidden deep below source code and images. The ‘Load Images Last’ technique aids in higher natural search engine rankings.

Many websites, load webpages from top to bottom, therefore having images and navigational scripts loading first with the content following.

Moving non indexable content below the main text by the use of absolute positioning can greatly increase search engine keyword density and page rank.

This ensures that the first content block to load and the first content encountered by crawlers is the main body text, which has been optimised with keyword phrases and keyword density methodologies. Ensure that optimised text is placed higher up the page than navigation and graphical images.

Style sheets, need to be maximised for search engine optimisation best practice.

Tuesday, August 10, 2010

Analyse your code efficiency for better SEO

The Search Engine Code Efficiency analyses how many lines of code are contained in the source of a website's code prior to the Body tag. The Body tag is where the body copy content is located and what search engines typically look for.

By reducing the number of lines of code before the Body tag this will improve the efficiency by which the search engines will index the website.

Designers are recommended to consider inserting content following the Body tag across the website to increase visibility to search engines. This can be achieved through a technique called ‘absolute positioning’.

So what is the importance of Search Engine Code Efficiency?

As search engines interpret the HTML code of websites from the top of the page to the bottom of the page - reading the content from left to right - the spider or crawler often gives up before discovering content if it is buried within the website.

Search engines will open the <table> tag and look for the first "table row" <tr> and begin to read each "data set" <td>"data"</td> inside the "table row" from left to right until they find the closing </tr> tag.

Often, if there are too many code lines before the <body> tag, the main body content may not be indexed because it is situated too far down the page.

It is also known that search engines usually look at many page identifiers such as the page headings, chapter headings and the main paragraphs of the body copy to determine the consistency of the first 200 words of content after the opening <body> tag. Generally, the first 200 words at the top of the page are considered the most important text. It is common-place that website developers have unknowingly pushed the indexable content below the radar of the search engine crawlers.

As an SEO best practice, you will want to make sure your code is as error-free as possible, from a W3C validation standpoint, and that you follow guidelines for semantically correct markup. Testing shows that good, clean, semantically correct code not only allows your site to load faster in major browsers, but also allows for faster indexing by the search engines.

Monday, August 9, 2010

Benefit from style sheets to control content positioning

The use of css (Cascading Style Sheets) in the control of absolute positioning has the following benefits:

Smaller file sizes and faster page loads.
A single style sheet can control 100’s of pages across the your website, enable simple and quick updates/changes.
Improved Keyword density advantages.
Utilise style sheets for different media (mobile).
Finer control over your page layouts.

An advantage of using style sheets to control positioning is that this technique has no effect on the GUI (Graphical User Interface) or how the viewer sees content in their browser.

Code Efficiency for search engine visibility

To maximise search engine visibility for a website, the ideal location for the main body text is immediately after the <BODY> tag. This location ensures maximum visibility and pick-up for your content by the search engine spiders and website crawlers.

Consider the following to reduce code bloat:
Use of external style sheets for design elements
Use of external Javascript where possible
Use of a CSS based navigation menu

This process also assists in page management, code stability, debugging and a better end-user experience via streamlined HTML code. Consider the following search engine optimisation technique called ‘absolute positioning’ to avoid search engine crawlers visiting the website but not indexing the low positioned content.

Absolute positioning will allow you to write pages in the order required to maximise the likelihood of search engines spiders crawling the source code and to provide control over the order of content elements appearing in the visitor’s browser when accessing the various sections of your website.

Sunday, August 8, 2010

Absolute Positioning - giving your content a lift up

The higher your content is on a given page the more it counts for Google. Google does not see a page like a human being, Googlebot crawls the code. Thus the higher your content is in the code the better. So if you have a complex site with lots of menus, scripts and other gimmicks you should consider absolute positioning, otherwise Google might even stop crawling your page before it reaches the main content. You can place the actual content high up in the code, at the top, while the users will see it in the middle of the page below the menus.

Saturday, August 7, 2010

Invisible Text - Search engines can see you, and don't like you for it!

Simply put, do NOT use invisible text to keyword stuff content in the text on your page! Using 'invisible' text is the practice of making the colour of the text on the page the same as the background colour. This ia an 'old' black-hat SEO technique. As an example... white letters on a white background qualify as hidden text. The reason that many search engines have banned this kind of text is because it is VERY hard to detect it. This is usually a result of the vast number of colour combinations that have to be recognized when an engine 'spiders' the page.

As an example: You have a web page that uses a background colour of #FFFFFF (white). You then place text on the page and slightly alter the colour to make the colour of the text #FFFFF0 (off-white), which would now be ALMOST totally invisible.

Many, but not all, of the major search engines now have a policy against this type of technique, and have set up their indexing programs to hunt it down and blacklist the site because of it. When spotted by a bot, the penalty can range anywhere from ignoring the text, to refusing to index the page at all, or even worse. This type of practice is regarded as a black-hat SEO SPAM technique by most of the search engines and it highly recommended to avoid including it on your page.

Of course you may come across a well ranked page here and there that is still using hidden text. This should be taken more as an indication that the page has not been re-indexed since the tactic was made 'illegal', than an indication that the search engine allows this practice. Basically, most SEO's have come to regard hidden text as too risky, and have moved on from it.

Friday, August 6, 2010

Avoid Duplicate Content

A big no-no. To avoid delisting, the process is to procure content from other websites and rewrite. Cutting and pasting is not acceptable, original content must be written by you. Search Engines can detect cut/pasting of same word sequencing across websites and punish website rankings that do this. So your aim is to always have fresh and original text for your copy.

Duplicate Content Checker Resource:
http://training.seobook.com/duplicate-content-checker

Thursday, August 5, 2010

Keyword Density in SEO on-page copy

Our goal - achieving 7% or more of Keyword Density per 250 words for each webpage.

Where possible, each web page should have high pay per click and high traffic keywords included into the body copy at a density level of approximately 7% for 250 words – which works out to being about 20 words. It is important that the keywords are contained on the first line of each paragraph. (Just like in school, the starting sentence of the paragraph grasps our attention, then unfolds through the rest of the paragraph as more is unveiled to support the idea being presented in the paragraph).

Use keyword density analysers to analyse the keyword density of keywords on your web site pages or measure the keyword density of your competitor's website to determine the optimum keyword density for your own keywords. The links provided below allow you to enter the URL and search word or phrase which you would like to run the keyword density analyser on (without quotes). The aim - is to have your focus 3/5 keywords-phrases appear predominantly in the results.

Keyword Density Analyser Resources:

Keyword density and word depth calculator
http://www.keyworddensity.com/

Webjectives Keyword Density Analyser Version 2.0
http://www.webjectives.com/keyword.htm

Keyword Density - SEO Chat
http://www.seochat.com/seo-tools/keyword-density/

Keyword Density & Prominence - Ranks
http://www.ranks.nl/tools/spider.html

Analyse Keyword Density - Google rankings
http://googlerankings.com/ultimate_seo_tool.php

Keyword Density Checker - Webconfs
http://www.webconfs.com/keyword-density-checker.php

Keyword Density Analyser Tools - SEO Book
http://tools.seobook.com/general/keyword-density/

http://www.keyworddensity.com
http://www.webjectives.com/keyword.htm

Wednesday, August 4, 2010

Tips for Good Copywriting

Here is a list of considerations when reviewing or writing the copy for your website:

Mention what you offer exactly on your site/page, use brands and exact product names
Mention where you offer it “Travel Services Australia”.
Mention why you offer it “We offer recycling solutions because we believe that clean business is profitable business”.
Mention to whom you offer it: “Web hosting solutions for small businesses”
Explain one key term on the page eg. your homepage contains: SEO (Definition: search engine optimisation, the process of making websites more search and search user friendly)
Replace your homepage images with smaller ones in byte size (below 50 kb) so that non-broadband users stick with you. (Okay off topic slightly, but just an ongoing reminder on the importance of this one). 
Convince! Start the first or second sentence of your page with “We will make you number one in…” instead of solely describing “Product X offers…”
Add your name to the text: Bart, CEO of Simpson Industries... people trust people, not companies.
Understand your audience by researching business related keyword phrases.
Write informative titles and descriptions.
Link the title, description and keyword metatags to the content on the page.
Use language that a user/customer expects and understands.
Good content - emphasise the subject matter within the content.
Did you remember your USP (Unique Selling Proposition)? What makes your company or product different and better to what else is out in the marketing? Why should people buy yours?!

Tuesday, August 3, 2010

Copy Writing - Page Content for SEO

Optimise Text on the Page - The text on your home page is crucial to maintaining the attention of fresh viewers. In fact, copy is so important that many companies prefer to have a professional copy writer create this content. For many small business website owners however this may not be an option. What you need to do is to keep your text engaging as well as smartly optimised to present an obvious topic to the visiting search engine spiders or visitors. To do this, keep your mind focused on the keyword(s) that you have chosen to target on the search engines while you write the content for the page. (Think of the consumer when choosing these initial level keywords... keywords will refine through the buying process cycle and become more specific as the consumer prepares for purchase.) Implement the keywords within the text without sacrificing the true intent of the information - to engage and retain your viewers.

Now, on the Home Page... specifically;

Often the first 25-30 words of your home page are what the each search engine will use for the description of your web site. Try to utilise your target keyword(s) within this area but be certain the resulting sentence is legible and descriptive. Also repeat your main focus keyword(s) in the last section of wording on your page. The index (or default) page is the first page loaded after the root URL is entered into a browser (e.g. index.htm or default.htm). This is the most important page of any site, as a number of search engines will only start indexing from the home page. (Do you have a textual link navigation that branches out from the home page to touch each and every page on your website?... We'll save this for another article). 

The body text on the opening page must contain keywords, and keyword phrases (don't forget the benefit of the long tail search), that relate to the theme of the site as a whole. The copy on the home page should convey to a user, and thus search engines, what the site is about. It is not enough to rely on images and other visual clues, it must be stated in plain text. Search Engines are essentially blind (remember what I was saying about the need for ALT tags on images in your code?), using the text within the source code of the page, to understand what the page is about, and how to navigate around the site.

Many sites make the mistake of using a redirect to move the users to a new page depending on their browser. Mostly, this is done with JavaScript which cannot be followed by a Search Engine. Other sites use the opening page for a Flash introduction or simply a company logo, both of which are barriers to a search engine indexing your site. JavaScript is a language that Search Engines do not understand fully, so many do not bother to read it, or struggle with it.

Content on a page should be around 500 words. Keep in mind that content is for the broader internet community, and remember that the internet is a unique media. Reinforce the keywords you are targeting within the content by repeating them. A keyword or phrase mentioned several times within the content is likely to be given high relevancy when ranking results within search engines, where as a word used once or twice is less likely to deliver traffic. (This should be done appropriately and not just 'keyword stuffed' or other black-hat SEO techniques that may be detrimental to your site's performance).

To achieve high rankings in search engines for specific key phrases, a site needs a page on which that phrase is prominent. Often, for sites with dynamic content, there is no logical page to attract relevant traffic, e.g. a flights site has a lot of flights, but as these change frequently, there is no logical landing point for many, frequently searched terms such as “cheap flights”. By building a few such pages, a site can effectively capture such traffic.

Duplicate Content Checker Resource:
http://training.seobook.com/duplicate-content-checker

By developing pages that Search Engines can index, that are always live and based around commonly searched phrases, particularly specific, longer phrases for which there is currently no logical landing page, the breadth of search terms for which a site can be found can be increased significantly. In terms of volume of traffic, a couple of dozen of pages built in this way will collectively deliver significant volumes of traffic, and traffic that is currently purchased at premium prices.

From a usability perspective, this provides users with a highly bookmarkable page that they can return to time and again, with information tailored to their specific interests or needs. This is particularly true for sites such as property sites, job sites and any other sites with a user base with an action in common, but little else. Giving users exactly what they are looking for and nothing else, is the best way to ensure return visitors.

There is so much information that can be gleaned from users simply by what they search for or which pages they look at. By utilising this information, and offering people a convenient page offering only the information that is relevant to them, not only will Search Engine traffic increase, but so will repeat traffic, which ultimately is the best traffic of all.

Consistency in Page and Paragraph Headings

In search engine copywriting, headlines are as sacred as in most other communication mediums. As search engines weight the text higher in importance if it appears in the page and paragraph headings, it is important the client utilises the targeted phrase for each page. It is important there is consistency in the navigational menu, the Title/Description/Keyword Meta Tags, Page Headings and Chapter headings on every page throughout the website in this manner.

SEO using PDF files

PDF files greatly assist visibility throughout search engine results pages. High natural search engine results are received by PDF documents.

Search Engine Visibility and Adobe

Adobe Publisher allows descriptions to be added into the document which allows search engine optimization to be related to off page elements such as PDF documents. Each document should have high audience keywords and the descriptive phrase in the Title Tag, Page Heading and Paragraph heading provides keyword density for search engines.

PDF files greatly assist indexation throughout search engine results pages. This is because search engines perceive these documents to be very helpful to users. Although search engines read and store some PDF documents in html files on their servers, if the PDF file name is consistent with the content, they will index it and return these files with higher rankings. To improve your SEO capabilities through .pdfs on your own website, have your design team consider deploying PDF documents into various sections of the website and classify these PDFs with a separate nomenclature as this will improve search engine visibility.

For Acrobat Reader files, Google SERPs now shows information such as the author details form the .pdf in results. This is also an area that can be optimised and searched upon.

What can Google actually index?

Google can index most types of pages and files. The most common file types include:

Adobe Portable Document Format (.pdf)
Adobe PostScript (.ps)
Atom and RSS feeds (.atom, .rss)
Autodesk Design Web Format (.dwf)
Google Earth (.kml, .kmz)
Lotus 1-2-3 (.wk1, .wk2, .wk3, .wk4, .wk5, .wki, .wks, .wku)
Lotus WordPro (.lwp)
MacWrite (.mw)
Microsoft Excel (.xls)
Microsoft PowerPoint (.ppt)
Microsoft Word (.doc)
Microsoft Works (.wks, .wps, .wdb)
Microsoft Write (.wri)
Open Document Format (.odt)
Rich Text Format (.rtf)
Shockwave Flash (.swf)
Text (.ans, .txt)
Wireless Markup Language (.wml, .wap)

Monday, August 2, 2010

SEO using Google Maps

The use of Google Maps is believed to lead to an increased natural search engine indexation. The rationale is that Google will most likely index pages that contain their Google Maps product compared to other competitor website pages which do not, assuming all other web page elements are equal.

What are Google Maps?

Google has an Application Programming Interface (API) for website developers to integrate Google maps into web pages. This API allows web developers to create an administration where non-technical persons can browse the map using the zoom in and zoom out functions to select a location. This user-friendly navigation automatically generates the geographical coordinates of the location, rendering the map in real time in the users Internet browser.

This same rationale applies of course to the other map providers; Yahoo! and Microsoft. Would site variations that use other map providers rank higher because of this... an interest hypothosis to test I think.

ALT tags for images - usability and SEO

The hidden text. Remember that Google and search engines are smart, but they don't have eyes looking at the page itself, so use the code to determine what is happening. NEVER ever, forget to put ALT tags for images in pages, and again, be descriptive and keyword focused.

ALT tags are of course a very good design methodology to utilise as they are required for disabled visitors to your site who may be viewing with a speech device that reads the content on the web page out aloud. And of course there are still other web browsers that allow users to “turn off” images when browsing… this is an approach used by users on very poor line connections or bandwidth starved who just want to get to the good stuff!

ALT tags are designed to ensure that the Internet is usable for everyone, irrespective of handicap or impediment. The Australian Government has been vocal about enforcing the use of ALT tags in the past, and it is always a good idea to ensure that they are used throughout a site.

Every image should have an ALT tag, and these should either describe what an image is, or where an image leads to if it is a link. E.g., if an image is a picture of a man standing on a mountain and links to a page about South American tours the image’s alt tag should read as follows: <img src="image_name.gif" width="10" height="10" align="middle" border="0" alt=“Information on Peru & Bolivia tours” />

ALT tags should be kept short, no more than 10 words, and should not repetitively use keywords. This is referred to as “keyword stuffing” and is likely to get a site either flagged for spamming, or outright banned.

Google will only read ALT tags for images that are links, and many other search engines are aware of the past strategy of keyword stuffing in ALT tags. For Organic SEO purposes, ALT tags are often very important due to the extensive use of image-based navigation. For sites that use such a navigation structure, it is important that ALT tags are added to the images in order to maximise the benefit of Link Reputation.

Sunday, August 1, 2010

How do you rank... for your images?

Users online search for many things, and hence use many different types of specialised search tools to filter out and find what they are looking for, whether it is images, music, books, videos... Now getting visitors to your website other than through standard methods, may get you thinking outside of the square. Google has a robot called the Imagebot, which prowls the Internet for images to place in Google's image search, and delivers the user to your website to see that image - all you have to do is capture the users attention and interest from there!

Google ranks and indexes images based upon their filename, surrounding text, alt text and page title.

So this is where your designer needs to consider including the keywords into the naming convention (referred to as the nomenclature) of the images across the website. Get your heads together and think how you can attract interest and clicks to your website with imagery.