Friday, July 30, 2010

Using Google Advanced Search to diagnose your website


Using Google itself, with knowledge of some of the advanced search features, will enable you to achieve an insight into your own site in regards to SEO.

Firstly, check to ensure that your site is free from any filtering issues by using: search [yourdomain.com] in Google; if your sites comes up #1 for this query, you are doing alright. If you see other sites mentioning your domain ranked first, that could be a bad sign.

Next, check for duplicate content problems. Search for a pretty long extract of text from your site (exact match, i.e. in quotes). Blogs, for example, have a good chance of seeing the category page (summarising the post) instead of the corresponding post page.

Determine how many links or URLs from your site have actually been indexed by Google by using: search [site:yourdomain.com] and then “dig deeper” into the search results by searching [site:yourdomain.com/subdirectory1] and then, [site:yourdomain.com/subdirectory2] and so on, as the “deeper” you dig, the accuracy increases.

It is worthwhile checking to see if your site has canonical problems where a non-www. url reference may be listed, if you had determined to focus on www. displaying primarily. Do this by searching [site:yourdomain.com -inurl:www] and see if any non-www URLs are picked up.

The following are search queries to use in identifying the more powerful pages of your site:
[www site:yourdomain.com]
[tld site:yourdomain.tld]
[inurl:domain site:yourdomain.com]
[domain site:yourdomain.com]

Leading on from this, you can then identify the most powerful pages of your website (keyword-dependent) by using the search query: [site:yourdomain.com inanchor:keyword]

To then determine sites with the most potential for promotional opportunity use:
[site:yourdomain.com inanchor:”key * word”]
[site:yourdomain.com intitle:”key * phrase”]

To assist you to find the most relevant pages of your site, lending them to additional promotion for a specified term use the search query [site:yourdomain.com keyword] or [site:yourdomain.com key * phrase].

Crawl and index frequency by Google can be determined using the search query [site:yourdomain.com] then play around with “date range” advanced search option.

Check who, and also what, your site is associated with in the SERPs by using the search query [related:yourdomain.com]. This will help to identify your site co-citation, or who your promoters also link to as well.

Review your competition's backlinks by using the link domain command. [linkdomain:competitor.com -site:competitor.com]

To narrow this down further and to see who links to all of the competitors you're targeting, you can use a search query such as: [linkdomain:competitor1.com linkdomain:competitor2.com linkdomain:competitor3.com].

Probably worth refining this query to exclude your own site to something like: [linkdomain:competitor1.com linkdomain:competitor2.com linkdomain:competitor3.com -linkdomain:yoursite.com].

Thursday, July 29, 2010

RSS Marketing

RSS (Really Simple Syndication) - How RSS Will Increase Your Business Success

Improve Your Search Engine Rankings

RSS will increase your rankings for the most important search engines and at the same time, through RSS specific search directories and engines, generate a new stream of traffic for your web site.

RSS can and will generate completely new traffic for you and help you increase the power of your existing traffic sources.

This medium is also beneficial in that it will get you listed almost immediately, even at directories like Yahoo!, and will get you highly ranked positions for your most important keywords and phrases.

And the best part: it's free and quick to set up.
100% Content Delivery - Avoiding the perils of email marketing

Forget about spam filters that are keeping your content away from your subscribers. RSS gets 100% of your content delivered. This works for direct marketing messages, e-zine publishing, customer support and so on.

With RSS you can even easily deliver daily or hourly news to your subscribers, and everything else as well. RSS will help you expand your content delivery to daily content updates, content updates by interest, content updates for different target audiences and so on. Use one RSS feed to deliver your daily news and the other to deliver in-depth articles, and so on...

So when speed to market is an important consideration for your marketing message, RSS can deliver!

And don't forget about the power of RSS personalization and autoresponder messages, which are already possible.

Discover New Marketing Opportunities

New times bring new marketing opportunities, and RSS is the leader in this area as well.

Use RSS to increase the sales of your affiliates by providing them with RSS feeds to better promote your products.

Launch your own RSS product feeds and digital catalogues (Amazon started publishing those not long ago) that bring your products directly to your recipient's desktops. Amazon is doing, but not many other people yet. Be among the first and get a lion's share of your target market. It is great for content specific subject areas that users wish to focus upon without all the other distractions.
Market through branded RSS aggregators and establish a constant connection with your subscribers. It's just like having a branded e-mail client that your customers and prospects are using every day ... but much cheaper.

Find innovative ways of delivering your ads and direct marketing messages directly to your audiences, making sure they are actually read.

Autoresponders are a great marketing tool, but are becoming ineffective because your prospects just don't want to give you their e-mail addresses. RSS gives you this very same power, but without the fear people have with e-mail.

Use RSS to market to your existing customers, affiliates, business partners and employees/team members.

Publish your own "podcast", a special RSS feed, which carries audio. Perfect for your own audio "radio station" or for getting your voice to your readers, without having to worry about too large e-mail attachments. Podcasting got so big that not only every respected internet marketers is using it, but also huge corporations.

Use RSS to deliver latest posts and topics from your forums directly to your readers, to increase forum popularity and the quantity & quality of conversations.

Get Your Content Published On Other Web Sites

Get your content easily published on other web sites to reach new audiences and use their traffic to increase your own sales, as well as achieve greater recognition as an expert in your field.

Again, it's free and easy, and has the potential of bringing your content to thousands and thousands of new prospects, who are ready to buy now.

Generate New Subscribers More Easily

Internet users are reluctant to subscribe to any more e-mail newsletters, but because RSS is so easy to control they don't have any problems with subscribing to new RSS feeds that match their interests.

That means that by having an RSS feed you can raise your visitor-to-subscriber conversion easily.

Just imagine how converting more visitors in to subscribers will improve your long-term sales ... every additional subscriber you get means a new potential sales and a long-term relationship that could lead to dozens of sales on the long-term, especially if you convert him in to an affiliate.

If however you're not using RSS, these sales could easily be lost!

Don't Worry About Messages From Your Customers Not Reaching You

E-mail messages, yes, even those from your customers and their business enquieries, are often lost due to spam filters.

With RSS, no messages from your customers or prospects will ever get lost again. That means that you'll now capture every business enquiery and respond accordingly, turning it in to a real sale.

Don't forget to measure and optimise your feeds!

Wednesday, July 28, 2010

Business Research - Brand names in Social Media

Are you in the early stages of a business start-up that will be considering online activities now or in the future? There are many factors in your business research for consideration, but area that should get some attention is brand management in the online space particular in Social Media.

If your brand name is not available at Twitter or Facebook, and the domain name in the .com and local domain space are not available I would be suggesting some thinking around naming conventions needs some serious consideration.

Two useful resources to assist in your research are:

Knowem? http://knowem.com/  
Namechk http://namechk.com/  

Promote your brand consistently by registering a username that is still available on the majority of the most popular sites.

Pre-emptive action for online brand management

35 percent of the biggest companies own the negative domains COMPANYNAMEsucks.com or IhateCOMPANYNAME.com, .net, .org or similar.
This is a pre-emptive strike action for online brand management damage prevention.
FairWinds analysis on 1,058 domain names for companies on the Global 500 and Fortune 500 lists showed that of the companies surveyed, 35% own the domain name for their brand followed by the word "sucks." They include Wal-Mart Stores, Coca-Cola, Toys"R"Us, Target and Whole Foods Market, according to FairWinds. Some 45% of these domains have yet to be registered by anyone.

Many companies that own these domain names publish no content on them, but simply park and hold them, others take a further procative approach and include support and customer help details or FAQs.

Some companies have been much more aggressive than others. Xerox, for example, has bought or registered about 20 unflattering domain names, including xeroxstinks.com, xeroxcorporationsucks.com and ihatexerox.net.

In an ideal world we would never have to consider these options, but unfortunately comments on the internet last forever, particular with search engines, so many people who know how to use search engines, research a company or individual  are bound to find negative sentiments in the SERPs.

NoScript tag for Javascript

The <noscript> tag can be a great way to provide both users and search engines with some information about your page when they do not have JavaScript available. The contents of the noscript tag could provide a description of what the javascript element does, along with a link to a text-only version of the same content. For sophisticated AJAX applications, I would recommend using progressive enhancement techniques (like Hijax)

Tuesday, July 27, 2010

SEO Map Listings on Search Engines

http://maps.live.com/
http://au.maps.search.yahoo.com/search/maps
http://maps.google.com.au/maps?hl=en&tab=wl

Five Ways to Use Windows Live Search Maps Windows Live Search Maps is a no-cost global mapping and search service that lets you search specific locations for businesses, people, and detailed maps. You can see overhead or 3-D views of many major cities.

Here are five ways Windows Live Search Maps can help your business.

1. Add your business to Live Search Maps results. Say you run an art supply shop. When map users type in " art supplies " and the name of your city, they should be able to find you. So add your business listing to the online directory used by the mapping service. Live Search Maps provides an online form that you can fill out and submit that includes your address, telephone number, and other basic business information.

After your information is added to the Live Search Maps business directory database, users searching it for your type of business in a certain locale will see your business marked on the map. By placing the cursor on the mark, your address and telephone number are displayed.

2. Add a driving directions link on your website. Your website may have general instructions on how to get to your place of business. However, these directions can never take into account where every customer is coming from. The interactive capabilities of Live Search Maps enable the map user to pinpoint your location and their location and then find detailed driving instruction between the two points. This can be a great comfort and convenience for those trying to find your workplace in an efficient manner.

3. Locate potential customers. Use Live Search Maps to locate the types of business in your area that use your services and products. Many companies prefer to do business with suppliers and service providers that are nearby.

4. Discover where your competitors are. Every business needs to keep track of its competition. Use Live Search Maps before you launch or expand your business to search for similar businesses near where you want to set up shop. Even after you have established your location, search your area periodically for new competition that has moved in. Visit their locations to discover how you might differentiate your business from theirs.

5. Discover areas that need your service. If you want to add new business, use Live Search Maps to pinpoint areas where your type of service is in limited supply or not offered at all.

Live Search Maps not only offers a great way to add visibility for your business, but also to gain insight into new business opportunities.

Adjust crawl speed for MSNBot

If you feel that MSNBot is crawling your site too frequently, you can use the crawl delay directive in robots.txt. Please refer to the MSNBot support page for more information. Here are a few recommended settings:

Slow (wait 5 seconds between each request)

Crawl-delay: 5

Really Slow (wait 10 seconds between each request)

Crawl-delay: 10

Note that setting the crawl delay reduces the load on your servers, but it also increases the amount of time it will take MSNBot to index your website (proportional to the length of the delay), and possibly make it more difficult for your customers to find your site on Live Search.

Another great way to reduce the impact of MSNBot on your website is to enable HTTP Conditional GET and HTTP Compression as outlined in our prior blog post.

Monday, July 26, 2010

Search Engine Optimization and LSI

Many people are confused by search engine optimization and LSI. What is LSI and what has it to do with SEO? In fact nothing. That’s right – nothing. Zilch. Absolutely nothing whatsoever. So why all the fuss? Through ignorance and possible subterfuge.

LSI stands for Latent Semantic Indexing, a term that has no meaning whatsoever. There is a term called ‘latent semantic analysis’ that is the analysis of the hidden meaning of text and its explanation by means of an analysis of the way that other words in the passage are used. The code for each letter in a word is analyzed, and the word thereby identified. There are certain known juxtapositions of certain words that provide a meaning to these words, although all may not be what it seems.

For example, is I ‘bought an apple’, did I buy a computer or a piece of fruit? That only becomes apparent when the rest of the semantics of the passage is analyzed. The word semantics means the meaning of words in the way that they are used. For example: “I bought a dog lead for my German shepherd” indicates that the German shepherd is a dog, and not a Teutonic gentleman looking after sheep. It is the rest of the semantics in the text that makes it clear what the term means.

Take another example: “spiders crawl the web looking for ….”. What does this actually mean? Is the passage referring to arachnids seeking flies or search engine spiders looking for specific keywords? Only the remainder of the text will make this clear. What does the term ‘the history of locks’ refer to? If you came across a website with that title, what topic do you think it would cover? The history of canal locks? The history of door locks? Only an analysis of the rest of the text in the page would tell you that, and that is what LSI is. It is not ‘latent semantic indexing’ but ‘latent semantic analysis’.

What the term relates to is the ability of search engine algorithms to determine the relevance of your content to the search term used by the search engine user. Someone searching for the “history of English canal locks” dos not want to be presented with a website entitled “the history of locks” that is an authority on jail locks. The algorithm analyses the text on the page and decides the topic from the word used.

It also ensures that the content of the page is relevant to the keyword, and does not only consist of a continuous repetition of the keyword, as many sites did prior to Google’s algorithm change. In fact, this type of analysis put paid to the software that generated thousands upon thousands of websites and page content from a single template containing no content at all to speak of but a form of words into which any keywords could be entered. Thus, “information on dogs is available all over the internet, and dogs are a popular topic with many websites providing information on dogs”.

Any keyword you can think of could be use in place of ‘dogs’ and the content provides no information whatsoever. That is typical of the reason why Google introduced the LSI algorithm, even though the term is meaningless to the webmaster. Apart from the fact it should really be termed latent semantic analysis, LSA, and not LSI, it is an analytical technique, and analytical techniques cannot be used by anybody to improve their website. You cannot make a website LSI compliant, but you can write a web page with good content that is relevant to your page title. You can include varied information and good English, rather than endless repetitions of the keyword, and so allow the search engine spiders to understand what your web page is about.

Sunday, July 25, 2010

Twitter Tools, Clients, Plug-ins and Analytics

Twitter Clients

TweetDeck
http://www.tweetdeck.com/beta/
Sleek Adobe Air desktop client for Twitter offering a good overview with several panes.

twhirl

The social software client
http://www.twhirl.org/
One of the more popular Twitter clients.

Spaz: An Open-Source Twitter Client for Mac OS X, Windows and Linux
http://funkatron.com/spaz
Cross platform open source Twitter client.

Hahlo

http://hahlo.com/resources/
Web based iPhone (and iPod Touch) optimized Twitter app.

Twinkle - New iPhone Twitter Client Uses Locate Me Features!

Just Another iPhone Blog
http://justanotheriphoneblog.com/wordpress/2008/04/09/twinkle-new-iphone-twitter-client-uses-locate-me-features/
Twitter client for the iPhone that allows you to socialize with people near you. Great for conferences it seems.

Twobile: A Twitter client for Windows Mobile - Download Squad

http://www.downloadsquad.com/2008/04/02/twobile-a-twitter-client-for-windows-mobile/
Twitter client for Windows mobile powered mobile phones. Most smartphones other than iPhone, Blackberry and Google Phone use Widows Mobile.

Misc. Twitter Tools

Twellow :: Twitter Search Directory, Twitter Search Engine

http://www.twellow.com/
Twellow is the “Yellow Pages” of Twitter.

iTweet 2 : Web

http://itweet.net/web/
This is an alternative web based interface for Twitter and indeed it’s a little more usable than the default one, for instance it offers one click retweets and makes bio links clickable etc.

TwitterCounter: How popular are you on Twitter?

http://twittercounter.com/
This is a Twitter followers counter similar to the Feedburner count for blog subscribers.

Magpie: Make Money on Twitter

http://mashable.com/2008/10/31/magpie/
Magpie is an ad-network for Twitter. It boast that users can make something like 50 to 200$ a month just by tweeting.

About crowdstatus :: Crowdstatus.com

http://crowdstatus.com/about.aspx
This tool allows you to create address groups of people at Twitter and notify all of them at once.


Twitter WordPress Plugins

Twitter for WordPress - Rick’s HideOut

http://rick.jinlabs.com/code/twitter/
Very basic but unobtrusive way of including your Tweets in your WordPress blog.

Wordpress Twitter Widget

http://xavisys.com/2008/04/wordpress-twitter-widget/
Clean and simple Twitter widget for the WordPress sidebar.


WP to Twitter
Joe Dolson Accessible Web Design
http://www.joedolson.com/articles/wp-to-twitter/
Twitter updater plugin using the Cli.gs short URL service for tweeting your posts.

Adnan`s Crazy Blogging World » Blog Archive » My blog gets twitterized

http://kadnan.com/blog/2007/11/12/my-blog-gets-twitterized/
Basic and quite ugly but very popular Twitter plugin many bloggers use, even TechCrunch.

AJAX Twitter plugin for Wordpress

http://kish.in/ajax-wordpress-twitter-plugin/
Advanced AJAX powered widget for your blog not only displaying tweets but letting you send updates from your blog.

Twitter Updater » Fireside Media Development Blog

http://www.firesidemedia.net/dev/software/wordpress/twitter-updater/
This tool lets you tweet your blog posts automatically.

Firefox Extensions for Twitter

TwitterFox – naan studio

http://www.naan.net/trac/wiki/TwitterFox
Simple but effective and popular Twitter Firefox add on.

TwitBin - twitter your browser - twitbin.com

http://www.twitbin.com/
Even simpler Twitter add on for Firefox.

TwitKit

http://engel.uk.to/twitkit/features/
Another Twitter Firefox client with more features though.

TwitterBar :: Firefox Add-ons

https://addons.mozilla.org/en-US/firefox/addon/4664/
Let’s you post from the address bar of your browser.

Twitter Social News

Twitturly - Real-time Link Tracking on Twitter

http://twitturly.com/
Digg-like interface for the currently hot tweets.

MicroBlogBuzzes of the last 24 hours

http://www.microblogbuzz.com/
Shows you what’s most popular today, this week etc. on Twitter and across the other common microblogging platforms.

Twitturls - Popular Twitter Links Tweeted err Twittered err Twhatever

http://twitturls.com/
Shows the latest and most popular links on Twitter.

ReadBurner: What’s Shared on the Web

http://readburner.com/
Lets you monitor the buzz around Twitter elsewhere among many other memes.

POPrl.com / Shrinking popular URLs since 1973 / What’s POPular

http://poprl.com/
Short URL service like TinyURL but better. Offers not only stats but also a Digg-like interface for the most popular URLs shared.

Twitter Analytics

TweetStats :: Graphin’ Your Stats

http://tweetstats.com/
This statistic tool measures everything from when you tweet (weekdays, time of day) to who your real friends are by counting how often you address people.

Twitstat.com - Twitter Analytics

http://www.twitstat.com/cgi-bin/view.pl?search=interface
Both a Twitter search engine and popularity stats at the same time (e.g: showing most active users).

Twitterverse

http://www.twitterverse.com/
A Twitter keyword tag cloud for quick overview on what’s going on.

Twist - see trends in twitter

http://twist.flaptor.com/?tz=1
Twist allows you to quickly view and compare popularity trends on Twitter. It’s similar to
Google Trends.


FriendOrFollow.com - Who’s not following you back? Who aren’t you following?

http://friendorfollow.com/
This tool compares your list of friends with your followers and shows you who does not follow you back.

Twitter Twerp Scan

http://twerpscan.com/
Gives you a quick overview about your followers so that you don’t have to click each one.

Saturday, July 24, 2010

Useful tips for running a Travel PPC Campaign

Nearly half of all travel searches are brand related; can you afford to miss out on all this traffic? 36% of people who buy holidays use a brand search first and use a brand search immediately before purchasing so bid on branded keywords in your PPC campaigns.

Use day parting for PPC; people are 30% more likely to purchase a holiday on a Monday or Tuesday. Increase your bids then to capture this traffic and lower them at the weekend. Only 7% of purchasers buy a holiday on a Saturday.

Get them early; 15.9% of purchasers buy their holiday from the first site they visit. Only 1.6% will buy immediately, but around 14.3% will return at some point for a conversion (Don't forget to get the user subscribed to your regular newsletter!). Forget what you’ve learnt about the buying cycle; bidding on keywords that customers use in the research phase can get you a 15.9% conversion rate!

Make sure your URL’s are memorable; 35% of transactions occur without a search on the same day. These people must’ve seen something they liked then gone away to think about it. Make sure they can remember where they were.

Destinations aren’t as important as you think. 45% of online travel purchases are made without a destination search. Of course this means that 55% do use a destination related search term but I used to think that just about everybody would search for their destination at some point.

Save some money for January. For the last few years there has been a massive peak in travel searches every January. Look on Google Trends with the travel query of your choice. Or don’t; trust me, there will be a peak in January.

Ad variations are always a bit of a mystery. Test everything. Misspellings can garner some superb traffice - “hotels” as “hotsel” is an ad which turned out to have a (statistically) significantly better CTR. I thought I’d found something great so I rolled similar variations out across other ad groups. A few weeks later I checked to see what was going on, using splittester [http://www.splittester.com/] to judge which results were significant. Some ad groups it was better, some ad groups it was worse. I have no idea why. Test everything all the time.

Most purchasers will visit your site at least twice before purchasing; make repeat visits more likely by including new and interesting content for them.

Be patient. You’ve made all these changes, but on average it takes 29 days between first search and transaction for a holiday buyer. 30% of purchases occur more than 6 weeks after the initial search.

Don’t want to be patient? Want to get the 17% of users who purchase after only one search? Then ideally you’re from easyjet, ryanair or some other well known airline. Branded searches tend to convert quicker (63% of single search transactions are branded) so build your brand if you want the shortest gap between click and conversion.

Website Review Tools

Website Grader from HubSpot

Website Grader is one of my favorite tools because of how helpful and usable it is. You’ll get a lengthy report broken into various sections with an evaluation of the page and recommended changes. While the grade is helpful to know where you stand, the suggestions are more valuable because they help you to identify areas for improvement, and many of them can be pretty simple.

http://www.websitegrader.com/tabid/6956/Default.aspx


Trifecta from SEOmoz

Quite a unique tool, Trifecta will analyze a page, a blog, or an entire domain based on slightly different criteria. Trifecta will produce numbers based on a variety of factors and it will give you an overall score. Without a pro membership you’re limited to one report per day.

http://www.seomoz.org/trifecta


Spider Simulator from Summit Media

This tool will give you a good idea of how search friendly you site is, and it will also give you a percentage rating. It bases the rating on factors like meta tags, use of headers, images and alt tags, load time, and links.

http://tools.summitmedia.co.uk/spider/


Web Page Analyzer from WebsiteOptimization.com

This free tool will give you plenty of information to work with. It will test how long the page takes to load, how many objects are on the page, the size of the objects and more. The most helpful part of the report that is produced is the “Analysis and Recommendations” section where it will list 11 aspects of the page and give you a rating. Red items are warnings, yellow items are cautions, and green items are good.

http://www.websiteoptimization.com/services/analyze/

Friday, July 23, 2010

Image Naming Conventions

Ensure images that you save for a page have the keyword name in them.

Keywords for a page should be focused on a cluster of 5 keywords that map around the primary focus or goal of a page. Image search does bring people to our site, though it is realistically only a small figure, but part of branding exercises. So, for image naming conventions keep text lowercase, use underscores/hyphens to separate words, and put location based keywords or product based keywords into the name of the image.

The Design team should use user friendly readable names for content display images, and shortened names for design images.

For instance the name and file structure for an image that is the flag of France should be:

images/flags/flag-france.jpg

Whereas an image that is of the background for a div should be:

images/content/bg-div.gif

Tagging

There were two significant early adopters in tagging. These are del.icio.us and furl.net. These sites make it possible for users to 'tag' any web page.

Del.icio.us describes itself as "a social bookmarks manager". Del.icio.us describe tags themselves as "one-word descriptors that you can assign to any bookmark." Furl.net doesn't define tags, but says it is "a free service that saves a personal copy of any page you find on the Web, and lets you find it again instantly by searching your archive of pages. It's your Personal Web."

Both sites do basically the same thing, allowing users to put a label on a webpage that they have visited, so that they can easily find it again. Users have the option of making their tags public or private (where only the person themselves can see what they have previously tagged) or they can share tagged site information with other individual members. Where the tags are public, other visitors can then see the tags that have been assigned to particular sites by users.

From the perspective of page design this becomes a useful way of summarising the content of a page, and developing the keyword cluster focus for the content of that page. Ideally the page design or CMS should be enabled to allow the administrator to tag pages with relevant keywords. These keywords can then act a primary search parameter for site searches, can be adjusted as the page changes, and act a search mechanism themselves through finding similar tagged pages. These specified tags should appear at the end of the content on the page as such:

“Tags: seo, sem, optimisation, search, organic search”

List of Semantic Code Elements

ABBR and ACRONYM: For Abbreviations and Acronyms. They have an accompanying title tag which you can use to describe the actual meaning of the abbreviated word or acronym.

CITE: Citation, used to cite a source of information.

CODE: Computer or Programming code.

DEL: Deleted word or phrase.

DFN: Definition.

DL: Definition List. Similar to UL and OL, but uses DT (Definition term) and DD (definition description) to show terms and definitions.

EM: Emphasis, displayed as italicized text.

INS: Insert, used to display text you have inserted due to an edit at a later date.

KBD: Keyboard instructions.

OL: Ordered List.

SAMP: Sample output, used to show sample output from programming code.

STRONG: Strong, or bold, emphasis on a word or phrase.

UL: Unordered List.

VAR: Variable, used to represent a variable in programming code.

Unordered Lists

Dot points <ul> on the page are great, as these are accentuated words that you are highlighting, so again, try to use these in your page design, include your primary 5 keywords within these.

Make an unordered list which reflects the 3 most relevant topics of the page you’re tweaking and put it on top adding the words “this article deals with…”

Don’t use <br> to separate list items. Instead use the <ol> tag with <li> elements for ordered lists, and <ul> and <li> should be used for unordered lists.

Thursday, July 22, 2010

Bold Text and Links

A keyword that appears in emphasized text is often assumed to be a heading or a sub-title, and is thus often assigned a higher relevancy value.

Consider putting some of your keywords within the body text in-between the <b>...</b> or <strong>...</strong> as text within links and format tags like bold (<b>) and strong (<strong>) are given a minor boost.

As appropriate, bolding certain words, and ensuring keywords are in links, is a useful way of boosting a site’s rankings for certain keywords.

Commonly used links such as ‘click here’ are meaningless to search engines. It says nothing about the content of the following page. It is far better to use specific keyword oriented text links, replacing the ‘click here’ with meaningful text instead. Relevant everyday search phrases should be used in links if possible.

Hyper Link keywords to other relevant pages within the website: Search engines look for keywords that are hyper text linked on each webpage. It is important to have high pay per click keywords hyper-text linking to other web pages with the website that relate to that specific keyword. This is known as internal linking and allows search engine crawlers to easily find other pages within the website during the crawling process.

Please note for bold or emphasized text, it is preferred to use <strong> or <em>, instead of the less descriptive <b> and <i> tags. Wrap paragraphs in <p> tags, and never use <p> or <br> tags just for spacing. Use the margin and/or padding attributes of the <p> tag in your CSS code to add visual spacing.

Wednesday, July 21, 2010

SEO with Breadcrumb Navigation

Breadcrumb navigation shows the user's path to their current location, and should be located at the top of the page.

Breadcrumb navigation is wonderful for both usability and for SEO. This text-based navigation shows where in the site hierarchy the currently viewed web page is located and your location within the site, while providing shortcuts to instantly jump higher up the site hierarchy.

Example from Google's Webmaster section:

Google Help > Help Center Home > My site and Google > Creating a Google-friendly site > Working with AJAX-enhanced sites

If the breadcrumb contains text links with relevant keywords in the anchor text, that is a significant SEO benefit. The anchor text provides the search engines with an important, contextual clue as to the topic of the linked page. That equates to improved rankings.

One throwaway phrase that’s used almost universally within breadcrumbs is “home.” Try revising that link to something more search-optimal version of the anchor text with words like “computing” or “IT” or “technology” along with perhaps “store” or “products” that is site relevant.

You should also consider the amplifying effect of breadcrumb navigation. A link in the breadcrumb will be “voted for” more times if that linked page is higher up in the site hierarchy and if there are more pages underneath that page in the hierarchy. A supercategory page receives more internal links than a subcategory page. A category page covering hundreds of products will receive more internal links than one with only a dozen products in the category.

Breadcrumb Separator (for Horizontal Breadcrumbs)

Here is a summary of what designers tend to use as separators:

67% Right Arrow
10% Pipe
9% Colon
5% Slash
3% Left Arrow
3% Bullet
3% Text Treatment

Orientation of Breadcrumbs

95% Horizontal
5% Vertical

Heading Tags

In descending priority of importance, H ref tags should be used to group content on the page, and the titles need to include one or more of the primary focus 5 keywords within the title text.

Heading tags (<h1>, <h2>; etc) are given an additional boost in Search Engine rankings by search engines. The reason for this is that headings traditionally carry important information about the structure of a page. For this reason, it is preferable to use headings to mark important elements in a document and to utilise keywords within the heading tags:
<h1>
<h2>
<h3>

SEO for Page Headings

Page Headings need to contain keywords from the URL.

As search engines calculate the combined total of keywords on the webpage, it is important to ensure there is consistent use of the keywords in the page heading that are used in the URL. This provides a compound effect for search engines to increase the relevancy of the webpage which produces higher search engine rankings.

The following list summarises the important points for consideration:

1. Search engines generally ignore common words like "the", "a", "are", "is", "of", etc - your URLs will therefore do just fine without them.

2. Phrases like "How to do this" or "Which is the best" or "What are the options" or "When did this happen" make titles attractive but you need not put these words in the URL. For instance, the URL is "stop-junk-mail" for the article "How to Stop Junk Mail."

3. If something important doesn’t fit in the page title, put it in the URL. For instance, the URL of a page titled "Get your search fix with two videos" is like "free-search-seo-video" - so you have two new words ("free" and "seo") in the URL that were not in the title but do help in describing the underlying web page.

4. Use hyphens (or dashes) to separate keywords in URLs though Google can also read underscores.

Google probably assigns some fixed weight to your URLs which gets distributed across different words used in that URL. Now the weight per keyword will obviously dilute when you have long URLs.

Therefore, it will help if you can manually create URLs with lesser number of keywords but they should also be relevant to the context of your content. It requires a little extra effort at the time of writing your blog post but they may reap good benefits in the long run.

Tuesday, July 20, 2010

Local Search using Phone Numbers

The use of telephone numbers to increase organic rankings in search engines is a technique that is being increasingly implemented.

Place the company phone number on web pages in the following locations:

1. Title, Description and Keyword Meta Tag.
2. Within the Website Address.
3. Within the body copy of the web-page.

An example of using the phone number, including the area code, is detailed below on Yellowbook.com for the Empress Food Products page at: http://www.yellowbook.com/5137711441/empress/ where the phone number is 513-771-1441.

Best practice here is to include the “business name”, “city” and “state” in the URL as well as the phone number so the URL would be as follows:

http://www.YourURL.com/Business-Description/City/Telephone-0812345678

Keywords in URLS - Information Architecture

I suggest the inclusion of high audience and pay per click keywords into the naming convention (referred to as the nomenclature) within the website. Aim to include real search phrases as part of a page’s URL structure.

Search engines like Google and Yahoo directly match keywords typed in by users and return relevant URLs in the search engine results. URLs which have keywords which match those typed into the search engine typically are returned in priority.

The Keyword Information Architecture approach has proven to increase search engine indexation for an entire website.

Methodology:


Add keywords to the URLs throughout the website according to keywords that are either:

1. High Price and Converting Keywords purchased on search engines such as Google.com
2. Keywords which refer users to the client website that lead to high time spent viewing
3. Keywords competitors are optimizing on
4. Keywords that result in qualified leads and closed orders from lead sources

Implementation:

The rationale for having keywords twice in the URL is to ensure search engines recognize this URL is more heavily weighted to these keywords and ensure higher search engine rankings.

Here is an example of how SearchForecast implemented keywords in TuVox Inc website.

The following list of keyword phrases are sourced from Google.com

Keywords Estimated Avg. CPC
hosted IVR $15.01
IVR $12.43
VoiceXML $7.82
speech recognition software $5.40
Speech recognition $5.38
VXML $3.75
Source: Google.com

These keywords are then threaded into the URLs as below:

Products

On Demand Speech Apps
Former URL: http://www.tuvox.com/prod_ondemand.html
New URL: http://www.tuvox.com/ivr_solutions/ivr_speech_recognition/VoiceXML_VXML.html

Call Routing
Former URL: http://www.tuvox.com/prod_apps_routing.html
New URL: http://www.tuvox.com/ivr_solutions/ivr_speech_recognition/voice_self_service/call_routing.html

Where possible use relevant high yielding keywords when building descriptive URL’s.

It’s also important to not make URL’s too long – keep them to fewer than 1024 characters. Use the home pages, or main category pages to target primary, most competitive keywords. These have the best chance or ranking highly

Page Naming Conventions

You'll want to ensure that the most popular areas of your site (ideally the pages in which you optimize) are featured in the main navigation that's on every page of the site. The search engines rightly assume that the most important stuff of your site is in your main navigation, and therefore give extra weighting to those pages in their ranking formulas. Use your page's primary focus keyword as the filename for that page.

Search engines always take into account the title text and keywords used in the URL of a web page while determining rankings of that page in search results. The influence may be small but keywords mentioned in the URL do carry some weight.

For instance, if all other factors remain same, a web page at abc.com/iphone-review may rank higher for a search query "iPhone review" than, say, xyz.com/best-phone or xyz.com/apple-phone-review because of the keyword iPhone that’s present in the URL.
*Most blogging platforms allow you to write custom URLs (aka post slugs in WordPress).

Now if you’re keen to master the art of writing good URLs that are descriptive and search friendly but without getting into any black hat SEO tricks like keyword stuffing, use descriptive titles for blog posts and keep URLs short, neatly written and use only relevant keywords.

Some examples:

Title: I do not wish my screensaver to lock my computer, thank you.
URL: disable-screensaver-password

Title: Get your search fix with two videos
URL: free-search-seo-videos

Title: Protect yourself: get a free credit report
URL: free-credit-report

Title: How to back up your Gmail on Linux in four easy steps
URL: backup-gmail-in-linux-with-getmail

Title: How to stop junk mail
URL: stop-junk-mail

Title: I love my pedometer
URL: best-pedometer

Title: Crap. My Ubuntu machine won’t boot
URL: ubuntu-freeze-no-resume

Title: What are the best iPhone applications?
URL: best-iphone-application

URLs must contain keywords - www.yourURL.com/keywords1/keywords2.php

Search engines like Google and Yahoo directly match keywords typed in by users and return URLs which contain the keyword in the search engine results. URLs which have keywords which match those typed into the search engine typically are returned in priority.

Monday, July 19, 2010

Crawling and Indexing explained

The terms crawling and indexing (and indexing's cousin, caching) are frequently used together, but you should not consider them synonyms.

Crawling is the process of an engine requesting, and successfully downloading, a unique URL. Obstacles to crawling include no links to a URL, server downtime, robots exclusion, or using links (such as some JavaScript links) from which bots cannot find a valid URL.

Indexing is the result of successful crawling. I would consider a URL to be indexed (by Google) when an info: or cache: query produces a result, signifying the URL's presence in the Google index.

Obstacles to indexing can include duplication (the engine might decide to index only one version of content for which it finds many nearly identical URLs), unreliable server delivery (the engine may decide to not index a page that it can access during only one-third of its attempts), and so on.

What's the difference between crawling and indexing, in terms of time? In comparing a newly introduced URL to see when it would be indexed, the text cache showed results after 15 days and finally stopped saying "Your search - cache:[URL] - did not match any documents." But what was interesting is that the cached file showed the results of the URL "as retrieved on xDate (7 days prior)." So make special note that the URL was crawled and cached over a week before it appeared in the index.

A better, more comprehensive test would be to watch server logs and see how many times the file was requested, and with what frequency, between the original request date and date at which the cache query showed results. Additional testing would try to detect ways to shorten that time by increasing the number (and prominence) of incoming links and so on.

Spider Simulators:

What you see as a visitor in your browser while watching any web site differs a lot from what the search engines spiders see when indexing your pages.

Find out what spiders see when they crawler your websites by using these simulators:

Spider Simulator - SEO Chat
http://www.seochat.com/seo-tools/spider-simulator/

Spider View - Iwebtool
http://www.iwebtool.com/spider_view

Search Engine Spider Simulator - Anownsite
http://www.anownsite.com/webmaster-resources/search-engine-spider-simulator.php

SE Bot Simulator - XML Sitemaps
http://www.xml-sitemaps.com/se-bot-simulator.html

SE Spider - LinkVendor
http://www.linkvendor.com/seo-tools/se-spider.html

Spider Simulator from Summit Media
http://tools.summitmedia.co.uk/spider/

Depth of Page Location

Generally, the deeper a page’s location in the site’s structure, the less frequently it will be indexed (and thus Google’s index of that page’s content will be less up to date).

“Depth” of location refers to how deep the page is in the site, for example, if a page is found directly from a Home page link, it is not deep. If a Search Engine spider (or website user) has to drill down through four pages before it finds a link to the page, then that page is a “deep page”.

The use of javascript or server-redirect links. Search Engines cannot follow javascript pop-ups or server redirect links (except 301 server redirects). If you are linking to a page, then choose standard HTML links wherever the CMS allows you to.

Caution on Content Automation spam

Are there “suspiciously large” page volumes or does content look suspiciously similar across a large number of pages? Many search spammers use Content automation, which is frowned upon by the Search Engines. It may result in many pages not being indexed.

How “fresh” is the site’s content? Is it stagnant, or does it change regularly? The more frequently it changes, the more regularly the site will be indexed. However, pages that have not changed in years can still be the #1 result for hotly-competitive terms. This is usually the case where the page’s information is not particularly time sensitive

A look at how Search Engines work

Search Engines heavily weigh content for searches that use a combination of words, i.e. three word searches and above. The content on any page is a major factor in determining how any page is ranked. The more words a user types into a search engine, the more likely that the user will be directed to a page deep within the site.

Obscure or “deep” search, accounts for approximately 60% of all searches made via search engines. Therefore it is vitally important that copy is written in a way that is search engine as well as human friendly for readability, and most likely to deliver traffic to the most relevant page within your site.

Search Engines use automated robots, also known as “spiders”, "bots", "crawlers" or "indexers" to find content on a site. They “spider” a site by following links to individual web pages and thus “index” all of the individual page content of a website. Webmasters are also able to submit to Search Engines specific site map lists of the content that they want indexed in the directory via .xml or .txt feeds that list each URL on the site. When indexing a site via the robot method, the Search Engine spiders look mainly at the following factors when deciding on how deeply and frequently to index a site:

Are the site’s individual pages unique in their content? Is each page uniquely different from other internal pages on the website? Is the site’s content duplicated from other pages on the internet?

Search Engines look at the quantity and quality of inbound links to your site. As a rule of thumb, the more links from other high-quality sites, the more frequently your site will be deep-indexed.

Sunday, July 18, 2010

SEO - What is it?

SEO is the process of identifying the keyword phrases that potential customers are using with search engines via the Internet to find products and services, and ensuring that there are no obstacles for the website to capture the maximum possible volume of relevant, income producing traffic.

Organic traffic is the traffic derived from the natural search engine indexes and cannot be bought like through Search Engine Marketing (SEM) campaigns such as Google's Adwords. How a website ranks is determined by complex algorithms that are applied (differently) with each Search Engine to an index of all the known pages on the web. The purpose of these algorithms is to provide Searchers with the most relevant pages for a particular search.

To achieve organic listing, as many pages of a site need to be indexed by the search engine as possible, and each page well optimised through words on the page (content), and meta content behind the page to match the criteria of the search.

All Search Engine algorithms currently employed, no matter how sophisticated and how many additional factors are taken into account, all come back to indexing the words used on pages. If a site doesn’t have one page that uses a specific word, it is virtually impossible for that site to rank organically for that phrase.

We measure success from targeted traffic and conversions. SEOs have traditionally measured success by tracking the rankings in the search engines for various keyword phrases. However, due to numerous factors such as personalized search, geo-targeted search and multiple search engine data centres, no two searches will show the same results.

In fact, it's common to do a Google search using a particular phrase in the morning, then perform the same search in the afternoon and see different results. Rankings are simply not a good measure of success. All the #1 rankings in the world won't mean a thing if a) you're the only one seeing those rankings b) you're ranked for keyword phrases nobody is searching on or c) the rankings bring Web site traffic, but not from people interested in what you are selling. The fact of the matter is that rankings do not help your bottom line. Today, SEO success is measured by how much targeted traffic is delivered, and more importantly how much of that traffic converts from visitors to buyers.

Setting the Scene

Hello! And welcome to the Blog of Matt Lynch.

I thought I would take this opportunity with my first post to introduce myself and set the scene...

Located in Perth, Western Australia, I have over twenty years IT experience covering a range of disciplines and with a speciality in SEO (Search Engine Optimisation), which I'll focus my Blog posts upon, but don't be surprised to hear me talk about; design and usability, coding, OSs, iPhones and sysAdmin... and a few other geeky things inbetween.

You are also most welcome to follow along with my IT tweets on twitter: http://twitter.com/mattrlynch

My recent jobs have been with:

Best Flights
March 2001 - June 2010
IT Manager
Perth, Western Australia
Manage the Search Engine Optimisation (SEO), Search Engine Marketing (SEM) and Online Marketing, as well as Information Technology & Communication requirements as the Senior System Administrator (Windows/Linux) for BestFlights.com.au an online and call centre based Australian Travel Agency (100+ staff).

Dyma Designs
January 1996 - March 2001
Managing Director
Perth, Western Australia
Web development and IT company, specialising in SEO, SEM, Online marketing, Internet Marketing, emarketing, e-marketing and business start-ups. Client base of 500+ clients.

If you choose to follow my Blog you may find some of the information I post very useful as I explore the state of SEO and musings on my own technology experiences.
Remove starLikeShareShare with noteEmailKeep unreadAdd tags