Monday, July 19, 2010

A look at how Search Engines work

Search Engines heavily weigh content for searches that use a combination of words, i.e. three word searches and above. The content on any page is a major factor in determining how any page is ranked. The more words a user types into a search engine, the more likely that the user will be directed to a page deep within the site.

Obscure or “deep” search, accounts for approximately 60% of all searches made via search engines. Therefore it is vitally important that copy is written in a way that is search engine as well as human friendly for readability, and most likely to deliver traffic to the most relevant page within your site.

Search Engines use automated robots, also known as “spiders”, "bots", "crawlers" or "indexers" to find content on a site. They “spider” a site by following links to individual web pages and thus “index” all of the individual page content of a website. Webmasters are also able to submit to Search Engines specific site map lists of the content that they want indexed in the directory via .xml or .txt feeds that list each URL on the site. When indexing a site via the robot method, the Search Engine spiders look mainly at the following factors when deciding on how deeply and frequently to index a site:

Are the site’s individual pages unique in their content? Is each page uniquely different from other internal pages on the website? Is the site’s content duplicated from other pages on the internet?

Search Engines look at the quantity and quality of inbound links to your site. As a rule of thumb, the more links from other high-quality sites, the more frequently your site will be deep-indexed.

No comments:

Post a Comment