Wouldn't it be nice if
search engines could comprehend our impressions of search results and adjust their databases accordingly? Properly optimized web pages would show up well in contextual searches and be rewarded with favorable reviews and listings. Pages which were spam or which had content that did not properly match
query would get negative responses and be pushed down in
search results.Well, this reality is much closer than you might think.
To date, most webmasters and search engine marketers have ignored or overlooked
importance of traffic as part of a search engine algorithm, and thus, not taken it into consideration as part of their search engine optimization strategy. However, that might soon change as search engines explore new methods to improve their search result offerings. Teoma and Alexa already employ traffic as a factor in
presentation of their search results. Teoma incorporated
technology used by Direct Hit,
first engine to use click through tracking and stickiness measurement as part of their ranking algorithm. More about Alexa below.
How can Traffic be a Factor?
Click popularity sorting algorithms track how many users click on a link and stickiness measurement calculates how long they stay at a website. Properly used and combined, this data can make it possible for users, via passive feedback, to help search engines organize and present relevant search results.
Click popularity is calculated by measuring
number of clicks each web site receives from a search engine's results page. The theory is that
more often
search result is clicked,
more popular
web site must be. For many engines
click through calculation ends there. But for
search engines that have enabled toolbars,
possibilities are enormous.
Stickiness measurement is a really great idea in theory,
premise being that a user will click
first result, and either spend time reading a relevant web page, or will click on
back button, and look at
next result. The longer a user spends on each page,
more relevant it must be. This measurement does go a long way to fixing
problem with "spoofing" click popularity results. A great example of a search engine that uses this type of data in their algorithms is Alexa.
Alexa's algorithm is different from
other search engines. Their click popularity algorithm collects traffic pattern data from their own site, partner sites, and also from their own toolbar. Alexa combines three distinct concepts: link popularity, click popularity and click depth. Its directory ranks related links based on popularity, so if your web site is popular, your site will be well placed in Alexa.
The Alexa toolbar doesn't just allow searches, it also reports on people's Internet navigation patterns. It records where people who use
Alexa toolbar go. For example, their technology is able to build a profile of which web sites are popular in
context of which search topic, and display
results sorted according to overall popularity on
Internet.
For example a user clicks a link to a "financial planner", but
web site content is an "online casino". They curse for a moment, sigh, and click back to get back to
search results, and look at
next result;
web site gets a low score. The next result is on topic, and they read 4 or 5 pages of content. This pattern is clearly identifiable and used by Alexa to help them sort results by popularity. The theory is that
more page views a web page has,
more useful a resource it must be. For example, follow this link today -
http://www.alexa.com/data/details/ traffic_details?q=&url=http://www.metamend.com/
- look at
traffic details chart, and then click
"Go to site now" button. Repeat
procedure again tomorrow and you should see a spike in user traffic. This shows how Alexa ranks a web site for a single day.