Google’s sweeping changes confirm search giant has launched a full out assault against artificial link inflation & declared war against search engine spam in a continuing effort to provide best search service in world… and if you thought you cracked Google Code and had Google all figured out … guess again.
Google has raised bar against search engine spam and artificial link inflation to unrivaled heights with filing of a United States Patent Application 20050071741 on December 31, 2003. On March 31, 2005 is was available online for first time.
The filing unquestionable provides SEO’s with valuable insight into Google’s tightly guarded search intelligence and confirms that Google’s information retrieval is based on historical data.
What exactly do these changes mean to you? Your credibility and reputation on-line are going under Googlescope! Google has defined their patent abstract as follows:
A system identifies a document and obtains one or more types of history data associated with document. The system may generate a score for document based, at least in part, on one or more types of history data.
Google’s patent specification reveals a significant amount of information both old and new about possible ways Google can (and likely does) use your web page updates to determine ranking of your site in SERPs.
Unfortunately, patent filing does not prioritize or conclusively confirm any specific method one way or other.
Here’s how Google scores your web pages.
In addition to evaluating and scoring web page content, ranking of web pages are admittedly still influenced by frequency of page or site updates. What’s new and interesting is what Google takes into account in determining freshness of a web page.
For example, if a stale page continues to procure incoming links, it will still be considered fresh, even if page header (Last-Modified: tells when file was most recently modified) hasn’t changed and content is not updated or ‘stale’.
According to their patent filing Google records and scores following web page changes to determine freshness.
·The frequency of all web page changes ·The actual amount of change itself… whether it is a substantial change redundant or superfluous ·Changes in keyword distribution or density ·The actual number of new web pages that link to a web page ·The change or update of anchor text (the text that is used to link to a web page) ·The numbers of new links to low trust web sites (for example, a domain may be considered low trust for having too many affiliate links on one web page).
Although there is no specific number of links indicated in patent it might be advisable to limit affiliate links on new web pages. Caution should also be used in linking to pages with multiple affiliate links.
Developing your web page augments for page freshness.
Now I’m not suggesting that it’s always beneficial or advisable to change content of your web pages regularly, but it is very important to keep your pages fresh regularly and that may not necessarily mean a content change.
Google states that decayed or stale results might be desirable for information that doesn't necessarily need updating, while fresh content is good for results that require it.
How do you unravel that statement and differentiate between two types of content?
An excellent example of this methodology is roller coaster ride seasonal results might experience in Google’s SERPs based on actual season of year.
A page related to winter clothing may rank higher in winter than summer... and geographical area end user is searching from will now likely be considered and factored into search results.
Likewise, specific vacation destinations might rank higher in SERPs in certain geographic regions during specific seasons of year. Google can monitor and score pages by recording click through rate changes by season.
Google is no stranger to fighting Spam and is taking serious new measures to crack down on offenders like never before.
Section 0128 of Googles patent filing claims that you shouldn't change focus of multiple pages at once.
Here’s a quote from their rationale:
"A significant change over time in set of topics associated with a document may indicate that document has changed owners and previous document indicators, such as score, anchor text, etc., are no longer reliable.
Similarly, a spike in number of topics could indicate spam. For example, if a particular document is associated with a set of one or more topics over what may be considered a 'stable' period of time and then a (sudden) spike occurs in number of topics associated with document, this may be an indication that document has been taken over as a 'doorway' document.
Another indication may include sudden disappearance of original topics associated with document. If one or more of these situations are detected, then [Google] may reduce relative score of such documents and/or links, anchor text, or other data associated document."
Unfortunately, this means that Google’s sandbox phenomenon and/or aging delay may apply to your web site if you change too many of your web pages at once.
From case studies I’ve conducted it’s more likely rule and not exception.
What does all this mean to you?
Keep your pages themed, relevant and most importantly consistent. You have to establish reliability! The days of spamming Google are drawing to an end.
If you require multi page content changes implement changes in segments over time. Continue to use your original keywords on each page you change to maintain theme consistency.
You can easily make significant content changes by implementing lateral keywords to support and reinforce your vertical keyword(s) and phrases. This will also help eliminate keyword stuffing.
Make sure you determine if keywords you’re using require static or fresh search results and update your web site content accordingly. On this point RSS feeds may play a more valuable and strategic role than ever before in keeping pages fresh and at top of SERPs.
The bottom line here is webmasters must look ahead, plan and mange their domains more tightly than ever before or risk plummeting in SERPs.
Does Google use your domain name to determine ranking of your site?
Google’s patent references specific types of ‘information relating to how a document is hosted within a computer network’ that can directly influence ranking of a specific web site. This is Google's way of determining legitimacy of your domain name.
Therefore, credibility of your host has never been more important to ranking well in Google’s SERP’s.
Google states they may check information of a name server in multiple ways.
Bad name servers might host known spam sites, adult and/or doorway domains. If you’re hosted on a known bad name server your rankings will undoubtedly suffer… if you’re not blacklisted entirely.
What I found particularly interesting is criteria that Google may consider in determining value of a domain or identifying it as a spam domain; According to their patent, Google may now record following information: