Continued from page 1
Issues such as
Google Sandbox theory tend to distract webmasters from
core ‘good' SEO practices and inadvertently push them towards black-hat or quick-fix techniques to exploit
search engine's weaknesses. The problem with this approach is its short-sightedness. To explain what I'm talking about, let's take a small detour and discuss search engine theory.
Understanding Search Engines
If you're looking to do some SEO, it would help if you tried to understand what search engines are trying to do. Search engines want to present
most relevant information to their users. There are two problems in this –
inaccurate search terms that people use and
information glut that is
Internet. To counteract, search engines have developed increasingly complex algorithms to deduce relevancy of content for different search terms.
How does this help us?
Well, as long as you keep producing highly-targeted, quality content that is relevant to
subject of your website (and acquire natural inbound links from related websites), you will stand a good chance for ranking high in SERPS. It sounds ridiculously simple, and in this case, it is. As search engine algorithms evolve, they will continue to do their jobs better, thus becoming better at filtering out trash and presenting
most relevant content to their users.
While each search engine will have different methods of determining search engine placement (Google values inbound links quite a lot, while Yahoo has recently placed additional value on Title tags and domain names), in
end all search engines aim to achieve
same goal, and by aiming to fulfill that goal you will always be able to ensure that your website can achieve a good ranking.
Escaping from
Google Sandbox
Now, from our discussion about
Sandbox theory above, you know that at best,
Google Sandbox is a filter on
search engine's algorithm that has a dampening influence on websites. While most SEO experts will tell you that this effect decreases after a certain period of time, they mistakenly accord it to website aging, or basically, when
website is first spidered by Googlebot. Actually,
Sandbox does ‘holds back' new websites but more importantly,
effects reduce over time not on
basis of website aging, but on link aging.
This means that
time that you spend in
Google Sandbox is directly linked to when you start acquiring quality links for your website. Thus, if you do nothing, your website may not be released from
Google Sandbox.
However, if you keep your head down and keep up with a low-intensity, long-term link building plan and keep adding inbound links to your website, you will be released from
Google Sandbox after an indeterminate period of time (but within a year, probably six months). In other words,
filter will stop having such a massive effect on your website.
As
‘Allegra' update showed, websites that were constantly being optimized during
time that they were in
Sandbox began to rank quite high for targeted keywords after
Sandbox effect ended.
This and other observations of
Sandbox phenomenon – combined with an understanding of search engine philosophy – have lead me to pinpoint
following strategies for minimizing your website's ‘Sandboxed' time.
SEO strategies to minimize your website's ‘Sandboxed' time
Despite what some SEO experts might tell you, you don't need do anything different to escape from
Google Sandbox. In fact, if you follow
‘white hat' rules of search engine optimization and work on
principles I've mentioned many times in this course, you'll not only minimize your website's Sandboxed time but you will also ensure that your website ranks in
top 10 for your target keywords. Here's a list of SEO strategies you should make sure you use when starting out a new website:
Start promoting your website
moment you create your website, not when your website is ‘ready'. Don't make
mistake of waiting for your website to be ‘perfect'. The motto is to get your product out on
market, as quickly as possible, and then worry about improving it. Otherwise, how will you ever start to make money?
Establish a low-intensity, long-term link building plan and follow it religiously. For example, you can set yourself a target of acquiring 20 links per week, or maybe even a target of contacting 10 link partners a day (of course, with SEO Elite, link building is a snap). This will ensure that as you build your website, you also start acquiring inbound links and those links will age properly – so that by
time your website exits
Sandbox you would have both a high quantity of inbound links and a thriving website.
Avoid black-hat techniques such as keyword stuffing or ‘cloaking'. Google's search algorithm evolves almost daily, and penalties for breaking
rules may keep you stuck in
Sandbox longer than usual.
Save your time by remembering
20/80 rule: 80 percent of your optimization can be accomplished by just 20 percent of effort. After that, any tweaking left to be done is specific to current search engine tendencies and liable to become ineffective once a search engine updates its algorithm. Therefore don't waste your time in optimizing for each and every search engine – just get
basics right and move on to
next page.
Remember, you should always optimize with
end-user in mind, not
search engines.
Like I mentioned earlier, search engines are continuously optimizing their algorithms in order to improve on
key criteria: relevancy. By ensuring that your website content is targeted on a particular keyword, and is judged as ‘good' content based on both on-page optimization (keyword density) and off-page factors (lots of quality inbound links), you will also guarantee that your website will keep ranking highly for your search terms no matter what changes are brought into a search engine's algorithm, whether it's a dampening factor a la Sandbox or any other quirk
search engine industry throws up in
future.
by Brad Callen Search Engine Optimization Expert Learn How To Get A Top Google Ranking In Under 28 Days With This Breakthrough New SEO Software! http://www.seoelite.com

Search Engine Optimization Expert