Search Engine Optimization For GoogleWritten by SEO Expert
A group of Google employees have recently filed a patent application (#20050071741) with United States Patent and Trademark Office which gives insight into how to optimize one’s website to do well in Google rankings. The filing of patent gives verification that Google uses or intends to use historical data in its ranking algorithm. This patent also gives credence to Google Sandbox Theory for new websites.Under Google Sandbox Theory, new websites are placed in a sort of holding tank for observation for a period of time (6 – 9 months at present) until website has proved that it’s not a fly-by-night operation. Once Sandbox period is over, then new websites will climb rapidly in rankings. The Google Sandbox Theory is an unofficial theory based on observation and anecdotal evidence from those within search engine optimization industry. Based on new Google patent here are top 5 suggestions to better optimize a website for Google: 1.Build links slowly to your website. Websites that put up a bunch links quickly send up a red flag that links are being added in order to boost rankings. According to Google, natural links happen slowly over time, so one’s link-building strategy also needs to include link-building slowly over time. 2.The anchor text in back links to a website also need to be natural as well. If a website has lots of great content of interest to visitors, other Webmasters will naturally link to website. Content is still king when it comes to building natural links. In fact, having great content is best natural linking strategy.
| | Design A Spider Friendly SiteWritten by Matt Colyer
To be successful in search engines it's important to design your web site with spiders in mind. Using latest in web page design is not generally best way to go. Spiders don't view web pages like humans do, they must read HTML in page to see what it's about. Below you will find tips on how to best design your web site with search engines in mind.Do not use frames at all. Some search engines cannot spider web pages with frames at all. For other search engines that can, they can have problems spidering it and sometimes they too can't index web page. Do not only use image's to link out. You should always use text links to link out to important content on your web site. Spiders can follow image links, but like text links more though. Use external JavaScript files instead of using Java Script code in HTML document, using Java Script in HTML document will make page size much larger. Using an external Java Script file to do job will reduce page size and make it easier for both spiders and browsers to download page. Using Cascading Style Sheets can reduce page size and making download time much faster in most cases. It will allow spider to index your web page faster and can help your ranking. Avoid using web page creators such as FrontPage, Dreamweaver or a WYSIWYG editor. Software such as that will often times add scripting code that is not needed, making page larger than it needed to be and making it harder to crawl. It will also add code that can't be read by search engines, causing spider not to index page or not index whole web page. It is better to use standard HTML. Adding code that they can't read or have a hard time to read can lead to major problems with your ranking.
|