Continued from page 1
If code is not clean, spiders stop indexing pages. If pages are not indexed, they do not appear in search engine results. What stops spiders? Following are most common problems:
1. Frames – Don’t use them. 2. Dynamic Pages – Your pages need to be static. If they are dynamic, spiders often will not index them because they aren’t sure of content. 3. Bottlenecks – Make sure each page of your site links to every primary page at a minimum. You do not want a spider to get stuck on a page and miss key pages.
4. Bad URLs – A huge mistake is to put database parameters in URL. The URL should contain only domain name and keywords for page. A good URL reads: http://www.marketingtitan.com/internet_marketing_services. A bad URL with parameters would read: http://www.marketingtitan.com/id#us57486&095783
5. Images – Don’t overuse images and don’t put text in images. Images slow down your site, so make them small and optimized. Robots do not read text inside of images, thus text needs to go outside of images.
Evaluating The Layout
Once site is designed, TEST IT! Surf pages and see if you are able to flow through site. Add internal links wherever possible. Finally, test load times of your site on a 56k dial-up modem. If site loads in under 20 seconds, you are headed in correct direction.
Your site layout is important. Make sure it caters to needs of visitors, whether human or spider.
Halstatt Pires is with Marketing Titan- an Internet marketing and advertising company comprised of a search engine optimization specialist providing meta tag optimization services and Internet marketing consultant providing internet marketing solutions through integrated design and programming services.