Keyword DensityWritten by Kristy Meghreblian
We can't emphasize enough importance of including keyword-rich content on your site to increase your ranking potential. Simply put, keywords are words and/or word phrases people use when searching. As we've mentioned throughout site, search engine spiders love content. Therefore, more keyword-rich content you have, better. When a search engine spider crawls your site, it won't recognize pictures or images. So, if you have limited amounts of text (or none at all) and you've got a lot of beautiful pictures or Flash animation, spider may deem your site unworthy of listing.What Is Keyword Density? Keyword density is ratio of a keyword or key phrases to total number of words on that page. It is one of most critical aspects of successful search engine optimization. To improve your search engine ranking potential, your keyword density must be just right. To calculate your keyword density, divide total number of words on your page by number of times your primary keyword or key phrase appears. Keyword density is critical when outlining keyword portion of your search engine optimization strategy.Naturally, there is a fine line between strategically scattering these keywords throughout your content versus grouping them all together, separated by commas. The latter is known as spamming and you will get penalized for doing it. Don't think you can fool search engines -- they have technology to figure out these little tricks.Using Keyword Density To Improve Your Search Engine Ranking The best way to increase your search engine ranking potential is to develop your keyword strategy by researching most relevant (and most searched for) keywords or keyword phrases before you even begin building your site. So, you've already built your site? No worries -- you should still consider reviewing keywords you have selected and make any necessary changes to your meta tags and site content. No matter how nice your site looks,
| | How to Optimize Your Website for Both Google & InktomiWritten by Jim Hedger
The search engine environment continues to evolve rapidly, easily outpacing ability of consumers and SEO practitioners to quickly adapt to new landscape. With ascension of Inktomi to level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even ethics of technique. Assuming this debate will unfold over coming months, how does an "ethical SEO firm" work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi's standards as a way of beating his competition to what looks to be new main driver. That shouldn't be necessary if you are careful and follow all "best practices" developed over years. The answer to our puzzle is less than obvious but it lies in typical behaviors of two search tools. While there are a number of similarities between two engines, most notably in behaviors of their spiders, there are also significant differences in way each engine treats websites. For most part, Google and Inktomi place greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are most important factors in getting good rankings. Both engines consider number and arrangement of keywords, incoming links, and anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where similarities end and, point where SEO tactics need revision. Once Inktomi is adopted as Yahoo's main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as other with caveat that Inktomi powers two of big three while Google will only power itself. 2004 - The Year of Spider-Monkey The first important factor to think about is how does each spider work? Entry to Inktomi Does Not Mean Full-Indexing Getting your site spidered by Inktomi's bot "Slurp" is essential. Like "Google-bot", "Slurp" will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on number of pages in a site they wish to have spidered. From information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in site but, pages within that site that are paid-for during submission will be spidered far more frequently and will appear in indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed older site, more pages appear in Inktomi's database and on SERPs on search engines using Inktomi database. (This is assuming webmaster only paid for inclusion of their INDEX page) Based on Inktomi's pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to paid-inclusion database so it is safer then not to assume that most small-business webmasters won't want to pay that much.
|