Analysing And Creating Highly Popular Web PagesWritten by David Gikandi
Today's webmaster faces a very common yet disturbing problem: getting a good position on major search engines. How many times have you ever wondered why, no matter what you do, you can't seem to find your site when you do a search for your keywords on Hotbot or Altavista? And you know, therefore, that no one else is finding your site and you are missing out on heaps of traffic. It is a very frustrating feeling common to webmasters.According to 1999 NEC Research Institute report, Web has over 800 million pages and most major engines only index about 10 per cent of that. To make matters worse, just getting indexed doesn't mean much unless you get indexed and ranked highly for your search terms. That's because most people never bother drill down beyond first 30 links returned on a search. The good news is that you can tune up your pages to get that top ranking. It is all a matter of careful analysis of current top ranking pages to figure out what text proportions and arrangements you need to use on your pages for them to get that same high rank. It is that simple, and many professional webmasters employ this technique very successfully. The first step is to analyse pages that are currently ranking at top of searches for keywords related to your business. Search engines look at almost all parts of a web page to calculate its rank. The title, meta tags, body text, links in page, alt tags, comments, form hidden fields and headings all usually count. By looking at exact number of words and keywords in each of these sections in a page that currently ranks highly, then applying those statistics to your own pages, you stand a very high chance of getting a similar high rank. You may not get exact same rank, primarily because search engines also use some other factors such as a page's popularity to adjust their ranking scores. But you will still get a very good rank near page that you analysed. What you would need to do would be to do a search on a keyword or phrase in a search engine. See what page ranks highest for that keyword or phrase. Make sure that actual page is same one displayed in search results and not a redirected page or a newer page. You do this by comparing file date, file size, and wording on title and description as they are on search engine results and on actual page. If it isn't same page that was indexed, move on to next highest-ranking page. The search engines do not always have most recent copy of a page on their index. For example, engine may have indexed a page on, say, June 12, 1998, and that page ranked 2 on your search. However, that page may have been changed, perhaps extensively, by its webmaster after that indexing was done, on maybe July 1, 1998. But that change may not be indexed yet because engine would revisit that page maybe 2 months later. So if you were doing your search and analysis on June 25, 1998, you would get old version appearing as a top ranking page, but when you click on to it, you would retrieve new version of page. The problem is that it is most likely that new version would not have same ranking as old one! So if you take its statistics and use them, your pages will rank poorly. What you should do always is look a little closer at information you get from your search results. Many engines provide extra information about each page on their results list such as file size. Look at reported file size on search result, then go on to actual page and see whether file size is just about same. On Internet Explorer, you do so by right-clicking on page and choosing Properties menu item from popup menu. Another way of finding out is seeing whether there are any differences in title and description of page on search engine results and on actual page itself. Most engines use page title as title of search listing, and meta description or first few words on a page as description on results. You might find, for example, that title on search result reads 'Super Real Estate Page' and on actual page it reads 'A Big Super Real Estate Page', meaning that page currently available is a modified version of one that was originally indexed at by search engine.
| | Search Engine Spam: Useful Knowledge for the Web Site PromoterWritten by David Gikandi
Before getting started on using gateway pages and other HTML techniques to improve your search engine ranking, you need to know a little about spam and spamdexing. Spamming search engines (or spamdexing) is practice of using unethical or unprofessional techniques to try to improve search engine rankings. You should be aware of what constitutes spamming so as to avoid trouble with search engines. For example, if you have a page with a white background, and you have a table that has a blue background and white text in it, you are actually spamming Infoseek engine without even knowing it! Infoseek will see white text and see a white page background, concluding that your background color and your page color are same so you are spamming! It will not be able to tell that white text is actually within a blue table and is perfectly legible. It is silly, but that will cause that page to be dropped off index. You can get it back on by changing text color in table to, say, a light gray and resubmitting page to Infoseek. See what a difference that makes? Yet you had no idea that your page was considered spam! Generally, it is very easy to know what not to do so as to avoid being labeled a spammer and having your pages or your site penalized. By following a few simple rules, you can safely improve your search engine rankings without unknowingly spamming engines and getting penalized for it.What constitutes spam? Some techniques are clearly considered as an attempt to spam engines. Where possible, you should avoid these: Keyword stuffing. This is repeated use of a word to increase its frequency on a page. Search engines now have ability to analyze a page and determine whether frequency is above a "normal" level in proportion to rest of words in document. Invisible text. Some webmasters stuff keywords at bottom of a page and make their text color same as that of page background. This is also detectable by engines. Tiny text. Same as invisible text but with tiny, illegible text. Page redirects. Some engines, especially Infoseek, do not like pages that take user to another page without his or her intervention, e.g. using META refresh tags, cgi scripts, Java, JavaScript, or server side techniques. Meta tags stuffing. Do not repeat your keywords in Meta tags more than once, and do not use keywords that are unrelated to your site's content. Never use keywords that do not apply to your site's content. Do not create too many doorways with very similar keywords. Do not submit same page more than once on same day to same search engine. Do not submit virtually identical pages, i.e. do not simply duplicate a web page, give copies different file names, and submit them all. That will be interpreted as an attempt to flood engine. Code swapping. Do not optimize a page for top ranking, then swap another page in its place once a top ranking is achieved. Do not submit doorways to submission directories like Yahoo! Do not submit more than allowed number of pages per engine per day or week. Each engine has a limit on how many pages you can manually submit to it using its online forms. Currently these are limits: AltaVista 1-10 pages per day; HotBot 50 pages per day; Excite 25 pages per week; Infoseek 50 pages per day but unlimited when using e-mail submissions. Please note that this is not total number of pages that can be indexed, it is just total number that can be submitted. If you can only submit 25 pages to Excite, for example, and you have a 1000 page site, that's no problem. The search engine will come crawling your site and index all pages, including those that you did not submit. Gray Areas There are certain practices that can be considered spam by search engine when they are actually just part of honest web site design. For example, Infoseek does not index any page with a fast page refresh. Yet, refresh tags are commonly used by web site designers to produce visual effects or to take people to a new location of a page that has been moved. Also, some engines look at text color and background color and if they match, that page is considered spam. But you could have a page with a white background and a black table somewhere with white text in it. Although perfectly legible and legitimate, that page will be ignored by some engines. Another example is that Infoseek advises against (but does not seem to drop from index) having many pages with links to one page. Even though this is meant to discourage spammers, it also places many legitimate webmasters in spam region (almost anyone with a large web site or a web site with an online forum always has their pages linking back to home page). These are just a few examples of gray areas in this business. Fortunately, because search engine people know that they exist, they will not penalize your entire site just because of them.
|