Continued from page 1
To be short, your task is to find web-sites that have highest SE listing positions and/or page rank (determined via Google Toolbar) and negotiate a link to your site in return for some service, product or solicit simple exchange of links. As you see these "manual" work is most time-consuming, but it repays if you are focused to get as much relevant links as possible.
You may apply viral strategies by offering some free/paid service that implies putting a link back to your site.
Google has developed its own link popularity evaluation tool called Page Rank. It is calculated basing on consistently changing number rules: current rank of site link to your page is pointing from, its relevance to your web-site topic, presence of targeted words etc.
They are what I call them and used by webmasters similar to ways some "marketers" use spam to promote their businesses.
Unfortunately, usual internet users don't have ability to "ban" spammers same way SEs penalize those "smart" webmasters. I don't recommend you to use any of these tactics, even on someone's "advice".
They include excessive use of related and totally unrelated keywords, comment tags, hidden layers, text on background of same color, artificial link farms, numerous entry pages etc. This game simply won't be worth candles if your web-site is banned for good.
Very important file every web-site should have. It allows you to literally rule or direct SE spider to "proper" places, explaining what and where should be scanned, not just blind waiting of your lucky day. With its help you can also protect your confidential web-pages and or directories from scanning and showing at SE searching results, very important feature many web-masters solve with "tons" of Java or even Perl coding instead of one line string in robots.txt file that will forbid to scan "download", so-called "thank you" pages or anything you want!
General rules of creating robots.txt file you can find here http://www.robotstxt.org/wc/robots.html
Design & Layout issues
Next point is to have a textual info. The simple declaration of content rich web-site is not enough, SEs need text to scan.
Clear to follow links. If you have Flash or Java applet navigation menu, make sure to duplicate somewhere and include HTML links as well. Most SE spiders cannot distinguish dynamically created web-pages with help of ASP, Perl, PHP or other languages. It is also clear that all web-pages, access to which was forbidden (no matter how) by administrator, would also be left unnoticed. The same relates to HTML frame sites. What frames actually do is complicate way web-site is being scanned, no more, no less. When I see web-site made of frames, it is like webmaster telling me: "I want lower SE position."
Because of excessive work spiders have to do in order to scan as many pages as possible, their scanning "accuracy", if we can say so, have dropped, so they will hardly scan each and every of your pages from very top to bottom, it is more likely to be selective scanning, so, to ease this process you should try to arrange most valuable info, including header tags and text at very top of web-pages. Having "site map" page with all link connections of your site not only does it help your potential visitors, but SEs as well.
All link names, inside your informational content, are to contain your related keywords or phrases, not just "click here" or "download here".
Allow to Internet market know your business better.
Pavel Lenshin is a devoted Internet entrepreneur, founder of ASBONE.com, where you can find everything to make your business prosper. Discounted Internet services, FREE ebooks http://ASBONE.com/ebooks/ FREE reports http://ASBONE.com/reports/