Continued from page 1
Using your web browser view
page code as in 2 above and check for
robots meta tag at
top of
page between
<head> and </head> tags.
If it says : <meta name="robots" content="index,follow"> or <meta name="robots" content="all"> then all is OK.
If
tag says : <meta name="robots" content="noindex,follow"> or <meta name="robots" content="index,nofollow"> or <meta name="robots" content="noindex,nofollow"> or <meta name="robots" content="none">
Then this page is not being given full access to
search engines. Do not link to this sort of page.
If
robots meta tag you checked is OK but you still suspect a problem with a low PR then you should check
sites robots.txt file. To do this type
main URL of
site into a web browser but add robots.txt for example: "http://www.domainyourlinkingto.com/robots.txt"
The robots.txt file is read by
search engines and it tells it
directory and files it can access. A simple robots.txt file might look something like:
User-agent: * Disallow: /cgi-bin Disallow: /forms Disallow: /contact.html
If
URL of
page you were linking to was : "http://www.domainyourlinkingto.com/dir/web-design.html"
Then you would want to be sure that in
robots.txt file you should NOT see :
Disallow: /dir Disallow: /dir/web-design.html
This is telling
search engine robot not to index or follow
links in
link directory called dir and to ignore
links page web-design.html.
And you should not see : User-agent: * Disallow: /
If you see : User-agent: * Disallow:
Then that's OK.
All sounds a bit complicated I know, but there is no easier way to explain this sort of thing. Some reciprocal link manager checking software will also detect
incorrect use of
meta robots tag and also check
robots text file.
However some link manager software I have experienced, incorrectly reported a link page as blocked by
robots text file because it read "Disallow:" as prohibiting
search engine when in fact it means allow (see above). It is "Disallow: /" that would tell
search engine not to index
site.
Once you've completed your link exchange and done
checks to ensure you're not being cheated you must then check your links at regular intervals. Once you have more than about 50 links you will soon find link checking becomes a time consuming process. It's far better to build your link directory using some form of link manager software that will automatically check your links at intervals you specify.
To help you in making
right choices about setting up a link directory read my article "10 Mistakes to Avoid When Setting Up a Link Directory" which you can find at http://www.webpageaddons.com/link-manager-mistakes.htm
Of course not all link theft is intentional, sometimes it's just
webmaster not knowing that
way he has set up his link directory will not provide search engine link benefit to anyone that links to them.However some link theft is intentional,
webmaster knows exactly what they are doing and by following this advice you can avoid being their next victim.

Tony Simpson, Involved in Web Site Design, Promotion and Optimization for 5 years. He provides advice, product reviews and products at Web Page Add Ons to Make Automation of Your Web Site Work for You.