Saturday, November 15, 2008

Five mistakes that search engine robots away from your website-

Whereas we know Search engine robots are very simple software programs. If Your website is not indexing in search engine. You should think you are doing mistakes that an indexing robot cannot find the content of your website immediately; it will skip your site and go to the next link in the list. For that reason, it is very necessary to make sure that search engine robots can index your website without any problems.
Here are the top 5 elements:-
Reason 1: Your URLs contain too many variables
URLs with many variables can cause problems with search engine robots. If your URLs contain too many variables, search engine robots might ignore your pages.
Reason 2: In On page optimization your robots.txt file is damaged or it contains a typo
If search engine robots misinterpret your robots.txt file, they might completely ignore your web pages.
Double check your robots.txt file and make sure that you use the disallow parameter only for web pages that you really don't want to have indexed.
Reason 3: Your web pages contain too much code
Of course, your web pages can contain JavaScript code, CSS code and other script code that is not directly related to your content. Visit your website with a web browser and select "View source" or "View HTML source".
If it is difficult for you to spot the actual content of your website then search engines might also have difficulty to parse your pages.
Reason 4: You use session IDs in your URLs
Many search engines don't index URLs that contain session IDs because they can lead to duplicate content problems. If possible, avoid session IDs in your URLs. Better use cookies to store session IDs.
Reason 5: Your website navigation causes problems
Fancy JavaScript or DHTML menus cannot be parsed by most search engine robots. Flash or AJAX menus are even worse when it comes to website navigation.

No comments: