Whereas we know Search engine robots are very simple software programs. If Your website is not indexing in search engine. You should think you are doing mistakes that an indexing robot cannot find the content of your website immediately; it will skip your site and go to the next link in the list. For that reason, it is very necessary to make sure that search engine robots can index your website without any problems.
Here are the top 5 elements:-
Reason 1: Your URLs contain too many variables
URLs with many variables can cause problems with search engine robots. If your URLs contain too many variables, search engine robots might ignore your pages.
Reason 2: In On page optimization your robots.txt file is damaged or it contains a typo
If search engine robots misinterpret your robots.txt file, they might completely ignore your web pages.
Double check your robots.txt file and make sure that you use the disallow parameter only for web pages that you really don't want to have indexed.
Reason 3: Your web pages contain too much code
If it is difficult for you to spot the actual content of your website then search engines might also have difficulty to parse your pages.
Reason 4: You use session IDs in your URLs
Reason 5: Your website navigation causes problems