Why Google not Indexing Your Site?

Google-Indexing

“Why my site is not indexed on Google?!”

Google must index the website for your site to get any organic traffic from Google. If your site not indexed, you are lost your business. Nobody will find your content organically, because it’s not part of Google’s search index.

The first step to fix an indexing issue is diagnosing the indexing issue. The following list will help you do just that.

I’ve roughly prepared this list from most to least common. You can go through the list from top to bottom, and I’m sure you’ll find your exact issue.

Google Not Found Your Site Yet

This is regularly a problem with new sites. Hold it a few days (at least), but if Google still hasn’t indexed your site, make sure your sitemap is uploaded and working properly. If you do not submit a sitemap, this could be your problem. You have to request also to Google crawl and fetch your site. These are google Instructions on how to do that:

  • On Webmaster Tools, under the Home page, click the site you want.
  • On Dashboard, under Crawl, click Fetch as Google.
  • In the text box, type the path of your page you want to check.
  • In the dropdown list, select Desktop. (You can select any type of page, but we only accept submissions for our Web Search index.)
  • Click Fetch. Google fetches the URL you requested for. It may take up to 10 or 15 minutes for to Fetch status to be updated.
  • Once you see the status as “Successful”, click on the Submit to Index, and then click any one of the following:
    • To submit the individual URL to Google’s index, select any URL, and click Submit. You can submit up to 500 URLs of these requests per week.
    • To submit the URL and all pages linked from it, click the URL and all linked pages. You can submit up to 10 times these requests per month.

Site Indexed Under a www / Non-www Domain

Technically www is a subdomain. Thus, http://example.com is not the same as http://www.example.com. Make sure you add both sites to your Google Webmaster account to ensure they are both indexed. make sure to set your preferred domain, but verify ownership of both.

The Site or Page(s) are Blocked With Using robots.txt

Another problem is developer or editor has blocked the site using robots.txt. This is an easy fix. Just remove the entry from the robots.txt, and your site will reappear in the index. Read more about robots.txt here.

Don’t Have a sitemap.xml

Every website has to have a sitemap.xml. it’s a simple list of directions that Google should follow to index your site. You can read about Google’s Sitemap policy

If you have any indexation issues on any portion of your website, I recommend that you revise and resubmit your sitemap.xml just to make sure.

Crawl Errors

In a few cases, Google will not index some pages of your website because it can’t crawl them.

To identify these types of errors, go to your Google Webmaster Tools → Select your site, → click on “Crawl” → Click on “Crawl Errors”. If you have any issues, i.e., unindexed pages, you will see them in the list of “Top 1,000 pages with errors.”

Lots of Duplicate Content

Lots of duplicate content on a site can confuse search engines and make them give up on indexing your site. If multiple URLs on your website are returning the same content, then you have a duplicate content issue on your site. To solve this problem, pick the page you want to keep and 301 the rest.

It makes sense to canonicalize pages, but be more careful. Some websites reported that a confused canonicalization issue has prevented indexation.

The Site is Blocked by .htaccess

Your .htaccess file is a part of your website’s existence on the server. It allows it to be available on the world-wide-web. This file is written in Apache. Although .htacess is handy and useful, it’s used to block crawlers and prevent indexation.

Hosting Down Times

If the crawlers can not access your site, they won’t index it. This is obvious, but why does it happen? Check your connectivity. If your host has a frequent outage, it could be that the site isn’t getting crawled. Time to go shopping for a new host.

NOINDEX in the Meta Tag

Another way of saying “no” to the robots, and thus not having any indexation, is to have an issue of noindex meta tags. It often looks like this:

<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>

This is one of those issues where you’re like, “Oh, shit, I can’t believe I didn’t see that!”.

Remove this line of code, and you’ll be back in the index of Google.

AJAX/JavaScript Issues

Google does index JavaScript and AJAX But these languages are not easily indexable as HTML. So, if you incorrectly configure your AJAX pages and JavaScript execution, Google will not index the page.

Takes Forever to Load

Google doesn’t like this if your site takes an eternity to load. If the crawlers encounter interminable load times, it will likely not index the site at all.

Deindexed

This is really bad at all.

If you got hit with a manual penalty and removed from the index, I think you already know about it. If you have a site with a dark history (that you don’t know about) it could be that a lurking manual penalty is preventing indexation.

If your site has removed from the index, you’re going to have to work very hard to get it back in.

Conclusion

Indexation is the keystone of good SEO. If your site or certain pages of your site are not indexing, you need to figure out why.

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe

Select Categories