Bedpage

1st Steps for SEO Site Audit – Indexing Problems in Google

You own a website. What would be the only wish you would ask for? Yes, you want Google to index your site. Why so? Just to rank better than other sites and to increase viewership. A very simple way to earn better. But is that so simple? Only a quality content with some effort can make your site rank better than others.

Many people now face difficulty in getting their site indexed by Google and look for a custom-made or an ideal solution for it.

Follow these steps to boost your Google ranking

You find that your site is not being indexed by Google. What will be the reason? Your site is not yet being read by the search engines i.e Google, Bing and so on.

So there is no way that your site will rank better if you are not aware of your site’s indexing status

Know whether your site is being indexed or not

A number of tools are available online to check whether your site is being indexed or not.

In indexing, Search engines read the pages and manage accordingly.

A simple way to check whether your site is being indexed is

site : (colon operator) Google search term

For example:- site:linkskorner.com and

Press Enter. Doing so will display all the pages indexed by Google for this particular domain. You can also enter a specific page URL of the domain to check whether the specified page is being indexed.

LInksKorner

What if your site is not being indexed?

There are not many, but two important reasons for your site not being indexed. They are

  • Meta robots tags in your content
  • Inappropriate use of disallows in robots.txt file

These two, meta robots tag on page level and the robots.txt file send the instructions to the search engines on how your site or page should be treated.

Do you know the difference between these two? Robot tags appear on an individual page of any site while the robots.txt file rules over the site as a whole.

On the robots.txt file, you can pick out particular pages or directories and decide how these pages should be handled during indexing.

Robots.txt

Confused about your site uses robots.txt file or not? Here comes a quick way to find out.

Just enter your domain in the browser followed by /robots.txt.

Are you aware of a common robots.txt Tester tool- Google Search Console? Yes, it is a more convenient and a user-friendly test tool, which helps you to spot the errors in your robots file.

An additional advantage of this tool allows you to check whether robots file of your page of the site is presently blocking the Google bot or not.

You have to enter the specific URL of the page and click ‘Test’.

Robots File

When a page or directory on the site is blocked, it appears after Disallow: in the robots file. As in the example, I have disallowed the landing page folder (/lp/) from indexing using robots file.

This prevents any pages of that directory from being indexed by search engines.

There are many options available where you can use the robots file. Google developer’s site has a complete analysis of all the ways in which you can use the robots.txt file.

Robots meta tag

Robots meta tags are always used in the header of the page. To add on, there is no necessary to include both robots meta tag and robots.txt file to disallow a page.

You don’t need to add robots meta tag to all the landing pages in the landing page folder (/lp/) to block Google from indexing your site if you have disallowed the page or directory using robots.txt file.

Robots meta tag has furthermore functions too.

For example, you can tell search engines that links on the entire page should not be followed for search engine optimization purposes. That could come in handy in certain situations, like on press release pages.

RobotsTag-linkskorner

The commonly used directives for SEO with this tag are noindex / index and nofollow / follow:

  • Index follow:- This is suggested by default. This directive allows search engine indexing robots to follow the specified page. Also search engine robots should follow the links in this page.
  • Noindex nofollow:- This directive allows search engine robots to NOT index the information on this page. Also, search engine robots should NOT follow links on this page.

For more information about robots meta tags, refer Google developer’s site.

XML Sitemaps

XML sitemap acts as a roadmap for all important pages in a website.

When you create your new page in your site, you want search engines to crawl it and index it. You can use eXtensible markup language (XML) and register with search engines to help you get indexed.

With XML sitemap, Google can find and crawl all pages you consider as essential on your site. Google decides the structure of all important pages of a site with the help of XML sitemap.

XML helps you better in case you have a new page with no inbound links to it, then search engine robots will definitely find it difficult to follow the links to your page.

To make search engine robots easier to find your page, many content management systems now have XML sitemap capability either built in or via a plugin, one like the Yoast SEO plugin for WordPress.

Note- You should have an XML sitemap registered with Google Search Console and Bing Webmaster Tools. This makes Google and Bing to know the location of sitemap and can index it.

Do you believe, with this method you can find whether your page is indexed or not within 8 seconds. Really, it’s very quick.

sitemap-xml-file

Javascript

Google once announced, it executes JavaScript and index certain dynamic elements. But, Google can’t always executes and indexes all JavaScript.

In Google Search Console, two tools- Fetch and Render can help you determine whether Googlebot (Google’s robot) can see your content in JavaScript.

The fetch and render tools show that Googlebot cannot see the page and links like us. This means that Googlebot can’t follow the page or links in the JavaScript to deep pages on the site.

Conclusion

Your ultimate goal in creating a page or a site to get indexed by the search engine robots. Just keep in mind the few steps mentioned in this blog and check your site’s indexing status. With quality content and effort you can definitely make your page rank.

Ritvik

Ritvik is a passionate blogger at LinksKorner. He loves to share his knowledge about the latest and productive Link building Resources through his blogs. Apart from writing, he finds reading books on Digital Marketing as interesting. For more info contact us here. For Advertise on LinksKorner, visit Advertise page.

11 thoughts on “1st Steps for SEO Site Audit – Indexing Problems in Google

  1. Thanks for sharing indexing problem in google tips and tricks. It may be helpful for my website. Very nice post, its quite different from other posts.

  2. Great thing that you have shared.
    But Still i m facing Indexing Problem by following Above Steps.
    Kindly help me out.

Leave a Reply

Your email address will not be published. Required fields are marked *