How to Index Website on Google? 4 Proven Ways

how to index website on google

The journey of a website does not start with the first post or creation of the first webpage, but the journey of a thousand posts and hundreds of thousands of visitors per month starts with a simple index.

A website, including this and yours, is a collection of databases, images, texts, and other packaged content (such as HTML, CSS, JS, and other anchor content). All this data is inside the server, also called the website’s host. But this is where the website lives. Getting on Google requires you to complete another task; indexing your website.

So what is indexing, and why is it so important for a website to get indexed? This article will explain all the process details, the methods of getting your site indexed, and some bonus steps that you need to follow to get the best results. First, let’s understand what indexing is.

What is Indexing?

A website is stored inside the server. All the files, images, videos, and text that you can access are accessed through the servers. So, where does indexing come in the picture? Users do not search for these websites on the internet, which is a jumbled mess of files and links. Instead, people use special programs called search engines to reach websites. Some examples of search engines include Google, Yahoo, Bing, etc.

So put simply, if the servers are where the websites reside, the indexes are the address book for them. Through the help of search engines, users can navigate to these server locations and access the files (texts, photos, videos, etc.). So one can imagine how important indexing becomes as to how crucial it is for getting discovered.

The index is an XML file that contains hyperlinks to all the different parts of a website. People can navigate a website through different web pages. But most of the external elements are unnecessary for search engines.

They want just the skeletal structure of the website, telling them how many pages are there, how many posts are there, and some other information. This is so that they can guide the user to the specific pages or posts if needed.

Below you can see the index of This is the language of search engines, no images, no graphics, just addresses. Don’t worry. All indexes are not this messy.

What is Indexing

Most of the indexes are stored in the sitemap file of the website. The name is very apt for the indexes as it is a map for the website. For example, here’s a well-organized sitemap (index) of another website with everything well-listed and sorted.

What is Indexing 1

As you can see, all the indexes are links to some page or post of the website. So now we know what the index of a website is. But that is the need for indexing a website?

Why Indexing is Necessary?

One can have a website, a server (hosting plan), and upload posts, articles, images, etc. But as long as their website is not indexed, the website will not get a single visitor, no matter how great posts and articles they write. This is because Google (and other websites) are unaware of their existence. Out of the billions of web pages and sites, how is Google knowing about a new website?

This is where indexing comes into play. Google and all the other search engines have a depository of indexes of all the websites that you can access through them. Google’s super index is called “Caffeine,” It is a library of files just like we showed in the pictures above. Any website you see on Google’s search result page, including ours, has a spot in Google’s Caffeine.

Whenever a user searches for something, Google has to present a relevant answer in the form of search results. But how would Google know where the web pages are? Since millions of websites are hosted on hundreds of thousands of servers, Google needs an address book to reach all these websites when needed. This is what Caffeine is.

So every time a search query is made, small programs called “bots” or “crawlers” or more aptly “spiders” scour through this depository of indexes. They compare different sites through the indexes, look at relevant posts, and finally, present the web page in the search results page after finding the best post for the query (with SEO evaluation). If the website is not present in the index, then non of these processes will happen to begin.

This is why indexing your website is the first and most important step to discoverable. If your site is not indexed, your website is just a personal cloud storage service for your texts and media files.

Now we are clear with indexing and all the associated things, let’s look into the process of indexing and all the things you’ll need for it. We’ll describe the easiest and fastest (and recommended by Google) method for indexing, along with some other methods in case the first one does not work.

Things Needed For Indexing

For the easiest method for indexing, you’ll need a website (of course), and if you are using some popular CMS such as WordPress, then a plugin called “Yoast” would do all the job. It is a free plugin, and not only is it great for making sitemaps, but it has become mandatory for the SEO of websites.

Another important part is Google Search Console. It is a service by Google to monitor your website and regularly check the index’s status, how often Google is crawling your website, and if there are any indexing issues. So Google Search Console is a must. It is free, and registering your website is easier than you imagine. No matter how you create the index, registering on Google Search Console is a must if you want to be seen on Google.

1. Registering on Google Search Console

Here’s a quick and brief guide to getting your website registered on Google Search Console since this is a mandatory step to getting your site indexed. We shall also describe the same process for the Bing search engine, albeit the process is very similar for both.

In some cases, if your website is registered and verified in Google Search Console, Bing can automatically detect it and register your site on their index depository.

First and foremost, visit Google Search Console and create an account using your primary email address since Google will use this address for communicating all the important website information. Once on the homepage, look at the top left side to find an option to add a property. This is where you add your websites. Click on it.

Google Search Console

The next step will bring a dialogue box something like this:

Registering on Google Search Console1

Here is where you need to enter the domain name of your website. While the URL prefix method can be faster and easier, we recommend going for the domain verification method. This might take a little longer, but Google will verify all the different subdomains of your website across HTTP and HTTPS. So enter your domain name and click continue.

Registering on Google Search Console2

Here, you can see that a code is presented to you for verification. Google will need this code on your domain registrar’s website for verification.

For automatic verification, you can also select from the list of different registrars, such as Namesilo, GoDaddy, etc. But do note that the registered email address must be the same for Google Search Console and your domain registrar. Then, copy the site verification code.

NOTE: You just need the code, not the text part that says “google-site-verification.”

2. Moving to The Domain Registrar

Moving to The Domain Registrar

Log in to your domain registrar, where the domain’s nameservers are placed. If you are using Cloudflare, users will do these settings in Cloudflare. All you need to access is the DNS or Domain Name Server records. Here’s an example from Namecheap.

In the DNS management records section, create a new record. From the record type, select a TXT record. Write “@,” which points to the entire domain in the host section. In the value section (here IP address), paste the copied text from the Google Search console. Set the time to automatic and save it.

While we have shown Namecheap as the example, almost all the other registrar has the same process. Just look for the domain you want to work with, open the DNS management section, and edit the records.

3. Waiting For Google 

Waiting For Google

After you are done with the name servers, come back to Google Search Console and press verify. Now, this could be awaiting part since DNS updates take a little longer. So verifying on GSC might show that Google crawlers could not verify your property. Don’t worry. Give it a day or two, and your site will be verified.

Along with this, you can log in to your website dashboard (backend). Then, get the free plugin called Yoast, and in the General>Webmaster section, paste the verification code and save the changes. While it is not necessary, it is also recommended to get it done from this part.

4. Alternate Methods

Alternate Methods

The method that we showed for getting your site verified is the easiest, faster, and recommended method. But in case you do not want to take that approach, there is another method of getting your site verified.

For example, if you do not want to access your domain registrar and directly use your website’s code for verification, then the HTML method would be the best for you. This method is also for people who do not use CMS like WordPress and are more familiar with cPanel.

Download the HTML verification file from Google Search Console. To get the file, instead of the domain verification method, you’ll have to enter the URL of your website in the second option. This will present you with a host of options, HTML file being one of them.

The file extension will be “googleverificaiton.HTML” After getting the file, open the cPanel of your website and click on the File Manager link. This will bring up many folders containing your website files.

Alternate Methods 1

From the list, select Public html and then your domain name folder. In the folder for your domain, upload the HTML verification file, and that’s it. Once done, go back to Google Search Console and click on the verification button. Don’t worry if it does not happen instantly. Google Search Console will verify your site.

Alternate Methods 2

The HTML tag method works where you have to add the code in your website’s source file, between the head and body tag. While this method is safe, we do not advise trying this one since most WordPress users will have to edit the theme’s code or tinker with the website’s HTML file. In addition, it could break your website. If you are getting it done, then you should ask a website developer to access and edit your website’s source files.

5. Creating The Sitemap

After your website has been verified by Google using these methods, you can now access your website’s internal data using Google Search Console. Just select your website from the top left corner.

Now comes the part for indexing your website. Note that how fast Google will index your website depends on its availability. Once your website has been crawled and indexed, you are good to go.

So how to tell Google to reach your website?

By creating a sitemap.

Google can access all the pages and posts of your website via the sitemap of your website.

Creating a sitemap is very important and, fortunately, very easy too. Generating a sitemap will be a breeze for the people who are using Yoast. Here’s how to do it.

Log in to your website dashboard. In the SEO section or the Yoast setting section, click on General>Features. Here, you’ll see all the features that come with Yoast. Near the bottom, you’ll find the Sitemaps option. Click on the question mark beside it and click on the “See the sitemap” link. This link will take you to the sitemap of your website. This page will display all the important links of your website, from all the pages to posts. The sitemap has been generated.

6. Adding it to Google Search Console

With the sitemap ready, go back to Google Search Console; click on the Sitemaps tab on the left-hand side menu. This section will bring the “add sitemap” tab. Take a look at the picture for a better understanding.

Adding it to Google Search Console

Your website’s domain name will already be mentioned in the “Add a new sitemap” section. All you have to do is type sitemap_xml.html. This is the link to the sitemap page created by Yoast. This is where Google will send its crawlers to get more information about your website.

Put the link and then click on submit. The process won’t be instant, but your website indexing status will soon turn to success, as shown in the picture. We will discuss the submitted and last read section later in the article.

A very important check is to see if your website is not stopping Google’s bots from crawling your website. If this is the case, your site might not get indexed, even after submitting the sitemap. Here’s how to check for it.

Open your website dashboard, then on the left-hand side menu, click on Setting and go to the “Reading” section. At the bottom of that page, you’ll find a section that says “Search engine visibility.” DO NOT check the box as it will discourage search engines from reading your website and prevent indexing. If the box is checked, uncheck it.

Adding it to Google Search Console 1

Many websites have this turned on, and as a result, Google cannot access the site. Too many roadblocks, and Google might stop considering indexing your website. So this is a very important part to check.

An Alternate Way of Indexing 

People who are not using a CMS website such as WordPress cannot use the benefits of the Yoast plugin. There is another, the longer process for creating and submitting a sitemap.

Remember that you do not submit the sitemap file to Google Search Console. You just add a link to the sitemap file. The sitemap stays in your website folders, on the server, it is hosted. So just adding the file will NOT work if your sitemap has not been created.

To create a sitemap easily, you can visit any sitemap generator website, such as the XML Sitemaps Generator. We found it simple and very easy to use. First, visit the website and enter the domain of your website.

Depending on the size of your website, it can take from a few seconds to a few minutes for sitemap generation. Assuming that you have started your website recently (as to why you would wait so long for indexing your site), The site will generate your sitemap within a minute or so.

Download the sitemap file, which will be in XML format. The process from this step is the same as uploading the verification code for site verification. Using cPanel, open your website’s root files. Go to File Manager, then click on the Public html folder. Then click on the domain folder and upload the file there. That’s it; your sitemap has been created and uploaded.

Now back to Google Search Console. The same process of adding a sitemap. Add the link to your sitemap and click on submit. After a few minutes, your sitemap will be successfully added to Caffeine. And congratulations, your website just got indexed by Google.

For Bing, the process is the same. All you have to do is go to Bing Webmaster, log in or sign up using the same email address used in the Google Search console. The user interface is also very similar. On the top left side, click on add a new site, and then you’ll get the same option for verifying the site as in GSC. The better option would be to import the site from Google Seach Console for a hassle-free experience.

An Alternate Way of Indexing

That’s it. Bing will index your website as well. You don’t need separate indexing for other search engines such as DuckDuck Go or Yandex. They use Google’s index for websites, so once you’re on Google, you are good to go.

How to Check if a Website is Indexed or Not?

After your sitemap has been read and stored, checking for indexing is important to ensure everything has been completed. For example, to check if Google has taken your website into the Caffeine depository and the search results show your website, all you need to do is a simple Google search.

Go to Google search bar and type So if your website is “,” then the search query would be Click search, and if you see your website on the search results page, then the site has been indexed, and everything is working properly. If you do not see your site there, wait for hours or even a day or two. It will be indexed soon.

1. Indexing Intervals

A very important part of indexing is how often Google visits your website or crawls it. This step is important because if important pages of your websites that are frequently updated are not getting crawled and updated, those pages will never be shown to the readers and hence, will not be ranked. This possess a big problem, and while there is no sure fix for it since Google selects the frequency of crawling, there are some methods to make it better, if not fix it.

Here’s a detailed video from Only studying the indexing interval of some popular websites and how much they lost due to this delay. You can watch the YouTube video here. The gist of the video and the study is that some large websites such as Eventbrite did not get their site completely indexed, losing almost half of their web pages from getting indexed. You surely do not want that to happen to your site.

After your website has been indexed, Google crawls it in frequent intervals to look for any changes or updates. Now it is under Google’s discretion as to when it will crawl which part of your website.

While the search engine has gotten smarter in terms of understanding which part of your website needs frequently crawling, in some cases, important pages are not indexed for months. Let’s see how to improve upon this situation.

2. Check Coverage

Go to Google Search Console and click on the coverage tab on the left side. This section will show you all the important indexing data for your website. If there are any indexing errors or Google cannot access links, everything will be shown here.

Check Coverage

Here, if you notice, on the top right corner, is the last updated date as of yesterday. This means that Google is crawling parts of this website frequently. This is a good thing. You can check how often your website is getting crawled from this section.

In the sitemap section, you can see a section saying the last read date of your site. This is what Google considers as the overall site crawling date, and it should usually be a week or two behind the recent date. Anything longer than that shows that there must be some indexing issues with your website.

Based on how frequently parts of your website get updated, Google decides the crawling frequency. If you see a longer waiting period for crawling, it could be due to some internal problems or update in Google. So do not panic if the last read date was shorter than two weeks. But what if you want to get some parts of your website crawled?

3. Manual Crawling Request

In the Google search console, there is a section that allows users to ask Google to crawl some parts of their website. Notice the word “ask” since Google will decide if those parts need crawling or not. But since that is all you can do from your end, this is the only option.

Click on the URL inspection part in GSC and enter the link to the web page you want to be crawled. Enter the link and search for it. If the web page is not indexed, you’ll get a message that the page is not indexed and the URL has not been discovered. Here’s an image for that:

Manual Crawling Request

You can request Google to index that particular page from this part, and Google will consider crawling it. But that is not all. There can be cases when this page says that the page has been discovered but not indexed. This is usually done for web pages of your site, which are not required in the search results.

For example, settings, admin, and other pages are usually discovered by Googlebots but not indexed. But if you see this “discovered, but not indexed” message for important pages, then request indexing as soon as possible.

4. The Tag-Along Method

The tag-along method for getting certain pages crawled is when you use a web page from your website that gets a lot of clicks and frequently crawled and then adds a link to the page you want to be crawled in the popular page. Since Google crawls the popular page frequently, finding the link to your uncrawled page will lead Googlebots to crawl it.

This is not a guaranteed fix as Google might again find the page and if it decides that it is not necessary to crawl it. But that is a rare scenario. So the first method, along with this method, would be enough to make Google crawl!

Go to the Performance tab in Google Search Console, and there you can get a list of all the popular pages and posts of your website. That’s all. Go to the backend of your website, edit the post and contextually add the link to the page you want to be crawled. Remember to add the link contextually. It must be somehow connected to the page you are putting the link in. Unrelated linking is a negative SEO factor.

So we have covered verifying your domain with different methods, adding your website to Google Search Console and other search engines. Creating sitemaps, adding them to the index, and requesting specific pages to be crawled are also covered. But what if nothing works? What if your site is not getting indexed or crawled at all? The next section will talk about solving this issue and ensuring your website gets indexed.

Robots.txt And Everything About it

While sitemaps are the maps for search engines to navigate your websites, this navigation is done by small programs called bots. These are called Googlebots, crawlers, or spiders. They are responsible for reading your site, crawling it, and presenting it to the people when a search is done. But did you know that there is a way to communicate with these bots?

Just like we talk with texts and voice, some programs talk with programming language. Similarly, these bots talk using a language called robots.txt. This is how we can also communicate with these crawlers and provide them with commands to crawl and index the websites.

If there is some persistent error in indexing your website, the error has to be with the robots.txt file of your website. So let’s see how to create a robots.txt file for your website and all the important settings for it.

1. Create Robots.txt

If you are using CMS like WordPress, then again, Yoast will create this file easily. All you have to do is go to your website dashboard and follow these steps:

Go to the SEO settings, from there click on Tools, and then click on the File editor. Here’s an image to understand the process better:

Create Robots.txt 2

On the file editor page, you’ll see an option to create a robots.txt file for your website. Click on that, and Yoast will create your robots.txt file. That’s all for creating the file. To check if the file has been created and embedded properly, just search for “” So if your site is, the search would be

Here’s an example of a robots.txt file for a simple website.

Create Robots.txt 1

We’ll explain what these words mean. But first, let’s see how to create a robots.txt file if your website does not use a CMS like WordPress. The process is again very simple.

Visit any robots.txt generator website. We will recommend Seoptimer as it is very powerful and simple to use. All you need to do is allow the robots no delay in crawling and enter the link for your sitemap, as shown in the image. Also, allow all the search engines and then generate the file.

Create Robots.txt 3

Once the file is generated, download it and then upload it in the public_html folder using File Manager. After the upload has finished, execute the robots.txt search, and it should be up and running. Now that we understand how to get the robots.txt file, let’s try to understand its importance and how it can affect crawling and indexing.

2. Understanding Robot.txt

Understanding Robot.txt 2

Take a look at this image. There are four important commands here that the crawlers will look for and execute crawling accordingly. Let’s try to understand each one of them as it is crucial for indexing your website and solving errors. 

The sitemap parts mentioned above tell the crawlers which site to crawl. As you can see, the entire sitemap is mentioned so that the bots will look at the entire site for indexing. If you want to include only specific pages, you can select specific page sitemaps instead of sitemaps. However, this is not recommended. Instead, it is best to include the entire sitemap.

The user-agent part is a very important part. This is the name for the crawlers, the bots that will index your website. The symbol of a star “*” here represents default. This means that all the crawlers are allowed, be it Google, Bing, Baidu, etc. The bots can access the sitemap anywhere you have submitted your sitemap too. 

The disallow and allow section work as expected. Allow part can have the entire website that you want Google to crawl and show to the public. This section can also be left blank as, by default, you are allowing the bots to crawl the site. 

The disallow part is used to stop the bots from crawling the private parts of the website, meant for the administrator. This includes the backend panel, login page, etc., which only you or the website owner can access. The disallow part tells the bots not to include it in search results or crawl it. You can add as many pages as you want

Take a look at how many pages Apple has stopped Google from indexing here:

Understanding Robot.txt 1

So even if you search for these pages or links, you won’t be able to find them since these pages are not indexed in Google.

Remember that if your website’s robot’s.txt file is improperly formatted, it might disallow search engines from indexing the site. These files are case sensitive so keep everything in lowercase. If your website is not getting indexed, look for its robots.txt file, and if you find something like this:

User-agent: * (or Googlebot)

Disallow: /

This would tell Google not to index or crawl your website at all. Instead, change it to “allow:/” to let Google and other search engines index your site.


Indexing is the most important part of your website if you want to get ranking, for this is the way you get into Google’s index. If your website is not indexed, then there is no point in adding content to it or even hosting it, for that matter.

The indexing process might look very complicated, but with careful understanding, it is one of the easiest things to do. We hope that this article helped you in understanding and indexing your website.

Leave a Comment