SEO is the most important thing when it comes to internet publishing success. Nothing can skip the SEO part and get organic visitors. And with each algorithm update from Google, search engine optimization is becoming more and more instilled into the core values of Google.
However, while literally hundreds of factors, small and large, affect the SEO of a website, people often forget about the tools to do these optimizations. One of these tools is by Google itself, and it’s called Google Search Console.
This article will present you the complete SEO guide to Google Search Console; How to use it for optimizing your website, how to leverage the power it has, and the insights it gives.
Google Search Console is a free service from Google for web admins, developers, and publishers to get deeper insights into the website, its components, performance, and a host of other statistics.
People often get confused with Google Analytics, thinking that both the services are the same. The only similarity between these two is that both are free and are from Google. Where Analytics is to know the data about the people who visit your website, Google Search Console is about the websites.
Apart from Google Search Console being a free service, the other great benefit of using this is it’s from Google. Who knows website data better than Google?
There are similar services from other companies such as Moz Pro, SEMrush, Ahrefs, etc. However, there are two problems with these services. First, all of them are paid, and by paid, we mean very expensive.
The second issue is that the data they provide is sourced from Google itself. Yes, these pro services offer more features and functionality, but Google Search Console is more than sufficient for an average user, even at advanced stages of website data requirement.
So let’s dive into the powerful world of GSC and see how it can be used to improve your website’s SEO. We’ll present details about each of the different sections of the console, what it does, how it can help you in optimizing your website, and some other tips and tricks to keep your website ahead in the ranking, literally! Let’s get started.
We have skipped the first option that you get in Google Search Console, and we’ll explain why in later parts of this article. First, let’s start with the Performance tab in GSC. As the name implies, the performance section shows your website’s search performance, or precisely, different keywords. This section has some of the most important insights about the website, helping you understand whether all the other SEO practices bring any benefits to your site.
The top section shows a graph of your website performance over the months or years. This graph shows both website clicks and impressions. The precision of data is great with this graph as you can see the metrics of each day, i.e., impressions and clicks on each date. Not just web pages, but there is also an option to get the statistics of images, videos, and news as well.
This section also shows the average CTR and average position over the time period selected. CTR stands for Click Through Rate, and it is the metric achieved by dividing how many times people clicked on your website by the total number of times it was shown to them.
In simple words, it is clicks divided by impressions. Anything between 1.7-1.9 is considered a good click-through rate.
The average position shows the average position of your website for all the keywords it has ranked. The smaller the number, the better. This is where you understand whether the employed SEO practices are working or not.
You have to select two-time frames ( say July to September and October to December). The SEO practices are working well if you see better performance in the graph (rising) and lower numbers in the average search position.
Metrics under Performance tab
There are some other metrics in the performance tab that further detail the website performance data. These tabs include:
This section shows the best-performing keyword of your website. Remember that this might not be the exact keyword you have worked on since Google ranks keywords according to its own methods and selection. This section helps you understand what type of keywords you rank for, even choosing some related keywords which will give your site a better chance of ranking.
This section shows the web page that’s ranking and how many clicks and impressions it is bringing to the site. This helps you tag other posts or link them to other similar web pages to get some of that “ranking juice.”
This section shows which countries most of your traffic comes from, in the time frame you have selected. These numbers are dynamic and keep changing. Countries can help you identify which country you are ranking for so that you can create content more suitable for that place.
Or, if you want to target a specific country for your website particularly, then a rising number of visitors from that country can indicate that your SEO practices are helping.
Which type of device is the biggest source of your website traffic? This section shows that, including mobile, desktop, and tablet devices. How is this beneficial to your SEO? For example, let’s say you are getting more website traffic from mobile devices, then make your page more mobile-centric, with better fonts, easier navigation, and a simple and light theme.
The search appearance and date section show the same data as the graph does. The four above-mentioned metrics are the most helpful ones for your website SEO.
2. URL Inspection
The URL inspection is a great tool for your website’s sitemap and indexing information. We have covered the importance of indexing and everything related to it. You can read that article here. To get the gist of it, indexing is the process of Google registering your website and all the various pages inside it to its own directory.
So when someone searches for any keyword, it can look up this directory, and if your webpage suits the SEO, it gets ranked. If your website is not indexed, or even some pages of your site are not indexed, it will not get ranked or even show up in a search.
The URL inspection of GSC is to check whether particular pages of your website are indexed in Google or not. First, you have to go to your website and to the web page you want to check. Then, copy the public link (not the backend) and paste it into the URL inspection search bar. If the web page with that link is indexed, then GSC will tell exactly that. But if not, then something similar to this will be shown.
If the web page exists and says that the URL is not on Google, you might ask Google to index it by clicking on the Request Indexing button. This is a great tool for your SEO because if some of your website’s most important web pages are not indexed, you are losing potential rankings and visitors.
A great way to check if all your web pages are indexed or not is by doing this; Check the ranking list from the performance tab and select the web pages. From here, make a note of all the web pages that are not ranking (with zero impression and clicks). Then test the URLs of the pages. How could this help?
Sometimes, Google can be very selective about the pages it should index and the pages it should exclude. It might be that some of these pages were crawled (checked by Google) but not indexed. This could be the reason why these pages are not getting shown to the visitors. You can request indexing pages, improving your SEO efforts.
The Pages tab is the doctor of your website, always looking out for errors, missing pages, compatibility issues, server issues with some pages, readability issues, and much more. This page has four sections: Errors, Valid with Warnings, Valid, and Excluded. Here’s what all of these mean;
This section is where all the errors in your websites are shown. All the possible errors, from indexing to 404 errors to readability issues where the text is too small to read, will be shown here. This section should have 0 issues, and if in case there are any, clicking on it will show the details of the errors, with the link to the page and the type of error it is experiencing.
Valid with Warnings
This section shows the pages that are good to go but can be a problem later. It also shows all the problematic pages of your website that might be causing some underlying issues to rank and performance. Click on it if there are any to get more details about the issue. This should also be 0.
This section is where all the valid pages of your website are shown. These pages need no work or improvement, and every technical part of these pages is good to go. Remember, this shows the indexing health of these pages and other technical health, not ranking factors. So these pages might need better content and other search engine optimization.
This is the most interesting and confusing part of the Coverage page. You’ll see that many of your web pages or links are in the excluded section. This, according to us, is one of the most powerful indexing tools that you can find. Here’s what the different sections mean;
Crawled, currently not indexed
This part will contain most of the RSS feed of your website. These are just bits of information that are not required to be indexed, and if you see a lot of links here, do not panic. Google has seen these parts of your website and decided not to index them.
However, if you see anything other than “feed” at the end of the links here or any link to an actual post, you must request that link be indexed in the URL inspection section.
Alternate page with the proper canonical tag
This section shows all the pages of your website that are considered poor canonicalized duplicate pages of some other web page. The canonicalization process is a very complex one, and it needs a separate article to explain completely, but for now, here’s what you need to know.
While a few pages under this URL are normal, if you see a very large number, you might need to request indexing for these pages and properly do the canonicalization for these pages. Non-indexing of important pages can result in a huge loss in website traffic. So make sure none of your important pages are on this list.
To request crawling, either you can copy the link to the URL inspection tab or just click on the link and choose “inspect URL.” You can also select the “Test Robots.TXT blocking,” which shows whether the page is not allowing Google to visit and index it.
Pages with Redirect
This section shows the pages on your website that take the visitors to some other web page than their request. A broken link or a moved page could be the result of this. You can visit these pages and look for the links and check if everything is working fine or not. If not, fix it there. And if things are working fine, then you can request for validation in GSC for that particular page.
Discovered, currently not indexed
This section shows your website’s tags and/or categories. There is no need to worry about it. But if you see any page other than these, then request indexing and fix it.
Not found (404): Webmasters need to fix this issue ASAP. These are all the pages that are not present on the website, but a link to these pages exists. This is improper website structuring and could be a negative SEO factor.
Excluded by the ‘noindex’ tag
Check all the links in this section to make sure none of the important pages of your website are here. This section shows all the pages that specifically request not to be indexed. You can find random strings and other obscure elements of the website here, but no important page with articles or anything that is meant for the visitors must be here.
This section shows all the pages that are deleted or not present on the website but still does not show a complete 404 status. Rather just show the “page not found” status. All you need to do is properly fix these pages and broken links, and everything will be fine. Therefore, this section does not warrant much attention.
We have already written a very detailed article about sitemaps, its importance, and the necessity of indexing your website. A sitemap, which is very aptly named, is the map of any website with all the link addresses to different pages within the website. It is a collection of all the links to your website.
So what’s the importance of it? Google, and pretty much all the search engines, need a sitemap to index the website in their directory. Since it is the map with which search engines navigate your website, a missing link or improper sitemap can exclude important pages on your website from getting indexed.
The sitemaps tab in Google Search Console is how you submit your website’s sitemap to Google. This is the only way to do it, so that explains the importance of GSC. The entire process is very easy. All you have to do is add the link of the sitemap, which is usually like this; yourwebsite.com/sitemap_index.xml. Just submit it, and after a few minutes or hours, Google will be successfully able to crawl and index your website. But there’s more.
GSC showed the most recent date when your website was crawled. This is a very important indicator of your website’s SEO. If your site is not getting crawled or crawled frequently, this indicates that it is missing out on ranking. Getting the frequency and the latest crawling report allows web admins to run the SEO better.
Do not panic if you see the last crawl date of your website was around five days ago. It is normal for Google to crawl websites each 4-5 days. So even if you see the latest crawl date to be a week or two ago, there’s no need to panic. It must be some error from Google and will be fixed soon. But if the last crawl date was more than a month ago, and you cannot find any news of indexing error for all the websites, then it is time to check the sitemap and robots.txt status or to use the URL tester and test your website’s URL.
The sitemap is a useful part of GSC and should be visited frequently, preferably every month, to ensure that crawling is going well. Not only about adding pages to the index of Google, but you can also remove either the entire sitemap or parts of it (pages) from Google search results.
This tab allows you to temporarily remove or block certain parts of your indexed website or some web pages that are live on Google. From this tab, you can select either to temporarily remove some pages, get rid of outdated content, or tell Google some of the pages in the website might not be safe for everyone.
The outdated content
The outdated content gives you two options, and both of these are very helpful for SEO and user experience. You can opt for either the outdated cache removal or the entire web page removal. The outdated cache removes only the irrelevant or removed content from the web page, but the webpage still remains in the index.
This is done to remove some of the outdated content from a relevant page. Google stores a snapshot version of web pages called cache. In some cases, even if you have already removed the unneeded bits from the page, this might not be updated in the search results as the cache is still not updated.
The SafeSearch filtering
The SafeSearch filtering tools let you tell Google that some website web pages can be inappropriate because the content on the page could be against Google’s rules and policy. This is important because if Google finds it out itself, the website will get penalized and even deranked from the search results. So it is always better to let Google know and get it de-indexed.
The tab shows a list of all the URLs that have been submitted or requested and the current status of the request. Remember that all the URLs here are not being removed from your website. These pages can still be accessed using the URL. However, these pages are getting de-indexed, meaning that they’ll never be shown in the search results.
This is the latest addition to Google Search Console, and thankfully so. The Core Web Vitals Page much needed it in GSC, and the addition here shows that Google wants to make GSC an SEO powerhouse. Earlier, web admins had to use Google Page Speed Insights to know all these parameters and metrics. But now, with integration, all the data is presented to you in one useful dashboard. And this is perhaps the most powerful SEO tool in GSC as Google has pushed the Core Web Vitals algorithm update, and it is one of the most important algorithm updates to date.
The Page Experience tab shows a list of information about the website’s vital scores and how it performs on mobile and desktop devices. For all those who do not know, these vitals are not about the content of your website. Instead, these vitals are about how fast your website loads, how much time it takes to become interactive, and how visually stable the entire contents of the website are. The vitals are indicative of the user experience, and it’s all technical.
There’s a section that shows all the failing URLs (web pages) in the Core Web Vitals. Clicking on it takes you to a section that shows the issue these URLs are facing, which could be any of these; LCP, FID, and CLS.
This refers to the largest contentful paint, and it is the time taken by your website to load the largest element of the web page. Usually, large images are the most common reason behind a failing LCP score.
FID stands for First Input Delay, and it is time the web page takes between the user inputting a response and the web page registering it. This means that when a web page loads, FID is the time you have to wait for it to become scrollable or interactive. A complicated website with too many elements and a slow server is why this section fails.
Cumulative Layout Shift represents the visual stability of the website, and it is a measure of how much the elements of a website shift from the first web page loads till the entire loading is completed. The reasons for a failing CLS score could be excessive ads, too many functional plugins, poorly coded themes, etc.
Apart from the Core Web Vitals, this page also shows the mobile usability status. Mobile usability is more important than you think. Since Google is a mobile-first indexing search engine, it wants web admins to make the website responsive and usable with mobile devices such as smartphones and tablets. So make sure there are no issues in this section.
An HTTPS section also shows if the website has some HTTP pages. HTTP is just the protocol of security for websites, where the added “S” shows “secure.” An SSL certificate is how you can make your website use HTTPS protocol. Make sure your website uses HTTPS instead of HTTP since this has become a ranking factor.
Core Web Vitals
Core Web Vitals are very important for the success of any website. Google has taken this update very seriously. It is only going to get more complex and detailed in the coming years, with Google adding more and more metrics to enhance user experience.
Apart from this, there is a similar section for desktop URLs with the same features. In case of failing URLs, there is an option to validate the fix. Once you have removed the error or the reason for vitals failure, you can validate the fix, which asks Google to check the URL again.
Core Web Vitals
The Core Web Vitals below the Page Experience tab is just a graphical data representation of the data shown in the Page Experience tab, showing the three types of URLs; good, mediocre (need improvement), and poor. The goal for SEO should be to keep the green line in the graph highest and the orange and red lines as low as possible, preferably in the 0s.
The Mobile Usability tab is also the separate section of the same feature present in Page Experience. You can see all the valid and invalid (error) URLs for mobile usability, along with the reason for the error. Overall, the Experience Tab, with the Page Experience, Core Web Vitals, and Mobile Usability, is very crucial in understanding and improving the website or the individual web pages.
The Enhancement tab contains two sections; Breadcrumbs and Sitelinks Searchbox. Let’s see in detail what these two are and how they affect SEO;
Breadcrumbs are all about the structure of your website. This is how users and sometimes crawlers navigate your website. For example, a breadcrumb structure may look something like this; Yourwebsite.com>Blog>Writing>Cars>Mercedes. Here, the breadcrumb address shows where the reader is. As you can see, these breadcrumbs are very important for Google to see how well-structured your website is.
While most of the breadcrumbs are managed by the theme you choose for your website, properly defining the breadcrumbs structure is very important if your website is running a custom-theme code. If not done right, this is the place where you’ll find all the errors; the Breadcrumbs section of the Enhancement tab.
Sitelink Searchbox is another new (around 2015, so not that new) feature in Google Search Console. Many people are confused about this section, even some webmasters. Sitelink is a structure added to the code of the website, which helps Google put your website in search box suggestions. Whenever you search for something in Google, you must have noticed that Google tries to predict your search.
Well, if you search the name of your website and it does not have the Sitelink structure in the code, there won’t be any suggestions in the search box. Hence, the site link search box is the section where you’ll find any error relating to it.
Clearly, these two are not the most powerful or important SEO tools, but they certainly help in pushing the website further in the ranking. Remember that a successful SEO only happens when all the little, even seemingly insignificant parts of the website are optimized. Every optimization counts.
6. Security And Manual Actions
This tab shows any security issues that might be present on the website or any manual actions that might be blocking your website from appearing in the search results or getting rankings. This section generally stays calm and quiet with no notifications. Any security breach or issues will be shown here, and all you can do is pray that these sections never ask your attention!
7. Legacy Tools
The legacy tools are some very useful tools that allow a deeper and better control over your website and the search engine optimization for it. There are four different options with different functions under this tab. Let’s see what each of these options does;
International Targeting allows you to target your website to a specific country based on the location or language. If you publish in some other language other than English, or maybe there is a version of your website that offers the same content in different languages, the HREF LANG tag would be very useful for you.
How to use it and implement it needs a separate article of itself, but this tag can help you tell Google about the other versions of your website or simply ask Google to target your website to the speakers of a specific language.
The Geo-targeting feature in this section allows you to ask Google to target your website to visitors from one specific country. Notice how we use the word “ask.” This is because targeting your website to a specific country does not mean that your website will start getting traffic from that country. Instead, Google will consider it, and if the content seems relevant to Google, only then will your website be recommended to that country.
This section is just the old notification section where web admins would get notifications about their website, including errors or general messages. Nothing helpful or important in this section.
For websites with a very large page count (crossing more than 1,000), the web admins can set certain URL parameters to tell Google how to handle the crawling of the URLs. For example, take the case of an eCommerce website. Here, there could be ten different pages for the same product, just with color variations. The same T-shirt can have different colors and different pages. So web admins can tell Google not to crawl certain links because it increases the crawl load.
This is what URL parameters are used for, but Web masters can use it for any reason, including not crawling specific parts of the website or crawling it in a specific way.
Web admins can use web Tools indirectly for the SEO of your website. Since a great user experience is one of the pillars of SEO in 2022, ensuring the visitors have a smooth and easy experience using the website is compulsory for ranking. The Web tools section helps in just that.
This section allows you to manage the advertisements that are being shown on your website. All you have to do is enable the statistics on your website, and every time a visitor on your website gets some intrusive or abusive ads, you get to know that too.
Without this information, you might never be able to take action and report to Google about such ads on your website. This hampers the user experience. The section allows you to check the ad reports for both mobile and desktop.
Enabling this is a must if you don’t want your bounce rates to be as high as the sky, as users will not find it very helpful to visit a website littered with ads and, worst-case scenario, abusive or inappropriate ads.
Remember at the beginning of this article, and we skipped the overview part? That is because it is just a tab that shows all the data from the tabs as mentioned earlier on one page. That’s what overview means; all the data in one place, but not as detailed. So, for example, in the overview section, you can see the performance graph, Coverage graph, Experience graph, and Enhancement report.
Finally, towards the end, GSC has the links and settings section. Here’s what comes in both of them:
This section shows you some great insights about your backlink profiles, showing the top linking sites which link to your website, the total number of external links, top linked pages, and much more. You can also see the top internally linked pages with external links.
This section is meant to view your site’s internal and external linking, a very important SEO factor for ranking.
The settings section shows some important information such as site verification, other services Google Search Console is linked to, and the address of the owner/webmaster.
It also shows some vital stats such as Crawl stats, the indexing crawler’s name, and the option to remove the property from GSC. Since we are talking about removing the property from GSC, let’s look briefly at the process of adding a website.
11. Adding a Website in GSC
We will keep it short and to the point. Here’s a condensed version of the process that will give you an idea of adding your website to Google Search Console.
First, you need to log in or create an account using your primary email address since this is the email address Google will use to communicate with you about the website. After creating the account, look at the top left corner. An option to add a property will be present. Click on it to open another box.
From this box, you get two options; either enter the domain name, which will include all the subdomains and URLs, or choose the URL prefix method. The domain method adds all the sub-parts of the domain but requires DNS verification (which means you’ll have to add records in the domain setting in the domain registrar’s website).
The URL prefix method has different ways of verifying the website, but we recommend the domain method as it includes all.
After selecting, all you have to do is add a verification code to your website, and after a few hours (or a day or two in some cases), Google will verify your website. Remember that you cannot get any information about your website if it has not been verified.
So verification is a must if you want to use Google Search Console. In addition, you can add multiple websites in one account, making multiple managing websites so much easier and better.
Google Search Console is a powerful tool for SEO and understanding your site better. The biggest advantage it has over the competitors is that it is from Google, so all the future updates in the search algorithm will bring new features here first.
This close-knit relationship between search results and Google search console makes it the best choice for web admins and website owners. This, along with the fact that it is totally free to use, makes Google Search Console the powerhouse of SEO.
So do not miss out on it. GSC might seem a little complicated, but it is not once you understand it, and we hope that this article helped you understand it.