Monday, April 15, 2019

Google Search Console

Google Search Console

Google search console


Search console is a free web service by Google for its web masters. It help web masters to indexing status and optimize our website. In May 20'2015 Google rebranded Web-master tool as Google Search Console.

Some of the major features of Google Search Console are:

 1) Search Analytics:

Google search console analaytics
Search Analytics is one of the most popular features of Google Search Console. It allow us to get organic traffic from Google. It's  easier to filter data in multiple ways like pages, queries, devices, and more. S.E.O. professionals never fails to check the Queries as it helps them in the identification of organic keywords that people commonly use to search for the products or services offered by a website. We can also find out the number of visitors using Image search for visiting our website.

2) HTML Improvements

HTML Improvments


HTML Improvements helps us to improve the display of the Search results. In case, if there are any issues related to S.E.O, these features helps us to identify them. Errors like Missing Meta data, Duplicate content and more can be identified. If identical content is available on the Internet as multiple pieces, the search engines find it harder to make a decision regarding which content is more relevant to a specific query. But if meta-data's like Meta Descriptions or Title tags is missing, it can be  found out easily .

3) Crawl Errors 

crawl errors


Checking the crawl error report on a periodically allow us to solve various problems related to the crawling. All the errors related to Google-bot encounters are shown  while crawling websites. All the information about those site URLs that can  be crawled successfully by the Google is shown as HTTP error code. An individual chart can be easily shown and information like DNS errors, Robots.txt failure and server errors can also be revealed.

4) Fetch as Google

Fetch as Google helps us in ensuring that the web pages are search engine friendly. Google crawls every page on the site for indexing on the Search Engine Result Page. The URLs is analyzed with the help of this tool for verification process. This includes changes in the content, title tag, etc. This tool help us  in communicating with the search engine bots and help us to find out if the page can be indexed or not. This tool also helps us in indicating when due to certain errors, the site is not being crawled or may get blocked by coding errors or robots.txt.

5) Sitemaps & Robots.txt Tester

XML site-map is used to help the search engines to understand the website better while crawling by robots. There is a section named site-map where you can test our site-map to be crawled. No web pages are indexed by Google without the site-map. Robots.txt is a text file which instructs search engine bots what to crawl and what not to crawl. This file is used to check which URLs is blocked or disallowed by robots.txt

1 comment: