The Search Console, or ” Webmaster Tools “, is a free service provided by Google for monitoring and analyzing your own websites. At the beginning of 2015, Google referred to the service as “Search Console” and no longer “Webmaster Tools”. However, “Webmaster Tools” is still used as a synonym and will be used as such in the following.
According to aviationopedia, the Search Console enables a website to be checked for errors in the code, the sitemap, internal links or the URL structure. In addition, the page can be marked with structured data, links and search queries with which users come to the page are displayed, as well as the crawl behavior of Google bots, the status of the indexing and much more. In addition, the Search Console is used to contact Google.
The Google Webmaster Tools are often referred to as ” Google Analytics Light” because they have some functions in rudimentary form that Google Analytics offers in greater depth and in a comprehensive manner. In addition, the data in the Webmaster Tools are often a few days out of date. The webmaster tools are still of great value for search engine optimization.
This article discusses Google’s Webmaster Tools. Other search engines, such as Bing or Yandex, offer comparable services.
Activate / set up Search Console
In order to use the Search Console, after registering with any Google service (Google Plus, Gmail, Google My Business, etc.), the ownership of the website that is to be monitored with the Search Console must be confirmed. To do this, a simple HTML file must be uploaded to the web server.
Alternative methods, such as an HTML tag, the link with Google Analytics or the Google Tag Manager are also possible.
Every website to be monitored is a “property”. A total of 100 properties can be created in the Search Console. Properties do not have to be entire websites, but can also represent subdirectories of a website.
The dashboard gives an overview of four metrics:
New and important: Recent errors or messages from Google are displayed. A message can be a manual penalty (more on this under ” Disavow Tool “), confirmation of a URL move or a change of address.
Crawling errors indicate whether a page could not be found under the URL known to Google (404 error).
The search analysis gives a good insight into the number of visitors over time and which search terms in Google led to a visit to the page.
The sitemaps display shows how many URLs have been submitted in the sitemap and which have been indexed, as well as any errors in the sitemap.
Clicking on the respective category in the dashboard takes you to the full view of this point.
Representation of the search
Under the representation of the search you can check structured data and the data highlighter, HTML improvements and sitelinks.
The item Structured data shows whether microformats (schema.org) are correctly marked on the website and can be read by Google. With the data highlighter, elements on the website can be highlighted according to microformats for Google without interfering with the source code. Markings can be made under “Start marking”.
After successfully activating the Search Console and waiting around 24 hours, errors in the title, meta information and any non-indexable content can be viewed under HTML improvements.
Sitelinks are indicated in the SERPs below the actual hit:
Google determines which links these are. If a link is not desired as a sitelink, it can be “devalued” under this point. Google will then no longer display these links as a sitelink. There is no guarantee for this, however.
Under the item search queries you will find analyzes of user behavior, links to the page, and other useful points.
The search analysis provides an insight into the amount of search queries on Google that led to a visit to the website and which search terms were used. In addition, various filters can be set, e.g. sorting according to smartphones, countries or sub-pages.
The “search analysis” function is currently (July 6th, 2015) still in the beta version. As soon as there is a final version, it will be presented here. You can find more information on search analysis in our blog: Webmaster Tools with a new function – search analysis.
As the name suggests, backlinks to the website are listed under Links to your website. The number of links is not always complete – Google reserves the right to only display some of the links, especially with a large number of backlinks. Nevertheless, the view offers a good overview.
As the name suggests, all internal links on the website are displayed under Internal Links.
If the website has received a manual penalty, the corresponding email is displayed under Manual measures, as well as possible answers and solutions (see also: What is the disavow tool).
If a website uses the hreflang tag for an international orientation, errors or inconsistencies are displayed under this option.
Since every website should run without problems on smartphones or similar mobile devices, you can examine the code of the page under Ease of use on mobile devices.
Options relating to the indexing of the page on Google are displayed under the heading Google Index
The number of indexed pages and any URLs blocked by the robots.txt are displayed under indexing status.
Content keywords shows the frequency of certain keywords within the domain.
Blocked resources shows resources that cannot be read by the crawler. Usually these are login URLs. CSS files or Java scripts should not be blocked!
If you want to remove certain URLs of your website from the Google index, you can do this under Remove URLs.
The Crawling rubric offers the webmaster the opportunity to examine the Google crawler more closely and, if necessary, to influence the behavior of the crawler.
URLs that have produced an error are displayed under crawling errors; these are usually 404 status codes.
The crawl statistics show how often the Google crawler was on the website.
Retrieval like by Google offers the possibility to see the website from the eyes of the crawler in order to prevent possible discrepancies between what users see and what is indexed. If a URL is entered here, it is possible to send it to the index:
This should definitely be done when pages have been changed. This ensures that Google always has the latest version of the page in the index.
The robots.txt tester shows any errors in the robots.txt.
Under Sitemaps there is the possibility to send your own sitemap directly to Google. This is not always necessary, but it is particularly worthwhile for larger pages!
Under URL parameters you can control how the crawler should handle duplicate content. This option should only be used by experienced webmasters. Further information on the URL parameters can be found at: https://www.search-one.de/parameter-gwt/
Possible hacks, Trojans or viruses on a website are identified under security problems. Other resources include links to training programs from Google, Google’s domain service, or Google Merchants.
The test tool for structured data is worth mentioning, with which one can identify the schema.org awards of the website in more detail: https://developers.google.com/structured-data/testing-tool/