17 Google Search Console Reports: Common Errors and Tips for Fixing Them

From Experts
Like
Comments
Share
17 Google Search Console Reports: Common Errors and Tips for Fixing Them

Google Search Console (GSC) is a service that allows website owners to analyze their performance in Google's organic search. Before the update on May 20, 2015, the service was called Google Webmaster Tools. Since the rebranding of GSC, many reports have been created that contain helpful information and allow you to evaluate the effectiveness of your website in various areas:

  • crawling and indexing of website pages;
  • validity of micro-markup of rich data;
  • website usability on mobile devices;
  • the effectiveness of Core Web Vitals (CWV);
  • validity of the SSL certificate (HTTPS protocol);
  • validity of XML sitemaps.

In this article, we will analyze the most common errors that arise in GSC reports and consider possible ways to fix them.

We also advise you to read a useful article about google analytics & search console integration in Netpeak Spider.

1. Report ‘Not found (404)’

The ‘Not found (404)’ report contains a list of URLs of website pages that return a 404 response to the server (page not found).

To view the report, go to the ‘Pages’ → ‘Not found (404)’ section:

Report ‘Not found (404)’

The report will show a graph of the number of pages with a 404 server response and examples of such pages on the website (no more than 1000 URLs):

soft 404

Such pages waste Google's resources on website crawling and negatively affect user behavior.

To fix pages with a 404 server response, you need to:

  • restore inaccessible pages if they were deleted by mistake;
  • set up 301 redirects to similar pages if they were deleted intentionally or as a result of a URL change.

2. Report ‘Server error (5xx)’

The ‘Server error (5xx)’ report contains a list of URLs of website pages that return any server response with the code 5xx. This means that the error occurs on the server side and may require the help of a programmer or system administrator to fix it. The following server error codes are included:

  • 500 Internal Server Error is an error that means that the server has experienced unexpected malfunctions that prevent it from correctly processing a user request. As a rule, such errors are temporary and can occur or disappear on their own without the participation of a system administrator;
  • 501 Not Implemented is a server error that occurs when the server does not support the functions necessary to process a user request. Most likely, to fix this error, you need to change the server settings;
  • 502 Bad Gateway is an error that occurs when the server received an invalid response from an inbound server and usually happens if the site is using a proxy or gateway server;
  • 503 Service Unavailable is an error that often occurs when the server is undergoing maintenance or is overloaded;
  • 504 Gateway Timeout is an error from a server that acts as a gateway and has not received a response from a higher server, which is necessary to fulfill a user request;
  • 505 HTTP Version Not Supported is a server error that occurs when the HTTP version specified in the user's request does not match the HTTP version used on the server;
  • 506 Variant Also Negotiates is an error that occurs when the server to which the user's request is directed is not the endpoint of the request negotiation but is a kind of intermediary in the chain of connected servers;
  • 507 Insufficient Storage is an error that occurs when the server does not have enough memory on the physical disk to fulfill the request. This error is often temporary and can be resolved by increasing the server's memory or freeing up memory by deleting unnecessary or outdated files;
  • 508 Resource Limit is Reached is a primarily temporary error that can occur when the available server limits (the number of processes, RAM, CPU load, etc.) are exceeded;
  • 509 Bandwidth Limit Exceeded is a temporary server error that occurs when the number of users on a website significantly exceeds the server's bandwidth. If this error occurs quite often (daily or several times a week), it is a signal that you should increase the server limits or change the server to a more powerful one with more bandwidth;
  • 510 Not Extended is an error that occurs when the client's request does not comply with the server access policy. In response, the server is able to send additional data required to generate a new client request.

To view the report, go to ‘Pages’ → ‘Server error (5xx)’. Example:

Report ‘Server error (5xx)’

The GSC report ‘Server error (5xx)’ only contains a list of URLs (no more than 1000 pages) where the errors occur, but by downloading the server logs for the specified pages, you can get more detailed information on how to fix these errors. To download the server logs, contact the programmer who developed your website or the system administrator who maintains your server. An example of a GSC Server error (5xx) report:

Server error (5xx)

Such pages waste Google's resources on website crawling and negatively affect user behavior. Pages that regularly display the 5xx server error may be marked by Google as low-quality and excluded from the Google index, so you should not ignore them.

3. Report ‘Page with redirect’

The ‘Page with redirect’ report contains a list of URLs of website pages from which the server redirects users to other pages. Often, such redirects are not errors and do not require additional actions. For example, redirects from HTTP to HTTPS or from WWW to non-WWW are not errors.

However, sometimes redirects can occur due to incorrect server or site configuration. Therefore, it is worth checking all the pages in this report to make sure that none of them are problematic.

To view the report, go to the ‘Pages’ → ‘Page with redirect’ section:

Report ‘Page with redirect’

If you find pages that inappropriately redirect users, you should take steps to cancel such redirects. To do this, use the administrative panel interface or contact your system administrator. Read more about setting up redirects in this article.

4. Report ‘Excluded by ‘noindex’ tag’

The ‘Excluded by 'noindex' tag’ report contains a list of URLs of website pages that are excluded from indexing by search engines by using the Robots meta tag with the value ‘Noindex’. Often, such pages are not an error, and the report has a purely informative function. For example, content sorting pages, internal search pages, or user account pages may be blocked from indexing.

Sometimes, however, due to malfunctioning scripts on the site or incorrect server settings, important pages that generate a lot of traffic can be mistakenly hidden from search engines. As a result, such pages will be dropped from the search index and the website will lose traffic. Therefore, it is worth checking all the pages in this report to make sure that there are no pages intended to get traffic from organic search in the report.

To view the report, go to the ‘Pages’ → ‘Excluded by 'noindex' tag’ section:

Report ‘Excluded by ‘noindex’ tag’

If you find pages that are erroneously blocked from being indexed by search engines, you should take steps to unblock them. To do this, use the administrative panel interface or contact your system administrator.

5. Report ‘Alternate page with proper canonical tag’

The ‘Alternate page with proper canonical tag’ report contains a list of URLs of website pages that contain the Canonical tag from the URL of an alternate page. Often, such pages are not errors, and the report serves only as an informative tool. For example, such a report may contain pagination pages that contain the Canonical tag from the URL of the page to which these pagination pages belong: https://website.com/category/page-2 and https://website.com/category/page-3, which have the Canonical tag from the URL https://website.com/category.

However, sometimes due to incorrect website setup, Canonical tags may not appear correctly. For example, on all pages, the Canonical tag may contain the URL of the home page. Therefore, you should check all the pages in this report to make sure that there are no pages with incorrect Canonical tags.

To view the report, go to the ‘Pages’ → ‘Alternate page with proper canonical tag’ section:

Report ‘Alternate page with proper canonical tag’

If you find pages with incorrect URLs in Canonical tags, you should correct them. To do this, use the administrative panel interface or contact your system administrator. Read more about setting up the Canonical tag in this article.

6. Report ‘Duplicate without user-selected canonical’

The ‘Duplicate without user-selected canonical’ report contains a list of URLs of duplicate pages on a website that do not contain the Canonical tag from the URL of the alternate page. For example, this report may contain pages with UTM tags without the Canonical tag from the main page URL: https://website.com/category?utm_source=website.com and https://website.com/category?utm_source=domain.com duplicate the page https://website.com/category, but do not contain the Canonical tag from the URL https://website.com/category.

To view the report, go to the ‘Pages’ → ‘Duplicate without user-selected canonical’ section:

Report ‘Duplicate without user-selected canonical’

If you find duplicate pages without a proper Canonical tag, you need to identify the main page among them and include its URL in the Canonical tag of all duplicate pages. To do this, use the administrative panel interface or contact your system administrator.

7. Report ‘Duplicate, Google chose different canonical than user’

The ‘Duplicate, Google chose different canonical than user’ report contains a list of URLs of website pages for which Google has independently determined an alternative URL that differs from the one specified on the duplicate pages in the Canonical tag. For example, the pages https://website.com/category/page-2 and https://website.com/category/page-3 duplicate the page https://website.com/category/page-1 and have the Canonical tag with the URL https://website.com/category/page-1, but Google has independently determined the canonical URL for the page https://website.com/category/page-2 instead of https://website.com/category/page-1.

See the screenshot below for more details:

Google chose different canonical than user

This is not a common error, but it can occur if the duplicate page has more internal and external links than the main page. It may also occur if the duplicate page loads faster or has a better user experience than the main page.

To view the report, go to the ‘Pages’ → ‘Duplicate, Google chose different canonical than user’ section:

Report ‘Duplicate, Google chose different canonical than user’

If you have found pages for which Google has independently determined the canonical URL, you should review all internal and external links to these pages and check the XML sitemaps. If possible, remove all internal and external links to duplicate pages or replace them with the URL of the main page. To do this, use the page editor in the administrative panel or contact your system administrator for help.

8. Report ‘Blocked by robots.txt’

The ‘Blocked by robots.txt’ report contains a list of URLs of website pages that are closed from crawling by search engine crawlers. Often, such pages are not an error, and the report serves only as an informative tool. For example, content sorting pages, pages with UTM tags and other GET parameters, and other pages that are not optimized for organic search promotion may be hidden from crawling.

However, sometimes, due to incorrect settings of the robots.txt file, important pages that are supposed to receive organic traffic can be mistakenly blocked from crawling by search engine crawlers. As a result, they will be dropped from the search index and your website will lose traffic. Therefore, it is worth checking all the pages in this report to make sure that it doesn’t contain any pages created to get traffic from organic search.

To view the report, go to the ‘Pages’ → ‘Blocked by robots.txt’ section:

Report ‘Blocked by robots.txt’

If you find pages that are incorrectly blocked from being crawled by search engine crawlers, you should change the settings of the robots.txt file. To do this, open the file with any text editor (we recommend using simple editors that work with TXT files) or a code editor and delete the Disallow directive that blocks the crawling of the necessary pages. An example of how the directive may look:

Disallow: /category/*
Disallow: /product/*
Disallow: /post/*

Read more about the robots.txt file and how to customize it in this article.

9. Report ‘Soft 404’

The ‘Soft 404’ report contains a list of URLs of website pages that show users the ‘Page not found’ template, but return a 200 or 5xx server response instead of a 404. Such pages are also called ‘redirect’ or ‘soft’ 404 errors.

To view the report, go to the ‘Pages’ → ‘Soft 404’ section:

Report ‘Soft 404’

Such pages mislead search engine crawlers and waste their resources on website crawling as well as negatively affecting user behavior.

To fix pages with a 404 server response, you need to:

  • restore inaccessible pages if they were deleted by mistake;
  • set up 301 redirects to similar or equivalent pages if they were deleted intentionally or as a result of a URL change;
  • set up a 410 server response if a page has been deleted, but there is no alternative or similar page to which a redirect could be set up.

10. Report ‘Indexed, though blocked by robots.txt’

The ‘Indexed, though blocked by robots.txt’ report contains a list of URLs of website pages that are indexed by Google despite the fact that they are blocked from crawling by the Disallow: directive in the robots.txt file.

You can find such pages in the GSC report, which can be opened in the section ‘Pages’ → ‘Indexed, though blocked by robots.txt’:

Report ‘Indexed, though blocked by robots.txt’

Sometimes this happens if the same pages are simultaneously closed for crawling through the robots.txt file and closed for indexing through the Robots meta tag with the value ‘Noindex’. To fix it, you need to choose only one method. That is, if you close the page from indexing using the Robots meta tag with the value ‘Noindex’, you do not need to close it for crawling through the robots.txt file. And vice versa, if a page is closed for crawling in the robots.txt file, you do not need to close it for indexing using the Robots meta tag.

Also, pages that are closed for crawling can be crawled and indexed if they have external links or a large number of internal links. Therefore, it is recommended not to place links to pages that are closed for crawling anywhere in the robots.txt file.

11. Report ‘Sitemap’

In the ‘Sitemap’ section, you can specify a link to an XML sitemap (or several sitemaps) of the site that contains the pages that are prioritized for crawling. If the XML sitemap contains errors, a corresponding notification will be displayed in the Sitemap section:

Report ‘Sitemap’

By opening an XML map with an error, you can see a detailed description of the error and recommendations for fixing it:

sitemaps report

The most common errors in XML sitemaps are as follows:

  • syntax error;
  • the number of URLs in the XML sitemap exceeds 50000 or the size of the sitemap file is more than 50 MB;
  • URLs specified in the XML sitemap belong to a different domain.

Read more about how to create a correct XML sitemap in this article.

12. Report ‘Core Web Vitals’

The ‘Core Web Vitals’ report contains a list of URLs of website pages that have problems with Core Web Vitals (CLS, LCP, and FID). To view the report, go to the Core Web Vitals section and open the report for mobile or desktop devices:

Report ‘Core Web Vitals’

Each report will contain a list of problematic metrics with sample pages:

core web vitals report

This report only contains a list of problematic metrics and pages, but does not provide specific recommendations for optimizing them. You can learn more about ways to improve Core Web Vitals in this article.

13. Report ‘Mobile Usability’

The ‘Mobile Usability’ report contains a list of URLs of website pages that have problems with the convenient display of content on mobile devices. The most common errors are:

  • the text is too small to read;
  • content blocks are too close to each other;
  • buttons and other interactive blocks are too small or too close to each other;
  • content blocks are not adapted to the width of the mobile device screen and extend beyond it, which causes horizontal scrolling;
  • the viewport meta tag is missing, which helps to adjust the content to the size of the mobile device screen.

To view the report, go to the Mobile Usability section and select a specific error from the displayed list:

Report ‘Mobile Usability’

Each error page has a link to the official help document with tips on how to solve the problem:

help document

For example, to solve the problem of a small text size, you need to configure the viewport meta tag and set the font size for the text to at least 12 px.

14. Report ‘HTTPS’

The ‘HTTPS’ report contains a list of URLs of website pages that have problems with the SSL certificate and are loaded using the HTTP protocol instead of HTTPS.

To view the report, go to the ‘HTTPS’ section:

Report ‘HTTPS’

If all pages of the website have no problems with the SSL certificate and are loaded via HTTPS protocol, they will be marked in green.

If the website pages have problems loading via HTTPS protocol, you need to check the relevance of the SSL certificate and the redirect from HTTP to HTTPS.

15. Report ‘Product snippets’

The ‘Product snippets’ report contains a list of website pages with Product structured data markup errors. To view the report, go to the Product snippets section and select a specific error from the displayed list:

Report ‘Product snippets’

To fix errors, use the instructions in the official Google manual. You can learn more about structured data markup and its impact on website promotion in this article.

16. Report ‘Breadcrumbs’

The ‘Breadcrumbs’ report contains a list of website pages with Breadcrumb List structured data markup errors. To view the report, go to the Breadcrumbs section and select a specific error from the displayed list:

Report ‘Breadcrumbs’

To fix errors, use the instructions in the official Google manual.

17. Report ‘Review snippets’

The ‘Review snippets’ report contains a list of website pages with structured data markup errors in the extended Review snippets data. To view the report, go to the Review snippets section and select a specific error from the displayed list:

Report ‘Review snippets’

To fix errors, use the instructions in the official Google manual.

Conclusion

Google Search Console has many useful reports that can help you identify and fix important errors, improving the quality of your website and boosting its position in organic search. Not all reports indicate errors; some are for informational purposes only. To find even more errors that hinder website promotion, we recommend conducting a technical audit with Netpeak Spider. You can learn more about how to do it in this article.