in

Crawl Error

There are two things that explain why your pages are not getting showed by Google, either your site has reached the crawl budget or your specific page occur some crawl error.

While you have learnt about crawl budget on the previous lesson, this module you’ll going to learn about what is crawl error, what is the impact to the SEO and how to fix them.


What is Crawl Error?

Crawl error is the issue that search engine discover when trying to crawl a website.

The error then stop search engine from reading the page content.

The easiest way to identify if your website has a crawl error issue is by accessing Google Search Console.

Just go to the main dashboard > crawl > crawl errors.

crawl error in GSC


Types of Crawl Error

When you seeing from the Google Search Console account, you might notice Google separated crawl error into two sections:

  • Site errors – this means that your entire site is having issue and can’t be crawled.
  • URL errors – this means that some specific URL occur error, while the rest of the site are fine.

Let’s get into depth for each.

1. Site Errors

Site errors are the crawl error that result the search engine bot from accessing your entire website.

These are some of the reasons:

DNS Errors

This happen when a search engine isn’t able to communicate with your server. It might be caused by down server or anything that make your website can’t be visited.

If you see notices of this in your Google Search Console at crawl errors, don’t worry since it is usually a temporary issue. Googlebot will come back to your website later and crawl your site.

Server Errors

Server errors happen when the bot wasn’t able to access your website due to timed out request.

The search engine have attempted to visit your website, but it took long time to load that the server served an error message.

Server errors also occur when there are flaws in your code that prevent a page from loading. It can also mean that your site has so many visitors that the server just couldn’t handle all the requests.

Robots Failure

Before crawling, Googlebot will read your robots.txt file first, just to see if there are any areas on your website you’d rather not have indexed. If that bot can’t reach the robots.txt file, Google will postpone the crawl until it can reach the robots.txt file. So always make sure it’s available.

2. URL Errors

URL errors happen when the search engine failed to crawl a specific page of a website.

This can happen by several factors such as 404 pages and internal linking failure.

For example, when page A has internal linking to page B, and then all of sudden you decided to delete page B.

If you forgotten to remove the internal linking in page A, this will result in URL errors.

This is why you should always be aware of these.

Another reason to have URL error is when you submitted a page for indexation in Google Search Console, but in the same time you marked that page as “no index” or block the page on robots.txt.


Is Crawl Error a Ranking Factor?

Crawl error is not a ranking factor, but your page won’t get rank if it has crawl error.

Think this way, if Google can’t even discover your page, it won’t get showed in search result and won’t be rank at all.

Therefore, solving the crawl error is an important SEO task to do.