Page Errors



A page error occurs when a visitor or search engine bot attempts to view a page on your site, but it’s not available or it’s inaccessible. The most common page error is 404 Not Found, which is usually caused by the page being deleted or the URL being renamed.

How to fix the most common type of page errors

There are several different types of page errors. They are categorized in the 4xx class of status codes.

404 Not Found

This is the most common page error. As stated before, it’s usually caused by the deletion of the page or a change to the URL. Aside from potentially providing a poor user experience (UX), 404 errors do not adversely affect SEO.

There are a few options for dealing with 404s. They include:

  1. Finding the page and fixing the URL if the content still exists.
  2. Creating a 301 redirect for the broken link to a relevant page on the site.
  3. Ignoring the 404 if the content is no longer available and there isn’t a relevant page to link to.

403 Forbidden

The 403 error typically occurs when there are links to pages that are not publicly accessible. There are a couple options for dealing with 403s. They include:

  1. Fix the permissions on the web server to allow public access to the page.
  2. If the page should remain inaccessible, find and remove any links to it on your site. If an external site is linking to it, you may want to contact them and ask them to remove the link.

400 Bad Request

The 400 error means the browser is having problems communicating with the web server. If it’s happening in one browser, then there’s a good chance it’s happening in other browsers too.

400 errors are typically caused by incorrect syntax in the URI (aka the link). Finding and correcting the link should fix the problem. If it doesn’t solve the problem, then you may need to have a webmaster or system administrator troubleshoot the problem.

401 Unauthorized

The 401 error is similar to the 403 error. They are both permission issues, except the 401 error is caused by requiring the user to authenticate (to enter a username and password) in the browser to access the page.

If you want the page to continue to require authentication, then you may want to block search engines from crawling the page(s) in your robots.txt file and remove any links to it from publicly accessible pages. Otherwise, you should remove the authentication requirement.