As you know, Google crawls content URLs at regular intervals to know whether they still exist or not. If Google bots found any difficulties in fetching links, it will inform you the same through the webmaster account.
In what circumstances will you get 404 errors?
- The post is deleted or moved to a new location
- The website is busy and unable to handle excess number of visits at a time
- Server is down
404 errors won’t affect your website’s search performance, but you can improve the user experience by handling the part.
Sign into your Google webmaster account first.
Expand Crawl menu from the left pane and open Crawl Errors to check your error statistics.
Google shows 404 errors for both desktop and smartphones. A server error occurs when Google bot can’t access your URL, the request timed out or the website was busy. If it tries to open a non-existed page, the request will return a 404 page not found error.
Switch in between the platforms and errors types to monitor all of them.
Click a listed URL to get the details on them.
On the top side, you will find the URL where Google encountered the error along with the last crawled date and when the issue was first detected. Click the URL link to confirm whether it is active or not. If you are not facing any issues, hit the blue color Mark as fixed button from the bottom. It will inform Google that error is fixed and it can crawl the link now.
What will you do if the listed URL is not opening from your end too?
See the second tab titled, Linked from on the same page. Open it. Now it shows various website and social media page URLs from where the opened link is connected. Contact those page owners, inform them about your URL change and request them to make changes in the post accordingly.
Redirection is another method to divert readers from old page to the new location and resolve the 404 page not found errors. You can apply the same method to fix crawl errors on posts, pages, categories, tags, and media attachments.
If you are not interested in using plugins, still you can fix the errors via Robots.txt. Disallow Google bots from crawling your old URLs, files, directories, and others.