After the developer removed the block on Google’s IP and we felt the issue was solved, we wanted to force a recrawl.
To do this, you first need to submit URLs to used the “Fetch as Googlebot” feature and get diagnostic feedback on or either Google’s success or error when attempting to fetch the URL.
There are plenty of reasons why you’d want Googlebot to recrawl your website ahead of schedule.
Maybe you’ve cleaned up a malware attack that damaged your organic visibility and want a clean bill of health so rankings recover faster; or maybe you’ve implemented site-wide canonical tags to eliminate duplicate content and want these updates sorted out quickly; or you want to accelerate indexing for that brand new resources section on your site.
Search marketers should know that Submit URL to Index comes as advertised, and is very effective in forcing a Google recrawl and yielding almost immediate indexing results.
Recently, a client started receiving a series of notifications from Webmaster Tools about a big spike in crawl errors, including 403 errors and file errors.
Now it’s worth noting that there may be a system lag with some of these notices.
So even after you’ve made fixes, you may still get technical error notices.
Here we’ll take a look at the basic things you need to know in regards to search engine optimisation, a discipline that everyone in your organisation should at least be aware of, if not have a decent technical understanding.But with these new crawl error notifications are a real Godsend and saved us a ton of time and effort trying to isolate and diagnose the issues, making my life easier and helping greatly reduce the amount of time it takes to solve issues.