SEO 101: What to do when Google Search Search Console Can’t Crawl Your Site Because of Robots.txt
The first rule of Search Engine Optimization is simple: Make sure that Google – and the other major search engines – can properly access your site! If they can’t “crawl” your site, it doesn’t take a rocket scientist to figure out that your search rankings aren’t going to be very good. In fact, you won’t show up at all. So if Google Search Console is telling you that your robots.txt file is blocking them from crawling the site… you’ve got a problem. The good news is that it’s not a difficult fix, and that’s what we’re going to walk you through here today.
First, some background. What does it mean to “crawl” your site in the first place? Search engines like Google use automated scripts to browse websites across the world wide web. These scripts are referred to as “spiders” and the process of reviewing your website content is known as crawling.
Simply put, crawling is the process that search engines use to gather information about your website. So, if your website won’t allow the search engines to crawl them, your site is basically invisible.
Which is not a good place to be!
If your robots.txt file is blocking Google from crawling the site, it’s usually because you’ve launched the site but forgot to update the robots.txt file beforehand. When a site is in development, it’s standard procedure to block search engines from finding it, because you don’t want website visitors stumbling across your half-built website.
But if you or your developer fails to update the robots.txt file ahead of time, it won’t be accessible to the search engines and you’ll get the following error message in Google Search Console:
So let’s talk about how to fix it.
The first step is to update your robots.txt file. You can generally access this file through a website plugin or the source code itself. If you’re working with a web developer, he or she can do this for you very easily.
But simply fixing the file doesn’t solve the problem. Google Search Console won’t know that you’ve changed your settings unless you take action. Here’s what to do:
Step One: Open up Google Search Console and select “Go to the Old Version” on the left-hand menu.
Step Three: Click “Submit”
Step Two: Choose “Crawl” and then “Robots.txt tester”
Step Four: In the dialogue box, choose “Ask Google to Update”
Step Five: Return to New Google Search Console, Inspect Home Page, and “Request Indexing”
It will take a few moments to process, and then your site will be placed into their indexing queue.
Check back a few moments later and you should be good to go! Once you’ve gone through this process, Google (and other search engines) will be able to access your site and include your website in their search rankings. If you have any further questions, we’d be happy to help. Simply click here to get in touch with us! And if you’d like to learn more about the technical aspects of SEO, click here to subscribe to our YouTube Channel.
See you next time!
Related Articles You May Like
Facebook Ads v. Google Ads: Which One Is Better for Your Business? In the world of online advertising, there are two kings: Facebook and Google. Between the two of them, they account for a large percentage of all ad traffic online. Given how many people use at least...
Small Business Facebook FAQFor small businesses, social media has become an absolutely critical part of overall advertising and awareness, and a central hub for digital marketing efforts. Whether your company offers service-based options like plumbing, electrical...
2 Simple Reasons Website Speed Is Hurting Your Business Online (And 3 Ways to Fix Them)It might not seem obvious, but website speed is one of the most important factors in seeing success and growth online. Whether you run a major plumbing business or a Mom and Pop...
Upper Loop Way, Columbia SC 29212