Lots of folks think that SEO is about “gaming the system.” Well… that’s true of “black hat” SEO, but those of us who are trying to make pages easy for Google to crawl and evaluate are working towards what I like to think of as “natural” SEO. If you are new in this niche, I recommend you to start with our Search Engine Optimization guide.
We put in all the right meta tags, make sure that your page is about what you tell search engines it’s about in your description and generally try to streamline things so that spiders won’t be caught in traps or leave pages entirely.
So, let’s say that you have recently built a new website. Is it search-friendly? Or more importantly, is it Google-friendly? No, Google certainly doesn’t pay me and I don’t worship at the Google altar, either, but let's face it. Google brings the most traffic and for some reason, that traffic seems to convert. That’s why we want to please the gods of Google as much as we possibly can.
Connect Google Search Console for your Domain
No matter how much you profess to hate Google, you have to agree that they have given us all the tools we need to be successful in search and much, much more. I mean, parts of my business run on using Google Tools, and they give us lots of stuff to help us to figure things out. Let’s take Google’s Fetch and Fetch and Render tools, for example.
If you’re new to SEO and/or online marketing, and don’t yet have your site registered with Google Webmaster Tools, do it. NOW!
First, go to google.com/webmasters. Then verify your site. You can use a small file from Google that you upload to your main directory or add some code to your page, but after your site is verified as yours (or one you manage), all you have to do (assuming you’re already signed into Google) is to click “Sign in to Webmaster Tools.”
Once inside, you’ll find a wealth of information to help you to see how your site is faring in Google’s eyes (not Bing or Yahoo or DuckDuckGo, only Google). They don’t just tell us how many keywords a site is ranking for, but how many backlinks they’re counting for that particular Web property, whether something put your site on Google’s bad side, and importantly right now, whether your site is “mobile friendly.”
So much of this information is critical to getting Google traffic. If you don’t go inside at least once a month or more to see if you’re getting any “you have issues” messages, including “manual actions” by Google, you could be living in a fool’s paradise, eh? You need to clear those things up right away.
However, if you’re doing things right, and not following anyone else’s scam tactics to rank well, you will probably never see such messages. But then again, you might be doing things wrong that you think are right. It’s best to check periodically to know you’re cool. It only takes minutes and it’s worth the effort.
How to Pinpoint Crawl Weakness
One element that many webmasters overlook is the “fetch” page function. I’m not just an SEO, I’m also a designer/developer and I check every website I build to make sure that Google loves it. I allow Google to “fetch” it in my client’s account. (These functions will only work for individual sites that have already been verified. So you can’t go into your account and try to fetch someone else’s site.)
During the fetch function, Google sends out a robot to scan the page you designate in the query box, and you can tell it which platform to fetch it for, too — desktop or various forms of mobile:
You can fetch your homepage by leaving the box blank, or any other directory or page you designate in your site’s interior. This will help you to see whether there are any issues with connectivity or security, and it’s pretty quick. Takes less than 30 seconds, and spits back your page in code, which is what Googlebot sees. A simple fetch never sees the “render” part of the equation or what your visitors see.
When you use the “Fetch and Render” option, Google will tell you about elements that spiders can’t see or are blocked from seeing. Googlebot runs through the page (or set of pages you designate by entering a directory) and all the links on or in it. In it, meaning behind the face of your page, in the code. Links can be to images, separate CSS files, or JavaScript or other codes, for example.
When the fetch and render is complete, Webmaster Tools then shows you two images — one that Google sees and another that your viewers see.
You’ll notice that the images for my homepage are different. Visitors are asked to “Like” my page on Facebook, which comes up in a lightbox pop-up. But in this fetch and render, Google was happy. Everything looked good and there were no crawl errors or other issues. (Better NOT be. Ha!)
Fixing Trouble Spots
But let’s take a site with issues. After Google completes the fetch, you’ll see this:
“Partial” means that Google was able to crawl part of your site with no issues, but there were other parts they couldn’t get near.
If you click the listing (/ in this case), you can find out which pages are at issue.
Happily, this error is only a small image that is unreachable. It probably took too long to load or is broken. Easily fixed. But it’s issue like this that you need to be watching for and repairing, especially if you’re posting elements from other websites (videos, slideshows, text links, etc.).
Other issues will have to do with scripts that might be running. You can also check your robots.txt and edit it to be sure that Google is seeing all that you want it to see and blocking the things you don’t want crawled. (There’s a robots.txt checker n Webmaster Tools, too.)
Other codes that Google might return are:
- Redirected: Either your robots.txt or javascript has redirected Googlebot to another page.
- Not found: Though Google can contact your site, they can’t find the page you specified.
- Not authorized: Google can contact your site, but you have blocked the page from being crawled.
- DNS not found: The domain name that you entered isn’t registered or reachable.
- Blocked: Your robots.txt file has blocked Google from reaching the page.
- Unreachable robots.txt: Your robots.txt file is missing.
- Unreachable: Google reached your server, but was sent home without seeing the page. Probably timed out.
- Temporarily unreachable: As above, but this is only a temporary issue.
- Error: Google was prevented from fetching the page.
The Bottom Line
The best thing to do is to keep your pages crawlable. If you make a change to a specific page, run it through the fetcher at least once to be sure that it’s available and A-OK for Googlebot, unless you don’t want it to be. (Content in a membership site is one reason that comes to mind.) The more accessible and easier your pages are to crawl, the better Google and all other search engines will love you.
And we all know what happens then, right? TRAFFIC. Without that, you have nada.