How Important is the new Googlebot Announcement from Last Week for your SEO Ranking? (HINT – don’t ignore it!)

seoLast week you and I and millions of other site owners received the Googlebot announcement which said that the “Googlebot Cannot Access Your JavaScript and CSS Files.”

A week has gone by and does it really matter?

For your sake, I did some research and found that it does, or at least, it is not worth the risk to assume anything, especially when Google says that “rendering without certain resources can impair the indexing of your web pages.

I have fixed it for our company site and am working to fix it for my top clients. (If you would like me to take a look at it for you, just let reach out to me. Whether you are a current client or not.)

What is the fix?

Basically, there is a lot of conflicting advice from among the top experts out there in the SEO and WordPress worlds.

Yoast, the software company that develops the plugin Yoast SEO, which we use on all of our sites, says to simply use their plugin and delete the default WordPress robots.txt file, essentially allowing the Google bot to index the /wp-admin directly.

When I read that, it made me personally a little nervous; I also saw some hesitation in the comments section below. I reached out to our systems administration team at the datacenter, and they thought it not a good idea to basically allow Google to have carte-blanche access to index system files in the /wp-admin and /includes folders. Perhaps I am missing something that Yoast knows that we are unaware of. But we decided at this time to not go that way.

Then I read an article by SearchEngineLand and then another by TheSEMPost blog. TheSEMPost article really had the solution that made the most sense. They suggest to put the following lines of code into to the robots.txt file.

User-Agent: Googlebot
Allow: .js
Allow: .css

This seems to be the most delicate way to address the issue, and it worked for our site. Basically the code is telling the Googlebot to go ahead and index all javascript and css files. After implementing, we checked in the Google Fetch and Render tool and saw the site rendering correctly by the Googlebot, where it was not before. We also looked into the list of blocked resources, that the Google Search Console lists and saw all non-third party resources fall out of that list.

Feel free to add to this discussion, if you see anything I missed in this article, and don’t hesitate to reach out to us if you would like our team to take a look at your site. Our commitment in this blog is to only reach out to you with what matters and the solutions that get you there.