How to Fix the 'Googlebot cannot access your JavaScript and/or CSS files' for Concrete5

Don’t be alarmed if you received a warning from Google in your email today — many webmasters were recently alerted that “Googlebot cannot access your JavaScript and/or CSS files.”

Google sent out this warning via Search Console, while also reminding them that Googlebot’s inability to access those files may result in “suboptimal rankings."


Although this sounds scary, there is a very simple fix for this for Concrete5 websites.  In the root directory of your website is a robots.txt file. This file tells the Googlebot what folders it is allowed to crawl and what folder are not allowed to be crawled.  By default, when Concrete5 is installed, this robots.txt file is created.  To fix the crawl issue remove the following three lines from the robots.txt file.  Google wants access to those folders so that it can render the website the exact same way a user would see the website.  By removing these four lines, the Googlebot should be able to see everything it needs to:

Disallow: /themes

Disallow: /packages

Disallow: /blocks

Disallow: /js

The Googlebot works just like a Chrome browser we have found and Google likes to know what your site looks like so it can appropriately rank your website.  If you need help fixing this, let us know.



For more information about this blog or Concrete5 please contact Jamie Johnson.