Google sent out this warning via Search Console, while also reminding them that Googlebot’s inability to access those files may result in “suboptimal rankings."
Although this sounds scary, there is a very simple fix for this for Concrete5 websites. In the root directory of your website is a robots.txt file. This file tells the Googlebot what folders it is allowed to crawl and what folder are not allowed to be crawled. By default, when Concrete5 is installed, this robots.txt file is created. To fix the crawl issue remove the following three lines from the robots.txt file. Google wants access to those folders so that it can render the website the exact same way a user would see the website. By removing these four lines, the Googlebot should be able to see everything it needs to:
The Googlebot works just like a Chrome browser we have found and Google likes to know what your site looks like so it can appropriately rank your website. If you need help fixing this, let us know.