Administrator
Administrator
- Joined
- May 18, 2016
- Messages
- 85
If one externalize all CSS style definitions, Java scripts and disallow all user agents from accessing these external files (via robots.txt), would this cause problems for Googlebot? Does Googlebot need access to these files?
It is recommend not to block, because for an instance, the White House recently released a new robot.txt and they blocked the images, directory or CSS or JavaScript or whatever. It is better not to block it, as it will be very helpful when something spammy is going on with JavaScript.
So, it is really good to allow Googlebot go ahead and crawl it. And the notable fact is, these files are not huge, so it does not drain a lot of bandwidth.
So, you just allow Googlebot have access to all such stuff. Mostly, Google will not obtain it, but in rare cases, when they do a quality check on behalf of someone or when they get a spam report, they will fetch that and ensure that the site is clean and not having any sorts of problems.
It is recommend not to block, because for an instance, the White House recently released a new robot.txt and they blocked the images, directory or CSS or JavaScript or whatever. It is better not to block it, as it will be very helpful when something spammy is going on with JavaScript.
So, it is really good to allow Googlebot go ahead and crawl it. And the notable fact is, these files are not huge, so it does not drain a lot of bandwidth.
So, you just allow Googlebot have access to all such stuff. Mostly, Google will not obtain it, but in rare cases, when they do a quality check on behalf of someone or when they get a spam report, they will fetch that and ensure that the site is clean and not having any sorts of problems.