Technical SEO Consultant
For technical SEOs, this rule comes as no surprise – yet we still see this mistake being made on websites today.
If this is a persistent sitewide issue, you might receive a message in your Google Search Console: ‘Googlebot cannot access CSS and JS files’. We have covered this issue in a previous article and we strongly recommend that you unblock access. As previously demonstrated, this simple action can dramatically change your SEO rankings:
The test works on a page by page principle however these issues are generally site-wide so checking main templated pages (category landing page, product landing page, blog post, etc.) will usually highlight if there is a site-wide issue.
First, enter a URL and wait for the test result:
Second: if you see “Page Loading Issues”, click on “View Details” and a new page will open:
This has now replaced the Fetch and Render tool which has been deprecated in the old Search Console, and runs on the same principle as the Mobile Friendly Test.
When you inspect a URL from the same domain as the verified property, you will get this:
To find out if there are any blocked resources, run a Live Test. If there are any blocked resources, you will see the below message in Page resources under the More info tab in the right-hand corner.
There are two levels of actions, depending on whether you host your domain or not. If you host your own domain, resolving the issue simply involves identifying the robots.txt rules which are causing the issue and amending them to allow crawling and rendering of the resources.
It’s little trickier if you do not host your own domain, as it is likely to be a third-party tool which has blocked access to GoogleBots in their own robots.txt. While some of these tools may not affect content rendering, others will. In these cases, it’s important to contact the third-party tool and get them to unblock their robots.txt for .js and .css files.
First mentioned on the PageSpeed Insights tool, the industry has learnt a lot about how Google renders pages:
Having good Page Speed is a crucial aspect of SEO; content needs to be loaded by the five second threshold for the web page to be indexed by Google.
Small critical scripts can be inserted directly into the HTML document (known as inlining): this unblocks rendering while ensuring that style is applied to the above-the-fold content straight away.
The best simple way is by using the “view page source” in a web browser and check if there are any large <script> inlined in the <head> above important attributes for SEO:
Large scripts like this take a long time to load, and when combined with other external render blocking scripts located in the <head>, make it likely that important SEO attributes are not rendered.
It is recommended to place your important SEO attributes as close as possible to the beginning of the <head>. This is especially true for the following:
When it comes to using <link rel=”alternate” hreflang>, Google recommends the following as best practice:
Put your <link> tags near the top of the <head> element. At minimum, the <link> tags must be inside a well-formed <head> section, or before any items that might cause the <head> to be closed prematurely, such as <p> or a tracking pixel. If in doubt, paste code from your rendered page into a HTML validator to ensure that the links are inside the <head> element.
If you’ve found a lot of script in the <head> and moved your hreflang attributes above them and near the top of the <head>, the best way to visualise success when doing so is by checking the “International Targeting” report in Google Search Console. A sudden rush of detected hreflang tags should appear in your report:
We’ve covered three hreflang mistakes and how to fix them here.
W3C describes the DOM as: The Document Object Model is a platform- and language-neutral interface that will allow programs and scripts to dynamically access and update the content, structure and style of documents. The document can be further processed, and the results of that processing can be incorporated back into the presented page.
With some websites taking weeks for the second crawl to occur, they may suffer from consequences of reduced online visibility long before the problem is identified.
Best practice tips:
<button onclick=”myFunction()”>Click me</button>
After this method of linking was introduced, the website showed a dramatic loss of crawl followed by a huge loss of traffic.
Note: The xlink:href method is now deprecated and it is recommended to use <a href=>.
Using the Chrome extension Link Redirect Trace, we can see that a catch all 302 redirect was implemented on this website which sends users and robots to a new URL containing a parameter “?url=”, creating thousands of new URLS.
To fix the above issue we recommend the removal of the catch-all 302 redirect.