I have Eliminate render-blocking resources as recommended by Google with a deferring code, but for reason the recommendation still keeps appearing on google suggestions. Any idea why?
I cleared cache*
And this is the code i have used for defer
Normally those can be removed or improved if you use caching plugins.
Also sometimes it works by moving the scripts from header to footer by using this plugin: https://wordpress.org/plugins/scripts-to-footerphp/
Firstly whatever method you found there is not a good idea. What if JavaScript is turned off in your visitors browser? They would see an unstyled page.
I find it surprising that Google would recommend this (I am guessing that is a very old article you found)??? see 'BONUS' below for a better way to defer CSS
Anyway, render blocking resources are resources that are needed to paint the 'above the fold' content. (everything you can see without scrolling when a page first loads.)
The reason your site is failing is that you are still needing to download those resources to render the initial content (if anything what you have done will slow it down I am afraid!).
To get around this is difficult as you have to inline the CSS in style tags within the document body for every item above the fold.
That way the page can be rendered without needing to wait for any external CSS files to download. -> this is what 'render blocking resources' is on about
There are plugins that are supposed to be able to do this, in my experience none of them work though as this is a complex problem to solve.
The only way to do this is to either design the theme yourself with this in mind (bit late for that) or...
Use the Google Developer Tools coverage checker (or a coverage checking tool - there are a few good ones on NPM if you can use Node), load the page and without scrolling find all the used CSS.
Then add every item of the used CSS into a <style> block within the main body of the page, then remove all the styles used from your external CSS (to save duplication, this step is not actually necessary to fix this issue).
As you will soon discover this is a nightmare to reverse engineer, but with a little patience it is possible.
Example
See the source code of klu.io (this is my site for transparency) to see how this is done, you will see there are 2 <style> blocks at the top of the page, every item in there is needed to render 'above the fold' content. (there are 2 blocks as one is site-wide and one is page specific)
Clarity on 'Above the Fold' on my site given as an example
On my site 'above the fold' is the actual visible content when the page loads as the home page is not scrollable on first load.
If you click the 'See What We Can Do For You' button you will see a load of content slides in from the right, that content is not visible so the styles for it are located in main.css.
BONUS
Also for how to defer CSS properly so it still works without JS try the following (you may need to adjust this for multiple CSS files but that is easy once you get the concept):-
<noscript id="ds">
<link rel="stylesheet" href="your-css.css" media="all">
</noscript>
JS
var dfr = function () {
var n1 = document.getElementById("ds");
var r1 = document.createElement("div");
r1.innerHTML = n1.textContent;
document.body.appendChild(r1);
n1.parentElement.removeChild(n1);
};
var raf = window.requestAnimationFrame || window.mozRequestAnimationFrame ||
window.webkitRequestAnimationFrame || window.msRequestAnimationFrame;
if (raf)
raf(function () {
window.setTimeout(dfr, 0);
});
else
window.addEventListener("load", dfr);
The CSS is located within a <noscript> block as a fall back.
The JS then moves this CSS into a div it creates the second the page has loaded.
Related
Where is the best place to put Jquery code (or separate Jquery file)? Will pages load faster if I put it in the footer?
Put Scripts at the Bottom
The problem caused by scripts is that
they block parallel downloads. The
HTTP/1.1 specification suggests that
browsers download no more than two
components in parallel per hostname.
If you serve your images from multiple
hostnames, you can get more than two
downloads to occur in parallel. While
a script is downloading, however, the
browser won't start any other
downloads, even on different
hostnames. In some situations it's not
easy to move scripts to the bottom.
If, for example, the script uses
document.write to insert part of the
page's content, it can't be moved
lower in the page. There might also be
scoping issues. In many cases, there
are ways to workaround these
situations.
An alternative suggestion that often
comes up is to use deferred scripts.
The DEFER attribute indicates that the
script does not contain
document.write, and is a clue to
browsers that they can continue
rendering. Unfortunately, Firefox
doesn't support the DEFER attribute.
In Internet Explorer, the script may
be deferred, but not as much as
desired. If a script can be deferred,
it can also be moved to the bottom of
the page. That will make your web
pages load faster.
EDIT: Firefox does support the DEFER attribute since version 3.6.
Sources:
http://www.w3schools.com/tags/att_script_defer.asp or better:
http://caniuse.com/#feat=script-defer
All scripts should be loaded last
In just about every case, it's best to place all your script references at the end of the page, just before </body>.
If you are unable to do so due to templating issues and whatnot, decorate your script tags with the defer attribute so that the browser knows to download your scripts after the HTML has been downloaded:
<script src="my.js" type="text/javascript" defer="defer"></script>
Edge cases
There are some edge cases, however, where you may experience page flickering or other artifacts during page load which can usually be solved by simply placing your jQuery script references in the <head> tag without the defer attribute. These cases include jQuery UI and other addons such as jCarousel or Treeview which modify the DOM as part of their functionality.
Further caveats
There are some libraries that must be loaded before the DOM or CSS, such as polyfills. Modernizr is one such library that must be placed in the head tag.
Only load jQuery itself in the head, via CDN of course.
Why? In some scenarios you might include a partial template (e.g. ajax login form snippet) with embedded jQuery dependent code; if jQuery is loaded at page bottom, you get a "$ is not defined" error, nice.
There are ways to workaround this of course (such as not embedding any JS and appending to a load-at-bottom js bundle), but why lose the freedom of lazily loaded js, of being able to place jQuery dependent code anywhere you please? Javascript engine doesn't care where the code lives in the DOM so long as dependencies (like jQuery being loaded) are satisfied.
For your common/shared js files, yes, place them before </body>, but for the exceptions, where it really just makes sense application maintenance-wise to stick a jQuery dependent snippet or file reference right there at that point in the html, do so.
There is no performance hit loading jquery in the head; what browser on the planet does not already have jQuery CDN file in cache?
Much ado about nothing, stick jQuery in the head and let your js freedom reign.
Nimbuz provides a very good explanation of the issue involved, but I think the final answer depends on your page: what's more important for the user to have sooner - scripts or images?
There are some pages that don't make sense without the images, but only have minor, non-essential scripting. In that case it makes sense to put scripts at the bottom, so the user can see the images sooner and start making sense of the page. Other pages rely on scripting to work. In that case it's better to have a working page without images than a non-working page with images, so it makes sense to put scripts at the top.
Another thing to consider is that scripts are typically smaller than images. Of course, this is a generalisation and you have to see whether it applies to your page. If it does then that, to me, is an argument for putting them first as a rule of thumb (ie. unless there's a good reason to do otherwise), because they won't delay images as much as images would delay the scripts. Finally, it's just much easier to have script at the top, because you don't have to worry about whether they're loaded yet when you need to use them.
In summary, I tend to put scripts at the top by default and only consider whether it's worthwhile moving them to the bottom after the page is complete. It's an optimisation - and I don't want to do it prematurely.
Most jquery code executes on document ready, which doesn't happen until the end of the page anyway. Furthermore, page rendering can be delayed by javascript parsing/execution, so it's best practice to put all javascript at the bottom of the page.
Standard practice is to put all of your scripts at the bottom of the page, but I use ASP.NET MVC with a number of jQuery plugins, and I find that it all works better if I put my jQuery scripts in the <head> section of the master page.
In my case, there are artifacts that occur when the page is loaded, if the scripts are at the bottom of the page. I'm using the jQuery TreeView plugin, and if the scripts are not loaded at the beginning, the tree will render without the necessary CSS classes imposed on it by the plugin. So you get this funny-looking mess when the page first loads, followed by the proper rendering of the TreeView. Very bad looking. Putting the jQuery plugins in the <head> section of the master page eliminates this problem.
Although almost all web sites still place Jquery and other javascript on header :D , even check stackoverflow.com .
I also suggest you to put on before end tag of body. You can check loading time after placing on either places. Script tag will pause your webpage to load further.
and after placing javascript on footer, you may get unusual looks of your webpage until it loads javascript, so place css on your header section.
For me jQuery is a little bit special. Maybe an exception to the norm. There are so many other scripts that rely on it, so its quite important that it loads early so the other scripts that come later will work as intended. As someone else pointed out even this page loads jQuery in the head section.
Just before </body> is the best place according to Yahoo Developer Network's Best Practices for Speeding Up Your Web Site this link, it makes sense.
The best thing to do is to test by yourself.
My webpage webpage link uses 3 javascripts. A TabSlideOut script, a SmoothDivScroll script and the TN3 Image Gallery script.
When the page is loaded for the first time or reloaded the script for the TN3 Image Gallery is running for a while because many images have to be loaded and this takes time.
During this time the script for the SmoothDivScroll waits and only executes when the script for the TN3 Image Gallery is finished. Because of that the page looks very ugly during this time because the images of the SmoothDivScroll script are shown one after each other instead scrolling smoothly as they do when the SmoothDivScroll script is executed. You can see this when you reload the page.
What I would like to achieve is that the script for the SmoothDivScroll is executed first and only then the script for the TN3 Image Gallery should be executed. Or anything else that could stop the webpage from looking ugly when it is reloaded.
I am not a very experianced web implementer and I don't have javascript programming knowledge. I tried for two days to find a solution but I struggled. I hope that somebody can help to solve my problem. Thanks
I'm going to call this a FOUC problem; e.g., a "Flash of Unstyled Content." Very common. Been around since the late 1900's, and Safari is notorious for this.
Short Answer: Initially set visibility:hidden on your elements with an inline style. Then use JavaScript to set visibility:visible after they've loaded.
Generally, the solution is to hide content until the content is loaded, and then display it when it's ready. Often while content is loading, you will display a spinner of some kind.
Technically, there are many ways to do this. You can toggle the CSS display, visible, and/or the opacity setting. You can show an overlay div with a high z-index--which I call a "veil" with id="veil"--and then remove it when content is loaded, and use a spinner as the veil. You can also move things completely off the screen until they've loaded, and then move them into place. You can combine these methods.
Personally I've had the best success cross-browser and cross-device with the CSS visibility property. I like how it reserves space for the object in the layout. The other solutions sometimes flake on mobile and some older browsers. Here's a couple of snippets to get you started.
First, set visibility to none with an inline style.
- DISCLAIMER: I am NOT a fan of inline styles, and understand the concept of separation of concerns. In this case, I deem it necessary because this must have the highest cascade priority, be applied as quickly as possible, and I've had good success with it cross-environment. Purists will argue that this should go in a CSS file, but I believe that we should not follow any guideline dogmatically; sometimes we must have the courage to break convention in the presence of strong justification. Let the reader decide.
<div id="pan-content" class="clearfix" style="visibility:hidden">
Then, on page load (using jQuery):
$('#window').load( function() {
$('#pan-content').css({visibility:'visible'});
});
This might prove to be a little slow, because you're waiting until the whole window loads until you display the banner. You can also attach the event to specific resources, which will speed things up. See the following post:
Detect image load
Hope this helps!
You should never rely simply on order of scripts to determine your execution. Put your calls TN3 in a function that is called in the SmoothDivScroll complete method.
It might be easiest to use the non-minified version to do this.
On a website, I'm experiencing a "flash" of white that occurs between page loads. It looks bad because I'm using a background image and when the page loads, the background images flash before it comes onto the screen (take a look for yourself). This issues occurs in chrome and IE but not in firefox.
The site has a way of preloading stuff. Every element on the page is in a div wrapper #website which is initially at display:none, and every image is in a div wrapper #website-images which is also hidden. Then the site (using a jquery plugin) checks to see if all the images in #website-images are done loading, once they are a cookie is set to remember that this user has loaded the images already so it won't go through the preloading process once they go to another page or reload the current one, then a call to $("#website").show() is made to display the webpage.
So what could be causing this flickering between the page loads? Is it my way of preloading images? I've added different doctypes, and changed meta information but NOTHING has worked. I'm really lost here, does anyone have any ideas or insights?
This is happening because the DOMLoaded event is fired enough milliseconds before the page actually renders.
In a nutshell, this means you have to optimise your website's speed. This doesn't mean to make it download faster, but it means to download in the correct order, in a non-blocking way.
Step one: Your markup
1)
It seems there is a lot you can do to optimise your markup. Firstly, the order of stylesheets and JavaScripts can be optimised. To ensure CSS files are downloaded asynchronously, you always have to include external CSS before external JavaScript files. style.css is downloaded after some/all of your JavaScript calls.
There is 1 script block found in the head between an external CSS file and another resource. To allow parallel downloading, move the inline script before the external CSS file, or after the next resource.
2)
Your main JavaScript file is inline within your markup. Not only does this block the page download until the script has finished downloading, but having it before your content is probably causing (or adding to) the white flash.
Loading your script asynchronously in the head is my preferred method. You will then have to trigger your script when the DOM has finished loading, or you can achieve the same result by placing the script at the bottom of the body tag.
Step two: Harness the browser's capabilities
1) Looking at the http headers, there are 28 items being served as separate HTTP calls, that are not being cached on the browser (including the html pages, jpg images, stylesheets and JavaScript files).
These items are explicitly non-cacheable, and this can be easily fixed by editing your webserver's configuration.
2) Enable gzip compression. Most web browsers (yes, even IE) supports gzip decompression, and most (if not all) web servers support compressing using gzip. You could even go overkill and look into SPDY, which is an alternative lighter HTTP protocol (supported in Chrome and Firefox).
Step three: Content serving
There are around 30 individual items being served from your domain. Firstly, consider how you could reduce this number of requests. 30 HTTP requests per page view is a lot. You can combat this using the following methods:
1) Paralleled downloads across multiple hostnames. Browsers currently limit the number of concurrent connections to a single domain. Serving your images from a separate domain (for example, img.bigtim.ca) can allow them to be served in parallel to other content.
2) Combine multiple items into one. Many items that are downloaded are purely style content, such as the logo, menu elements, etc. These can be combined into a single image (downloaded only once), and split using CSS. This is called CSS spriting. Stack Overflow does this: look here.
3) If you cannot reduce the amount of items needing downloading, you could reduce the load on your server (and in turn, the client's browser) by serving static content from a cookieless domain. Stack Overflow does this with all their static content such as images, stylesheets and scripts.
Step four: Optimise your own code
There's only so much that HTTP and browser technology can do to help your website's speed. This last step is down to you.
1) Is there any reason you choose to host jquery yourself? Jquery's download page shows multiple CDNs where you can point to for speedy, cached script downloading.
2) There are currently over 20 unused CSS rules within your stylesheets (that's 36% of your entire CSS file). Have a re-think of what is really needed.
3) The main chunk of JavaScript (at the top of your body tag) seems to be a hack to attempt to speed things up, but is probably not helping anything.
A cookie is being set to specify whether or not the page has faded in yet. Not only are you using JavaScript to perform a transition which can happily be performed by CSS, but more than half of the script is used to define the functionality for reading and writing the cookie.
Seeing things like this: $("body").css ("background-image", "url('images/background.png')"); and $("#website").show (); usually gets me ranting about "separation of concerns", but this answer is long enough now so hopefully you can see that it is bad practice to mix style and functionality in the same code.
Addendum: Looking at the code, there is no need for jquery at all to
perform what you are doing. But then again, there is no need to
perform what you are doing, so you could probably do better without any
JavaScript at all.
Move your javascript to the end of the html just before closing the body tags. Sometimes it helps.
I know this is old thread but here is a hack I tried and works.
The idea is not to display anything while CSS is loaded completely.
in html file:
<body style="display:none">
in your CSS, the last line:
body{display:block !important}
CSS is render-blocking.
Divide you CSS into 2 parts -
Critical CSS
Non-Critical CSS
Make Critical CSS load with the page. It should come embedded within the head tag.
Make Non-critical CSS lazy load via ajax.
This will result in serious performance optimization in your webpage leading to less white-screen time.
Also, you can consider loading your Javascript in async/defer way.
I am about to embark on a new web project and I plan to put some JavaScripts in the <head> and also some before </body>, using the following scheme:
Scripts that are essential for the UX of the page: in the <head>. As I've picked up perusing the web - scripts in the <head> is loaded before the page loads, so it would make sense to put scripts that are essential to the user experience there.
Scripts that are non-essential to the design and UX (Google Analytics scripts etc.): before the </body>.
Is this a sensible approach?
Another approach would be to put all the scripts in the <head> and add defer attributes to the non-essential scripts. However, I read that older versions of Firefox don't pick up the defer attribute.
I think a lot of developers run JavaScript just before the </body> so that it is run after all the elements have been rendered.
However, if you organise your code correctly, the position on the page doesn't matter.
For example, when using jQuery, you can ensure the code isn't run until the page and its elements are fully rendered by doing the following:
$(document).ready(function(){
//Code here
});
Then the script reference can be put in the head tag.
Script tags should be referenced just before </body>. This prevents render blocking while the scripts load and is much better for site perception speed.
No obtrusive JavaScript should be used when using this technique.
JavaScript code should be placed at the end of the document so that it doesn't delay the parallel loading of page elements. This does then require that the JavaScript code is written in a specific way, but it does improve the speed of page loads.
Also, ideally you could host references like this under a different (sub)domain. References to jQuery should be pointed to Google's CDN too.
See Best Practices for Speeding Up Your Web Site for more information.
One of the reasons you'd want to put scripts before the </body> is if they manipulate the DOM without user interaction, so you'll need the DOM to be loaded in order to be manipulated. Another way to do that is to add an event listener and run the scripts when the page has loaded, but this will require additional code, which might get complicated if you have a lot of scripts, especially ones you haven't written yourself. Putting them at the end of the page also will speed up page load, though in the case of DOM manipulating scripts you might get some not-so-pretty results from that.
I'd say that's perfectly sensible. As you said, as long as you don't move essential scripts (e.g. jQuery, Modernizr, etc., etc.) out from the <head>, you shouldn't have problems.
Moving non-essential scripts to the bottom of the page should help with the perceived loading speed (that and minimizing / concatenating scripts).
It all depends on what you mean by "essential for UX". I agree with having Modernizr appear early for example, but not everything needs to load straight away. If you're trying to avoid a flash of unstyled text (FOUT), that's a good reason. Similarly, if you have scripts that affect how the page looks before the user does anything, you should load those early.
Don't forget though, speed is part of UX. There's no advantage in having some jQuery interaction ready to run when the user can't see the content it applies to yet. The difference between loading the scripts at the start of the end is a matter of seconds. If you let the page load first, the user will be using those seconds to take the page in, allowing you to load scripts unobtrusively.
Your page will load faster if you move scripts to the bottom of the page, and that makes a difference to your pagerank these days.
Also, some versions of Internet Explorer will throw errors if you try to run a script before the element it refers to has loaded.
Like Ed says, your scripts should be stored in a separate file, and in as few files as possible.
Put the JavaScript code in a separate file and place a link to it in the head part of the HTML.
In order to avoid javascript to block webpage rendering, can't we just put all all our JS files/code to be loaded/executed simply before the closing </body> tag?
All JS files and code would be downloaded and executed only after the all page has being rendered, so what's the need for tricks like the one suggested in this article about non blocking techniques to load JS files. He basically suggests to use code like:
document.getElementsByTagName("head")[0].appendChild(script);
in order to defer script laod while letting the webpage to be rendered, thus resulting in fast rendering speed of the webpage.
But without using this type of non-blocking technique (or other similar techniques), wouldn't we achieve the same non-blocking result by simply placing all our JS files (to be loaded/executed) before the closing </body> tag?
I'm even more surprised because the author (in the same article) suggests to put his code before the closing </body> tag (see the "Script placement" section of the article), so he is basically loading the scripts before the closing </body> tag anyway. What's the need for his code then?
I'm confused, any help appreciated, thanks!
UPDATE
FYI Google Analytics is using similar non-blocking technique to load their tracking code:
<script type="text/javascript">
...
(function()
{
var ga = document.createElement('script');
ga.type = 'text/javascript';
ga.async = true;
ga.src = 'your-script-name-here.js';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(ga, s); //why do they insert it before the 1st script instead of appending to body/head could be the hint for another question.
})();
</script>
</head>
Generally saying no. Even if scripts will be loaded after all the content of the page, loading and executing of the scripts will block the page. The reason for that is possibility of presence of write commands in your scripts.
However if all you want to achieve is the speed of loading page contents, the result of placing script tags right before </body> tag is the same as for creating script tags dynamically. The most significant difference is that when you load scripts in common static way they are executed one by one, in other words no parallel execution of script file (in old browsers the same true is for downloading of the script too).
If you want asynchonous scripts.
Use the (HTML5) async tag if it is availble in the browser you're in. This is what Google Analytics is doing in the code you posted (specifically the line ga.async = true MDN Link, scroll down for async).
However, this can cause your script to load at arbitrary times during the page load - which might be undesirable. It's worth asking yourself the following questions before choosing to use async.
Don't need user input? Then using the async attribute.
Need to respond to buttons or navigation? Then you need to put them at the top of the page (in head) and not use the async tag.
Async scripts run in any order, so if your script is depending on (say) jQuery, and jQuery is loaded in another tag, your script might run before the jQuery script does - resulting in errors.
Why don't people put things at the bottom of the body tag? If the script is taking enough time to load that it's slowing/pausing the load of the website, it's quite possible that that script is going to pause/hang the website after the website has loaded (expect different behaviour on different browsers) - making your website appear unresponsive (click on a button and nothing happens). In most cases this is not ideal, which is why the async attribute was invented.
Alternatively if your script is taking a long time to load - you might want to (after testing) minify and concatenate your script before sending it up to the server.
I recommend using require.js for minifying and concatenation, it's easy to get running and to use.
Minifying reduces the amount of data that needs to be downloaded.
Concatenating scripts reduces the number of "round-trips" to the server (for a far away server with 200ms ping, 5 requests takes 1 second).
One advantage of asynchronous loading (especially with something like the analytics snippet) is, at least if you would place it on the top, that it would be loaded as soon as possible without costing any time in rendering the page. So with analytics the chances to actually track a user before he leaves the page (maybe before the page was fully loaded) will be higher.
And the insertBefore is used instead of append, because if I remember correctly there was a bug (I think in some IE versions, see also link below theres something in the comments about that).
For me this link:
Async JS
was the most useful I found so far. Especially because it also brings up the issue, that even with googles analytic code the onload event will still be blocked (at least in some browsers). If you want this to not happen, better attach the function to the onload event.
For putting the asynchronous snippet on the bottom, that is actually explained in the link you posted. He seems to just do it to make sure that the DOM is completely loaded without using the onload event. So it may depend on what you're scripts are doing, if you're not manipulating the DOM there should be no reason for adding it on the bottom of body. Besides that, I personally would prefer adding it to the onload-event anyway.