Prevent displaying unstyled page when doing a page-refresh - javascript

How could I prevent my page displaying an unstyled view while the page is loading ?
I think its probably because of the order of loading different JavaScript-files.
Is there a best-practice, for example loading plugins before my own code ?
Should every jQuery/.js-function called after document.ready or windows.load ?
link to page
Thanks

YSlow will give you some good ideas for starting points on this page. Quoting from running it on your www.cewas.org:
Grade D on Reduce the number of DOM elements. There are 1489 DOM elements on the page. A complex page means more bytes to download, and it also means slower DOM access in JavaScript. Reduce the number of DOM elements on the page to improve performance.
Grade E on Make fewer HTTP requests This page has 11 external Javascript scripts. Try combining them into one. This page has 5 external stylesheets. Try combining them into one. Decreasing the number of components on a page reduces the number of HTTP requests required to render the page, resulting in faster page loads. Some ways to reduce the number of components include: combine files, combine multiple scripts into one script, combine multiple CSS files into one style sheet, and use CSS Sprites and image maps.
Grade F on Compress components with gzip. There are 15 plain text components that should be sent compressed ... Compression reduces response times by reducing the size of the HTTP response. Gzip is the most popular and effective compression method currently available and generally reduces the response size by about 70%. Approximately 90% of today's Internet traffic travels through browsers that claim to support gzip.
Grade F on Add Expires headers. There are 61 static components without a far-future expiration date.... Web pages are becoming increasingly complex with more scripts, style sheets, images, and Flash on them. A first-time visit to a page may require several HTTP requests to load all the components. By using Expires headers these components become cacheable, which avoids unnecessary HTTP requests on subsequent page views. Expires headers are most often associated with images, but they can and should be used on all page components including scripts, style sheets, and Flash.
To add my own 2 cents: you might want to hide all elements until the entire page loads. This seems to be what you intend with the progress bar, but the sheer number of elements and scripts/styles on your page seems to be preventing it. You could load the bare minimum subset of CSS/JS/HTML to set up the progress bar and then load the rest of the elements in some asynchronous Javascript, only showing the full page once they have all loaded (a la Gmail).

The main problem with your site is the fact that you are attempting to load everything in one go at initial page load. Web developer toolbar's document size report shows a total of 1.1mb of content - that would be nearly 750kb of images and 385kb of scripts. Loading this amount of content in one go is really not recommended, especially for slower connection speeds.
The obvious solution would be to hide everything and only display it when the scripts are ready, but this is a really bad solution - your visitors will be looking at upwards of 10 seconds or more of a blank page. What you should be doing is to restructure the site - rethink your architecture for this. Websites aren't meant to be downloaded in one go - there's too much data, and one of the reasons why users dislike Flash sites, because Flash has to download all of the assets in one go, and therefore users are forced to sit through long waiting times.
You should either breaking up the pages into normal HTML documents, which will load traditionally, or use ajax to load the contents sequentially. Have your script intelligently reorder the loading of contents - front page contents first, then as the user chooses where he's going, the site behind his back loads the assets for those pages. Images are the big problem here - you have .7mb of them, and loading all of them blocks the loading of the site, including scripts, by a very long amount of time.
These aren't easy task by any means, but this is the best method for rectifying the problem.
Some more immediate solutions to your problems include most of what #matt b said - enable gzip compression, combine all your scripts and stylesheets into a single file, etc. You should also consider using CSS sprites for your images to reduce the number of HTTP requests. There's a large number of jQuery plugins that are prime candidates for consolidation, and you're also loading jQuery twice - adding nearly 100kb to the amount of things you need to load.

I assume you are using an external stylesheet?
If you simply embed all your styles in the page, it will not show up unstyled.
Also, place the style sheet definition before any javascript code in the header.

It is commonly know as FOUC - Flash of unstyled content. The options in the answer above make eminent sense but it may still be impossible to eliminate in all circumstances. Initially it was and IE problem but has recently happened a lot more in Safari and is probably to with the underlying method used in the browser application itself and therefore may not be remediable.
Sometime addins such as jQuery and typekit can exacerbate the problem hence paying close attention to and testing different scenarios for loading them.

Related

Advantage of loading javascript files in footer instead of header? [duplicate]

What are the real benefits (if any) to loading your JS at the bottom of the document, as opposed to the top. It seems there is a brief delay in page loading and JS dependent functionality.
I am using html5boilerplate for beginning all my templates, but am not really sure on how beneficial loading the JS at the bottom is.
Does it really make all that much of a difference, and if so, why?
If you include external js files at the bottom of your page, you give the priority of your HTTP requests to the visual display that will be presented to the client instead of to the logic of interaction or dynamics. I believe, if you do not use a content delivery network to deliver images to the client then you are only allowed to process a maximum of 2 HTTP requests at a time. You do not want to waste these requests on logic because we all know how impatient the end user is.
By loading js at then end of the file you can access the DOM(most of the time) without having to call a document.ready() function. You know that if the page render finally makes it to your javascript code that the necessary page elements have usually loaded already.
There are a few more reasons out there but these are the important ones that I try to remember when it feels so awkward to place all js at the bottom.
As scripts that are referred to are being downloaded, browsers typically won't download other files in parallel, thus slowing the page load.
refer: Best Practices for Speeding Up Your Web Site
A Google search will return a large number of results as to why you want to do so and what improvement you'll see. Check out some of these following links:
High Performance Web Sites: Rule 6 - Move Scripts to the Bottom
Rails Best Practices: Scripts at Bottom
Basically, the main reason for doing so is that you'll improve render times of your page. From the first article:
[I]t’s better to move scripts from the
top to as low in the page as possible.
One reason is to enable progressive
rendering, but another is to achieve
greater download parallelization.
depending on what is in the js. if only want it to 'go' when the page loads either surround your code by jquery's: $(function(){}) or place it at the bottom of the page

Is using inline JavaScript preferred to an external include if the script is really short?

I use External JavaScripts in a website as I always try to keep JavaScript at bottom and external.
But Google page speed is giving this suggestion
The following external resources have small response bodies. Inlining
the response in HTML can reduce blocking of page rendering.
http://websiteurl/ should inline the following small resources:
http://websiteurl/script.js
This external js file has only this content
$(document).ready(function() {
$("#various2").fancybox({
'width': 485,
'height': 691,
});
});
But in Yslow I get this suggestion
Grade n/a on Make JavaScript and CSS external
Only consider this if your property is a common user home page.
There are a total of 3 inline scripts
JavaScript and CSS that are inlined in HTML documents get downloaded
each time the HTML document is requested. This reduces the number of
HTTP requests but increases the HTML document size. On the other hand,
if the JavaScript and CSS are in external files cached by the browser,
the HTML document size is reduced without increasing the number of
HTTP requests.
Which is right Google or Yahoo?
This is a bit of a problematic example, on quite a few fronts.
You can organise your scripts in such a way that you do not need to inline that JS. For example you could have a common.js file that runs that snippet, other similar snippets and simplifies your code.
Additionally, this seems to have awoken "never inline any JavaScript EVER" architecture police. Turns out that sometimes it is a best practice to inline JavaScript, for example look at the common snippet from Google analytics.
Why are Google suggesting you should inline this tiny script?
Because 20% of the page visits you get have an unprimed cache
If you have a cache miss, it is likely a new connection to your site will need to be opened (1 round trip) and then the data delivered in the 2nd round trip. (if you are lucky you get to use a keepalive connection and it is cut to 1 round trip.
For a general "global" English web application you are looking at a typical 110ms round trip time for a service hosted in the US. If you are using a CDN the number would probably be halved.
Even if the resource is local, the web browser may still need to access the disk to grab that tiny file.
Non async or defer JavaScript script tags are blocking, if this script is somewhere in the middle of your page, it will be stuck there until the script downloads.
From a performance perspective if the only 2 options are:
Place a 50 char JavaScript bit inline
Place the 50 chars in a separate file and serve it.
Considering that you are a good web citizen and compress all your content, the amount of additional payload this adds is negligible compared to the 20 percent risk of giving people a considerable delay. I would always choose #1.
In an imperfect world it is very rare to have such a clear and easy set of options. There is an option 3 that involved async loading jQuery and grabbing this functionality from a common area.
Making scripts inline can have some detrimental effects -
a) Code organization - Your code gets scattered in between your markup, thus affecting readability
b) Code Minification and obfuscation becomes difficult
Its best to keep your js in seperate files, and then at build time integrate all of them into a single file, and minify and obfuscate this.
This is not quite true. You can configure the web server (well atleast apache) to make the scrips/ccs inlined when they are served.
Here is a useful link
http://www.websiteoptimization.com/speed/tweak/mod_pagespeed/
There are two factors to consider here. One is download time, the other is maintainability. Both of these are impacted by how many times a piece of Javascript is needed.
With respect to download time, you obviously have two choices: include the JS in the body of the page, or as an external file. Including the JS in the body does save an extra HTTP request, although it also bloats the HTML a bit and can be a pain to maintain if you have several scripts you're putting inline on several different pages.
Another important consideration is whether or not the JS is needed immediately on the page. If a small piece of JS is needed as soon as the page loads, then putting it inline may be a good idea. If it's being used for something asynchronous in the future, then putting it an external file may still be a good choice.
I usually write javascript inline, especially if the script is this small. I would say just paste it in your code. It won't increase the http document size by a lot.
While inlining the script will save a request, as Yslow suggests it increases the HTML document size, and mixes content/markup with code/logic, which you generally want to avoid from as much as possible.
The reason Yslow gives this caveat:
Only consider this if your property is a common user home page.
Is that if the page is loaded frequently, it's worth it to have the javascript external, since the files will be cached in the browser. So, if you combine your JS into one file, on the first request you incur one extra request, and on subsequent requests the file is loaded from the cache.
Aaron Peters talk from last year's Velocity EU gives a good insight into the options, and course you should choose - http://www.slideshare.net/startrender/fast-loading-javascript
For really small snippet of js it's really not worth putting them in an external file as the network overhead of retrieving them will dwarf the benefits.
Depending on the latency it may be ever worth including large scripts e.g. Bind mobile has loads of js in the first page loaded which it then cached in localstorage for later pages.
Addy Osmani recently put together a experimental library to help people play with caching scripts in localstorage - http://addyosmani.github.com/basket.js/

Speed optimizing a JavaScript function

I have a number of JavaScript functions like the following on my page:
function fun1(){...}
function fun2(){...}
function fun3(){...}
function fun4(){...}
I may use fun1 in one or two pages, but the other functions only for specific pages.
My question is: should I include all the functions in one file like script.js or include specific functions for specific page? Which one is better for speed optimizing?
I guess your question is about optimizing page loading speed.
I would suggest grouping them as mush as possible in a single js file.
Otherwise, you would have to load a lot of small js files, increasing the page loading time.
Consider minifying your JS files too.
Depends on the size of the functions, your visitors' access patterns, your cache settings and other circumstances. The speed of downloading a file depends on how many TCP packets the server has to send. (Packet sizes tend to be around 1,5K.) Increasing the file size only matters if means the file needs to be broken into more packets (the client-size delay of processing a script which needs not be run is negligible), so if your scripts are short (you should of course minify them first), its best to alwaays bundle them. If you expect the average visitor to need all scripts eventually, it's again best to send them in one file. If, however, the average visitor won't need some of the larger scripts (for example one part is only needed at upload, and only 0,1% of the visitors ever uploads something), it might be better to send them separately.
The .js files are cached by your browser. So you can include as many functions as you like in a single file. If you split them into separate files that much of additional calls are made from the browser which slows down the loading page.. Also you can compress the js files if you are concerned about the size of the .js file ..# http://javascriptcompressor.com/
It depends a lot on how your server is sending out these files. If you have Firebug, open up the Net tab and inspect your JS files. If you see a Last-Modified entry in the Headers tab, it means that you are better off putting all your JS into one file. If you don't see it, it's best to split things up into page-specific files.
In my opinion, there are four main methods of speeding up your page-load times:
server headers -- this one is more complex to set up, but if you control your server settings or if you are willing to serve your JS via a dynamic page (PHP or ASP), you can send extra instructions to the browser to cache specific content for specific periods. Since your JS files are likely to change quite infrequently, it's usually pretty safe to do this for them. You basically just need to set the Expires header to some point well into the future. This means that the browser will not need to request the file at all if it has it in the cache. This makes the most sense if you have visitors who come back again and again. If you get a lot of one-hit visitors, this won't make a difference. This does mean that if you change these files, many browsers won't pick up the change; thus you should either change the file name or append something to the query string like this: <script type="text/javascript" src="/sitescript.js?v=1.1"></script>. This can be a maintenance problem if you have more than a few static HTML pages.
numbers of files -- in my opinion, this is where you get the biggest bang-for-buck savings. I'm nearly certain that most browsers still support only four active requests at a time. That means that if your web page has five images, the last image won't get requested until one of the previous images completes loading. And if your site has 50 images and 3 CSS files and 10 JS files, it's going to take a while to clear all those requests. Remember, even if you are sending Last-Modified headers, the browser still needs to check if the content has changed, so it needs one of those request slots. If you can combine all your images into a single images (using CSS sprites) and all your JS into a single file, your pages will load significantly faster.
file size -- as the web speeds up, this gets less and less important. If your server does not support content compression, it's a pretty good idea to minify your JS, though the time savings are overrated in my opinion. This does make maintenance somewhat more time-consuming and live debugging nearly impossible, but it definitely brings file size down quite a bit. If you have a LOT of JavaScript (maybe ~150KB+?) or if you know your visitors are coming from slower networks (for example, people on a corporate network), I would recommend doing it. If your server DOES support compression, the savings are actually negligible.
script placement -- when the browser hits a <script src="..."> tag, it halts all rendering until the script has loaded and executed, which means an inevitable delay. If you put your scripts in the middle of your page, you'll note that half the page loads and then pauses. To speek up rendering, place as many of your <script> references as you can at the dead bottom of the page. Scripts that you need at the top of the page can go there, but the more <script> clutter you have up there, the slower the page will render. Any code that gets executed by onLoad or DOMReady can safely go at the bottom of the page.
Yahoo has a really quite amazing list of optimization tips at their Best Practices page.

Doesn't External JavaScript files lead to more clientside processing?

I was thinking about external HTML files, and it occured to me, that if I group the functions from several HTML pages, in one JavaScript this will lead to extra clientside processing.
Basically, I would like some idea of whether this is correct.
Here is my thinking. Suppose I have one JavaScript file for five pages. If the user goes to each page, for each page he has to load not only the JavaScript for that page, but the JavaScript for the other four pages. The final sum is the user's browser loaded about 5 times as much JavaScript as he would have normally.
I think most people group there JavaScript by common functionality. So you can have the JavaScript file with several pages, however you may not use all the JavaScript on every page. So all the JavaScript you don't use on every page is run/loaded without need.
I have a sub-question. I know you don't have to redoanload the JavaScript file for each page. Is the JavaScript file run each time? Is the JavaScript reloaded? By reloaded, I mean what kind of over head is there for each time the browse has to get a file out of the cache?
Thanks,
Grae
If I have a file of 200 lines, and seperate it out to 5 files of 40 lines each, the total number of lines remains at 200 BUT. remember that, if I pulled files 1-4 on the previous page, I only now need to pull file 5 since 1-4 are in my cache. additionally, most modern browsers are goint to thread those requests so instead of a single large file download for a single file, I get 5 threaded downloads of smaller files.
the overhead for the browsers would be pretty browser specific in how they handle it and above my head on exact implementation.
the user goes to each page, for each page he has to load not only the JavaScript for that page, but the JavaScript for the other four pages
If caching is set up correctly, the contrary will be true: The file will be loaded only once, at the beginning of the user's visiting the site. The overall amount of data to load will be reduced in most cases.
The JavaScript code for all four pages will be loaded in the browser's memory somehow, maybe even pre-parsed (I don't know the exact specifics of this), but that part of script processing is totally negligible.
It could still be wise to split your JS library into chunks, if they are totally separate on every four pages and really huge - it will depend on your script's structure. But mostly, having one external file, and therefore one slightly bigger request the first time but none afterwards, is the preferable way.
For your sub-question, take a look at Firebug's "Net" tab. It will show you which resources it loads and from where, and how long it takes to process them.
It's better to pack the javascript for all pages into one file. The file will be cached and not downloaded again by the browser for consecutive requests. The reason is that making a web request is far more expensive for your server and the client than for the client to parse the javascript-file.
Browsers are so fast these days that you don't have to worry about the client having to load some extra javascript that might not be used for that specific page.
To make your site fast, you should focus on keeping the amount of requests to an absolute minimum.

What is the real benefit of using external css and js in terms of page loading speed?

What is the real benefit of using external css and js in place of placing code directly in ... and in terms of page loading speed?
if we are controlling whole site from one header.php/aspx file? Is use of external files makes page loading faster?
My question is only related to page loading speed.
On a per-request basis (that is, looking at page load performance of JUST one page), speed-wise you actually take a small performance hit by separating the files. But when looking at performance, you get a performance boost when loading multiple pages that utilize the same JS or same CSS. In those cases, the content of the JS/CSS is only loaded once for all requests.
Although you didn't ask this as part of your question, it also helps code maintainability. If you make one change in your CSS and it gets loaded across multiple pages, if you embed the JS/CSS in the page then you have to make the same change across all of your pages.
External files can be cached, and have to be loaded only once even when referenced from multiple locations. This usually outweighs the performance hit caused by the one additional request when the resource is first loaded.
i know you asked about page load speed and from that standpoint id say the greatest benefit id say cacheing is a big advantage but i wouldn't break it into multiple external files (because like first answer said it takes time to make a request) but you also get a benefit from a SEO standpoint... crawlers will only index to a certain point in a page and keeping js and css out of the top lets them see content higher in the page
i just found this article where a guy did the tests
http://articles.sitepoint.com/article/indexing-limits-where-bots-stop
yes, the css and js files will be cached by the browser so they only get loaded once.

Categories

Resources