Uncompressing content in browser on client side - javascript

I am interested to know about the possibilities of reducing http requests on servers by sending different kind of contents in a single compressed files and later uncompress on client's browser and place the stuff(images,css,js) where it should be.
I read somewhere that firefox is working on plan to give such features in future releases but it has not been done yet plus it would not be a standard version.
Will you guys suggest any solution for this?can Flash be used to uncompress compressed files on client side for later use?
Thanks

We did more or less what you describe in our web an are extremely happy of the response time.
The original files are all separated (HTML, CSS, JS, images) and we develop on them.
Then when moving to production we have a shell script that:
use YUI compressor to compress CSS and JS
all images are read and converted to data:image/png;base64,...
all blank spaces and comments are removed from the HTML
all these resources are put inline in the HTML
The page is ~300kb and usually cached.The server gzip it, the real size travelling the network is then lower.We don't use any additional compression.
And then there is a second call to get the data(JSON for us) and start rendering it client side.

I had to read your question a few times before I got what you were asking. It sounds like you want to basically combine all the elements of your site into a single downloadable file.
I'm fairly confident in saying I don't believe this is possible or desirable.
Firstly, you state that you've heard that Firefox may be supporting this. I haven't heard about that, but even if they do, how will you be able to use the feature while still supporting other browsers?
But even if you can do it, you've tagged this as 'performance-tuning', on the grounds that you'll be saving a few http requests. But in your effort to save http requests to speed things up, you need to be cautious that you don't actually end up slowing things down.
Combining all the files may cut you down to one http request, but your site may then load slower as the whole thing would need to load before any of it would be ready for display (as opposed to a normal page load where your page load may take time but at least some of it may be ready for display quite quickly).
What you can do right now, and which will be useful for reducing http requests, is combine your stylesheets into a single CSS, your scripts into a single JS file, and groups of related images into single image files (google CSS Sprites for more info on this technique).
Even then, you need to be careful about which files you combine - the point of the exersise is to reduce http requests so you need to take advantage caching, or you'll end up making things worse rather than better. Browsers can only cache files that are the same over multiple pages, so you should only combine the files that won't change between page loads. So for example, only combine the Javascript files which are in use across all the pages on your site.
My final comment would be to re-iterate what I've already said: Be cautious about over-optimising to the point that you actually end up slowing things down.

Related

Does using more external files, opposed to cramming everything into one file, reduce run-time-efficiency?

I am relatively new to web design and the world of jquery, javascript and php. I guess this question would also suit css style sheets as well. Is it better to have everything stuffed into one "external document"? Or does this not affect the run time speeds?
Also to go along with this. Is it wrong, or less efficient, to use php in places where jquery / javascript could be implemented? Which of the two languages is generally faster?
The way you should look at it would be to load the minimum resources required initially which would be needed on page load, not everything. Make sure you group all of these resources together into a single file, and minify them.
Once your page is loaded, you can thereafter load other resources on demand. For e.g a modal, which does not need to be immediately visible can be loaded at a later point of time, when user does some action, and it needs to be shown. This is called lazy loading. But when you do load any module on demand, make sure you load all of its resources together and minified as well.
It's important to structure your code correctly and define the way you batch files together for concatenation and minification. It will help you save on performance by optimizing the number of calls made to the server.
About PHP and JavaScript, I would say in general JavaScript is faster than PHP, but it depends on your application, as one runs on the server and other on the client. So if you are doing too heavy and memory intensive operations, the browser might limit your capabilities. If that is not a problem, go ahead with JavaScript.
There's a lot of different factors that come into play here. Ultimately, it is better to call the least amount of resources possible to make the site run faster. Many sites that check page speed will dock points if you call a ton of resources. However, you don't want to go insane condensing and try to cram everything into a single file either... The best way to approach it is to use as few files as possible while maintaining a logical organization.
For example, maybe you're using a few different JS libraries... well merging those all into one would eventually get confusing and hard to update so it makes sense to keep them all separate. However, you can keep all your custom JS where you call those libraries in one separate file. This can even be applied to images. Let's say you're uploading 5 different social media icons and 5 different hover states for them. Well, instead of making the site call 10 different files, use a sprite and just call one.
You can also do things like use google's hosted libraries: https://developers.google.com/speed/libraries/ Many sites use these and therefore many users already have these resources cached which means they don't need to freshly load the libraries when visiting your site. It's very helpful for things like jQuery.
Another thing to keep in mind is minifying those files. Any library you use should have a minified version and you should use that as opposed to a full version. While you should keep unminified copies of your work around, whatever ends up on the live site should be minified to help with page speed. Here are a few resources for that: https://cssminifier.com/ https://javascript-minifier.com/ If you're using WP, there's tons of plugins out there that have similar functions like WP Fastest Cache.
You php/js/jquery question I can't really weigh in on too heavily. As mentioned, the base difference between php and JS ist whether the requests are client-side or server-side. Personally, I use whatever is prevalent in the project and whatever works best for your changes. For example, if you're working with variables and transferring data, PHP can be a really great

Page Speed in appending external js and css files

i simply would like to know from who has well documented about, if i have 10 js/css external files to append at my site does is better to compress them into only 1 file or is good to have 10,20 external source links anyway in a page speed point of view?
i also ask this, cause using both firebug google page speed and yahoo Yslow page speed tools, they conflict in this, google says to separate files, yahoo says compress all in one :| .. normally i would trust in bigG but who knows :|
One of the key performance enhancements you can make is to reduce the number of HTTP requests. Each external resource means one extra request, so grouping them together will have a positive impact on page performance.
If you want to learn more about front-end performance, check out Steve Souders' books. You can find a simplified overview of the topics in the books on the Yahoo! Developer Network (he was at Yahoo! when he wrote the first book).
First of all, compressing files is (nearly) always a good idea. Both using specific JS and/or CSS compressors, and using GZIP compression at the HTTP-level.
Deciding whether to combine files or not is not so easy; you need to juggle different goals:
Minimize the total number of bytes loaded; this includes making sure files can be retrieved from cache
Make sure the files loaded arrive in as few requests as possible.
Combining files optimizes for #2, but can be at the cost of #1. If different pages use different CSS / JS, then every page might get a different combined file (permutation of component files), making caching impossible.
A quick-and-dirty solution is to include all generic JavaScript and all CSS used on all the pages in two single compressed files (one JS, one CSS). If your visitors stay on your site for a longer time they will have the best experience, since all CSS/JS needs to be loaded only once, and that one time is as quick as possible.
If you combine your scripts into one script on one host and this host is slow -- your page will be slow. If you have your script broken up into a few scripts hosted on a CDN with different sub domains for the scripts, your browser will download more of them in parallel. Read that site on boosting download times. It has the conclusion "boosting parallel downloads can realize up to a 40% improvement in web page latency. You can use two or three hostnames to serve objects from the same server to fool browsers into multithreading more objects."
Make sure you have the Google Page Speed, YSlow and Firebug addons installed, then use them. They will help you make your website faster.

Good structuring of JavaScript in a project?

So, I'm fairly done with the JS in my website now. It involves jQuery (and as such a .ready init)
The page has two parts, the upper is mainly Google Maps and the lower is input forms.
All of this is currently in one .js, functions, inits, iterations, all of it.
It's well structure and all that, everything is properly done.
My question is however: What is a good structure?
Should I be putting the upper half in one file and the lower in another?
Or should I put all the needed initializations under the .ready() and place all functions in another file?
Or should I keep everything in an ever growing file?
From a performance perspective, supplying all your JS in a single file to the browser as suggested by the other answers is sensible. However, having your code built that way is not, each "class" should be in a separate file, splitting things up in to entities and a control file or two to handle the actual page calls - the same as in any other language. These can then be combined in to one file for supply either in advance, or dynamically - preferably also minimized.
Keeping the script calls to a minimum improves processing time -- each script include is a round-trip that adds time. So -- personal preference, as long as you don't care about page load times.
Yahoo's Best Practices for Speeding Up Your Web Site starts off with Minimize HTTP Requests:
80% of the end-user response time is
spent on the front-end. Most of this
time is tied up in downloading all the
components in the page: images,
stylesheets, scripts, Flash, etc.
Reducing the number of components in
turn reduces the number of HTTP
requests required to render the page.
This is the key to faster pages.
One way to reduce the number of
components in the page is to simplify
the page's design. But is there a way
to build pages with richer content
while also achieving fast response
times? Here are some techniques for
reducing the number of HTTP requests,
while still supporting rich page
designs.
Combined files are a way to reduce the
number of HTTP requests by combining
all scripts into a single script, and
similarly combining all CSS into a
single stylesheet. Combining files is
more challenging when the scripts and
stylesheets vary from page to page,
but making this part of your release
process improves response times.
Personally, I prefer to keep all code in a single file, so the browser has to fetch only one file and not two/three/four/whatever.
But I think it's up to personal preference.

Javascript errors / bad practice

Is it bad practice to have a single javascript file that gets loaded accross all pages even if certain functions are only needed on certain pages? Or should the files be split up according to functionality on a given page and loaded only by pages that need them?
According to YSlow less files is better, but try to keep each file under 25k. Also make sure you minify or otherwise reduce the size of the js (and css) files. If possible turn on GZip for js on the webserver and set a far future expires header.
See here for the Yahoo Developer performance best practices.
If this file is really large, it could impact certain user's perceived performance (i.e. download and render times). IMHO you should split it up into reasonable groups of functions, with each group having similar functions (such that pages only reference the files they need).
depends on the size and complexity of the unused functions.
the javascript-parser anyway only stores the location and the signature of each function. as far as i know, it is only parsed when executed.
if traffic is a problem for you, rather include only those you need...
regards
Since the JS files are cached once they are downloaded and the JS parser shows no noticable performance difference btw a big JS file(not a HUGE one ;)) and a small js file, you should go with the single file approach.
Also it is known that multiple js files reduces the performance.
You're best off with a single JS file, as browsers will cache it after the first request for it (unless they've turned that off, in which case they deserve what they get). Another thing that will vastly, vastly increase your perceived performance on page load is turning on gzip compression in the web server - most JS files compress very well.
I would recommend to use one big file, for each file the browser launches a web request. Most browsers, I'm not quite sure how much it is with the newest versions of the known browsers, only launch a few concurrent web requests. The browser will wait until the files have been downloaded before launching the next web requests.
The one big file will be cached and other pages will load faster.
As #Frozenskys mentions YSlow states that less files is better, one of the major performance enhancements proposed by the Yahoo team is to minimize the amount of http requests.
Of course if you have a HUGE javascript file that literally takes seconds to download, it's better to split it up to prevent that the user has to wait seconds before the page loads.
A single file means a single download; as this article explains, most browsers will only allow a limited number of parallel requests to a single domain. Although your single file will be bigger than multiple small ones, as the other answers have pointed out:
The file will be cached
Techniques like minification and server-side gzip compression will help to reduce the download time.
You can also include the script at the end of the page to improve the perceived load time.

Javascript and CSS parsing performance

I am trying to improve the performance of a web application. I have metrics that I can use to optimize the time taken to return the main HTML page, but I'm concerned about the external CSS and JavaScript files that are included from these HTML pages. These are served statically, with HTTP Expires headers, but are shared between all the pages of the application.
I'm concerned that the browser has to parse these CSS and JavaScript files for each page that is displayed and so having all the CSS and JavaScript for the site shared into common files will negatively affect performance. Should I be trying to split out these files so I link from each page to only the CSS and JavaScript needed for that page, or would I get little return for my efforts?
Are there any tools that could help me generate metrics for this?
­­­­­­­­­­­­­­­­­­­­­­­­­­­
Context: While it's true that HTTP overhead is more significant than parsing JS and CSS, ignoring the impact of parsing on browser performance (even if you have less than a meg of JS) is a good way to get yourself in trouble.
YSlow, Fiddler, and Firebug are not the best tools to monitor parsing speed. Unless they've been updated very recently, they don't separate the amount of time required to fetch JS over HTTP or load from cache versus the amount of time spent parsing the actual JS payload.
Parse speed is slightly difficult to measure, but we've chased this metric a number of times on projects I've worked on and the impact on pageloads were significant even with ~500k of JS. Obviously the older browsers suffer the most...hopefully Chrome, TraceMonkey and the like help resolve this situation.
Suggestion: Depending on the type of traffic you have at your site, it may be well worth your while to split up your JS payload so some large chunks of JS that will never be used on a the most popular pages are never sent down to the client. Of course, this means that when a new client hits a page where this JS is needed, you'll have to send it over the wire.
However, it may well be the case that, say, 50% of your JS is never needed by 80% of your users due to your traffic patterns. If this is so, you should definitely user smaller, packaged JS payloads only on pages where the JS is necessary. Otherwise 80% of your users will suffer unnecessary JS parsing penalties on every single pageload.
Bottom Line: It's difficult to find the proper balance of JS caching and smaller, packaged payloads, but depending on your traffic pattern it's definitely well worth considering a technique other than smashing all of your JS into every single pageload.
I believe YSlow does, but be aware that unless all requests are over a loopback connection you shouldn't worry. The HTTP overhead of split-up files will impact performance far more than parsing, unless your CSS/JS files exceed several megabytes.
To add to kamen's great answer, I would say that on some browsers, the parse time for larger js resources grows non-linearly. That is, a 1 meg JS file will take longer to parse than two 500k files. So if a lot of your traffic is people who are likely to have your JS cached (return visitors), and all your JS files are cache-stable, it may make sense to break them up even if you end up loading all of them on every pageview.

Categories

Resources