Page Speed in appending external js and css files - javascript

i simply would like to know from who has well documented about, if i have 10 js/css external files to append at my site does is better to compress them into only 1 file or is good to have 10,20 external source links anyway in a page speed point of view?
i also ask this, cause using both firebug google page speed and yahoo Yslow page speed tools, they conflict in this, google says to separate files, yahoo says compress all in one :| .. normally i would trust in bigG but who knows :|

One of the key performance enhancements you can make is to reduce the number of HTTP requests. Each external resource means one extra request, so grouping them together will have a positive impact on page performance.
If you want to learn more about front-end performance, check out Steve Souders' books. You can find a simplified overview of the topics in the books on the Yahoo! Developer Network (he was at Yahoo! when he wrote the first book).

First of all, compressing files is (nearly) always a good idea. Both using specific JS and/or CSS compressors, and using GZIP compression at the HTTP-level.
Deciding whether to combine files or not is not so easy; you need to juggle different goals:
Minimize the total number of bytes loaded; this includes making sure files can be retrieved from cache
Make sure the files loaded arrive in as few requests as possible.
Combining files optimizes for #2, but can be at the cost of #1. If different pages use different CSS / JS, then every page might get a different combined file (permutation of component files), making caching impossible.
A quick-and-dirty solution is to include all generic JavaScript and all CSS used on all the pages in two single compressed files (one JS, one CSS). If your visitors stay on your site for a longer time they will have the best experience, since all CSS/JS needs to be loaded only once, and that one time is as quick as possible.

If you combine your scripts into one script on one host and this host is slow -- your page will be slow. If you have your script broken up into a few scripts hosted on a CDN with different sub domains for the scripts, your browser will download more of them in parallel. Read that site on boosting download times. It has the conclusion "boosting parallel downloads can realize up to a 40% improvement in web page latency. You can use two or three hostnames to serve objects from the same server to fool browsers into multithreading more objects."

Make sure you have the Google Page Speed, YSlow and Firebug addons installed, then use them. They will help you make your website faster.

Related

Does using more external files, opposed to cramming everything into one file, reduce run-time-efficiency?

I am relatively new to web design and the world of jquery, javascript and php. I guess this question would also suit css style sheets as well. Is it better to have everything stuffed into one "external document"? Or does this not affect the run time speeds?
Also to go along with this. Is it wrong, or less efficient, to use php in places where jquery / javascript could be implemented? Which of the two languages is generally faster?
The way you should look at it would be to load the minimum resources required initially which would be needed on page load, not everything. Make sure you group all of these resources together into a single file, and minify them.
Once your page is loaded, you can thereafter load other resources on demand. For e.g a modal, which does not need to be immediately visible can be loaded at a later point of time, when user does some action, and it needs to be shown. This is called lazy loading. But when you do load any module on demand, make sure you load all of its resources together and minified as well.
It's important to structure your code correctly and define the way you batch files together for concatenation and minification. It will help you save on performance by optimizing the number of calls made to the server.
About PHP and JavaScript, I would say in general JavaScript is faster than PHP, but it depends on your application, as one runs on the server and other on the client. So if you are doing too heavy and memory intensive operations, the browser might limit your capabilities. If that is not a problem, go ahead with JavaScript.
There's a lot of different factors that come into play here. Ultimately, it is better to call the least amount of resources possible to make the site run faster. Many sites that check page speed will dock points if you call a ton of resources. However, you don't want to go insane condensing and try to cram everything into a single file either... The best way to approach it is to use as few files as possible while maintaining a logical organization.
For example, maybe you're using a few different JS libraries... well merging those all into one would eventually get confusing and hard to update so it makes sense to keep them all separate. However, you can keep all your custom JS where you call those libraries in one separate file. This can even be applied to images. Let's say you're uploading 5 different social media icons and 5 different hover states for them. Well, instead of making the site call 10 different files, use a sprite and just call one.
You can also do things like use google's hosted libraries: https://developers.google.com/speed/libraries/ Many sites use these and therefore many users already have these resources cached which means they don't need to freshly load the libraries when visiting your site. It's very helpful for things like jQuery.
Another thing to keep in mind is minifying those files. Any library you use should have a minified version and you should use that as opposed to a full version. While you should keep unminified copies of your work around, whatever ends up on the live site should be minified to help with page speed. Here are a few resources for that: https://cssminifier.com/ https://javascript-minifier.com/ If you're using WP, there's tons of plugins out there that have similar functions like WP Fastest Cache.
You php/js/jquery question I can't really weigh in on too heavily. As mentioned, the base difference between php and JS ist whether the requests are client-side or server-side. Personally, I use whatever is prevalent in the project and whatever works best for your changes. For example, if you're working with variables and transferring data, PHP can be a really great

Aggressive Caching vs Combining of Javascript files for an Intranet Site

I'm working on improving the page performance of my company's intranet page. We're looking to (dynamically) combine our javascript files as well as cache them for 30+ days. The page launches on login for everyone.
One of my coworkers asked if it's worth the time to combine the JS files if we're already caching them for a month. He's hesitant to do both because the combining tool is server side and doesn't run on our desktop, requiring a somewhat hacky workaround.
I've been doing some research and most of the recommendations for performance I've seen are for external sites. Since we're in a closed system it would seem like we wouldn't get much benefit from combining the files once everyone's cache is primed. What would combining the files buy us that aggressive caching wouldn't?
We're on IE8 if that makes any difference.
The most notable impact with having multiple JavaScript files is the time required to render the page. Each script tag is processed separately and adds time to the overall render process.
A pretty good answer can be found here # multiple versus single script tags
If we are talking a large number of scripts then you may see an improvement in render time; if it is just two or three files then it likely won't bring abount a noticable difference once the files have been cached.
I would suggest testing the page render time in both cases and see how much improvement you see in your case and decide based on that information.
As a useful example, here are some stats from Xpedite (runtime minification tool I created a while back); note the difference in time from load to ready for combined vs uncombined scripts.
Combine all your JavaScript files into a single big file (thus minimizing the number of requests made to the server), and set its name to something like "application_234234.js"; the number represents the last time you have changed the file and will help the browser know it's a new file (thus no cache when you change it).
Add an Expires or a Cache-Control Header (set it really far into the future). Since the file name will change each time you'll modify it, you don't have to worry.
Last and not least, compress and gzip the JavaScript file.
These are some important advices, but learn more about best practices on: http://developer.yahoo.com/performance/rules.html

how to optimize 1 large compressed JS file for different pages? (code not needed everywhere)

I have one JS-file that is merged and compressed from several separate files.
So when we start our app, there is only one http request for the JS-file which makes loading it really fast for all pages.
But, a lot of the code is jquery in document.ready(). It binds events and other code to elements on certain pages, but not all code is needed on all pages.
Now I was wondering if this is heavy on the browser? Or is there a way to make this more specific to only the pages the JS-code is targeting without having to split the files up again?
Your initial approach is sound. Concatenating all JS and limiting the number of server requests is a great way to optimize for front end speed.
Your next step in terms of JS performance would be:
How big is your big concatenated script file? If it's relatively small, don't bother with any further optimizations. If it's huge, involving many jQuery plugins and page-specific handlers, move on:
Some of the heaviest scripts out there are frameworks like jQuery, jQuery.ui, YUI, Prototype, etc. These can all be loaded from high-speed CDNs like Google CDN. This will keep your local script's file size down AND (usually) speed up the browser's fetching of the framework.
Analyze the way your users navigate your site; are there any script-heavy pages or sections which are only visited by a small fraction of users? If so, moving the page-/section-specific code into a separate script just for those pages/sections can speed up the other pages.
Look into the visitor profile of your site. How 'sticky' is your site? You want to look at the ratio of new visitors to returning visitors. If you have a lot of new visitors, but not a lot of returning visitors (this is the case for most small-to-medium web sites), it makes more sense to optimize for the first pageview (splitting resources into section-specific parts). If your site has a lot of returning traffic, this is less important.
Remember to lazyload as many resources as you possibly can, including nonessential Javascript.
One of the ways concatenated Javascripts tend to grow quickly is excessive use of jQuery plugins. In a lot of cases, the same feature or effect can be achieved with a lot less code if you create a custom function or strip down an existing plugin to the very minimal feature set.

Uncompressing content in browser on client side

I am interested to know about the possibilities of reducing http requests on servers by sending different kind of contents in a single compressed files and later uncompress on client's browser and place the stuff(images,css,js) where it should be.
I read somewhere that firefox is working on plan to give such features in future releases but it has not been done yet plus it would not be a standard version.
Will you guys suggest any solution for this?can Flash be used to uncompress compressed files on client side for later use?
Thanks
We did more or less what you describe in our web an are extremely happy of the response time.
The original files are all separated (HTML, CSS, JS, images) and we develop on them.
Then when moving to production we have a shell script that:
use YUI compressor to compress CSS and JS
all images are read and converted to data:image/png;base64,...
all blank spaces and comments are removed from the HTML
all these resources are put inline in the HTML
The page is ~300kb and usually cached.The server gzip it, the real size travelling the network is then lower.We don't use any additional compression.
And then there is a second call to get the data(JSON for us) and start rendering it client side.
I had to read your question a few times before I got what you were asking. It sounds like you want to basically combine all the elements of your site into a single downloadable file.
I'm fairly confident in saying I don't believe this is possible or desirable.
Firstly, you state that you've heard that Firefox may be supporting this. I haven't heard about that, but even if they do, how will you be able to use the feature while still supporting other browsers?
But even if you can do it, you've tagged this as 'performance-tuning', on the grounds that you'll be saving a few http requests. But in your effort to save http requests to speed things up, you need to be cautious that you don't actually end up slowing things down.
Combining all the files may cut you down to one http request, but your site may then load slower as the whole thing would need to load before any of it would be ready for display (as opposed to a normal page load where your page load may take time but at least some of it may be ready for display quite quickly).
What you can do right now, and which will be useful for reducing http requests, is combine your stylesheets into a single CSS, your scripts into a single JS file, and groups of related images into single image files (google CSS Sprites for more info on this technique).
Even then, you need to be careful about which files you combine - the point of the exersise is to reduce http requests so you need to take advantage caching, or you'll end up making things worse rather than better. Browsers can only cache files that are the same over multiple pages, so you should only combine the files that won't change between page loads. So for example, only combine the Javascript files which are in use across all the pages on your site.
My final comment would be to re-iterate what I've already said: Be cautious about over-optimising to the point that you actually end up slowing things down.

Javascript errors / bad practice

Is it bad practice to have a single javascript file that gets loaded accross all pages even if certain functions are only needed on certain pages? Or should the files be split up according to functionality on a given page and loaded only by pages that need them?
According to YSlow less files is better, but try to keep each file under 25k. Also make sure you minify or otherwise reduce the size of the js (and css) files. If possible turn on GZip for js on the webserver and set a far future expires header.
See here for the Yahoo Developer performance best practices.
If this file is really large, it could impact certain user's perceived performance (i.e. download and render times). IMHO you should split it up into reasonable groups of functions, with each group having similar functions (such that pages only reference the files they need).
depends on the size and complexity of the unused functions.
the javascript-parser anyway only stores the location and the signature of each function. as far as i know, it is only parsed when executed.
if traffic is a problem for you, rather include only those you need...
regards
Since the JS files are cached once they are downloaded and the JS parser shows no noticable performance difference btw a big JS file(not a HUGE one ;)) and a small js file, you should go with the single file approach.
Also it is known that multiple js files reduces the performance.
You're best off with a single JS file, as browsers will cache it after the first request for it (unless they've turned that off, in which case they deserve what they get). Another thing that will vastly, vastly increase your perceived performance on page load is turning on gzip compression in the web server - most JS files compress very well.
I would recommend to use one big file, for each file the browser launches a web request. Most browsers, I'm not quite sure how much it is with the newest versions of the known browsers, only launch a few concurrent web requests. The browser will wait until the files have been downloaded before launching the next web requests.
The one big file will be cached and other pages will load faster.
As #Frozenskys mentions YSlow states that less files is better, one of the major performance enhancements proposed by the Yahoo team is to minimize the amount of http requests.
Of course if you have a HUGE javascript file that literally takes seconds to download, it's better to split it up to prevent that the user has to wait seconds before the page loads.
A single file means a single download; as this article explains, most browsers will only allow a limited number of parallel requests to a single domain. Although your single file will be bigger than multiple small ones, as the other answers have pointed out:
The file will be cached
Techniques like minification and server-side gzip compression will help to reduce the download time.
You can also include the script at the end of the page to improve the perceived load time.

Categories

Resources