How much time does calling CDN jquery add? - javascript

I'm wondering if anyone has any insight into how much time per page load is added by calling out to:
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.1.0/jquery.min.js"></script>
Specificically, I want to know how much time is added the second time someone visits the same page from adding a call to the google hosted jquery, as opposed to:
Not loading jquery
Loading jquery from a page hosted wherever the html page is located
Loading jquery from a locally stored file (if the html page is loaded form a locally stored page, such as in a chrome extension).
So if you read between the lines of my question, what I really want to know, whether calling out to the CDN jquery is faster or slower than loading the page locally.
I've always heard that the CDN jquery is fast because it is cached. My question is pointing toward trying to understand how this caching works?
Edit in response to downvotes:
I am interesting in this answer, regardless of whether it has any noticable or "practical" significance. I am trying to develop a better mental model of how caching works in this context along with how the browser loads and parses locally hosted javascript.

The second time someone visits your page the file will be cached, so in your scenarios it would be roughly
A couple ms
Same
Same
The nice things about CDNs is that if you use a popular one the user will already have it cached, meaning that their first page load will also be faster. They are also likely to have a sever closer to the user.
TL;DR
Use the CDN.

That really all just depends on how fast the server your html is hosted on is. Yes CDNs are pretty fast. But as you indicated, once the browser has cached the resources (the first time a user visits your page), it will be loading the resource from the cache anyway. In addition, jquery is small. So small in fact that most hosting platforms will yield similar results.

Considering that the file is highly cached by CDN's, the delivery will be consistently fast.
This is the download speed taken from a metro area:
time_namelookup: 0.005
time_connect: 0.042
time_appconnect: 0.203
time_pretransfer: 0.203
time_redirect: 0.000
time_starttransfer: 0.216
----------
time_total: 0.248 milliseconds
size_download: 86351 Bytes
speed_download: 347767.000 B/sec

Related

Multiple files on CDN vs. one file locally

My website uses about 10 third party javascript libraries like jQuery, jQuery UI, prefixfree, a few jQuery plugins and also my own javascript code. Currently I pull the external libraries from CDNs like Google CDN and cloudflare. I was wondering what is a better approach:
Pulling the external libraries from CDNs (like I do today).
Combining all the files to a single js and a single css file and storing them locally.
Any opinions are welcome as long as they are explained.
Thanks :)
The value of a CDN lies in the likelihood of the user having already visited another site calling that same file from that CDN, and becomes increasingly valuable depending on the size of the file. The likelihood of this being the case increases with the ubiquity of the file being requested and the popularity of the CDN.
With this in mind, pulling a relatively large and popular file from a popular CDN makes absolute sense. jQuery, and, to a lesser degree, jQuery UI, fit this bill.
Meanwhile, concatenating files makes sense for smaller files which are not likely to change much — your commonly used plugins will fit this bill, but your core application-specific code probably doesn't: it might change from week to week, and if you're concatenating it with all your other files, you'd have to force the user to download everything all over again.
The HTML5 boilerplate does a pretty good job of providing a generic solution for this:
Modernizr is loaded from local in the head: it's very small and
differs quite a lot from instance to instance, so it doesn't make
sense to source it from a CDN and it won't hurt the user too much to
load it from your server. It's put in the head because CSS may be
making use of it, so you want it's effects to be known before the
body renders. Everything else goes at the bottom, to stop your
heavier scripts blocking rendering while they load and execute.
jQuery from the CDN, since almost everyone uses it and it's quite heavy. The user will probably already have this cached before they
visit your site, in which case they'll load it from cache instantly.
All your smaller 3rd party dependencies and code snippets that aren't likely to change much get concatenating into a plugins.js
file loaded from your own server. This will get cached with a
distant expiry header the first time the user visits and loaded from
cache on subsequent visits.
Your core code goes in main.js, with a closer expiry header to account for the fact that your application logic may change from
week to week or month to month. This way when you've fixe a bug or
introduced new functionality when the user visits a fortnight from
now, this can get loaded fresh while all the content above can be
brought in from cache.
For your other major libraries, you should look at them individually and ask yourself whether they should follow jQuery's lead, be loaded individually from your own server, or get concatenated. An example of how you might come to those decisions:
Angular is incredibly popular, and very large. Get it from the CDN.
Twitter Bootstrap is on a similar level of popularity, but you've got a relatively slim selection of its components, and if the user doesn't already have it it might not be worth getting them to download the full thing. Having said that, the way it fits into the rest of your code is pretty intrinsic, and you're not likely to be changing it without rebuilding the whole site — so you may want to keep it hosted locally but keep it's files separate from your main plugins.js. This way you can always update your plugins.js with Bootstrap extensions without forcing the user to download all of Bootstrap core.
But there's no imperative — your mileage may vary.

Is using inline JavaScript preferred to an external include if the script is really short?

I use External JavaScripts in a website as I always try to keep JavaScript at bottom and external.
But Google page speed is giving this suggestion
The following external resources have small response bodies. Inlining
the response in HTML can reduce blocking of page rendering.
http://websiteurl/ should inline the following small resources:
http://websiteurl/script.js
This external js file has only this content
$(document).ready(function() {
$("#various2").fancybox({
'width': 485,
'height': 691,
});
});
But in Yslow I get this suggestion
Grade n/a on Make JavaScript and CSS external
Only consider this if your property is a common user home page.
There are a total of 3 inline scripts
JavaScript and CSS that are inlined in HTML documents get downloaded
each time the HTML document is requested. This reduces the number of
HTTP requests but increases the HTML document size. On the other hand,
if the JavaScript and CSS are in external files cached by the browser,
the HTML document size is reduced without increasing the number of
HTTP requests.
Which is right Google or Yahoo?
This is a bit of a problematic example, on quite a few fronts.
You can organise your scripts in such a way that you do not need to inline that JS. For example you could have a common.js file that runs that snippet, other similar snippets and simplifies your code.
Additionally, this seems to have awoken "never inline any JavaScript EVER" architecture police. Turns out that sometimes it is a best practice to inline JavaScript, for example look at the common snippet from Google analytics.
Why are Google suggesting you should inline this tiny script?
Because 20% of the page visits you get have an unprimed cache
If you have a cache miss, it is likely a new connection to your site will need to be opened (1 round trip) and then the data delivered in the 2nd round trip. (if you are lucky you get to use a keepalive connection and it is cut to 1 round trip.
For a general "global" English web application you are looking at a typical 110ms round trip time for a service hosted in the US. If you are using a CDN the number would probably be halved.
Even if the resource is local, the web browser may still need to access the disk to grab that tiny file.
Non async or defer JavaScript script tags are blocking, if this script is somewhere in the middle of your page, it will be stuck there until the script downloads.
From a performance perspective if the only 2 options are:
Place a 50 char JavaScript bit inline
Place the 50 chars in a separate file and serve it.
Considering that you are a good web citizen and compress all your content, the amount of additional payload this adds is negligible compared to the 20 percent risk of giving people a considerable delay. I would always choose #1.
In an imperfect world it is very rare to have such a clear and easy set of options. There is an option 3 that involved async loading jQuery and grabbing this functionality from a common area.
Making scripts inline can have some detrimental effects -
a) Code organization - Your code gets scattered in between your markup, thus affecting readability
b) Code Minification and obfuscation becomes difficult
Its best to keep your js in seperate files, and then at build time integrate all of them into a single file, and minify and obfuscate this.
This is not quite true. You can configure the web server (well atleast apache) to make the scrips/ccs inlined when they are served.
Here is a useful link
http://www.websiteoptimization.com/speed/tweak/mod_pagespeed/
There are two factors to consider here. One is download time, the other is maintainability. Both of these are impacted by how many times a piece of Javascript is needed.
With respect to download time, you obviously have two choices: include the JS in the body of the page, or as an external file. Including the JS in the body does save an extra HTTP request, although it also bloats the HTML a bit and can be a pain to maintain if you have several scripts you're putting inline on several different pages.
Another important consideration is whether or not the JS is needed immediately on the page. If a small piece of JS is needed as soon as the page loads, then putting it inline may be a good idea. If it's being used for something asynchronous in the future, then putting it an external file may still be a good choice.
I usually write javascript inline, especially if the script is this small. I would say just paste it in your code. It won't increase the http document size by a lot.
While inlining the script will save a request, as Yslow suggests it increases the HTML document size, and mixes content/markup with code/logic, which you generally want to avoid from as much as possible.
The reason Yslow gives this caveat:
Only consider this if your property is a common user home page.
Is that if the page is loaded frequently, it's worth it to have the javascript external, since the files will be cached in the browser. So, if you combine your JS into one file, on the first request you incur one extra request, and on subsequent requests the file is loaded from the cache.
Aaron Peters talk from last year's Velocity EU gives a good insight into the options, and course you should choose - http://www.slideshare.net/startrender/fast-loading-javascript
For really small snippet of js it's really not worth putting them in an external file as the network overhead of retrieving them will dwarf the benefits.
Depending on the latency it may be ever worth including large scripts e.g. Bind mobile has loads of js in the first page loaded which it then cached in localstorage for later pages.
Addy Osmani recently put together a experimental library to help people play with caching scripts in localstorage - http://addyosmani.github.com/basket.js/

extjs, is it possible to compress load ext-all.js?

I have a website that's using the extjs librar. Exactly I just need grid, ajax and tree component.
My project is used nationally, and to avoid problems due to low bandwith in the some regions, I must to make it as light as possible.
When I use the developer tools in chrome, my site is too heavy. Especially when loading ext-all.js. It take 3,9 minutes to load(#512kbps), (even when I remove my own images and css from the website).
Is there a way to compress it? Or to just load the tree, grid and ajax components?
I was googling.. and I got this
<script type="text/javascript" src="js/ext-all.js?compression=gzip"></script>
but, it didn't help much.
This is the page that shows you how to build custom versions of ext-js. http://www.sencha.com/learn/Tutorial:Building_Ext_From_Source
They had a link to an online builder that would customize the download but it's been taken down. The link mentioned still points to good resources like JsBuilder, the tool they use to generate ext-all.js and the other packages in the distribution. Just open the ext.jsb to see how it works
You'll need to figure out the dependencies on your own though, good luck!
I'd estimate that at 512k, the extjs load should be around 30 s - 1 minute.
If you're looking at a 4 minute load time, your time is probably spent somewhere other than the download of the library. Are you sure it's the size of your download, or even extjs that is the problem? Could it be that your webserver is under heavy load, or that you're dealing with a latency issue?
As far as reducing the size of the library - there isn't much more you can do. The library is provided in minified format, and stripping it further is not recommended. Zipping it up only means you'll have to unzip at the other end once downloaded, and doesn't buy you that much load time with a library that's already very small.

How can I optimize my web-page (which is quite large)?

i'm working on a web application...
The application is running fine but the problem is the first time wen i open the application in the browser it shows a blank page i have to hit refresh three or four times to load the page completely and correctly.....
I think my application is too heavy to load, However once it is loaded it's good to go....
i have probably 5 JavaScript files which is around 1.3mb in size and also some UI components.....
is there a possible way to control it so that wen i load the application it returns the entire application without the necessarily hitting refresh again and again....
is there a way to optimize this page....
please help...
Thank you in adavance...
hi again,
is there a way to automatically reload the page if it didn't load the first time?
Check whether you can optimize your code in the javascript. Do you need all the functions that are defined in those 5 javascript files?If not you can split it and load it when other pages load that need this functionality.
Try to find out which part of the code is making it too slow?
1.3 MB of javascript is too much. Try compressing your javascript.
http://jscompress.com/
After compression, try delay loading the javascript files which ever possible:
http://www.websiteoptimization.com/speed/tweak/defer/
Run YSlow addon to gather more information about optimizations possible
http://developer.yahoo.com/yslow/
The easiest method is to run YSlow on a Firefox Console
You should also compress your javascript files using YUI Compressor
Have you minified your javascript. This makes it more difficult for humans to understand but can significantly reduce the file size. If one of those scripts is jQuery you might consider referencing the copy hosted at google on your page rather than having it hosted on the serve. Google's server is probably faster than your, and a lot of users will have a copy of jQuery from google cached.
If the site is image heavy and PNGs are used you might consider removing some data from them to make them smaller using tools like pngcrush
As mentioned by a few other, running the page through YSlow is very likely to help find issues that could cause slow performance

Javascript file inclusion in html pages- what happens underneath in the browser?

I think this may be a browser dependent question- Suppose I have 10 Javascript files and several HTML pages. Suppose HTML pageA needs only JS1.js and JS3.js, similarly HTML pageB needs JS4.js and JS1.js. I want to know what would be effect of including all the 10 javascript files in all HTML pages? Will it directly relate to the memory consumption by the browser?
I am facing this problem particularly with YUI javascript library. There are several components like datatable, event, container, calendar, dom-event etc., The order in which they are included also seems to matter a lot- For example the dom-event js should be included before the rest for it to work. So to avoid all this confusion, I thought of including all these js files in a header file that gets included in all HTML pages.
The thing that I am worried about is the memory bloat and performance problems that it may cause. Please provide your suggestions on the same..
Thanks,
-Keshav
Any script you load into your page, even once downloaded and cached must still be parsed before the rest of the page can load. So in that sense there is a memory penalty, and there's still a potential for something in the script to significantly delay rendering.
However, in the case of a conscientiously designed library such as YUI I would expect the parsing time to be minimised.
If you can load all your scripts in at the end of the page, that can vastly improve performance as the entire page can render before being blocked by javascript execution, and your site will feel a lot snappier.
I would suggest investigating the Firebug Net panel and the YSlow extension to get specific performance stats for your website.
External scripts delay the display of the following html until they have loaded and executed. The impact is much less after the first page load, since they're already cached, though browsers will occasionally check for new versions, which still carries a delay. I try to limit the number of scripts and move the script tags to the bottom of the page when possible. Users won't notice the script loading delay if the page has already fully displayed.
if a given script does nothing, it will not affect the performance.
Obviously the first page will load slowly, but the rest will not need to load all the scripts because they will be cached. So the next pages will load faster
Tips:
1) Load the script at the bottom of the page (just before the closing BODY tag).
2) Use a non-blocking way of loading the scripts. This is the one I'm using .
<script type="text/javascript">
function AttachScript(src) {
var script = document.createElement("script");
script.type = "text/javascript";
document.getElementsByTagName("body")[0].appendChild(script);
script.src = src;
}
AttachScript("/js/jquery.min.js");
AttachScript("/js/ndr.js");
AttachScript("/js/shadowbox.js");
AttachScript("/js/libraries/sizzle/sizzle.js");
AttachScript("/js/languages/shadowbox-es.js");
AttachScript("/js/players/shadowbox-img.js");
AttachScript("/js/adapters/shadowbox-jquery.js");
Can't find the source web page though :-(
Memory Consumption:
Assuming the scripts are well written then memory consumption and performance issues should be nominal. Your biggest problem with including all scripts at once will be the latency in the user experience first time through, or if you make changes, because they will have to download all of them in one hit. I think you should only include the scripts you need per page, not all scripts at once.
You can assess the impact yourself using simple tools like task manager/processes in Windows to monitor memory/processor useage, or plugs ins like Firebug for FireFox.
You can also look into something called minification to help make your script files as small as possible.
Dependencies:
The order in which you include the scripts is important as some scripts may depend on functionality in other scripts. So if the code in one script attempts to run and it requires code in another script that has not been downloaded then it will fail. My advice would be to actually understand those dependancies in your scripts files rather than just downloading everything at once because it seems easier.
Use the YUI Configurator to help determine the required file includes and order, as well as how to use the Yahoo! CDN combo service to combine all YUI files into a single script tag.
http://developer.yahoo.com/yui/articles/hosting/
External assets to the HTML page are typically cached by the browser. External assets are anything requested from the HTML such as images, CSS, JavaScript, and anything else. So if you load all 10 script files up front you are forcing a one time massive download hit to your user. After this one time the user does not need to download the scripts again unless the modify timestamp on the files change.
Your page will only use what it requires. If a particular page requests js4.js and js5.js then all the functions in those files will be loaded into the interpreter in the order in which they are first requested from the HTML and second by the order in which they are specified in each of those files. If there are any namespace conflicts what ever is loaded into the interpreter last wins. The interpreter will clear out the functions once the page is unloaded from the browser.
For efficiency I would suggest using a server-side inclusion process to read each of the js files and include the contents of each file into a new single js file. This will reduce the number of HTTP requests to the server and save your users an extreme amount of bandwidth resources with regard to HTTP headers and GET requests. Also, put the request of this new one script file directly prior to the closing body tag of your HTML. Downloading of scripts block parallel downloads in IE, so you want to load scripts at the lowest possible point in the page.
Scriptaculous implements a nice way to handle js dependencies. Guess you could check it out and "re-implement" it. ;D
As for memory bloat and performance issues... as long as your JS doesn't leak a lot (YUI probably doesn't) memory won't be much of a problem, although it will make your pages load slower, especially if loaded in the header.
You can read on caching methods using PHP to pass on several javascript files as one big JS file which includes everything you need. For additional performance gains, you can make the browser cache the file locally in addition to sending it gzipped (if the browser has support for the encoding using something like ob_start("ob_gzhandler");). By using gzip encoding, you can severely reduce the filesize of the main JS file you're sending which includes all your JS code (since plain text compresses so well). I recently had to do this on my own website and it's worked like a charm for both JS and CSS files.
http://www.ejeliot.com/blog/72
Note that by following the instructions on that tutorial, your JS file will only be sent once and the browser on the client's machine will keep a local copy stored which will also improve performance of every visit thereafter.
Also, consider googling "Minify" which should be hosted on Google Code.

Categories

Resources