Google CDN for Angular Dependencies? - javascript

Is there a way to reduce the following includes down to one?
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-route.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-sanitize.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-animate.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-cookies.min.js"></script>
I cannot find a combined version of these hosted on Google's CDN.

I have been looking for a bundle myself, but haven't found one yet. Seems to me you have to bundle them manually if you want to have them all in one js. file.
I was thinking about creating a grunt task (or similar) to fetch all dependencies and merge them into one file. I know you want to use a CDN, but just wanted to share that thought.
update
For anyone interested in the latter, just came across this grunt-fetch-from-cdn plugin. Haven't tried it myself yet, but looks interesting.

I would argue that the main benefit of a CDN is for everyone to be using the same files, thus allowing for caching to remove the need to load the file at all for most visitors due to its widespread use across other sites.
Assumably, the number of permutations required to bundle various configurations of Angular dependencies would completely negate this benefit, and you would be better off packaging the bundle with all of your other JS for the lowest possible number of requests and serving it yourself.
However, it does seem as if Angular updates rather frequently which, while good for bug fixes, means that there are probably many different versions (and thus files) in use in production environments at the moment. This will also lower the benefit of caching across various sites.
When in doubt, test both methods across devices from friends/family/work/etc. that have seen normal internet usage on sites other than your own.
I would guess that in most cases it would be smarter to just include each module's CDN link separately like you did above and let caching take care of reducing the actual number of requests. If that becomes common practice then the extra number of files won't have much impact on load time.

I agree with Colt, but the following can be useful if used wisely (see "Load multiple files with a single HTTP request"): JSDelivr

Either you can use gulp task to build them up into a single script or you can use bower to install these dependencies at once.

Related

Website load performance

I was planning to improve my website load performance with public CDN using requirejs, so that I can have local fallback when public cdn fail. But soon, I found out that requirejs has the optimize function, which combines all required module into one big js file.
So, which one is the better practice and have better performance? Loads multiple JS files across multiple public CDN or one big JS file from local?
You forgot option 3: one big file from CDN. It's usually best to have as little requests as possible, because less than the network speed (kb/s), network latency is a problem.
But, if your code changes a lot (and users often return), it might be a reasonable idea, to concatenate all your libraries and plugins into one file and your own code into another, instead of having 1 big file all together.
This way, if you make a small change, it's not 1 big file that changes and has to be downloaded again, but a user will only need to get your script file and not the (unchanged) libraries again. And those outweigh the custom code by 10:1 in my experience.
Also, if you use a CDN link for jQuery, best case it's already in the users cache.
In the end I'd also advise you to test the different options (Chrome DevTools' network tab). I had a case where a CDN link was a perceivable slower than a "local" link (maybe due to my location, Australia).
It will depend on the nature of your website.
For instance, if most of the modules are required on most pages you'll benefit more for the bundled option.
If you have many modules and most modules only appear on a single/few pages, then you'll benefit more for the CDN option.
If you expect most traffic to come from repeat users, as opposed to unique users you might also benefit more from the bundled option.

Requirejs - versioning

Quick note - by versioning I mean for the purposes of cache busting. The common practice of adding query params to the end of the script request does not work on all browsers. The easiest and the most messiest way that I have found to date is to version my entire deploy folder name.
-- scripts.v1
-- scripts.v2
But this is incredibly messy and mucks up the deploy times too (I use S3 as my cdn). Does anyone know of an alternate method to this?
EDIT
It seems, I have not been very clear. Let me be a bit more explicit.
I use requirejs on my site. It is quite a JavaScript heavy application with frequent updates and iterations. With requirejs in place now, the only way I can reliably make sure that browsers are serving the latest version, is to version my whole deploy folder name (javascript) and upload the whole lot of files to S3 again. I then use the data-main method to set the base path of the project.
For many reason, this is quite cumbersome. Even if the code change is just a few lines, the whole process has to be repeated. Is there some other decent method to let requirejs know that files have versions? As in, if I call
require(["superImportantJSFile"], function(){})
it will know that the current version is superImportantJSFile.v4.js or something along those lines.
I hope I have been more clear now. Any suggestions as to how the community in general does this? I'm pretty sure this has to be a common scenario, but I haven't been able to find a good solution to this yet
I like to use a post-build step that puts static resources into a folder with a path that includes the version control version number. For example source control revision number 1234 would lead to the creation of a path: /1234/scripts/*. These directories are also created in the CDN with the correct version of the assets within.
In our require.js config in a template, we replace the baseURL with the appropriate revision, which is controlled via a config file, eg:
var require = {
baseUrl: "%%resDir%%",
...
};
This makes it easy to change the asset versions between a few different releases, which can all stay on the CDN without causing any conflicts. It also solves the browser cache busting issue.
The HTML5 Boilerplate offers one of the most graceful solutions I have seen. They have configs available for Apache and nginx. From there you can just add a timestamp to the filename within your script tags, like so:
<script src="scripts/app.20130728.js"></script>
Which the web server would rewrite to scripts/app.js.
You can add aliases to your RequireJS configuration by using map (see http://requirejs.org/docs/api.html#config-map) for example:
require.config({ /* ... other config.... */
map: { '*': {'superImportantJSFile': 'superImportantJSFile.v4'} }
})
So you only have one place to update :)
You mentioned the use of a CDN which is a good use case to not put those files in your minimized r.js bundle (in case that you are using that tool). But if those files are updated frequently, maybe it makes sense to pack your modules with r.js and update the whole code.

How to utilize a js minimization for a web site of a big organization

I am looking for a JS minimization (maybe CSS as well) tool to use in our website. The site is fairly big and we cant manually minify files individually. We are also planning to use Long term caching for files and need to append like a version number to each file. I am afraid that this is very hard to keep track of when publishing frequently.
I know of tools like YUI Compressor, etc.. is there, but I am not sure how they are used for a big project like I have. Technically, I am looking for a script or an app that can be called after our development is finished to utilize it with the minified versions of files.
What are the common practices big companies use/follow these days ?? Any help is appreciated. I am just not sure what to search for.
Thank you.
I advise you to use a kind of makefile toolchain (there are many, for example ant or maven) to :
concatenate your js files in one file
then minify the resulting files (I use Google Closure Compiler, called with an ant target)
Note that making one file is the most important operation as on modern networks the latency due to the number of requests is much more a burden than the total size. This way you can easily work with dozens or hundreds of js (or css) files and don't hesitate to make a new one as soon as it helps the code source being readable and maintainable.
And this eliminates the need for the (manual or not) management of visible versionning of files for caching reasons.
As said recently in another answer, to help debug, my deployement scripts always make two versions in parallel : one non concatenated/minified and one concatenated/minified. The uncompressed version enables the development/test onsite without any deployement operation.

Bundling .js files vs CDN

In order to improve performance of our web pages, we are recommended to use CDNs to serve .js files on our web pages. That makes sense.
Also, we are recommended to bundle our .js files in order to reduce the number of requests which are being made to server on load.
So, we need to sit down and make decision between if we use CDN or bundle .js files.
What are the pros and cons? Which ones make more sense?
Why can't you bundle them and place them are the CDN? It should hardly be a decision of one or the other?
If you have to choose one or the other, it depends on how many .js files you are including. For a small number of files, I'd suggest that a CDN would be quicker, where-as for a greater number of files, a bundle of .js files would definitely be quicker. Where the switch-over would be, is something for you to experiment with.
My answer: both. Bundle them and place them on a CDN.
The downside of doing this? Depends. What does you build process look like? Can you easily automate the bundling and minification? Are you using Yahoo YUI or Google Closure or something else?
Also, if there is a lot of GUI dependent jQuery there might be some time consuming friction due to constantly changing elements/effects/css.
Testing is important too because due to possible minification quirks.
Bottom line: 5 javascript files safely bundled into 1 file === 4 fewer requests.
A page with just plain old Html and one external javascript reference === 2 requests to your server. However, a page with just plain old Html and one external javascript reference on a CDN === 1 request to your server.
Currently we are using the Google Closure tools. The Google Closure Inspector helps with the following:
Closure Compiler modifies your original JavaScript code and produces code that's smaller and more efficient than the original, but harder to read and debug. Closure Inspector helps by providing a source mapping feature, which identifies the line of original source code that corresponds to the compiled code.
As others have already stated, the answer is both if possible. Bundled (and minifying) gives a benefit to your users because it decreases the page weight. The CDN benefits your servers because you are offloading work. Generally speaking, you need not optimize either unless you have observed performance issues or you just have nothing better to do.
There's a few things you need to think about...
How much of the JS do you need to load early in the page load, and how much can you delay until later?
If you can delay loading JS (e.g. put it at the bottom of the page) or load it asynchronously as Google Analytics does, then you will minimise the amount of time downloading the JS spends blocking the UI thread.
After working out how the load of the JS can be split, I'd deal with the merge / minify of the various JS files - cutting down HTTP requests is key to improving performance.
Then look at moving to the CDN and ensure the CDN can serve the JS content compressed and allow you to set headers so it's "cached forever" (you'll need to version the files if you cache forever). A CDN helps reduce the latency but will also reduce size by being cookieless
Other thing you might want to consider is setting up a separate domain for static content, point it to your server(s) while you sort things out and then switch to a CDN if it looks worthwhile.
Andy

Managing jQuery Plugins

Often, when working with jQuery, the need arises to include multiple plugins. This can quickly become messy work, especially when some plugins require additional components (images and CSS files).
What are some of the "recommended" ways to:
a. Manage the required files/components (.js, .css and images) in a way that is easy to maintain, and;
b. Keep these plugin packages updated to the latest versions
I'm not necessarily looking for a tool to do this (although one that could perform this management would be useful, I suppose), but more of a way of thinking.
Update: These days there is Bower, Component and Browserify which take care of all of the following for us automatically.
I'm surprised no one has covered what I do yet. So here's how I manage scripts and resources.
I have each project I work on setup with SVN. Nearly all of the scripts I include have a SVN mirror (github has svn these days) this means that I can then use SVN externals and fetch whatever branch or version or whatever I want of that project directly into the projects scripts folder. As we are using SVN, it is easy to track, manage and update these scripts.
If a project is not on SVN, then I just add it to a common SVN project I have made, so for instance Project A and Project B, both use jquery-project-not-in-svn, so we stick jquery-project-not-in-svn into our common project's SVN repository, and then use SVN externals on Projects A and B to reference it - as explained before.
Now that covers managing, fetching and updating.
Here is how I cover script inclusions and requests.
As each project now has it's own scripts directory that contains all the scripts it needs (which is managed by SVN externals), we now have to worry about minifying them to reduce load on our server. Each project has a Makefile in it's root, which contains the command update. This command will perform the following:
Perform a SVN update (this will update all SVN externals appropriately)
Once that is done, it will pack and minify all the js files into scripts/all.js and scripts/all.min.js
I can't share the exact Makefile but I can share one which is public that handles packing/merging and minification of CSS and Javascript. Here is the link:
http://github.com/balupton/jquery-sparkle/blob/9921fcbf1cbeab7a4f2f875a91cb8548f3f65721/Makefile
By doing these things, we have achieved:
Management of external script resources over multiple projects
Updating of appropriate script resources automatically
Packing all used script resources of the project into one file
Minifying that file, such that only one JS request and one CSS request are performed.
So good luckmate, feel free to post a comment if you would like to learn more.
I would recommend not updating them unless you are experiencing a problem with the version you have or you would like to use a new feature available in the updated plugin. As the saying goes, if it ain't broke, don't fix it.
My own personal "recommended" way is to keep all my JavaScript files in one include folder, all CSS files in another, and all images in a third directory. I write shortcut functions for my projects that I can then use like <?php scriptlink( 'jquery.tooltip' ); ?> or <?php stylelink( 'jquery.thickbox' ); ?>. Each shortcut function takes a filename (only) as an argument and outputs the full HTML tag for that resource type, i.e. (in order) <script type="text/javascript" src="/includes/js/jquery.tooltip.js"></script> or <link rel="stylesheet" href="/includes/css/jquery.thickbox.css" />
Most jQuery plugins I've run across that require images allow either specifying a configuration variable in the script itself or in the code used to invoke the plugin. Stylesheets are quite easily included without mucking about with the script.
So far this method has kept me pretty sane, so I think it works rather well. I don't tear my hair out over where I stuck a particular plugin; I just include it with a function. (The system also supports subdirectories of the include directory, so e.g. <?php scriptlink( 'ui/accordion' ); ?> equals <script type="text/javascript" src="/includes/js/ui/accordion.js"></script>.)
YMMV of course, but the only issue I've had at all is with upgrades when plugin authors start distributing a jquery.plugin.pack.js version instead of jquery.plugin.min.js or vice versa, because I actually have to remember to change the filenames I look for.
(Since I've omitted the implementation of those simple functions, perhaps your version will check for different variants of the file name given. If the argument to scriptlink() is jquery.plugin, the function might check the file system to see if jquery.plugin.pack.js exists, and if not look for jquery.plugin.min.js, and if not look for jquery.plugin.js, etc.)
CDNs are great but not for debugging. Sometimes debugging really requires local access
to the scripts and CDNs are useless until in production mode. For this reason I still like
to keep both debug and minified versions around then compare results and benchmark response time until we shift to production.
All of my jQuery plugins are organised into subfolders which include the version number e.g.
/assets/js/plugin.1.4.1/plugin.1.4.1.min.js
/assets/js/plugin.1.4.1/images/image.gif
If I need to update to 1.4.2 I can drop it in a new folder without too many problem, I can even use a specific version of the plugin in different parts of the site if needed. When I site is large and your using a few different plugins it's helpful to quickly see version numbers without digging around source comments in a plugin.js file.
If a plugin requires CSS I will take the base styles out of the plugin CSS and bundle these in with my main stylesheet, requesting additional CSS files is expensive and 9 times out of 10 it will be customised anyway. Likewise with images, if I'm doing any image customisation I will bundle these into my main image sprite, otherwise I'll just link to the images into that plugin.1.4.1 directory.
Yes, you end up with a few more files in your repo but it means:
you can easily upgrade plugins just by updating your paths
you can debug plugin issues easier because you can see how out of date you are
you can roll back to an earlier version if everything breaks
You could utilize the Google CDN (Content Delivery Network) for more popular plug-ins. Google keeps it up-to-date, you can quickly choose/switch between versions, and you also get the benefits of caching from other websites that use CDN.
Example for jQuery:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.0/jquery.min.js"></script>
And, if you want to use a higher version automatically, change the version to 1.4 (automatic 1.4.x updates) or even 1 (automatic 1.x.x updates). Unfortunately not all plug-ins are available, but many of the major ones are.

Categories

Resources