Requirejs - versioning - javascript

Quick note - by versioning I mean for the purposes of cache busting. The common practice of adding query params to the end of the script request does not work on all browsers. The easiest and the most messiest way that I have found to date is to version my entire deploy folder name.
-- scripts.v1
-- scripts.v2
But this is incredibly messy and mucks up the deploy times too (I use S3 as my cdn). Does anyone know of an alternate method to this?
EDIT
It seems, I have not been very clear. Let me be a bit more explicit.
I use requirejs on my site. It is quite a JavaScript heavy application with frequent updates and iterations. With requirejs in place now, the only way I can reliably make sure that browsers are serving the latest version, is to version my whole deploy folder name (javascript) and upload the whole lot of files to S3 again. I then use the data-main method to set the base path of the project.
For many reason, this is quite cumbersome. Even if the code change is just a few lines, the whole process has to be repeated. Is there some other decent method to let requirejs know that files have versions? As in, if I call
require(["superImportantJSFile"], function(){})
it will know that the current version is superImportantJSFile.v4.js or something along those lines.
I hope I have been more clear now. Any suggestions as to how the community in general does this? I'm pretty sure this has to be a common scenario, but I haven't been able to find a good solution to this yet

I like to use a post-build step that puts static resources into a folder with a path that includes the version control version number. For example source control revision number 1234 would lead to the creation of a path: /1234/scripts/*. These directories are also created in the CDN with the correct version of the assets within.
In our require.js config in a template, we replace the baseURL with the appropriate revision, which is controlled via a config file, eg:
var require = {
baseUrl: "%%resDir%%",
...
};
This makes it easy to change the asset versions between a few different releases, which can all stay on the CDN without causing any conflicts. It also solves the browser cache busting issue.

The HTML5 Boilerplate offers one of the most graceful solutions I have seen. They have configs available for Apache and nginx. From there you can just add a timestamp to the filename within your script tags, like so:
<script src="scripts/app.20130728.js"></script>
Which the web server would rewrite to scripts/app.js.

You can add aliases to your RequireJS configuration by using map (see http://requirejs.org/docs/api.html#config-map) for example:
require.config({ /* ... other config.... */
map: { '*': {'superImportantJSFile': 'superImportantJSFile.v4'} }
})
So you only have one place to update :)
You mentioned the use of a CDN which is a good use case to not put those files in your minimized r.js bundle (in case that you are using that tool). But if those files are updated frequently, maybe it makes sense to pack your modules with r.js and update the whole code.

Related

Including node modules on my page

What's the best way of including a node module on my webpage?
Should I use absolute paths like <script src="../../node_modules/bootstrap/dist/js/bootstrap.min.js"></script> or is there an easier and better way of doing it?
Thank you.
Add this in your app.js-file
app.use('/placeholder', express.static(__dirname + '/node_modules/'));
This allows you to write:
<script src="/placeholder/bootstrap/dist/js/bootstrap.min.js"></script>
And Node will redirect that path to the node_modules folder.
You can change placeholder to whatever you want, mine is named scripts
Node modules are designed for server-side execution in the NodeJS environment, not for use in a browser. So the best way to include them in a web page is not to.
I will note that Bootstrap is a client-side framework, so it makes no sense as a NodeJS module in the first place.
If you're looking for an npm-like tool for client-side packages, the flavor of the week is Bower.
My answer to this question would be a bit different and a little advance.
Maybe it won't be helpful to you right now, but definitely, in future, it will. Also, it would help out others.
Yes, the first answer is absolutely correct, you can add a placeholder and then add links relative to it in your index file.
Now, are you using a task runner like gulp or grunt? If not I would recommend you to start using it, because you can cut out a lot of manual work use these tools and eventually save a heck load of time.
You might be thinking why am I talking about gulp or grunt over here. I will answer this questing shortly.
Since you are using node.js, you already know how modules are loaded in Node.
eg : require('express');
What if we could use this approach for our client applications? You would only have to include one js file in your html and that js file would require all the other js libraries for you.
Great, suddenly you can reduce the number of script tags in your html page from approx around 20-30 to 1.
This is where module loaders come into picture.
But browsers do not understand the require statement.
To deal with this we use a tool called a browserify, we can use gulp(which I talked about earlier) to configure a task to browserify our files.
When you use this, you will have to require all your js libraries and your own js files into a single file (say: app.js). But as we said browsers do not understand require. This is where browserify will take this app.js file pre process it and spit out a single file that should be included into your html.
You can follow this article to get a clear picture on how to achieve this.
Scalable app using Gulp and browserify.
Pretty neat right! :)
Some of the other module loaders are System.js and webpack

Google CDN for Angular Dependencies?

Is there a way to reduce the following includes down to one?
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-route.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-sanitize.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-animate.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-cookies.min.js"></script>
I cannot find a combined version of these hosted on Google's CDN.
I have been looking for a bundle myself, but haven't found one yet. Seems to me you have to bundle them manually if you want to have them all in one js. file.
I was thinking about creating a grunt task (or similar) to fetch all dependencies and merge them into one file. I know you want to use a CDN, but just wanted to share that thought.
update
For anyone interested in the latter, just came across this grunt-fetch-from-cdn plugin. Haven't tried it myself yet, but looks interesting.
I would argue that the main benefit of a CDN is for everyone to be using the same files, thus allowing for caching to remove the need to load the file at all for most visitors due to its widespread use across other sites.
Assumably, the number of permutations required to bundle various configurations of Angular dependencies would completely negate this benefit, and you would be better off packaging the bundle with all of your other JS for the lowest possible number of requests and serving it yourself.
However, it does seem as if Angular updates rather frequently which, while good for bug fixes, means that there are probably many different versions (and thus files) in use in production environments at the moment. This will also lower the benefit of caching across various sites.
When in doubt, test both methods across devices from friends/family/work/etc. that have seen normal internet usage on sites other than your own.
I would guess that in most cases it would be smarter to just include each module's CDN link separately like you did above and let caching take care of reducing the actual number of requests. If that becomes common practice then the extra number of files won't have much impact on load time.
I agree with Colt, but the following can be useful if used wisely (see "Load multiple files with a single HTTP request"): JSDelivr
Either you can use gulp task to build them up into a single script or you can use bower to install these dependencies at once.

Why use requireJS instead of an ordered include list?

I've been using a grunt file to concatenate all my JS into a single file which is then sent to the client. What advantage do I have in using require calls then? The dependencies are inherent from the concatenation order and I don't have to muddy all my JS with extra code and another third-party library.
Further, backbone models (for example) clearly state their inheritance in their definitions. Not to mention that they simply wouldn't work if their dependencies weren't included anyway.
Also, wouldn't maintenance be easier if all comments related to dependencies were in one place (the grunt file) to prevent human error and having to open every JS file to understand its dependencies?
EDIT
My (ordered) file list looks something like:
....
files: [
"js/somelib.js",
"js/somelib2.js",
"js/somelib3.js",
"js/models.js",
"js/views.js",
"js/controllers.js",
"js/main.js"
], ...
So perhaps requireJS isn't worth it for small projects anyway.
Using require.js allow you to break down each part of your application into reusable modules (AMD) and to manage those dependencies easily. It is not easy to manage dependencies in a javascript application with 100 classes, for example.
Also, if you don't want all the overhead of require, check this out (developed by the same guy who created require.js): https://github.com/jrburke/almond
The answer depends on the size of your app and the end use case..
A single site.min.js payload for the front end (client) generally aims for small file sizes and simple architectures (1 single file generated from maybe 10).
back end based (server) apps are usually much bigger and complicated and therefore may warrant the use of another tool to help with managing large code libraries and dependencies (50 files for example).
In general, RequireJS is worthwhile but only if you have many files and dependencies. An alternative for use in the client would be almond. Again, using a tool like this must warrant the need (many files and dependencies).
The answer from orourkedd is also worth reading.

Managing jQuery Plugins

Often, when working with jQuery, the need arises to include multiple plugins. This can quickly become messy work, especially when some plugins require additional components (images and CSS files).
What are some of the "recommended" ways to:
a. Manage the required files/components (.js, .css and images) in a way that is easy to maintain, and;
b. Keep these plugin packages updated to the latest versions
I'm not necessarily looking for a tool to do this (although one that could perform this management would be useful, I suppose), but more of a way of thinking.
Update: These days there is Bower, Component and Browserify which take care of all of the following for us automatically.
I'm surprised no one has covered what I do yet. So here's how I manage scripts and resources.
I have each project I work on setup with SVN. Nearly all of the scripts I include have a SVN mirror (github has svn these days) this means that I can then use SVN externals and fetch whatever branch or version or whatever I want of that project directly into the projects scripts folder. As we are using SVN, it is easy to track, manage and update these scripts.
If a project is not on SVN, then I just add it to a common SVN project I have made, so for instance Project A and Project B, both use jquery-project-not-in-svn, so we stick jquery-project-not-in-svn into our common project's SVN repository, and then use SVN externals on Projects A and B to reference it - as explained before.
Now that covers managing, fetching and updating.
Here is how I cover script inclusions and requests.
As each project now has it's own scripts directory that contains all the scripts it needs (which is managed by SVN externals), we now have to worry about minifying them to reduce load on our server. Each project has a Makefile in it's root, which contains the command update. This command will perform the following:
Perform a SVN update (this will update all SVN externals appropriately)
Once that is done, it will pack and minify all the js files into scripts/all.js and scripts/all.min.js
I can't share the exact Makefile but I can share one which is public that handles packing/merging and minification of CSS and Javascript. Here is the link:
http://github.com/balupton/jquery-sparkle/blob/9921fcbf1cbeab7a4f2f875a91cb8548f3f65721/Makefile
By doing these things, we have achieved:
Management of external script resources over multiple projects
Updating of appropriate script resources automatically
Packing all used script resources of the project into one file
Minifying that file, such that only one JS request and one CSS request are performed.
So good luckmate, feel free to post a comment if you would like to learn more.
I would recommend not updating them unless you are experiencing a problem with the version you have or you would like to use a new feature available in the updated plugin. As the saying goes, if it ain't broke, don't fix it.
My own personal "recommended" way is to keep all my JavaScript files in one include folder, all CSS files in another, and all images in a third directory. I write shortcut functions for my projects that I can then use like <?php scriptlink( 'jquery.tooltip' ); ?> or <?php stylelink( 'jquery.thickbox' ); ?>. Each shortcut function takes a filename (only) as an argument and outputs the full HTML tag for that resource type, i.e. (in order) <script type="text/javascript" src="/includes/js/jquery.tooltip.js"></script> or <link rel="stylesheet" href="/includes/css/jquery.thickbox.css" />
Most jQuery plugins I've run across that require images allow either specifying a configuration variable in the script itself or in the code used to invoke the plugin. Stylesheets are quite easily included without mucking about with the script.
So far this method has kept me pretty sane, so I think it works rather well. I don't tear my hair out over where I stuck a particular plugin; I just include it with a function. (The system also supports subdirectories of the include directory, so e.g. <?php scriptlink( 'ui/accordion' ); ?> equals <script type="text/javascript" src="/includes/js/ui/accordion.js"></script>.)
YMMV of course, but the only issue I've had at all is with upgrades when plugin authors start distributing a jquery.plugin.pack.js version instead of jquery.plugin.min.js or vice versa, because I actually have to remember to change the filenames I look for.
(Since I've omitted the implementation of those simple functions, perhaps your version will check for different variants of the file name given. If the argument to scriptlink() is jquery.plugin, the function might check the file system to see if jquery.plugin.pack.js exists, and if not look for jquery.plugin.min.js, and if not look for jquery.plugin.js, etc.)
CDNs are great but not for debugging. Sometimes debugging really requires local access
to the scripts and CDNs are useless until in production mode. For this reason I still like
to keep both debug and minified versions around then compare results and benchmark response time until we shift to production.
All of my jQuery plugins are organised into subfolders which include the version number e.g.
/assets/js/plugin.1.4.1/plugin.1.4.1.min.js
/assets/js/plugin.1.4.1/images/image.gif
If I need to update to 1.4.2 I can drop it in a new folder without too many problem, I can even use a specific version of the plugin in different parts of the site if needed. When I site is large and your using a few different plugins it's helpful to quickly see version numbers without digging around source comments in a plugin.js file.
If a plugin requires CSS I will take the base styles out of the plugin CSS and bundle these in with my main stylesheet, requesting additional CSS files is expensive and 9 times out of 10 it will be customised anyway. Likewise with images, if I'm doing any image customisation I will bundle these into my main image sprite, otherwise I'll just link to the images into that plugin.1.4.1 directory.
Yes, you end up with a few more files in your repo but it means:
you can easily upgrade plugins just by updating your paths
you can debug plugin issues easier because you can see how out of date you are
you can roll back to an earlier version if everything breaks
You could utilize the Google CDN (Content Delivery Network) for more popular plug-ins. Google keeps it up-to-date, you can quickly choose/switch between versions, and you also get the benefits of caching from other websites that use CDN.
Example for jQuery:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.0/jquery.min.js"></script>
And, if you want to use a higher version automatically, change the version to 1.4 (automatic 1.4.x updates) or even 1 (automatic 1.x.x updates). Unfortunately not all plug-ins are available, but many of the major ones are.

jQuery file name

This one should be easy, and I think I know the right answer, but here goes.
For compatibility reasons, should I leave the filename of jQuery as "jquery-1.3.2.min.js" or just rename it to jquery.js?
My guess is leave it as is to avoid conflicts in case another app uses a different version of jQuery. If they've renamed it to "jquery.js" and I do the same, I see potential version conflicts.
Am I wrong or way off base?
Jeff
It's a very good idea to have version-numbered JS (and CSS) files, because that lets you configure your web server to use a far-future Expires header on such files without running into caching problems. When the file gets updated, it gets a new version number, so the browser always fetches the new version, not the old cached one.
You should do this on your other JS and CSS files, too. You want this to be automated, not something you manage by hand. Your development work happens on unversioned files, and your versioning system creates versioned copies and works out the details of updating the references to the CSS and JS files in the HTML files to point to the versioned copies. This can be a bit of work, but well worth it when it comes to speeding up your site. It took me about a day to set my system up. The improvement wasn't subtle.
I would go with jquery-1.3.2.min.js because it's more specific and you can immediately tell if you're reviewing this site in months to come, as well as avoiding any filename confliction in the future.
You shouldn't have any issues with updating, if you're relying on something like an include/template file for the javascript.
In my opinion, its just a personal preference. If you have version in your file name, It helps you easily identify which one you are using with out actually opening the file. It also provides an indirect way of clients downloading the new version file (as it is never cached). If you don't use the ext, upgrading to newer version is easy in coding perspective, but takes the pain of force downloading the new file by all users.
Recommended way to use jQuery in app is using the google's hosting..
google.load("jquery", "1.3.2");
google.setOnLoadCallback(function() {
// Place init code here instead of $(document).ready()
});
Why and how to use jQuery hosted on google
I prefer to leave the version in the file name because there are times when you are changing versions and this is very helpful. At a glance I can see which version I am using on any given webpage.

Categories

Resources