Managing jQuery Plugins - javascript

Often, when working with jQuery, the need arises to include multiple plugins. This can quickly become messy work, especially when some plugins require additional components (images and CSS files).
What are some of the "recommended" ways to:
a. Manage the required files/components (.js, .css and images) in a way that is easy to maintain, and;
b. Keep these plugin packages updated to the latest versions
I'm not necessarily looking for a tool to do this (although one that could perform this management would be useful, I suppose), but more of a way of thinking.

Update: These days there is Bower, Component and Browserify which take care of all of the following for us automatically.
I'm surprised no one has covered what I do yet. So here's how I manage scripts and resources.
I have each project I work on setup with SVN. Nearly all of the scripts I include have a SVN mirror (github has svn these days) this means that I can then use SVN externals and fetch whatever branch or version or whatever I want of that project directly into the projects scripts folder. As we are using SVN, it is easy to track, manage and update these scripts.
If a project is not on SVN, then I just add it to a common SVN project I have made, so for instance Project A and Project B, both use jquery-project-not-in-svn, so we stick jquery-project-not-in-svn into our common project's SVN repository, and then use SVN externals on Projects A and B to reference it - as explained before.
Now that covers managing, fetching and updating.
Here is how I cover script inclusions and requests.
As each project now has it's own scripts directory that contains all the scripts it needs (which is managed by SVN externals), we now have to worry about minifying them to reduce load on our server. Each project has a Makefile in it's root, which contains the command update. This command will perform the following:
Perform a SVN update (this will update all SVN externals appropriately)
Once that is done, it will pack and minify all the js files into scripts/all.js and scripts/all.min.js
I can't share the exact Makefile but I can share one which is public that handles packing/merging and minification of CSS and Javascript. Here is the link:
http://github.com/balupton/jquery-sparkle/blob/9921fcbf1cbeab7a4f2f875a91cb8548f3f65721/Makefile
By doing these things, we have achieved:
Management of external script resources over multiple projects
Updating of appropriate script resources automatically
Packing all used script resources of the project into one file
Minifying that file, such that only one JS request and one CSS request are performed.
So good luckmate, feel free to post a comment if you would like to learn more.

I would recommend not updating them unless you are experiencing a problem with the version you have or you would like to use a new feature available in the updated plugin. As the saying goes, if it ain't broke, don't fix it.

My own personal "recommended" way is to keep all my JavaScript files in one include folder, all CSS files in another, and all images in a third directory. I write shortcut functions for my projects that I can then use like <?php scriptlink( 'jquery.tooltip' ); ?> or <?php stylelink( 'jquery.thickbox' ); ?>. Each shortcut function takes a filename (only) as an argument and outputs the full HTML tag for that resource type, i.e. (in order) <script type="text/javascript" src="/includes/js/jquery.tooltip.js"></script> or <link rel="stylesheet" href="/includes/css/jquery.thickbox.css" />
Most jQuery plugins I've run across that require images allow either specifying a configuration variable in the script itself or in the code used to invoke the plugin. Stylesheets are quite easily included without mucking about with the script.
So far this method has kept me pretty sane, so I think it works rather well. I don't tear my hair out over where I stuck a particular plugin; I just include it with a function. (The system also supports subdirectories of the include directory, so e.g. <?php scriptlink( 'ui/accordion' ); ?> equals <script type="text/javascript" src="/includes/js/ui/accordion.js"></script>.)
YMMV of course, but the only issue I've had at all is with upgrades when plugin authors start distributing a jquery.plugin.pack.js version instead of jquery.plugin.min.js or vice versa, because I actually have to remember to change the filenames I look for.
(Since I've omitted the implementation of those simple functions, perhaps your version will check for different variants of the file name given. If the argument to scriptlink() is jquery.plugin, the function might check the file system to see if jquery.plugin.pack.js exists, and if not look for jquery.plugin.min.js, and if not look for jquery.plugin.js, etc.)

CDNs are great but not for debugging. Sometimes debugging really requires local access
to the scripts and CDNs are useless until in production mode. For this reason I still like
to keep both debug and minified versions around then compare results and benchmark response time until we shift to production.

All of my jQuery plugins are organised into subfolders which include the version number e.g.
/assets/js/plugin.1.4.1/plugin.1.4.1.min.js
/assets/js/plugin.1.4.1/images/image.gif
If I need to update to 1.4.2 I can drop it in a new folder without too many problem, I can even use a specific version of the plugin in different parts of the site if needed. When I site is large and your using a few different plugins it's helpful to quickly see version numbers without digging around source comments in a plugin.js file.
If a plugin requires CSS I will take the base styles out of the plugin CSS and bundle these in with my main stylesheet, requesting additional CSS files is expensive and 9 times out of 10 it will be customised anyway. Likewise with images, if I'm doing any image customisation I will bundle these into my main image sprite, otherwise I'll just link to the images into that plugin.1.4.1 directory.
Yes, you end up with a few more files in your repo but it means:
you can easily upgrade plugins just by updating your paths
you can debug plugin issues easier because you can see how out of date you are
you can roll back to an earlier version if everything breaks

You could utilize the Google CDN (Content Delivery Network) for more popular plug-ins. Google keeps it up-to-date, you can quickly choose/switch between versions, and you also get the benefits of caching from other websites that use CDN.
Example for jQuery:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.0/jquery.min.js"></script>
And, if you want to use a higher version automatically, change the version to 1.4 (automatic 1.4.x updates) or even 1 (automatic 1.x.x updates). Unfortunately not all plug-ins are available, but many of the major ones are.

Related

How to use individual JS files in Bootstrap 5?

I am using Bootstrap, but would like to reduce the size of the Javascript.
I only need dropdown/collapse and sometimes carousel, so I want to include only those.
There is a folder "dist" with every single script individually.
I tried including them via -SCRIPT- tag. It does not work at all and produces lots of errors in the console.
Do I need specific other script files too, or is the JS in the dist-folder just not suitable for that?
Please forgive me, I have very little knowledge about JS and english is not my first language.
Simply put, how do I include only the needed JS into Bootstrap5?
I am on Windows and do NOT have NPM or any other bundler/packager/installer.
I am surprised, there is no dedicated website for configuring the JS.
I googled a lot but did not find anything related to my question.
First call basic utilities then call individual components [bootstrap 5.2.3]
My first observation is that you may be heading down the premature optimisation path. The difference between the minimal bootstrap build, and the individual components isn't huge. And on top of this, the main advantage of using a CDN is that the browser will likely have already loaded and cached it (from use in another site: it's a common resource) so trying to do anything non-standard will increase load-times, not reduce them.
But if you're set on using the individual components, they are available on the CDN too, as described on the bootstrap site.
Make sure to use the integrity and crossorigin attributes to protect your site from leaking information to the CDN, and also being attacked via the CDN. If you're new to this, have a read of this page on subresource integrity.
Use Bootstrap Cdn
You can simply use this link
<script src="https://cdn.jsdelivr.net/npm/bootstrap#5.1.3/dist/js/bootstrap.bundle.min.js"></script>
Now u dont need to include from folder

Google CDN for Angular Dependencies?

Is there a way to reduce the following includes down to one?
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-route.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-sanitize.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-animate.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.1/angular-cookies.min.js"></script>
I cannot find a combined version of these hosted on Google's CDN.
I have been looking for a bundle myself, but haven't found one yet. Seems to me you have to bundle them manually if you want to have them all in one js. file.
I was thinking about creating a grunt task (or similar) to fetch all dependencies and merge them into one file. I know you want to use a CDN, but just wanted to share that thought.
update
For anyone interested in the latter, just came across this grunt-fetch-from-cdn plugin. Haven't tried it myself yet, but looks interesting.
I would argue that the main benefit of a CDN is for everyone to be using the same files, thus allowing for caching to remove the need to load the file at all for most visitors due to its widespread use across other sites.
Assumably, the number of permutations required to bundle various configurations of Angular dependencies would completely negate this benefit, and you would be better off packaging the bundle with all of your other JS for the lowest possible number of requests and serving it yourself.
However, it does seem as if Angular updates rather frequently which, while good for bug fixes, means that there are probably many different versions (and thus files) in use in production environments at the moment. This will also lower the benefit of caching across various sites.
When in doubt, test both methods across devices from friends/family/work/etc. that have seen normal internet usage on sites other than your own.
I would guess that in most cases it would be smarter to just include each module's CDN link separately like you did above and let caching take care of reducing the actual number of requests. If that becomes common practice then the extra number of files won't have much impact on load time.
I agree with Colt, but the following can be useful if used wisely (see "Load multiple files with a single HTTP request"): JSDelivr
Either you can use gulp task to build them up into a single script or you can use bower to install these dependencies at once.

what is a preferred way to include bootstrap jQuery etc libs into project?

I recently started using js libs and have a question regarding them.
It's possible to include their source, but then there is a problem with versions, as there are two options: add version to file name, but then all includes will have version appended to file name, which will cause trouble when you will update version. If version isn't specified in file name it's not clear what version is, but it's not that big problem, as you can go inside js source and see it's version.
Another option is to link to libraries hosting url, but it'll add additional overhead to download them and when external host will be unreachable, your site won't be able to load that library.
There seem to be maven plugins for some js libraries, but they are usually 3rd party and frequently they refer to outdated versions.
The ideal solution will be something maven-like but with official support.
Also as a comment advises it's possible to use some sort of bundling, but bundling happens after building, so it's still a question how to keep those js libs before bungling.
Please advise.
For many projects it is not necessary to stay at the bleeding edge of 3rd party libraries. Like for jQuery, a new version can maybe break some of the plugins you use. So you have to check and test everything first before deploying a new version.
Having the version in the filename is considered good practice though, because it prevents caching issues and allows you to cache files for a very long time (since the browser will always download a file when the filename has changed).
Regarding the issue you pointed out with the libraries hosting url, they are true so far. But you also need to consider, that when those are widely used (which they are) the library may already be cached in your browser and therefore the browser won't need to download it again. You can check out https://developers.google.com/speed/libraries/devguide for a library hosting by Google, which you can expect to be pretty reliable I guess.
All that being said, it depends on the project. If you need 100% reliability you need to host the library by yourself. If you're fine with Google's reliability, go for library hosting.
As your edit pointed out bundling: https://github.com/bower/bower check this out. It is a package manager for installing dependencies etc. on frontend projects. Should be exactly what you're looking for.

Requirejs - versioning

Quick note - by versioning I mean for the purposes of cache busting. The common practice of adding query params to the end of the script request does not work on all browsers. The easiest and the most messiest way that I have found to date is to version my entire deploy folder name.
-- scripts.v1
-- scripts.v2
But this is incredibly messy and mucks up the deploy times too (I use S3 as my cdn). Does anyone know of an alternate method to this?
EDIT
It seems, I have not been very clear. Let me be a bit more explicit.
I use requirejs on my site. It is quite a JavaScript heavy application with frequent updates and iterations. With requirejs in place now, the only way I can reliably make sure that browsers are serving the latest version, is to version my whole deploy folder name (javascript) and upload the whole lot of files to S3 again. I then use the data-main method to set the base path of the project.
For many reason, this is quite cumbersome. Even if the code change is just a few lines, the whole process has to be repeated. Is there some other decent method to let requirejs know that files have versions? As in, if I call
require(["superImportantJSFile"], function(){})
it will know that the current version is superImportantJSFile.v4.js or something along those lines.
I hope I have been more clear now. Any suggestions as to how the community in general does this? I'm pretty sure this has to be a common scenario, but I haven't been able to find a good solution to this yet
I like to use a post-build step that puts static resources into a folder with a path that includes the version control version number. For example source control revision number 1234 would lead to the creation of a path: /1234/scripts/*. These directories are also created in the CDN with the correct version of the assets within.
In our require.js config in a template, we replace the baseURL with the appropriate revision, which is controlled via a config file, eg:
var require = {
baseUrl: "%%resDir%%",
...
};
This makes it easy to change the asset versions between a few different releases, which can all stay on the CDN without causing any conflicts. It also solves the browser cache busting issue.
The HTML5 Boilerplate offers one of the most graceful solutions I have seen. They have configs available for Apache and nginx. From there you can just add a timestamp to the filename within your script tags, like so:
<script src="scripts/app.20130728.js"></script>
Which the web server would rewrite to scripts/app.js.
You can add aliases to your RequireJS configuration by using map (see http://requirejs.org/docs/api.html#config-map) for example:
require.config({ /* ... other config.... */
map: { '*': {'superImportantJSFile': 'superImportantJSFile.v4'} }
})
So you only have one place to update :)
You mentioned the use of a CDN which is a good use case to not put those files in your minimized r.js bundle (in case that you are using that tool). But if those files are updated frequently, maybe it makes sense to pack your modules with r.js and update the whole code.

Loading jQuery with RequireJS - Which is better, a local version or a CDN one?

EDITED to clarify:
In terms of performance (though that's still a wild term, I know), which is better - loading a local version, or a CDN version of jQuery, over RequireJS?
For the record, RequireJS online doc contains some passage that seems to discourage CDN using, though I am not really sure 100% about what it means:
Do not mix CDN loading with shim config in a build. Example scenario:
you load jQuery from the CDN but use the shim config to load something
like the stock version of Backbone that depends on jQuery. When you do
the build, be sure to inline jQuery in the built file and do not load
it from the CDN. Otherwise, Backbone will be inlined in the built file
and it will execute before the CDN-loaded jQuery will load. This is
because the shim config just delays loading of the files until
dependencies are loaded, but does not do any auto-wrapping of define.
After a build, the dependencies are already inlined, the shim config
cannot delay execution of the non-define()'d code until later.
define()'d modules do work with CDN loaded code after a build because
they properly wrap their source in define factory function that will
not execute until dependencies are loaded. So the lesson: shim config
is a stop-gap measure for for non-modular code, legacy code.
define()'d modules are better.
Theoratically, using a CDN jQuery file would result in 1 more HTTP Request (can't be merged with other JS files using r.js), but has the potential benefit that your visitors may have already cached the CDN version from other sites they've visited.
However, if I got it right from the information googled, you still need to offer a local jQuery copy to r.js, as the resulting minified JS file would still need to contain a copy of the jQuery module to ensure the consistence of dependency. That would result in loading jQuery both over local and CDN. (Hope I got this part right?)
So, which way is better?
Your requirejs doc quote is specifically about using scripts that have a shim config for jQuery. Dynamically loading of a base dependency from a 3rd party CDN is fine if all the scripts are AMD modules.
Cache hits are not has high as you might think (Yahoo I believe did a study on cache vs non-cached state), and it means now that you now have to rely on another domain for loading.
The benefits probably depend on the app, profiling it will lead to the best answer. For instance, if it is a site with lots of images, then the strategy for jquery matters less as the image loading will probably be the more noticeable perf issue.
I would start out with optimizing jQuery into the built file and using AMD modules for everything, so if I want to delegate to the CDN I can. However, if using requirejs and the shim config, the base dependencies need to be inlined in the built file because the shimmed libraries do not call define() -- they do not wait for dependencies to load, they want them available immediately.
Short answer: Avoid the extra HTTP request and DNS lookup
You're most likely better off using your own copy and letting RequireJS merge the files. In other words, I'd say it's more valuable to avoid that extra http request and DNS lookup.
While it's true that a user may already have that file in their cache from another site, they most likely will not. Even if they had been to another site recently, cache sizes are generally small enough that during the course of a normal browsing session or two, a user can easily fill up their cache, in which case older files will be discarded.
I think you'd only really be talking about 1% of your traffic, at most, that have CDN file in cache already, so only 1% of your users are benefiting. However, by combining those resources and avoiding the extra http request, you're benefiting 99% of your users. So conversely, you'd be hurting 99% of you're users by not combining. Just another way of looking at this.
Another consideration is mobile users ... mobile users have terrible latency so the RTT for the additional http request and dns lookup have a larger cost.
It is not only the fact that people have cached the file. user agents can only load a couple of files from the same domain at the same time. So loading the JS file from a CDN makes sure the file will get loaded simultaneous.
This come next to he benifit of users already having a cached version of the file. So for popular files (e.g. the jQuery javascript) I would always load it from a CDN.
You could always add a fallback to a local version in case the CDN is down for whatever reason.
Note
Although the RFC states that user agents should do a maximum number of 2 requests simultaneous most user-agents ignore this spec nowadays. Also see this old (2009) question on SO. Note that it wouldn't surprise me that user-agents currently do even more requests.

Categories

Resources