Using Asynchronous Module load in javascript library project - javascript

I am working on an javascript project which will be provided as a third-library.
And there are a lot of modules and templates which is a waste of time and performance if loading them all at once. So I think use a module loader like requirejs maybe a good idea.
However since the project will be use by other people, so how about the client use requirejs at the same time?
Since I have to config the dependencies for my own library, while the client need to config the dependencies for his project. I am afraid this will cause some conflict.
Any alternatives?

And there are a lot of modules and templates which is a waste of time and performance if loading them all at once.
It sounds like you are opting to not use r.js to optimize your library. This is a choice you can make. However, you will have to document your library so that people using it can produce a RequireJS configuration that will take care of the needs of their own code and the needs of your library. You should provide an example of RequireJS configuration that satisfies your library and people wanting to use it can integrate that configuration with their own.
If you were to optimize your library with r.js, then you could use Almond to load your library as a single unit, and hide the fact that it is a collection of AMD modules. However, this entails "loading them all at once", which you do not want.

Related

Gradually move from including each JS files to module bundling

In our AngularJS application we currently have a lot (400+) of files, which get included via <script>-tags. The order these files is something like this:
AngularJS script files
3rd party plugins / modules
business Logic files
modules
services
controllers / components / directives
We would like to move to a better approach utilizing a module bundler and TypeScript. New files are already written in TypeScript but don't make use of import/export. In order to make things easier, we could convert every JavaScript file into a TypeScript file and fix the resulting errors in a feasible time.
However, before we do this, we would like to have a viable strategy on how we could gradually make use of import/export. I'm thinking of something like rewriting one module from time to time, starting with modules deep down in the dependency tree.
However I was not able to achieve this, but I'm quiet sure that others already had to solve this before.
Once you decide on a module bundler, you'll need to learn about its facilities for (1) allowing other JavaScript on the page to access things defined in the bundle and (2) allowing the bundle to access things defined by other JavaScript on the page. (If you're able to migrate in strict dependency order, you might never need #2.) For example, for Webpack, #1 would be the library* output options and #2 would be externals. Then just move the code into modules little by little, adjusting the configuration as necessary so that each part of the code has access to the things it needs from the other part of the code. Since Webpack only supports a single library export module, during the transition, you may have to maintain a dummy module that just re-exports all modules that you need to access from code outside the bundle. This is a little tedious and represents extra work that you wouldn't have to do if you migrated all at once, but you may decide it's worth paying that cost in order to be able to migrate gradually.
If you have issues getting type information from TypeScript module files in non-module TypeScript files, see this answer for a workaround.

Why use requireJS instead of an ordered include list?

I've been using a grunt file to concatenate all my JS into a single file which is then sent to the client. What advantage do I have in using require calls then? The dependencies are inherent from the concatenation order and I don't have to muddy all my JS with extra code and another third-party library.
Further, backbone models (for example) clearly state their inheritance in their definitions. Not to mention that they simply wouldn't work if their dependencies weren't included anyway.
Also, wouldn't maintenance be easier if all comments related to dependencies were in one place (the grunt file) to prevent human error and having to open every JS file to understand its dependencies?
EDIT
My (ordered) file list looks something like:
....
files: [
"js/somelib.js",
"js/somelib2.js",
"js/somelib3.js",
"js/models.js",
"js/views.js",
"js/controllers.js",
"js/main.js"
], ...
So perhaps requireJS isn't worth it for small projects anyway.
Using require.js allow you to break down each part of your application into reusable modules (AMD) and to manage those dependencies easily. It is not easy to manage dependencies in a javascript application with 100 classes, for example.
Also, if you don't want all the overhead of require, check this out (developed by the same guy who created require.js): https://github.com/jrburke/almond
The answer depends on the size of your app and the end use case..
A single site.min.js payload for the front end (client) generally aims for small file sizes and simple architectures (1 single file generated from maybe 10).
back end based (server) apps are usually much bigger and complicated and therefore may warrant the use of another tool to help with managing large code libraries and dependencies (50 files for example).
In general, RequireJS is worthwhile but only if you have many files and dependencies. An alternative for use in the client would be almond. Again, using a tool like this must warrant the need (many files and dependencies).
The answer from orourkedd is also worth reading.

When to use Requirejs and when to use bundled javascript?

This may be a dumb question for web guys. But I am a little confused over this. Now, I have an application where I am using a couple of Javascript files to perform different tasks. Now, I am using Javascript bundler to combine and minify all the files. So, at runtime there will be only one app.min.js file. Now, Requirejs is used to load modules or files at runtime. So, the question is if I already have all things in one file, then do I need requirejs? Or what is a use case scenario where I can use requirejs and/or bundler?
Please let me know if any further details are needed.
Generally you only use RequireJS in its loading form during development. Once the site is done and ready for deployment, you minify the code. The advantage here is RequireJS knows exactly what your dependencies are, and thus can easily minify the code in the correct order. Here is what it says on the RequireJS website:
Once you are finished doing development and want to deploy your code for your end users, you can use the optimizer to combine the JavaScript files together and minify it. In the example above, it can combine main.js and helper/util.js into one file and minify the result.
This is a hotly contested issue among many proficient javascript developers. Many other languages have a "compilation" phase where the entire program is bundled up for deployment (JBoss's .WAR files come to mind). Programmers that come from more traditional backgrounds often favor this approach.
Javascript has seen such growth in recent years that it is difficult to chart exact best practices, but those that appreciate the more functional nature of Javascript often prefer the module loading approach (like require.js uses).
I wrote Frame.js which works much like require.js, so my bias is towards the module loader approach.
To answer your question directly, yes, it is one or the other.
Most that argue for packing your scripts into a single file believe it enables more compression and is thus more efficient. I believe the efficiency advantages of packaging are negligible in most cases because: (1) module load times are distributed over the entire session, (2) individual modules can be compressed to nearly the same percentage, (3) individual modules can be cached by the server and routers separately, and (4) loading scripts only when they are needed ultimately allows you load less code for some users and more code overall.
In the long run, if you can see an advantage to dynamic script loading use it. If not, bundle your scripts into a single file.
It depends on your application. If you're making a server-side app with only modest javascript (less than 100kb minified) then go for total bundling, you're probably going to be fine.
But if you're making a javascript app and have a ton of code in it, then your needs are going to be different.
For example, in my app I bundle all the core files. There's jQuery, underscore, backbone, my main app files, my user login system, my layout system, my notifications and chat system, all are part of my big initial file.
But I have many other modules as well that isn't part of the initial bundle, that are loaded after those.
The forums, the wiki, the wysiwyg, color picker, drag/drop, calendar, and some animation files are part of the second category. You need to make reasonable decisions about what's commonly used and needed immediately vs what can be delayed.
If I include everything immediately I can get above a meg of javascript, which would be insane and make the initial boot unacceptably slow.
The second category starts downloading after initSuccess event fires from the initial file.
But the second category is more intelligent than the first in that it loads what's more important first. For example if you're looking at the wiki it'll load the wiki before it loads the color picker.

How can one create a reusable library structured as AMD module(s)?

We are creating a framework that we intend to use across multiple projects. All projects will use require.js to manage modules and dependencies.
Ideally I'd like to use the r.js optimizer to compile the framework into a single file that can be provided to the applications that use it. That file will contain all modules of the framework such that in my application I can write code like:
define(["framework/util/a", "framework/views/b"], function(A, B) {
var a = new A();
// etc...
});
But it appears there are two problems with this approach.
Depending on framework/util/a doesn't tell require.js that it needs to load framework.js in which it will find util/a
The optimize tool generates names for all modules included in framework.js such as define("util/a", function() { ... } ); Even if require.js loaded framework.js there is nothing that tells it that the defined module util/a is a relative module to framework and as such is identified as framework/util/a
Am I missing something or is a better approach to structure my framework as a CommonJS package and use require.js's packages configuration option?
Re: 1. It seems that indeed r.js optimisation not was designed to optimise partial dependancy trees, as lazy loading hinges on file paths. E.g. asking path/to/module to actually load path/to would seem like a hack. One solution would be to forgo lazy loading and include framework-built.js above your application code.
Re: 2. So you'll now need your framework-built.js with full paths. One way would be to build a dummy parent that requires all of framework, say dummy-framework.js. That way your dummy-framework-built.js will have the full path defines for framework and if not lazy loaded, it should work fine.
Disclaimer: I haven't used require.js all that much, though that's my best effort :)

Javascript dependecy management and packaging

I'm trying to figure out how to best manage Javascript file dependencies and have that drive the packaging of a 100% front-end app. In short, I'm building an application using backbone.js along with some other libraries. I want an organized codebase and would like to be able to declare dependencies within each file. Ideally those declarations would a) drive the order in which files are loaded by the browser (while in development I want the files to load separately) and drive the order in which the packaging scripts load the scripts for concatenation (I'm aiming to serve a single file for the entire app).
I've been reading on requirejs and commonjs but I'm not convinced.
I have a simple shell script right now that uses cat <file> <file> <file> <file> > concatenated.file to do what I want but it's a pain to maintain that list of files up to date and in the right order. It'd be much easier to be able to declare the dependency at the begining of each javascript file and have the packager and loaders be smart about using that information to concatenate/load scripts.
Any suggestions?
Thanks you,
Luis
I am partial to stealjs myself. It's part of JavascriptMVC but no reason why you can't use it with Backbone.js
The nice part about this one is that it builds your app for production including minifying your css and js and neatly packing all of it into 2 files: production.css and production.js.
It can handle loading non JS files too so you can do things like steal('somefile.css').then(function() {...});
For files, its very much like you would do in other languages:
steal(dep1, dep2, dep3).then(function () {
// code
});
For complex frontend apps Asynchronous Module Definition (AMD) format is best choice. And it's alot of great loaders that supports AMD (curl.js, RequireJS).
I recomend this articles to learn about modern approaches in javascript dependecy management:
Writing Modular JavaScript With AMD, CommonJS & ES Harmony
Why AMD?
For packaging take into account CommonJS specifications, there are few implementations and it's a matter of taste, but in any case I recommend to choose tools, that is compliant with some of that specifications.
It'd be much easier to be able to declare the dependency at the begining of each javascript file and have the packager and loaders be smart about using that information to concatenate/load scripts.
I have had the same idea several months ago and are working on a dependency resolver for my Resource Builder which already makes it easier for me (including the need to distinuish between development and deployed version, with the optional debug parameter).
JsDoc Toolkit (and related efforts), which syntax is supported e. g. by Eclipse JSDT, provides a #requires tag, so you could use that. But resolving dependencies is still not a trivial task (as you can see in ResourceBuilder::resolveDeps()). (The ultimate goal is to use static code analysis to resolve dependencies automatically, without any tags.)
This would reduce the current
<script type="text/javascript" src="builder?src=object,types,dom,dom/css"></script>
to a mere
<script type="text/javascript" src="builder?src=dom/css"></script>
As for asynchronous loaders: The good thing about asynchronous loaders is that they are fast. The bad thing about asynchronous loaders is that – if they work; they are all based on a non-standard approach – they are so fast that you cannot be sure that the features the scripts provide are available in following scripts. So you have to have your code executed by their listeners. I recommend avoiding them unless you really have features in your application that are only needed on demand.

Categories

Resources