How can you ensure that angular module dependencies get resolved? - javascript

Angular's documentation on modules (http://docs-angularjs-org-dev.appspot.com/guide/module) says:
Dependencies
Modules can list other modules as their dependencies.
Depending on a module implies that required module needs to be loaded
before the requiring module is loaded. In other words the
configuration blocks of the required modules execute before the
configuration blocks or the requiring module. The same is true for the
run blocks. Each module can only be loaded once, even if multiple
other modules require it.
I created this example (http://jsbin.com/IRogUxA/34/edit) which creates a controller module that depends on two "mid-level" modules, each of which depend on two "low-level" modules. So, I have two "mid-level" modules and four "low-level" modules.
Clearly, order does not matter in the JS source. In the example above I've defined the high level modules before the low level ones they reference. I understand that Angular makes use of dependency injection to wire up the dependencies, but the way it does so is mysterious to me.
My question: How does one ensure that the config blocks of the various modules are run in the proper order? Or more broadly, how is it that Angular can resolve all of my dependencies when they are defined in any order I choose (within the JS source code)?

All the angular module API methods, such as "config", "factory" etc, are wrapped in a "invokeLater" function. In another word, when the dependency module is evaluated, the module.config, module.factory etc are not really called at that time. Instead those invocations are simply pushed into a queue.
Consider this example:
var demo = angular.module('demo', ['module1']);
demo.config( function( ) {
console.log("In app.config")
} ).run(function(){
console.log("Angular run");
});
angular.module("module1", []).factory('myservice', function(){
return {};
}).controller('mycontroller', function(){} );
For each module it has its own queue: (for the main module "demo")
var invokeQueue = [];
invokeQueue.push("demo.config", xxx);
invokeQueue.push("demo.run", xxx);
For module1:
var invokeQueue = [];
invokeQueue.push("module.factory", xxx);
invokeQueue.push("module.controller", xxx);
Once all the scripts are loaded and DOMContentLoaded event is fired , angular starts actually load/eval all the modules. At this time angular already constructed a full module dependency tree. The dependency module is always loaded first before the main module so in this case module1 will be loaded first and it's invokeQueue is called in the original order(module.factory, module.controller etc). Then back to the main module demo's invokeQueue, demo.config, demo.run

I think it helps to think of modules as their own applications, not relying on ordering of other (external) dependencies.
If order is important, then you can introduce a module that simply composes other modules and coordinates their interactions.
We avoid taking hard module references in our angular.module({moduleName},[deps]) calls, preferring to have those rolled up by a higher level module. That makes testing in isolation lots easier and you can stub out the services you rely on with lighter weight.

Related

How to make sure a module is loaded in NodeJS

This is a problem I faced more than one. Here is an example file structure:
app.js
folder
-- index.js
-- module1.js
-- module2.js
From app.js, the entry point of my application, I require folder/index.js. This file itself is just a loader who requires all the other files in the directory. module1.js and module2.js both define some methods I want to use eventually. They are never directly required by app.js since index.js takes care of that and adds common helper utilities and applies some transformations to these modules.
All works well if I then use some methods defined in those files from app.js after I required them. But the problem comes when a method defined in module2.js wants to use a method defined in method1.js (or vice-versa). Sometimes it will work, sometimes not (in folders with multiple files I didn't get to find out when precisely it works and when it doesn't yet).
If, from module2.js I require ./index.js and then use methods in it defined by module1.js, I sometimes get Cannot read property 'myMethod' of undefined. I assume it has to do with the order the modules are required by index.js, but I can't seem to find a solution to this common problem (other than duplicating code or not using methods of these other modules).
Is there a common solution to this problem? I tried doing something like this :
var modules = require(./index.js);
exports.myMethod = function() {
if(!modules.module1 || !modules.module1.myOtherMethod) {
modules = require('./index.js');
}
modules.module1.myOtherMethod();
};
But it doesn't to do anything, modules.module1 is still undefined.
It just sounds like module should require module2. That's the solution to module1 needing to call methods in module2.
If you're worried about many calls to require, don't. That's standard practice in every programming language I've used (in particular look at the import statements in any Java program. Java is verbose, I know, but this is correct practice.)
It's definitely bad practice to look at code in one module and have no idea where anything comes from, because the require statements are missing.
You have a circular dependency problem. Try moving some of the common functions to a third file and have module1 and module2 require it, or make sure that one of them requires the other in one way only.
Never ever require a file that requires the current file back.

RequireJS order of dependencies

if you have a RequireJS module like so:
define(
[
'#patches',
'backbone',
'underscore',
'react',
'#allCollections',
'#allModels',
'app/js/routers/router',
'#allTemplates',
'#allControllers',
'#allRelViews'
],
function(){
var patches = arguments[0];
});
is there any way to know which dependency gets loaded first? In my case, '#patches' is a few window.X utility functions that I want to load before anything else. Do I need to configure this a different way to ensure this?
(in my case "#' is just my own notation to denote a module whose path is predefined in my main config file)
From the documentation : http://requirejs.org/docs/api.html#mechanics
"RequireJS waits for all dependencies to load, figures out the right order in which to call the functions that define the modules, then calls the module definition functions once the dependencies for those functions have been called. Note that the dependencies for a given module definition function could be called in any order, due to their sub-dependency relationships and network load order."
I think this may help: http://www.sitepoint.com/understanding-requirejs-for-effective-javascript-module-loading/ (see "Managing the Order of Dependent Files")
RequireJS uses Asynchronous Module Loading (AMD) for loading files. Each dependent module will start loading through asynchronous requests in the given order. Even though the file order is considered, we cannot guarantee that the first file is loaded before the second file due to the asynchronous nature. So, RequireJS allows us to use the shim config to define the sequence of files which need to be loaded in correct order. Let’s see how we can create configuration options in RequireJS.
requirejs.config({
shim: {
'source1': ['dependency1','dependency2'],
'source2': ['source1']
}
});
Hope it helps
EDIT: As said in comments, using Shim for AMD module is a bad idea, use only shim for non AMD modules and manage dependencies order there.
For AMD module requirejs will manage the order of loading.
A good link from the comments (thanks Daniel Tulp) ==> Requirejs why and when to use shim config

How do you manage component dependency order with Facebook React?

Say I have two React Components, A and B, where B depends upon (makes use of) A. Say that A is in a.js and B is in b.js .
Is there a way in React to safely resolve a dependency from B to A? Thereby guaranteeing that regardless of what order I actually include a.js and b.js, things will still resolve correctly?
I believe that the Google Closure compiler effectively fakes a dependency system for you in both development and production mode. This makes the order that different code is included in the source irrelevant; is there something like this for React?
Note: the answer below is very outdated.
I stopped using RequireJS a long time ago due to the fact that it gets rather slow in development because doesn’t bundle JS in a single file in development.
These days I’m mostly using Webpack.
You may also consider other bundlers such as Browserify and Rollup.
I use ES modules as the module system, transpiled by Babel into CommonJS. Eventually I plan to swich to ES modules completely as more bundlers with its first-class support, such as Rollup and Webpack 2, become mainstream.
Old Answer
To my knowledge, React itself doesn't provide any dependency management system; if you need one, you must use some existing tool that fits your workflow.
Personally, I'm happy using RequireJS with AMD sugar syntax so my modules look like this:
/** #jsx React.DOM */
/* jshint trailing:false, quotmark:false, newcap:false */
define(function (require) {
'use strict';
var React = require('react'),
_ = require('underscore'),
JoinWidget = require('common/views/join_widget');
var LoginWidget = React.createClass({
// ...
});
});
Each React view, as any other class, gets its own file.
This is fine in development environment but I don't want to load scripts on the fly in production, so I use grunt-requirejs task with almond to concatenate files in proper order so they are always loaded synchronously in a single file.

Is it possible to set not module *.js file as a dependency using require.js

We do not use require.js for implementing modules on js source, but I want to use it for tests. And there is a problem: I couldn't implement raw *.js file as a dependency for other modules. Is it possible?
I mean: load some *.js file and modules after it (to test it).
How define works
I use require.js for both implementation and tests. You can load any JavaScript file as a dependency before the execution of the module function using define.
define(["test/myTest.js", "test/anotherTest.js"], function(test1, test2) {
// perform your tests
});
How to use requirejs with asyncTests
You can also load code after the dependencies are loaded inside the module function using require. I use it with QUnit. Here is an example from my code.
First, make sure QUnit test runner is stopped by default (this will be similar with other test frameworks). This way, you can define when tests are going to run (that is after you loaded the relevant code).
QUnit.config.autostart = false
Second, you define your test as a module. The module loads the dependencies, then defines the tests, then loads the code to be tested. This will only be necessary when the code is self-executing and can not be load beforehand (in which case you could just go with define and be done with it). Here is my example using the Chaplin library (written in CoffeeScript).
define(['chaplin'], function(chaplin) {
asyncTest("publish startup complete event", function() {
chaplin.mediator.subscribe("startup~complete", function() {
ok(true, "startup~complete event fired");
});
return requirejs(['modules/startup/startup'], function() {
start();
});
});
});
The important part is the last requirejs call. It loads the code to be tested after the tests are defined.
dynamically loading dependencies
EDIT: Responding to comment
Assuming there exists a module called config that contains the configuration data. I am also assuming a certain format, so if your format is different you may make some minor changes. The principles holds true though.
define(["config"], function(config) {
// assuming config.modules is an array of all development modules,
// config.devPath is the base bath to development modules,
requirejs(
config.modules.map(function(module){
return config.devPath + module
})
, function() {
// all modules loaded, now go on
// however, no reference to modules left, so need to work with `arguments` array
});
});
However, you should know you lose the reference to your modules in the callback function.

Disposing a RequireJS Required Class

Im using requireJS to dynamically load JS modules in my html5 single page web application. Im just wanting to know whether I should dispose of the requireJS loaded modules once i have finished with them (so the garbage collector can clean them up)? And if so than how do you dispose a required module from requireJS?
Thanks in advance
It is possible to undefine a module:
There is a global function, requirejs.undef(), that allows undefining
a module. It will reset the loader's internal state to forget about
the previous definition of the module. However, it will not remove the
module from other modules that are already defined and got a handle on
that module as a dependency when they executed. So it is really only
useful to use in error situations when no other modules have gotten a
handle on a module value, or as part of any future module loading that
may use that module. See the errback section for an example. If you
want to do more sophisticated dependency graph analysis for undefining
work, the semi-private onResourceLoad API may be helpful.
http://requirejs.org/docs/api.html#undef
I'm not too sure on the internals but in my usage of the lib I've not found it necessary to do any manual clean up of modules.

Categories

Resources