Building AMD code for Node and browser - javascript

I have been searching around various repositories and blogs around the web and can't seem to find what I am looking for.
The Problem
When building code via AMD using something like RequireJS you can expect to set up your modules like so:
Module one:
define(['a'], function () {
return {
methodOne: function () {
return 'something';
}
}
});
Module two:
define(['b'], function () {
return {
methodTwo: function () {
return 'something';
}
}
});
This helps a lot with building a project. It helps separates concerns out into multiple files rather than having one large one.
But how do you optimize this into one file? I would assume that you would use some sort of build tool and come out with something like:
define(function () {
return {
/* other dependency functions */
methodOne: function () {
return 'something';
},
methodTwo: function () {
return 'something';
}
}
});
For AMD environments. This essentially would just combine all of the modules together into one file.
So I have a few questions:
How would I optimize my AMD code correctly in a build like I've
mentioned above? Is this even the best approach?
How do I make my code portable for AMD, Node and the browser??
What I've Tried/Found
It looks like RequireJS has it's own build tool (http://requirejs.org/docs/optimization.html) to do something like this. After going through the documentation and trying out the build tool for myself, I've found that it just combines all of the defines into one file. Am I missing something here? This isn't very useful.
The UMD pattern seems to be what normalizes your code to work with AMD, Node and the browser (http://know.cujojs.com/tutorials/modules/authoring-umd-modules, https://github.com/umdjs/umd). Does this mean that every AMD file I am using I have to convert to UMD? This seems like a headache.
I have stumbled upon uRequire (http://urequire.org/) which seems like it's the answer I am looking for. Being able to build code into UMD from AMD or CommonJS pattern.
This is an issue that has been eluding me. UMD looks cool, but it seems like the hangup there is making your code available for the browser easily.
Any help with this would be much appreciated.

Regarding the RequireJS optimizer usefulness, I use the RequireJS rjs optimizer for over a year to create bundles of modules that have to come together (and synchronously).
This is being very helpful because I am saving on the number of requests and also on the asynchronous overhead where require has to figure out all the dependency tree, because that simply doesn't need to happen.
For instance, if you think in terms of broader modules, a.k.a. sub-applications or even features, you can just give another lecture to your app where instead of loading 40+ requirejs modules you would then be loading just 5 sub-apps/features, and each block of them would come altogether and synchronously.
Regarding UMD and uRequire, I suggest having a look also on https://github.com/systemjs/systemjs - it seems to be a clever approach. Yet uRequire seems to be very nice as well.

Related

MongoDB map-reduce (via nodejs): How to include complex modules (with dependencies) in scopeObj?

I'm working on a complicated map-reduce process for a mongodb database. I've split some of the more complex code off into modules, which I then make available to my map/reduce/finalize functions by including it in my scopeObj like so:
const scopeObj = {
userCalculations: require('../lib/userCalculations')
}
function myMapFn() {
let userScore = userCalculations.overallScoreForUser(this)
emit({
'Key': this.userGroup
}, {
'UserCount': 1,
'Score': userScore
})
}
function myReduceFn(key, objArr) { /*...*/ }
db.collection('userdocs').mapReduce(
myMapFn,
myReduceFn,
{
scope: scopeObj,
query: {},
out: {
merge: 'userstats'
}
},
function (err, stats) {
return cb(err, stats);
}
)
...This all works fine. I had until recently thought it wasn't possible to include module code into a map-reduce scopeObj, but it turns out that was just because the modules I was trying to include all had dependencies on other modules. Completely standalone modules appear to work just fine.
Which brings me (finally) to my question. How can I -- or, for that matter, should I -- incorporate more complex modules, including things I've pulled from npm, into my map-reduce code? One thought I had was using Browserify or something similar to pull all my dependencies into a single file, then include it somehow... but I'm not sure what the right way to do that would be. And I'm also not sure of the extent to which I'm risking severely bloating my map-reduce code, which (for obvious reasons) has got to be efficient.
Does anyone have experience doing something like this? How did it work out, if at all? Am I going down a bad path here?
UPDATE: A clarification of what the issue is I'm trying to overcome:
In the above code, require('../lib/userCalculations') is executed by Node -- it reads in the file ../lib/userCalculations.js and assigns the contents of that file's module.exports object to scopeObj.userCalculations. But let's say there's a call to require(...) somewhere within userCalculations.js. That call isn't actually executed yet. So, when I try to call userCalculations.overallScoreForUser() within the Map function, MongoDB attempts to execute the require function. And require isn't defined on mongo.
Browserify, for example, deals with this by compiling all the code from all the required modules into a single javascript file with no require calls, so it can be run in the browser. But that doesn't exactly work here, because I need to be the resulting code to itself be a module that I can use like I use userCalculations in the code sample. Maybe there's a weird way to run browserify that I'm not aware of? Or some other tool that just "flattens" a whole hierarchy of modules into a single module?
Hopefully that clarifies a bit.
As a generic response, the answer to your question: How can I -- or, for that matter, should I -- incorporate more complex modules, including things I've pulled from npm, into my map-reduce code? - is no, you can not safely include complex modules in node code you plan to send to MongoDB for mapReduce jobs.
You mentioned the problem yourself - nested require statements. Now, require is sync, but if you have nested functions inside, these require calls would not be executed until call time, and MongoDB VM would throw at this point.
Consider the following example of three files: data.json, dep.js and main.js.
// data.json - just something we require "lazily"
false
// dep.js -- equivalent of your userCalculations
module.exports = {
isValueTrue() {
// The problem: nested require
return require('./data.json');
}
}
// main.js - from here you send your mapReduce to MongoDB.
// require dependency instantly
const calc = require('./dep.js');
// require is synchronous, the effectis the same if you do:
// const calc = (function () {return require('./dep.js')})();
console.log('Calc is loaded.');
// Let's mess with unwary devs
require('fs').writeFileSync('./data.json', 'false');
// Is calc.isValueTrue() true or false here?
console.log(calc.isValueTrue());
As a general solution, this is not feasible. While vast majority of modules will likely not have nested require statements, HTTP calls, or even internal, service calls, global variables and similar, there are those who do. You cannot guarantee that this would work.
Now, as a your local implementation: e.g. you require exactly specific versions of NPM modules that you have tested well with this technique and you know it will work, or you published them yourself, it is somewhat feasible.
However, even if this case, if this is a team effort, there's bound to be a developer down the line who will not know where your dependency is used or how, use globals (not on purpose, but by ommission, e.g they wrongly calculate this) or simply not know the implications of whatever they are doing. If you have strong integration testing suite, you could guard against this, but the thing is, it's unpredictable. Personally I think that when you can choose between unpredictable and predictable, almost always you should use predictable.
Now, if you have an explicitly stated purpose for a certain library to be used in MongoDB mapReduce, this would work. You would have to guard well against ommisions and problems, and have strong testing infra, but I would make certain the purpose is explicit before feeling safe enough to do this. But of course, if you're using something that is so complex that you need several npm packages to do, maybe you can have those functions directly on MongoDB server, maybe you can do your mapReducing in something better suited for the purpose, or similar.
To conclude: As a purposefuly built library with explicit mission statement that it is to be used with node and MongoDB mapReduce, I would make sure my tests cover all my mission-critical and important functionality, and then import such npm package. Otherwise I would not use nor recommend this approach.

Easy optional dependency on jQuery for RequireJS between Node.JS and browser

I'm using RequireJS, and trying to pack up a jQuery widget for easy usage into one file. Inside the widget's JavaScript code are a certain number of non-UI functions that don't call $-anything, that I'd like to export and be able to use on the server side.
(The shared routines that don't depend on jQuery used to be in a separate module called client-server-common.js. But like I said, I'm looking to reduce the number of files...and there's no real reason to be hung up on including the dead code for the widget on the server. So the widget can just subsume the common code.)
I'd like my only dependencies for the widget to be jQuery and underscore, with jQuery optional (and it degrades into what was client-server-common.js in that case). So the interface I'm looking for would be like:
require(['jquery', 'underscore'], function ($, _) {
var exports = {...};
if ($) {
// Do the stuff that only makes sense in the client
// Totally fine if there is no jquery; in that case
// all we probably care about are the exports
...
}
// Do the stuff for both client and server
...
// Return the exported functions
return exports;
}
Reading up on what others have asked, I notice this answer to this question from January says "You cannot really set it optional":
requireJS optional dependency
That's not what I want. Is that the last word? :-/
It seems that I can get around this by installing the jquery NPM package, which apparently does stuff with the DOM somehow:
Can I use jQuery with Node.js?
My understanding would be that if I added this dependency and just ignored it, I'd be okay. I might even find that there was some good reason to do DOM manipulation on the server (I haven't thought enough to figure out why or how that would be useful, is it?)
So should I add the dependency for simplicity's sake, and just not use it? Or can I rig it up so that I just give a jQuery of null on the server configuration, through some magic that doesn't involve waiting for timeouts and errors and hacks? Could someone make a "jquery-null" package that somehow just came back and gave you a {} or null for jQuery to smooth over situations like this?
Advice appreciated! :-/
The answer you mention says "You cannot really set it optional" and then gives a solution to operate even in the absence of modules. You can use an errback that will do nothing if running server-side.
The following code assumes that RequireJS' require call is available as requirejs. When loading RequireJS in a browser, this is the default (that is, after loading RequireJS requirejs === require is true.) Server-side, you would have to make it available with something like:
if (typeof window === "undefined")
var requirejs = require('requirejs');
(The above code would obviously fail if there is something in Node that sets window globally. I've never run into this problem.)
Once the above is taken care of we can do:
requirejs(['underscore'], function (_) {
var exports = {...};
requirejs(['jquery'], function ($) {
// This will execute only if jquery is present.
// Do the stuff that only makes sense in the client
// Totally fine if there is no jquery; in that case
// all we probably care about are the exports
...
}, function (err) {
// This will execute if there is an error.
// If server-side, do nothing. If client-side, scream!
});
// Do the stuff for both client and server
...
// Return the exported functions
return exports;
});
The answer you mentioned and RequireJS' documentation (which I linked above) mention checking the id of the failed module and undefining it. I'm quite certain that you would not need to undefine it since you won't try to load it again from a different place. Checking the module id would be a way to future-proof your code if someday jQuery depends on some other thing. Right now, loading jQuery just loads jQuery so if there is a failure it cannot be any other module than jQuery.
I would not include the actual code of jQuery server-side unless I'd have an actual substantial reason for it. What I would do if I wanted to get rid of the errback, for whatever reason, would be to have a build of my code for server-side use that includes a fake jQuery module. Something like:
define(function() {
return "I'm totally fake";
});
And then test it:
requirejs(['jquery'], function ($) {
if ($ !== "I'm totally fake") {
// Do the real deal.
}
});
If you do eventually need jQuery server-side, you'll have to also install something like jsdom. It used to be that installing jQuery with npm install jquery would include jsdom in the installation but I think this has changed recently.
I'm using RequireJS, and trying to pack up a jQuery widget for easy usage into one file. [..] I'd like my only dependencies for the widget to be jQuery and underscore, with jQuery optional [..] Reading up on what others have asked, I notice this answer to this question from January says "You cannot really set it optional" [..] That's not what I want. Is that the last word? :-/
No it's not the last word
This can be done elegantly and simple actually. I've described how to use an AMD module only when it's loaded in my answer to that question, but I'll repeat the gist of it here:
define(['require'], function(require){
if (require.defined('jquery') {
var $ = require('jquery');
$.fn.something = function(){};
}
});
We don't add the dependency directly, but instead manually require it only when it's already defined so we don't trigger any requests.
This form in which we declare a dependency on require, then use that inside our module is actually recommended by the RequireJS author, but my experimentation indicates that this actually also works:
define(require.defined('jquery') ? ['jquery'] : [], function($){
if ($) {
// Yippee, all jQuery awesomeness is available
}
});

Approaches to modular client-side Javascript without namespace pollution

I'm writing client-side code and would like to write multiple, modular JS files that can interact while preventing global namespace pollution.
index.html
<script src="util.js"></script>
<script src="index.js"></script>
util.js
(function() {
var helper() {
// Performs some useful utility operation
}
});
index.js
(function () {
console.log("Loaded index.js script");
helper();
console.log("Done with execution.");
})
This code nicely keeps utility functions in a separate file and does not pollute the global namespace. However, the helper utility function will not be executed because 'helper' exists inside a separate anonymous function namespace.
One alternative approach involves placing all JS code inside one file or using a single variable in the global namespace like so:
var util_ns = {
helper: function() {
// Performs some useful utility operation.
},
etc.
}
Both these approaches have cons in terms of modularity and clean namespacing.
I'm used to working (server-side) in Node.js land where I can 'require' one Javascript file inside another, effectively injecting the util.js bindings into the index.js namespace.
I'd like to do something similar here (but client-side) that would allow code to be written in separate modular files while not creating any variables in the global namespace while allowing access to other modules (i.e. like a utility module).
Is this doable in a simple way (without libraries, etc)?
If not, in the realm of making client-side JS behave more like Node and npm, I'm aware of the existence of requireJS, browserify, AMD, and commonJS standardization attempts. However, I'm not sure of the pros and cons and actual usage of each.
I would strongly recommend you to go ahead with RequireJS.
The modules support approach (without requires/dependencies):
// moduleA.js
var MyApplication = (function(app) {
app.util = app.util || {};
app.util.hypotenuse = function(a, b) {
return Math.sqrt(a * a + b * b);
};
return app;
})(MyApplication || {});
// ----------
// moduleB.js
var MyApplication = (function(app) {
app.util = app.util || {};
app.util.area = function(a, b) {
return a * b / 2;
};
return app;
})(MyApplication || {});
// ----------
// index.js - here you have to include both moduleA and moduleB manually
// or write some loader
var a = 3,
b = 4;
console.log('Hypotenuse: ', MyApplication.util.hypotenuse(a, b));
console.log('Area: ', MyApplication.util.area(a, b));
Here you're creating only one global variable (namespace) MyApplication, all other stuff is "nested" into it.
Fiddle - http://jsfiddle.net/f0t0n/hmbb7/
**One more approach that I used earlier in my projects - https://gist.github.com/4133310
But anyway I threw out all that stuff when started to use RequireJS.*
You should check out browserify, which will process a modular JavaScript project into a single file. You can use require in it as you do in node.
It even gives a bunch of the node.js libs like url, http and crypto.
ADDITION: In my opinion, the pro of browserify is that it is simply to use and requires no own code - you can even use your already written node.js code with it. There's no boilerplate code or code change necessary at all, and it's as CommonJS-compliant as node.js is. It outputs a single .js that allows you to use require in your website code, too.
There are two cons to this, IMHO: First is that two files that were compiled by browserify can override their require functions if they are included in the same website code, so you have to be careful there. Another is of course you have to run browserify every time to make change to the code. And of course, the module system code is always part of your compiled file.
I strongly suggest you try a build tool.
Build tools will allow you to have different files (even in different folders) when developing, and concatenating them at the end for debugging, testing or production. Even better, you won't need to add a library to your project, the build tool resides in different files and are not included in your release version.
I use GruntJS, and basically it works like this. Suppose you have your util.js and index.js (which needs the helper object to be defined), both inside a js directory. You can develop both separately, and then concatenate both to an app.js file in the dist directory that will be loaded by your html. In Grunt you can specify something like:
concat: {
app: {
src: ['js/util.js', 'js/index.js'],
dest: 'dist/app.js'
}
}
Which will automatically create the concatenation of the files. Additionally, you can minify them, lint them, and make any process you want to them too. You can also have them in completely different directories and still end up with one file packaged with your code in the right order. You can even trigger the process every time you save a file to save time.
At the end, from HTML, you would only have to reference one file:
<script src="dist/app.js"></script>
Adding a file that resides in a different directory is very easy:
concat: {
app: {
src: ['js/util.js', 'js/index.js', 'js/helpers/date/whatever.js'],
dest: 'dist/app.js'
}
}
And your html will still only reference one file.
Some other available tools that do the same are Brunch and Yeoman.
-------- EDIT -----------
Require JS (and some alternatives, such as Head JS) is a very popular AMD (Asynchronous Module Definition) which allows to simply specify dependencies. A build tool (e.g., Grunt) on the other hand, allows managing files and adding more functionalities without relying on an external library. On some occasions you can even use both.
I think having the file dependencies / directory issues / build process separated from your code is the way to go. With build tools you have a clear view of your code and a completely separate place where you specify what to do with the files. It also provides a very scalable architecture, because it can work through structure changes or future needs (such as including LESS or CoffeeScript files).
One last point, having a single file in production also means less HTTP overhead. Remember that minimizing the number of calls to the server is important. Having multiple files is very inefficient.
Finally, this is a great article on AMD tools s build tools, worth a read.
So called "global namespace pollution" is greatly over rated as an issue. I don't know about node.js, but in a typical DOM, there are hundreds of global variables by default. Name duplication is rarely an issue where names are chosen judiciously. Adding a few using script will not make the slightest difference. Using a pattern like:
var mySpecialIdentifier = mySpecialIdentifier || {};
means adding a single variable that can be the root of all your code. You can then add modules to your heart's content, e.g.
mySpecialIdentifier.dom = {
/* add dom methods */
}
(function(global, undefined) {
if (!global.mySpecialIdentifier) global.mySpecialIdentifier = {};
/* add methods that require feature testing */
}(this));
And so on.
You can also use an "extend" function that does the testing and adding of base objects so you don't replicate that code and can add methods to base library objects easily from different files. Your library documentation should tell you if you are replicating names or functionality before it becomes an issue (and testing should tell you too).
Your entire library can use a single global variable and can be easily extended or trimmed as you see fit. Finally, you aren't dependent on any third party code to solve a fairly trivial issue.
You can do it like this:
-- main.js --
var my_ns = {};
-- util.js --
my_ns.util = {
map: function () {}
// .. etc
}
-- index.js --
my_ns.index = {
// ..
}
This way you occupy only one variable.
One way of solving this is to have your components talk to each other using a "message bus". A Message (or event) consists of a category and a payload. Components can subscribe to messages of a certain category and can publish messages. This is quite easy to implement, but there are also some out of the box-solutions out there. While this is a neat solution, it also has a great impact on the architecture of your application.
Here is an example implementation: http://pastebin.com/2KE25Par
http://brunch.io/ should be one of the simplest ways if you want to write node-like modular code in your browser without async AMD hell. With it, you’re also able to require() your templates etc, not just JS files.
There are a lot of skeletons (base applications) which you can use with it and it’s quite mature.
Check the example application https://github.com/paulmillr/ostio to see some structure. As you may notice, it’s written in coffeescript, but if you want to write in js, you can — brunch doesn’t care about langs.
I think what you want is https://github.com/component/component.
It's synchronous CommonJS just like Node.js,
it has much less overhead,
and it's written by visionmedia who wrote connect and express.

Is it a good idea to use conditional dependencies in AMD modules?

I'm thinking of using conditions for specifying module dependendies in the AMD module system. For example to load libraryA on the browser and libraryB on the server.
This could look like this:
define([window?"libraryA":"libraryB"],function(library){
//do some stuff
});
This would allow me to build an abstraction layer for 2 modules. But is this really a good idea? Are there any drawbacks in doing this?
That approach could cause problems for the build tool.
Update:
After further research, I find that config settings in your main JS file are not read by default by the optimizer. So, a cleaner solution would be to use a different map config for client and server.
Original:
A safer approach may be to define a module that adapts itself to the environment, thus keeping all the conditional code within your module definitions, and leaving all dependency lists in the most reliable format.
// dependent module
define(["libraryAB"], function (library) {
//do some stuff
});
// libraryAB.js dependency module
define([], function () {
return window ?
defineLibraryA() :
defineLibraryB();
});
You could alternatively keep the libraryA and libraryB code separate by defining libraryAB this way.
// libraryAB.js dependency module
define(["libraryA", "libraryB"], function (libraryA, libraryB) {
return window ? libraryA : libraryB;
});
//define libraryA.js and libraryB.js as usual
If you want to avoid executing libraryA on the server or libraryB on the client, you could have these modules return functions and memoize the result if necessary.
The moral is that it's safest to keep all your non-standard code inside module definitions, keeping dependency lists nice and predictable.

Javascript modularity

I am studying the possibility of building Javascript applications in the Java / OSGi modularity style. I am sure I will not manage to actually reach the kind of flexibility that OSGi provides, but I would need to get to at least the following list:
split javascript code as modules, each module would lay in its own git repository
ideally, as little as possible dependencies between modules and definitely no circular dependencies between modules
have 2 or more "main" javascript applications that will use the modules described above
If I will manage to setup the above then I will probably want to be able to organize modules as layers like: core layer with several modules, ui layer, applications layer.
Are there any javascript libraries that help building the above setup? Is something like this possible in javascript?
Note: When I say javascript, I don't actually mean plain javascript. I am going to use a library like ExtJS or jQuery for the UI part at least.
Checkout RequireJS an implementation of Asynchronous Module Definition
Since this question was answered javascript now has native implementaitons of classes and modules that can give you what you need without having to use a 3rd party library.
js modules alow you to have a module or code and import methods (or everything) from it (a bit like in python and ruby)
// utilities.js
function something () {
// some great code
}
export function util() {
something();
}
//import a method
import {util} from "utilities.js";
// or import everything
import "utilities.js";
you can find more on es6 modules here
js classes are syntax suger over prototypal structure in js and can be used the same way as in more trandtional OOP languages
class MyClass {
constructor() {
// constructor code
}
add() {
// some method
}
}
var myObject = new MyClass();
You can find out more about es6 classes here

Categories

Resources