Packaging-up Browser/Server CommonJS modules with dependancies - javascript

Lets say I'm writing a module in JavaScript which can be used on both the browser and the server (with Node). Lets call it Module. And lets say that that Module would benefit from methods in another module called Dependancy. Both of these modules have been written to be used by both the browser and the server, à la CommonJS style:
module.js
if (typeof module !== 'undefined' && module.exports)
module.exports = Module; /* server */
else
this.Module = Module; /* browser */
dependancy.js
if (typeof module !== 'undefined' && module.exports)
module.exports = Dependancy; /* server */
else
this.Dependancy = Dependancy; /* browser */
Obviously, Dependancy can be used straight-out-of-the-box in a browser. But if Module contains a var dependancy = require('dependency'); directive in it, it becomes more of a hassle to 'maintain' the module.
I know that I could perform a global check for Dependancy within Module, like this:
var dependancy = this.Dependancy || require('dependancy');
But that means my Module has two added requirements for browser installation:
the user must include the dependency.js file as a <script> in their document
and the user must make sure this script is loaded before module.js
Adding those two requirements throws the idea of an easy-going modular framework like CommonJS.
The other option for me is that I include a second, compiled script in my Module package with the dependency.js bundled using browserify. I then instruct users who are using the script in the browser to include this script, while server-side users use the un-bundled entry script outlined in the package.json. This is preferable to the first way, but it requires a pre-compilation process which I would have to run every time I changed the library (for example, before uploading to GitHub).
Is there any other way of doing this that I haven't thought of?

The two answers currently given are both very useful, and have helped me to arrive at my current solution. But, as per my comments, they don't quite satisfy my particular requirements of both portability vs ease-of-use (both for the client and the module maintainer).
What I found, in the end, was a particular flag in the browserify command line interface that can bundle the modules and expose them as global variables AND be used within RequireJS (if needed). Browserify (and others) call this Universal Module Definition (UMD). Some more about that here.
By passing the --standalone flag in a browserify command, I can set my module up for UMD easily.
So...
Here's the package.js for Module:
{
"name": "module",
"version": "0.0.1",
"description": "My module that requires another module (dependancy)",
"main": "index.js",
"scripts": {
"bundle": "browserify --standalone module index.js > module.js"
},
"author": "shennan",
"devDependencies": {
"dependancy": "*",
"browserify": "*"
}
}
So, when at the root of my module, I can run this in the command line:
$ npm run-script bundle
Which bundles up the dependancies into one file, and exposes them as per the UMD methodology. This means I can bootstrap the module in three different ways:
NodeJS
var Module = require('module');
/* use Module */
Browser Vanilla
<script src="module.js"></script>
<script>
var Module = module;
/* use Module */
</script>
Browser with RequireJS
<script src="require.js"></script>
<script>
requirejs(['module.js'], function (Module) {
/* use Module */
});
</script>
Thanks again for everybody's input. All of the answers are valid and I encourage everyone to try them all as different use-cases will require different solutions.

Of course you could use the same module with dependency on both sides. You just need to specify it better. This is the way I use:
(function (name, definition){
if (typeof define === 'function'){ // AMD
define(definition);
} else if (typeof module !== 'undefined' && module.exports) { // Node.js
module.exports = definition();
} else { // Browser
var theModule = definition(), global = this, old = global[name];
theModule.noConflict = function () {
global[name] = old;
return theModule;
};
global[name] = theModule;
}
})('Dependency', function () {
// return the module's API
return {
'key': 'value'
};
});
This is just a very basic sample - you can return function, instantiate function or do whatever you like. In my case I'm returning an object.
Now let's say this is the Dependency class. Your Module class should look pretty much the same, but it should have a dependency to Dependency like:
function (require, exports, module) {
var dependency = require('Dependency');
}
In RequireJS this is called Simplified CommonJS Wrapper: http://requirejs.org/docs/api.html#cjsmodule
Because there is a require statement at the beginning of your code, it will be matched as a dependency and therefore it will either be lazy loaded or if you optimize it - marked as a dependency early on (it will convert define(definition) to define(['Dependency'], definition) automatically).
The only problem here is to keep the same path to the files. Keep in mind that nested requires (if-else) won't work in Require (read the docs), so I had to do something like:
var dependency;
try {
dependency = require('./Dependency'); // node module in the same folder
} catch(err) { // it's not node
try {
dependency = require('Dependency'); // requirejs
} catch(err) { }
}
This worked perfectly for me. It's a bit tricky with all those paths, but at the end of the day, you get your two separate modules in different files which can be used on both ends without any kind of checks or hacks - they have all their dependencies are work like a charm :)

Take a look at webpack bundler.
You can write module and export it via module exports. Then You can in server use that where you have module.export and for browser build with webpack. Configuration file usage would be the best option
module.exports = {
entry: "./myModule",
output: {
path: "dist",
filename: "myModule.js",
library: "myModule",
libraryTarget: "var"
}
};
This will take myModule and export it to myModule.js file. Inside module will be assigned to var (libraryTarget flag) named myModule (library flag).
It can be exported as commonJS module, war, this, function
Since bundling is node script, this flag values can be grammatically set.
Take a look at externals flag. It is used if you want to have special behavior for some dependencies. For example you are creating react component and in your module you want to require it but not when you are bundling for web because it already is there.
Hope it is what you are looking for.

Related

Application modularity with Vue.js and local NPM packages

I'm trying to build a modular application in Vue via the vue-cli-service. The main app and the modules are separated projects living in different folders, the structure is something like this:
-- app/package.json
/src/**
-- module1/package.json
/src**
-- module2/package.json
/src**
The idea is to have the Vue app completely agnostic about the application modules that can be there at runtime, the modules themself are compiled with vue-cli-service build --target lib in a local moduleX/dist folder, pointed with the package.json "main" and "files" nodes.
My first idea (now just for development speed purposes) was to add the modules as local NPM packages to the app, building them with a watcher and serving the app with a watcher itself, so that any change to the depending modules would (I think) be distributed automatically to the main app.
So the package.json of the app contains dependencies like:
...
"module1": "file:../module1",
"module2": "file:../module2",
...
This dependencies are mean to be removed at any time, or in general be composed as we need, the app sould just be recompiled and everything should work.
I'm trying to understand now how to dynamically load and activate the modules in the application, as I cannot use the dynamic import like this:
import(/* webpackMode: "eager" */ `module1`).then(src => {
src.default.boot();
resolve();
});
Because basically I don't know the 'module1', 'module2', etc...
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
There's a way to accomplish this?
Juggling with package.json doesn't sound like a good idea to me - doesn't scale. What I would do:
Keep all available "modules" in package.json
Create separate js file (or own prop inside package.json) with all available configurations (for different clients for example)
module.exports = {
'default': ['module1', 'module2', 'module3'],
'clientA': ['module1', 'module2', 'module4'],
'clientB': ['module2', 'module3', 'module4']
}
tap into VueCLI build process - best example I found is here and create js file which will run before each build (or "serve") and using simple template (for example lodash) generate new js file which will boot configured modules based on the value of some ENV variable. See following (pseudo)code (remember this runs inside node during build):
const fs = require('fs')
const _ = require('lodash')
const modulesConfig = require(`your module config js`)
const configurationName = process.env.MY_APP_CONFIGURATION ?? 'default'
const modules = modulesConfig[configurationName]
const template = fs.loadFileSync('name of template file')
const templateCompiled = _.template(template)
const generatedJS = templateCompiled({ `modules`: modules })
fs.writeFileSync('bootModules.js', generatedJS)
Write your template for bootModules.js. Simplest would be:
<% _.forEach(modules , function(module) { %>import '<%= module %>' as <%= module %><% }); %>;
import bootModules.js into your app
Use MY_APP_CONFIGURATION ENV variable to switch desired module configuration - works not just during development but you can also setup different CI processes targeting same repo with just different MY_APP_CONFIGURATION values
This way you have all configurations at one place, you don't need to change package.json before every build, you have simple mechanism to switch between different module configurations and every build (bundle) contains only the modules needed....
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
Why not?
More than this, with JS/TS you are not restricted to use classes implementing a specific interface: you just need to define the interface (i.e. the module.exports) of your modules and respecting it in the libraries entries (vue build lib).
EDIT: reading comments probably I understood the request.
Each module should respect following interface (in the file which is the entry of the vue library)
export function isMyAppModule() {
return true;
}
export function myAppInit() {
return { /* what you need to export */ };
}
Than in your app:
require("./package.json").dependencies.forEach(name => {
const module = require(name);
if(! module.isMyAppModule || module.isMyAppModule() !== true) return;
const { /* the refs you need */ } = module.myAppInit();
// use your refs as you need
});

browserify - exclude code blocks?

I'm building an app with shared React components in the browser and server-side Node.
Right now, I'm using Marty.js to do this:
function getUser() {
if (Marty.isBrowser) {
/* Get user using some client method */
} else {
/* otherwise, use some secret server code */
}
}
I'm bundling those functions up via Browserify, so they can run on the client as well as the server.
What I'd like to do is remove the else block from the bundle entirely, so I'm not leaking sensitive server-side code.
Is there a way to exclude blocks of code from the bundle?
I would create separate modules, one for the browser and one for the server. Then in your package.json, you tell browserify to use the browser module:
"browser": {
"./path/to/node-module.js": "./path/to/browser-module.js"
}
Now, whereever you call require('path/to/node-module'), browserify will load the other module instead.
More information from the docs:
browser field
There is a special "browser" field you can set in your package.json on a per-module basis to override file resolution for browser-specific versions of files.
For example, if you want to have a browser-specific module entry point for your "main" field you can just set the "browser" field to a string:
"browser": "./browser.js"
or you can have overrides on a per-file basis:
"browser": {
"fs": "level-fs",
"./lib/ops.js": "./browser/opts.js"
}
Note that the browser field only applies to files in the local module, and like transforms, it doesn't apply into node_modules directories.
While I'm not sure if it possible with Browserify, you can do it with Webpack using its DefinePlugin
From the docs (little modified):
Example:
new webpack.DefinePlugin({
DEBUG: false,
PRODUCTION: true,
...
})
...
Example:
if(DEBUG)
console.log('Debug info')
if(PRODUCTION)
console.log('Production log')
After passing through webpack with no minification results in:
if(false)
console.log('Debug info')
if(true)
console.log('Production log')
and then after a minification pass results in:
console.log('Production log')
You can use an environment variable, envify and uglify to do this.
if ('browser' === process.env.ENVIRONMENT) {
...
}
else {
...
}
Set process.env.ENVIRONMENT = 'browser' when doing your browser build, use the envify transform to substitute references to process.env with their current values and uglify will then perform dead code elimination to remove the branches which will never be hit.
Be more explicit about your intent, and put your code in their own files:
function getUser(options, callback) {
var fn;
if (Marty.isBrowser) {
fn = require("./lib/users/get.browser");
} else {
fn = require("./lib/users/get.server");
}
fn(options, callback);
}
and then as a browserify option you can say "replace require("./lib/users/get.server") with this variable instead, when you see it: ..." so that you don't build in that server file when you build for the browser.
However, if getUser can do different things based on where it's running, it feels far more likely that you're doing something wrong here: maybe that getUser should be a REST call to your server from the browser instead, but without more information, that's always hard to determine.
What about putting the code in a module for example UserServer and then exclude that module when you are compiling for the client? Your code becomes:
function getUser() {
if (Marty.isBrowser) {
/* Get user using some client method */
} else {
require('UserServer').getUser();
}
}
Browserify provides the following option to exclude files from the bundle:
--exclude, -u Omit a file from the output bundle. Files can be globs.

best practices for cross commonjs/browser development

Currently, I use a few defines via the Google Closure Compiler along the lines of IS_CJS and IS_BROWSER, and just have different files that get built (browser.myproject.js, cjs.myproject.js, etc).
Is this the standard way of doing things? If not, what is it and what are the advantages?
I've been using the following preamble in all my projects, for libraries that are loaded by both browser and server code:
if (define === undefined) {
var define = function(f) {
require.paths.unshift('.');
f(require, exports, module);
};
}
define(function(require, exports, module) {
...
// main library here
...
// use require to import dependencies
var v = require(something);
...
// use exports to return library functions
exports.<stuff> = { some stuff };
...
});
This works to load the library with a require(<library>) call running on my node server, as well as with a require(<library>) call with RequireJS. On the browser, nested require calls are pre-fetched by RequireJS prior to library execution, on Node these dependencies are loaded synchronously. Since I'm not using my libraries as stand-alone scripts (via a script tag in the html), and only as dependencies for scripts loaded via the script tag, this works well for me.
However, looking at stand-alone libraries, it looks like the following preamble seems to be the most flexible. (cut and paste from the Q promise library
(function (definition, undefined) {
// This file will function properly as a <script> tag, or a module
// using CommonJS and NodeJS or RequireJS module formats. In
// Common/Node/RequireJS, the module exports the Q API and when
// executed as a simple <script>, it creates a Q global instead.
// The use of "undefined" in the arguments is a
// micro-optmization for compression systems, permitting
// every occurrence of the "undefined" variable to be
// replaced with a single-character.
// RequireJS
if (typeof define === "function") {
define(function (require, exports, module) {
definition(require, exports, module);
});
// CommonJS
} else if (typeof exports === "object") {
definition(require, exports, module);
// <script>
} else {
Q = definition(undefined, {}, {});
}
})(function (serverSideRequire, exports, module, undefined) {
...
main library here
...
/*
* In module systems that support ``module.exports`` assignment or exports
* return, allow the ``ref`` function to be used as the ``Q`` constructor
* exported by the "q" module.
*/
for (var name in exports)
ref[name] = exports[name];
module.exports = ref;
return ref;
});
While wordy, it's impressively flexible, and simply works, no matter what your execution environment is.
You can use uRequire that converts modules written in either AMD or CommonJS to either AMD, CommonJS or UMD through a template system.
Optionally uRequire builds your whole bundle as a combinedFile.js that runs in ALL environments (nodejs, AMD or module-less browser < script/>) thats using rjs optimizer and almond under the hood.
uRequire saves you from having to maintain any boilerplate in each module - just write plain AMD or CommonJS modules (as .js, .coffee, .coco, .ls etc) without gimmicks.
Plus you can declaratively add standard functionality such as exporting a module to global such as window.myModule along with a noConflict() method, or have runtimeInfo (eg __isNode, __isAMD) selectively or replace/remove/inject a dependency while building, automatically minify, manipulate module code and much more.
All of these configuration options can be turned on and off per bundle OR per module, and you can have different build profiles (development, test, production etc) that derive(inherit) from each other.
It works great with grunt through grunt-urequire or standalone and it has a great watch option that rebuilds ONLY changed files.
Have you tried this: https://github.com/medikoo/modules-webmake#modules-webmake ?
It's the approach I'm taking, and it works really well. No boilerplate in a code and you can run same modules on both server and client side

requireJS an entire folder

Is it possible to "require" an entire folder using requireJS.
For example, I have a behaviors folder with a ton of behavior js files. I'd really like to be able to simply use require(['behaviors/*'], function() {...}); to load everything in that folder rather than having to keep that list up to date. Once compressed and optimized I'd have all those files lump together, but for development it's easier to work with them individually.
javascript in browser has no filesystem access and so it can't scan a directory for files. If you are building your app in a scripting language like php or ruby you could write a script that scans the directory and adds the file names to the require() call.
I don't know if I can recommend this approach anymore. I think the more explicit way to do this is by manually "requiring"/"exporting" the functionality you need. The exception I think is if you have a "namespace" of files that you want exported see below "Babel and ES6 Module Import Declarations (export-namespace-from) or see below "Babel and ES6 Module Import Declarations.
These solutions also assume that you have a meaningful file structure - where file names become part of that "require" * definition.
However, if you still need to do this there are a few existing tools and methods that might provide the behavior that you're looking for.
Possible Solutions
Babel and ES6 Module Import Declarations (plugin-export-namespace-from)
Have a setup that is ES6 compliant.
You need to update your .babelrc file to include babel-plugin-proposal-export-namespace-from.
Use export namespace plugin by writing syntax like the following:
common/index.js
export * from './common/a'; // export const a = false;
export * from './common/b'; // export const b = true;
main.js
import { a, b } from './common';
console.log(a); // false
console.log(b); // true
Babel and ES6 Module Import Declarations (plugin-wildcard)
Have a setup that is ES6 compliant.
You need to update your .babelrc file to include babel-plugin-wildcard.
Use wildcard namespace plugin by writing syntax like the following:
main.js
import { a, b } from './common/*'; // imports './common/a.js' and './common/b.js'
console.log(a); // false
console.log(b); // true
RequireJS (Now Outdated)
Download and install require-wild npm install require-wild
Configure the declaration as follows
grunt.initConfig({
requireWild: {
app: {
// Input files to look for wildcards (require|define)
src: ["./**/*.js"],
// Output file contains generated namespace modules
dest: "./namespaces.js",
// Load your require config file used to find baseUrl - optional
options: { requireConfigFile: "./main.js" }
}
}
});
grunt.loadNpmTasks("require-wild");
grunt.registerTask('default', ['requireWild']);
Then run the grunt task. Your file will be generated. Modify your setup to load namespaces.js
require(['namespaces'], function () { ... });
This now allows modules under src to use dependencies glob pattern matching.
require(['behaviors/**/*'], function (behaviors) { }
I know this is old, but I'd like to share my solution:
For this solution you need JQuery
1) Create a bash script that will list all the js files in
"MyDirectory/", and save it to "directoryContents.txt":
#!/bin/bash
#Find all the files in that directory...
for file in $( find MyDirectory/ -type f -name "*.js" )
do
fileClean=${file%.js} #Must remove .js from the end!
echo -n "$fileClean " >> MyDirectory/directoryContents.txt
done
File will look like this:
MyDirectory/FirstJavascriptFile MyDirectory/SecondJavascriptFile
MyDirectory/ThirdJavascriptFile
Problem with my script! Puts an extra " " at the end, that messes things up! Make sure to remove the excess space at the end of directoryContents.txt
2) Then in your Client side JS code:
do a "GET" request to retrieve the text file
For each entry (split by the space), 'require' that file:
.
$.get( "MyDirectory/directoryContents.txt", {}, function( data ) {
var allJsFilesInFolder = data.split(" ");
for(var a=0; a<allJsFilesInFolder.length; a++)
{
require([allJsFilesInFolder[a]], function(jsConfig)
{
//Done loading this one file
});
}
}, "text");
I was having a problem with this code not finishing before my other code, so Here's my extended answer:
define([''], function() {
return {
createTestMenu: function()
{
this.loadAllJSFiles(function(){
//Here ALL those files you need are loaded!
});
},
loadAllJSFiles: function(callback)
{
$.get( "MyDirectory/directoryContents.txt", {}, function( data ) {
var allJsFilesInFolder = data.split(" ");
var currentFileNum = 0;
for(var a=0; a<allJsFilesInFolder.length; a++)
{
require([allJsFilesInFolder[a]], function(jsConfig)
{
currentFileNum++;
//If it's the last file that needs to be loaded, run the callback.
if (currentFileNum==allJsFilesInFolder.length)
{
console.log("Done loading all configuration files.");
if (typeof callback != "undefined"){callback();}
}
});
}
}, "text");
}
}
});
What I ended up doing was everytime my Node server boots, it will run the bash script, populating directoryContents.txt. Then My client side just reads directoryContents.txt for the list of files, and requires each in that list.
Hope this helps!
There isn't really a way to do this conceptually on the fly (that I know of).
There's a few work arounds though:
Use grunt and concat and then just require that behemoth...I know, kinda sucky.
What I think is a better solution... use a require hierarchy like so:
require('/js/controllers/init', function(ctrls){
ctrls(app, globals);
});
// /js/controllers/init.js
define('js/controllers/index', 'js/controllers/posts', function(index, posts){
return function protagonist(app, globals){
var indexModule = index(app, globals);
var indexModule = posts(app, globals);
return app || someModule;
};
});
// /js/controllers/index.js
define('js/controllers/index', 'js/controllers/posts', function(index, posts){
return function protagonist(app, globals){
function method1(){}
function method2(){}
return {
m1: method1,
m2: method2
};
};
});
Note that "protagonist" function. That allows you to initialize modules before their use, so now you can pass in a 'sandbox' -- in this case app and globals.
Realistically, you wouldn't have /js/controllers/index.js... It should probably be something like /js/controllers/index/main.js or /js/controllers/index/init.js so that there is a directory adjacent to (sibling of) /js/controllers/init.js called "index". This will make your modules scalable to a given interface -- you can simply swap modules out and keep your interface the same.
Hope this helps! Happy coding!
I wrote a library to solve this problem. Eventually someone else came along and improved my library, here it is:
https://github.com/smartprocure/directory-metagen
You can use my lib with Gulp or whatever - it generates metadata for your project and RequireJS can use that metadata to require the desired files from the filesystem.
Using this lib will produce a RequireJS module that looks something like this:
define(
[
"text!app/templates/dashboardTemplate.ejs",
"text!app/templates/fluxCartTemplate.ejs",
"text!app/templates/footerTemplate.ejs",
"text!app/templates/getAllTemplate.ejs",
"text!app/templates/headerTemplate.ejs",
"text!app/templates/homeTemplate.ejs",
"text!app/templates/indexTemplate.ejs",
"text!app/templates/jobsTemplate.ejs",
"text!app/templates/loginTemplate.ejs",
"text!app/templates/overviewTemplate.ejs",
"text!app/templates/pictureTemplate.ejs",
"text!app/templates/portalTemplate.ejs",
"text!app/templates/registeredUsersTemplate.ejs",
"text!app/templates/userProfileTemplate.ejs"
],
function(){
return {
"templates/dashboardTemplate.ejs": arguments[0],
"templates/fluxCartTemplate.ejs": arguments[1],
"templates/footerTemplate.ejs": arguments[2],
"templates/getAllTemplate.ejs": arguments[3],
"templates/headerTemplate.ejs": arguments[4],
"templates/homeTemplate.ejs": arguments[5],
"templates/indexTemplate.ejs": arguments[6],
"templates/jobsTemplate.ejs": arguments[7],
"templates/loginTemplate.ejs": arguments[8],
"templates/overviewTemplate.ejs": arguments[9],
"templates/pictureTemplate.ejs": arguments[10],
"templates/portalTemplate.ejs": arguments[11],
"templates/registeredUsersTemplate.ejs": arguments[12],
"templates/userProfileTemplate.ejs": arguments[13]
}
});
You can then require modules in your front-end like so:
var footerView = require("app/js/jsx/standardViews/footerView");
however, as you can see this is too verbose, so the magic way is like so:
name the dependency above as allViews!
now you can do:
var allViews = require('allViews');
var footerView = allViews['standardViews/footerView'];
There are two advantages to requiring directories whole:
(1) in production, with the r.js optimizer, you can point to one dependency (module A) and it can then easily trace all of A's dependencies that represent a entire directory
(2) in development, you can require whole directories up front and then use synchronous syntax to require dependencies because you know they have already been loaded
enjoy "RequireJS-Metagen"
https://github.com/smartprocure/directory-metagen
https://www.npmjs.com/package/requirejs-metagen
https://github.com/ORESoftware/requirejs-metagen

JavaScript require() on client side

Is it possible to use require() (or something similar) on client side?
Example
var myClass = require('./js/myclass.js');
You should look into require.js or head.js for this.
I've been using browserify for that. It also lets me integrate Node.js modules into my client-side code.
I blogged about it here: Add node.js/CommonJS style require() to client-side JavaScript with browserify
If you want to have Node.js style require you can use something like this:
var require = (function () {
var cache = {};
function loadScript(url) {
var xhr = new XMLHttpRequest(),
fnBody;
xhr.open('get', url, false);
xhr.send();
if (xhr.status === 200 && xhr.getResponseHeader('Content-Type') === 'application/x-javascript') {
fnBody = 'var exports = {};\n' + xhr.responseText + '\nreturn exports;';
cache[url] = (new Function(fnBody)).call({});
}
}
function resolve(module) {
//TODO resolve urls
return module;
}
function require(module) {
var url = resolve(module);
if (!Object.prototype.hasOwnProperty.call(cache, url)) {
loadScript(url);
}
return cache[url];
}
require.cache = cache;
require.resolve = resolve;
return require;
}());
Beware: this code works but is incomplete (especially url resolving) and does not implement all Node.js features (I just put this together last night).
YOU SHOULD NOT USE THIS CODE in real apps but it gives you a starting point. I tested it with this simple module and it works:
function hello() {
console.log('Hello world!');
}
exports.hello = hello;
I asked myself the very same questions. When I looked into it I found the choices overwhelming.
Fortunately I found this excellent spreadsheet that helps you choice the best loader based on your requirements:
https://spreadsheets.google.com/lv?key=tDdcrv9wNQRCNCRCflWxhYQ
Take a look at requirejs project.
I have found that in general it is recommended to preprocess scripts at compile time and bundle them in one (or very few) packages with the require being rewritten to some "lightweight shim" also at compile time.
I've Googled out following "new" tools that should be able to do it
http://mixu.net/gluejs/
https://github.com/jrburke/almond
https://github.com/component/builder2.js
And the already mentioned browserify should also fit quite well - http://esa-matti.suuronen.org/blog/2013/04/15/asynchronous-module-loading-with-browserify/
What are the module systems all about?
Older Stack Overflow explanation - Relation between CommonJS, AMD and RequireJS?
Detailed discussion of various module frameworks and the require() they need is in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony
You can create elements to the DOM, which loads items.
Like such:
var myScript = document.createElement('script'); // Create new script element
myScript.type = 'text/javascript'; // Set appropriate type
myScript.src = './js/myclass.js'; // Load javascript file
Simply use Browserify, what is something like a compiler that process your files before it go into production and packs the file in bundles.
Think you have a main.js file that require the files of your project, when you run browserify in it, it simply process all and creates a bundle with all your files, allowing the use of the require calls synchronously in the browser without HTTP requests and with very little overhead for the performance and for the size of the bundle, for example.
See the link for more info: http://browserify.org/
Some answers already - but I would like to point you to YUI3 and its on-demand module loading. It works on both server (node.js) and client, too - I have a demo website using the exact same JS code running on either client or server to build the pages, but that's another topic.
YUI3: http://developer.yahoo.com/yui/3/
Videos: http://developer.yahoo.com/yui/theater/
Example:
(precondition: the basic YUI3 functions in 7k yui.js have been loaded)
YUI({
//configuration for the loader
}).use('node','io','own-app-module1', function (Y) {
//sandboxed application code
//...
//If you already have a "Y" instance you can use that instead
//of creating a new (sandbox) Y:
// Y.use('moduleX','moduleY', function (Y) {
// });
//difference to YUI().use(): uses the existing "Y"-sandbox
}
This code loads the YUI3 modules "node" and "io", and the module "own-app-module1", and then the callback function is run. A new sandbox "Y" with all the YUI3 and own-app-module1 functions is created. Nothing appears in the global namespace. The loading of the modules (.js files) is handled by the YUI3 loader. It also uses (optional, not show here) configuration to select a -debug or -min(ified) version of the modules to load.
Here's a solution that takes a very different approach: package up all the modules into a JSON object and require modules by reading and executing the file content without additional requests.
https://github.com/STRd6/require/blob/master/main.coffee.md
STRd6/require depends on having a JSON package available at runtime. The require function is generated for that package. The package contains all the files your app could require. No further http requests are made because the package bundles all dependencies. This is as close as one can get to the Node.js style require on the client.
The structure of the package is as follows:
entryPoint: "main"
distribution:
main:
content: "alert(\"It worked!\")"
...
dependencies:
<name>: <a package>
Unlike Node a package doesn't know it's external name. It is up to the pacakge including the dependency to name it. This provides complete encapsulation.
Given all that setup here's a function that loads a file from within a package:
loadModule = (pkg, path) ->
unless (file = pkg.distribution[path])
throw "Could not find file at #{path} in #{pkg.name}"
program = file.content
dirname = path.split(fileSeparator)[0...-1].join(fileSeparator)
module =
path: dirname
exports: {}
context =
require: generateRequireFn(pkg, module)
global: global
module: module
exports: module.exports
PACKAGE: pkg
__filename: path
__dirname: dirname
args = Object.keys(context)
values = args.map (name) -> context[name]
Function(args..., program).apply(module, values)
return module
This external context provides some variable that modules have access to.
A require function is exposed to modules so they may require other modules.
Additional properties such as a reference to the global object and some metadata
are also exposed.
Finally we execute the program within the module and given context.
This answer will be most helpful to those who wish to have a synchronous node.js style require statement in the browser and are not interested in remote script loading solutions.
I find the component project giving a much more streamlined workflow than other solutions (including require.js), so I'd advise checking out https://github.com/component/component . I know this is a bit late answer but may be useful to someone.
Here's a light weight way to use require and exports in your web client. It's a simple wrapper that creates a "namespace" global variable, and you wrap your CommonJS compatible code in a "define" function like this:
namespace.lookup('org.mydomain.mymodule').define(function (exports, require) {
var extern = require('org.other.module');
exports.foo = function foo() { ... };
});
More docs here:
https://github.com/mckoss/namespace
The clientside-require library provides an asynchronous load() function that can be used to load any JS file or NPM module (which uses module.exports), any .css file, any .json, any .html, any any other file as text.
e.g.,
npm install clientside-require --save
<script src = '/node_modules/clientside-require/dist/bundle.js'></script>
<script>
load('color-name') // an npm module
.then(color_name=>{
console.log(color_name.blue); // outputs [0, 0, 255]
})
</script>
A really cool part of this project is that inside of any load()ed script, you can use the synchronous require() function the same way you would expect in node.js!
e.g.,
load('/path/to/functionality.js')
and inside /path/to/functionality.js:
var query_string = require("qs") // an npm module
module.exports = function(name){
return qs.stringify({
name:name,
time:new Date()
}
}
That last part, implementing the synchronous require() method, is what enables it to utilize NPM packages built to run on the server.
This module was designed to implement the require functionality as closely as possible in the browser. Disclaimer: I have written this module.
Yes it is very easy to use, but you need to load javascript file in browser by script tag
<script src="module.js"></script>
and then user in js file like
var moduel = require('./module');
I am making a app using electron and it works as expected.

Categories

Resources