I'm working on multiple browser extension / add-ons that usually need to work at least in Chrome and Firefox (sometimes in Safari as well).
The biggest issue is staying DRY and on the other hand keeping the source clean.
Conceptually the project usually has following parts:
background script for Chrome
background script for Firefox
common background code
content script for Chrome
content script for Firefox
common content script code
other scripts (for example: option page).
To reduce code duplication I have a single content script for both browser and preprocess
it (removing other-browser-specific parts) during the build process.
Unfortunately this makes content scripts really long and ugly (and hard to lint).
I would like to use Browserify basically for the whole JS code in my projects.
Still to do it I need a solution to handle this kind of flow:
Browser specific entry script -> cross-browser code -> browser specific low-level code.
I would imagine this kind of hierarchy:
- Entry scripts
- Browser A
- Browser B
- ...
- Common code
- Low-level code
- Browser A
- Browser B
- ...
So, for example, during the build process I would like Browserify to take an entry script for browser A, then bundle it together with the common code and with low-level code for browser A only. This is to be done without this kind of switching in the common code:
if(isBrowserA()) {
var lowLevelModule = require("../lowLevel/browserA/module");
} else {
var lowLevelModule = require("../lowLevel/browserB/module");
}
I would like the build process with Browserify to do exactly that for me -- replace the "root path" of low level code depending on the target.
Hacking it around with package.json wouldn't work because I need flexible number of targets (and possibly even deeper dependency tree).
Try using the factor-bundle or partition-bundle Browserify plugins. They both help split code into different entry files and a common modules file. partition-bundle also includes scripts that enable asynchronous loading of your different bundles.
One possible way to do this would be to drop the if/else require() calls from your code, using a fixed path instead.
var lowLevelModule = require("../lowLevel/module")
Then run a separate build for each browser, using browserify to change what the path resolves to for each build using expose.
So in a gulpfile.js (just for example, the browserify API is the important bit - you could do the same from a shell using -r and -x flags to browserify, using : to separate the require/expose values), run the build once for each browser, passing in a different --browser= arg each time.
var browserify = require('browserify')
var gulp = require('gulp')
var gutil = require('gulp-util')
var source = require('vinyl-source-stream')
var browser = gutil.env.browser // browserA, or browserB, or...
// You might want to configure paths up-front separately, just
// hardcoding below for brevity.
gulp.task('bundle-app', function() {
var b = browserify('./entry/' + browser + '/module', {detectGlobals: false})
b.require('./path/to/lowLevel/' + browser + '/module', {expose: '../lowLevel/module'})
return b.bundle()
.pipe(source('app.js'))
.pipe(gulp.dest('./build'))
})
For common dependencies, you could bundle them into an external file and add external calls to your browser-specific bundles:
var commonModules = ['module1', 'module2']
gulp.task('bundle-common', function() {
var b = browserify({detectGlobals: false})
commonModules.forEach(function(module) {
b.require(module)
})
return b.bundle()
.pipe(source('common.js'))
.pipe(gulp.dest('./build'))
})
gulp.task('bundle-app', function() {
var b = browserify('./entry/' + browser + '/module', {detectGlobals: false})
commonModules.forEach(function(module) {
b.external(module)
})
b.require('./path/to/lowLevel/' + browser + '/module', {expose: '../lowLevel/module'})
return b.bundle()
.pipe(source('app.js'))
.pipe(gulp.dest('./build'))
})
I usually put chains of builds in a package.json script for convenience:
"scripts": {
"build": "gulp bundle-common && gulp bundle-app --browser=browserA && gulp bundle-app --browser=browserB"
}
Finally:
npm run build
This is not something Browserify supports out-of-the-box unless, but you could in theory achieve what you want if you write a custom source code transform.
As an alternative, the RaptorJS Optimizer provides the exact same feature that you are looking for out-of-the-box. (disclaimer: I am the author of this tool and it is the tool we use at eBay for all our Node.js applications) The RaptorJS Optimizer allows you to remap one module to another based on a set of arbitrary flags that are enabled during optimization. We use this feature at eBay to conditionally send down different code for different web browsers, devices, experimentation groups, etc. For more details on that feature, please see:
https://github.com/raptorjs/optimizer-require#conditional-remap
https://github.com/raptorjs/optimizer#conditional-dependencies
FYI, the RaptorJS Optimizer supports all of the features of Browserify, plus it provides support for non-JS dependencies, async loading, conditional dependencies, dynamic requires, etc. It is still very modular like Browserify and can be extended via plugins to teach it how to handle new dependency types. Unlike Webpack, the RaptorJS Optimizer does not overload the CommonJS module loading system so that code will still be allowed run under Node.js and in the web browser. We have had a lot of success with the RaptorJS Optimizer at eBay (and other companies) so I encourage you to check it out.
Related
I have more than javascript files in my html documents as external which I'd like to combine on account of not to be crowded. is there any way to combine my js files ? for example;
my files:
a.js
b.js
c.js
d.js
and i want;
all.js
Take a look at requirejs.org and especially look at r.js (http://requirejs.org/docs/optimization.html)
a.js
var i=0;
function fun1()
{
...
}
b.js
var k=0;
function fun2()
{
...
}
all.js just copy, paste like css
var i=0;
function fun1()
{
...
}
var k=0;
function fun2()
{
...
}
take care of semicolons and closed braces when you purticularly write whole script in an eventlistener, especially 'DOMContentLoaded'
document.addEventlistener('DOMContentloaed',function()
{
//whole big script
}
);
instead use
document.addEventlistener('DOMContentloaed',some_function);
var some_function = function(){*bla bla bla*};
A simple bash cat operation will do what you want, but, at some stage you're probably going to want more right?
Grunt and Grunt-contrib-concat is a good starter, but you'll quickly realise grunt is not particularly good. To summarise usage, you create a gruntfile, install a few dependencies (i.e. install grunt and its command line interface, this is easy) and run grunt from your project root. It then parses the gruntfile to find out what you want it to do, and it does it. Pretty simple, and simple is good.
Next up is Gulp, which is a nice build system using streams, so, slightly more complex (well, easier and more powerful but, streams can be kind-of confusing at first). Gulp works in the same way only it parses a gulpfile for instructions. For a concat operation the actual gulp command is trivial:
gulp.src( '*.js' )
.pipe( gulp.dest( 'all.js' )
Between the .src and the .dest you can pipe the files through multiple transforms, such as minifying, transpiling, notifying—the list of plugins and modules is dizzying (as it is for grunt).
However, if you’re a fan of node and npm (you probably should be) then you can use npm scripts to create a build system. npm is the node package manager and requires a package.json to give some clues as to how to work. Part of that json specification is a scripts block
"scripts": {
"build" : "cat *.js > all.js"
}
You can then use npm run build from the command line, whereby npm will parse the package.json and execute the script using bash (sh actually).
Note that these are build systems, and there are many others.
There are also other packagers (which you would probably use as part of your build system, although for some projects they are you entire build system) but they are more complex than your needs, for your own research browserify, webpack and jspm are all excellent (bare in mind AMD modules lost so require.js is probably not worth your time), although this area is becoming congested. Each of these are very powerful modularisation tools, but they will require some changes to how you structure your code. If you are serious about modularisation then they are worth your time learning.
On a slightly different tangent, there is some discussion about whether one large file is actually more beneficial than a number of smaller scripts. In many cases simply serving a few small files is actually quicker, and may be easier, although there can be other benefits of smashing code together. Currently it is probably still best to concat at least into less HTTP requests, but this requirement for performance is going away.
It might be helpful : https://github.com/mrclay/minify
OR
create file all.js and paste a.js,b.js,c.js,d.js file code
I am working on a javascript module/library that should work in 3 environments:
in node.js
in requirejs
when simply included using tags into the webpage. In this case the whole module should be hooked up under window.myModule
Do you have any suggestions as to how to write the structure of the library so that it works in all these environments?
EDIT: basically I mean some sort of wrapper code around the library so that I can call the file form any of those three methods and I'm fine...
This requirement and its solution is known as Universal Module Definition (UMD). It is currently a draft proposal. Background and current status is described in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony article. Look for "UMD" link pointing to various templates you can use.
Quite many other templates can be found on the web - UMD is the search keyword.
(did not find the final link myself yet :)
We're working on the same thing, I think.
And we have some success. We have library (we call it 'slib'), compiled to AMD js files. It does not depend on npm modules or browser, so it can be called from node and from browser.
1) To call it from node, we use requirejs:
file require.conf.js
module.exports = function(nodeRequire){
global.requirejs = require('requirejs');
requirejs.config({
baseUrl: __dirname+"/../web/slib/",
paths: {
slib: "."
},
nodeRequire: nodeRequire
});
}
In any other serverside (nodejs) file we add this line at the beginning
require("./require.conf")(require);
then we call slib's code by:
var Computation = requirejs("slib/Computation");
2) To call slib from browser, we just use requirejs. It handles everything fine.
3) We do not need to call slib from < script > directly.
For production, we use r.js to make a bundle js file with most of dependencies and use it on the page with one < script >. And this script downloads all other deps, if they are not included, using standard requirejs and it does not need requirejs (as far as I remember), it just works alone. This is very flexible for large projects: use requirejs while development, use r.js to bundle core files in production to speed up page load, use whole bundle if you need only one < script > without any other requests. r.js bundles all dependencies correctly, including old js libraries, which were commonly loading using only < script > and accessible by window.myOldLibrary using shim param on config.
It seems you can use browserfy to make some npm modules accessible from slib's code, but we did not tried yet.
Also, using requirejs on node's side, I think, can be simpler (why we need second 'requirejs' function together with node's one?) We just have not investigated it well, but this works.
In any slib module you can write
if (window)
window.module1 = this // or whatever
and it will be exported as old js lib upon load
I'm working on a userscript - in particular this userscript - which has been designed to encapsulate functionality in modules. In order to be able to do some automated testing I would like to split the modules into their own files and use node.js's module exporting and require functions to combine into one file for use in Greasemonkey or simple browser extensions.
My first thought was to just copy the modules into their own files as such
module.js
var exportedModule = (function (){
var Module = {
// public functions and members
};
//private functions and members
return Module;
}());
module.exports = exports = exportedModule;
And then have a central file that requires each of these modules, perhaps compiling them with something like Browserify.
script.js
var importedModule = require(./module);
importedModule.init();
Is this possible?
It seems to me that you would be better off using Requirejs, which uses AMD style modules and is inherently more browser friendly. Node commonjs-style modules are synchronous and do not fit the browser model very well.
Of course, using requirejs will change your scripts a bit.
It's possible, and Browserify makes it easy:
browserify src/my.user.js -o dist/my.user.js
The metadata in the source file may get moved, but it's still parsed correctly (by Greasemonkey at least).
For a more complex example which compiles various assets, including CSS and images, see here.
I'd like to know if there is a way to include a file in a coffee script.
Something like #include in C or require in PHP...
If you use coffeescript with node.js (e.g. when using the commandline tool coffee) then you can use node's require() function exactly as you would for a JS-file.
Say you want to include included-file.coffee in main.coffee:
In included-file.coffee: declare and export objects you want to export
someVar = ...
exports.someVar = someVar
In main.coffee you can then say:
someVar = require('included-file.coffee').someVar
This gives you clean modularization and avoids namespace conflicts when including external code.
How about coffeescript-concat?
coffeescript-concat is a utility that preprocesses and concatenates
CoffeeScript source files.
It makes it easy to keep your CoffeeScript code in separate units and
still run them easily. You can keep your source logically separated
without the frustration of putting it all together to run or embed in
a web page. Additionally, coffeescript-concat will give you a single
sourcefile that will easily compile to a single Javascript file.
Tl;DR: Browserify, possibly with a build tool like Grunt...
Solutions review
Build tool + import pre-processor
If what you want is a single JS file to be run in the browser, I recommend using a build tool like Grunt (or Gulp, or Cake, or Mimosa, or any other) to pre-process your Coffeescript, along with an include/require/import module that will concatenate included files into your compiled output, like one of these:
Browserify: probably the rising standard and my personal favourite, lets you to use Node's exports/require API in your code, then extracts and concatenates everything required into a browser includable file. Exists for Grunt, Gulp, Mimosa and probably most others . To this day I reckon it is probably the best solution if you're after compatibility both Node and the browser (and even otherwise)
Some Rails Sprocket-like solutions like grunt-sprockets-directives or gulp-include will also work in a consistent way with CSS pre-processors (though those generally have their own importing mechanisms)
Other solutions include grunt-includes or grunt-import
Standalone import pre-processor
If you'd rather avoid the extra-complexity of a build tool, you can use Browserify stand-alone, or alternatives not based on Node's require like coffeescript-concat or Coffee-Stir
[Not recommended] Asynchronous dynamic loading (AJAX + eval)
If you're writing exclusively for the browser and don't mind, or rather really want, your script being spread across several files fetched via AJAX, you can use a myriad of tools like:
yepnope.js or Modernizr's .load based on yepnope: Please note that yepnope is now deprecated by its maintainer, who recommend using build tools and concatenation instead of remote loading
RequireJS
HeadJS
jQuery's $.getScript
Vanilla AJAX + eval
your own implementation of AMD
You can try this library I made to solve this same problem coffee-stir
its very simple.
Just type #include and the name of the file that you want to include
#include MyBaseClass.coffee
For details
http://beastjavascript.github.io/Coffee-Stir/
I found that using "gulp-concat" to merge my coffee scripts before processing them did the trick. It can be easily installed to your project with npm.
npm install gulp-concat
Then edit your gulpfile.js:
var gulp = require('gulp')
,coffee = require('gulp-coffee')
,concat = require('gulp-concat');
gulp.task('coffee', function(){
gulp.src('src/*.coffee')
.pipe(concat('app.coffee')
.pipe(coffee({bare: true}).on('error', gulp.log))
.pipe(gulp.dest('build/')
})
This is the code I used to concatenate all my coffee scripts before gulp processed it into the final build Javascript. The only issue is the files are processed in alphabetical order. You can explicitly state which file to process to achieve your own file order, but you lose the flexibility of adding dynamic .coffee files.
gulp.src(['src/file3.coffee', 'src/file1.coffee', 'src/file2.coffee'])
.pipe(concat('app.coffee'))
.pipe(coffee({bare: true}).on('error', gulp.log))
.pipe(gulp.dest('build/')
gulp-concat as of February 25th, 2015 is available at this url.
Rails uses sprockets to do this, and this syntax has been adapted to https://www.npmjs.org/package/grunt-sprockets-directives. Works well for me.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Are there any libraries for in-browser javascript that provide the same flexibility/modularity/ease of use as Node's require?
To provide more detail: the reason require is so good is that it:
Allows code to be dynamically loaded from other locations (which is stylistically better, in my opinion, than linking all your code in the HTML)
It provides a consistent interface for building modules
It is easy for modules to depend on other modules (so I could write, for instance, an API that requires jQuery so I can use jQuery.ajax()
Loaded javascript is scoped, meaning I could load with var dsp = require("dsp.js"); and I would be able to access dsp.FFT, which wouldn't interfere with my local var FFT
I have yet to find a library that does this effectively. The workarounds I tend to use are:
coffeescript-concat -- it's easy enough to require other js, but you have to compile it, which means it is less great for fast development (e.g. building APIs in-test)
RequireJS -- It's popular, straightforward, and solves 1-3, but lack of scoping is a real deal-breaker (I believe head.js is similar in that it lacks scoping, though I've never had any occasion to use it. Similarly, LABjs can load and .wait() does mollify dependency issues, but it still doesn't do scoping)
As far as I can tell, there appear to be many solutions for dynamic and/or async loading of javascript, but they tend to have the same scoping issues as just loading the js from HTML. More than anything else, I would like a way to load javascript that does not pollute the global namespace at all, but still allows me to load and use libraries (just as node's require does).
2020 UPDATE: Modules are now standard in ES6, and as of mid-2020 are natively supported by most browsers. Modules support both synchronous and asynchronous (using Promise) loading. My current recommendation is that most new projects should use ES6 modules, and use a transpiler to fall back to a single JS file for legacy browsers.
As a general principle, bandwidth today is also typically much wider than when I originally asked this question. So in practice, you might reasonably chose to always use a transpiler with ES6 modules, and focus your effort on code efficiency rather than network.
PREVIOUS EDIT (or if you don't like ES6 modules): Since writing this, I have extensively used RequireJS (which now has much clearer documentation). RequireJS really was the right choice in my opinion. I'd like to clarify how the system works for people who are as confused as I was:
You can use require in everyday development. A module can be anything returned by a function (typically an object or a function) and is scoped as a parameter. You can also compile your project into a single file for deployment using r.js (in practice this is almost always faster, even though require can load scripts in parallel).
The primary difference between RequireJS and node-style require like browserify (a cool project suggested by tjameson) uses is the way modules are designed and required:
RequireJS uses AMD (Async Module Definition). In AMD, require takes a list of modules (javascript files) to load and a callback function. When it has loaded each of the modules, it calls the callback with each module as a parameter to the callback. Thus it's truly asynchronous and therefore well-suited to the web.
Node uses CommonJS. In CommonJS, require is a blocking call that loads a module and returns it as an object. This works fine for Node because files are read off the filesystem, which is fast enough, but works poorly on the web because loading files synchronously can take much longer.
In practice, many developers have used Node (and therefore CommonJS) before they ever see AMD. In addition, many libraries/modules are written for CommonJS (by adding things to an exports object) rather than for AMD (by returning the module from the define function). Therefore, lots of Node-turned-web developers want to use CommonJS libraries on the web. This is possible, since loading from a <script> tag is blocking. Solutions like browserify take CommonJS (Node) modules and wrap them up so you can include them with script tags.
Therefore, if you are developing your own multi-file project for the web, I strongly recommend RequireJS, since it is truly a module system for the web (though in fair disclosure, I find AMD much more natural than CommonJS). Recently, the distinction has become less important, since RequireJS now allows you to essentially use CommonJS syntax. Additionally, RequireJS can be used to load AMD modules in Node (though I prefer node-amd-loader).
I realize there may be beginners looking to organize their code. This is 2022, and if you're considering a modular JS app, you should get started with npm and Webpack right now.
Here are a few simple steps to get started:
In your project root, run npm init -y to initialize an npm project
Download the Webpack module bundler: npm install webpack webpack-cli
Create an index.html file:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>App</title>
</head>
<body>
<script src="_bundle.js"></script>
</body>
</html>
Pay special attention to _bundle.js file - this will be a final JS file generated by webpack, you will not modify it directly (keep reading).
Create a <project-root>/app.js in which you will import other modules:
const printHello = require('./print-hello');
printHello();
Create a sample print-hello.js module:
module.exports = function() {
console.log('Hello World!');
}
Create a <project-root>/webpack.config.js and copy-paste the following:
var path = require('path');
module.exports = {
entry: './app.js',
output: {
path: path.resolve(__dirname),
filename: '_bundle.js'
}
};
In the code above, there are 2 points:
entry app.js is where you will write your JS code. It will import other modules as shown above.
output _bundle.js is your final bundle generated by webpack. This is what your html will see at the end.
Open your package.json, and replace scripts with the following command:
"scripts": {
"start": "webpack --mode production -w"
},
And finally run the script watch app.js and generate the _bundle.js file by running: npm start.
Enjoy coding!
Check out ender. It does a lot of this.
Also, browserify is pretty good. I've used require-kiss¹ and it works. There are probably others.
I'm not sure about RequireJS. It's just not the same as node's. You may run into problems with loading from other locations, but it might work. As long as there's a provide method or something that can be called.
TL;DR- I'd recommend browserify or require-kiss.
Update:
1: require-kiss is now dead, and the author has removed it. I've since been using RequireJS without problems. The author of require-kiss wrote pakmanager and pakman. Full disclosure, I work with the developer.
Personally I like RequireJS better. It is much easier to debug (you can have separate files in development, and a single deployed file in production) and is built on a solid "standard".
I wrote a small script which allows asynchronous and synchronous loading of Javascript files, which might be of some use here. It has no dependencies and is compatible to Node.js & CommonJS. The installation is pretty easy:
$ npm install --save #tarp/require
Then just add the following lines to your HTML to load the main-module:
<script src="/node_modules/#tarp/require/require.min.js"></script>
<script>Tarp.require({main: "./scripts/main"});</script>
Inside your main-module (and any sub-module, of course) you can use require() as you know it from CommonJS/NodeJS. The complete docs and the code can be found on GitHub.
A variation of Ilya Kharlamov great answer, with some code to make it play nice with chrome developer tools.
//
///- REQUIRE FN
// equivalent to require from node.js
function require(url){
if (url.toLowerCase().substr(-3)!=='.js') url+='.js'; // to allow loading without js suffix;
if (!require.cache) require.cache=[]; //init cache
var exports=require.cache[url]; //get from cache
if (!exports) { //not cached
try {
exports={};
var X=new XMLHttpRequest();
X.open("GET", url, 0); // sync
X.send();
if (X.status && X.status !== 200) throw new Error(X.statusText);
var source = X.responseText;
// fix (if saved form for Chrome Dev Tools)
if (source.substr(0,10)==="(function("){
var moduleStart = source.indexOf('{');
var moduleEnd = source.lastIndexOf('})');
var CDTcomment = source.indexOf('//# ');
if (CDTcomment>-1 && CDTcomment<moduleStart+6) moduleStart = source.indexOf('\n',CDTcomment);
source = source.slice(moduleStart+1,moduleEnd-1);
}
// fix, add comment to show source on Chrome Dev Tools
source="//# sourceURL="+window.location.origin+url+"\n" + source;
//------
var module = { id: url, uri: url, exports:exports }; //according to node.js modules
var anonFn = new Function("require", "exports", "module", source); //create a Fn with module code, and 3 params: require, exports & module
anonFn(require, exports, module); // call the Fn, Execute the module
require.cache[url] = exports = module.exports; //cache obj exported by module
} catch (err) {
throw new Error("Error loading module "+url+": "+err);
}
}
return exports; //require returns object exported by module
}
///- END REQUIRE FN
(function () {
// c is cache, the rest are the constants
var c = {},s="status",t="Text",e="exports",E="Error",r="require",m="module",S=" ",w=window;
w[r]=function R(url) {
url+=/.js$/i.test(url) ? "" : ".js";// to allow loading without js suffix;
var X=new XMLHttpRequest(),module = { id: url, uri: url }; //according to the modules 1.1 standard
if (!c[url])
try {
X.open("GET", url, 0); // sync
X.send();
if (X[s] && X[s] != 200)
throw X[s+t];
Function(r, e, m, X['response'+t])(R, c[url]={}, module); // Execute the module
module[e] && (c[url]=module[e]);
} catch (x) {
throw w[E](E+" in "+r+": Can't load "+m+S+url+":"+S+x);
}
return c[url];
}
})();
Better not to be used in production because of the blocking. (In node.js, require() is a blocking call is well).
Require-stub — provides node-compliant require in browser, resolves both modules and relative paths. Uses technic similar to TKRequire (XMLHttpRequest).
Resulting code is fully browserifyable, in that require-stub can serve as a replacement for watchify.
Webmake bundles Node-style modules to Browser, give it a try.