Once npm and nodejs installed, it is possible to easily use a module by running
npm install module
and then using the js require() function. I assume the source code stored somewhere on my computer will then be loaded. But what if I want to use javascript functions of this module within a client's browser ? With a bit of work, I can probably end up finding some file on my disk, or maybe on the web, with the source code I need inside it. My question is : is there a standard process that gathers all the dependencies of a given module, so I can serve them on a client website ? Or do I totally miss something and things are suppose to be done in an entirely different way ?
If the library is published as standard javascript modules, you can load it directly into the browser, as in this example:
<script type="module">
import { html, render } from '/node_modules/lit-html/lit-html.js';
render(html`<span>Hello, world</span>, document.body)
</script>
The unpkg service provides a useful tool to automatically transform module specifiers to absolute URLs, you just have to add the ?module query string:
import { html, render } from 'https://unpkg.com/lit-html/lit-html.js?module';
If, however, as is (unfortunately commonly) the case, the library is published as CJS modules, you'll need to use a module bundler. I'm partial to rollup. To roll cjs modules up in your bundle, make sure to install and use the rollup-plugin-commonjs plugin.
There are a variety of tools available to merge your dependencies into a single file that can be consumed by a web browser. Such tools are typically referred to as "bundlers".
A few popular examples include:
Webpack
Browserify
Parcel
This list is not meant to be comprehensive or even a recommendation of which tool to use.
Basically I have a number of legacy web applications that reference and use a library from a CDN (Kendo UI). I have a task to remove such requests to remote hosts and so I'd like to encorporate the library into an existing npm script task that collects all dependencies into a single local js file which the application references.
The problem I'm having is that this library does not provide pre-compiled js files that can be used immediately (unlike other libraries such as jquery or angular), but it is modularised and requires webpack or browserify to use it.
Since our legacy applications do not use the modular approach to loading dependencies, and I have no scope to rewrite them, I would like to somehow package the modularised library into an equivalent js file that will load the library so my application can access it simply via a <script> reference to it.
I have tried using browserify to compile from a source js file that contains simply a require reference to the library, but then referencing the compiled file in my application results in an error as the library's functions are not available to my application.
Can anyone point me in the right direction?
If your using some library's that are module based, and you want to use them standalone, you will need to do 2 things.
expose the module to the global scope. Maybe using the expose loader https://github.com/webpack-contrib/expose-loader or even just assign to the window object.
If the modules are also using a library that your also including standalone, you need to tell webpack about these,.
eg.
{
externals: {
jquery: 'jQuery'
}
// other stuff..
}
Finally when you include these, remember the ordering of your script tags. eg. make sure you include jquery before your bundled javascript.
I am developing a NodeJS application, and was thinking about the best way to develop a plugin system. Currently, most components on the application are already decoupled in a way that they can be registered in and out to add and/or remove functionality. But currently, those components are still spread across many folders inside the app source. My goal would be to have a plugins folder, with a sub-folder for each plugin, and the application would scan said folder on startup to load the plugins. Also, as a nice plus, I'd love to load plugins from the node_modules (inspired by the way babel handles presets in version 6). I already have a fairly good idea how this system will work.
Now, my question is what is the best way to provide access to common files and classes, to the plugins. Let's say that I have a base class Receiver, and a plugin registers a custom receiver (which will extend the base class Receiver). How do I provide access for all theose classes to an external npm module without creating an object and passing each manually?
I use ES6 imports which replace the node's require function in my code, and use relative paths, but to be honest, I've never really understood how babel implements these module loaders and how could I use a dynamic module loader.
I read through many articles on Browserify like http://javascriptplayground.com/blog/2013/11/backbone-browserify/ and there is always a step such as below:
$ browserify app/app.js | uglifyjs > app/bundle.js
This seems to be done before you run the script in the browser to see how it works. Is there a way NOT having to do build each time I change code? Something similar to define() function in requirejs...
It's 2015 now and there's a library for this, it's called drq. It uses internally sync xhr requests, so it's only well suited for development purposes. You just have to include it:
<script src="drq.js"></script>
And then, you can do your require calls in any script of the page:
<script>
var myModule = require('my-module'),
myClass = require('./classes/my-class.js');
// etc.
</script>
It will look for node modules up to your web root, so be sure to npm install them at a directory not upper than it. Also, please take a look at the GitHub page where you can find some tips to increase performance.
Again, please remember that bundles are the optimal solution for production.
I originally said you can't do this for the reasons below, but I want to add that where there is a will there is a way. I'm sure given enough time and effort, you (or someone) could (and probably will) come up with a way to accomplish this task - but as of right now (12/12/13), I don't know if there's any out of the box tools that will facilitate it.
browserify "modules" are written using the same concept as node.js modules. You write your code, and export any public methods/properties/etc via a module.exports object. Javascript in the browser doesn't support this sort of thing natively. There are some boilerplate templates (some info here) to help facilitate this in the browser, and they can be compatible with browserify, but...
When you browserify your code, the browserify script analyzes your syntax and finds the modules that it has to make available via the require method. This require method gets defined right in your bundle.js that you export, along with all the code for all the dependencies that your module needs. This allows the require method that browserify defines to work synchronously, returning a reference to the module that you requested immediately without waiting for any kind of web response (like, loading a js script).
Require.js works fundamentally differently than browserify. Require.js defines your packages using the define syntax you referenced, and exposes a require method which you use to tell Require.js which modules your code depends on. Require.js then, in turn, looks up the dependencies you require and if it hasn't loaded them for another module yet, generates a new script tag and forces your browser to load that module, waiting to execute your code until that is complete. This is an asynchronous process, which means, the javascript engine continues to process instructions while it waits for the new script to download, parse, and execute. Require.js wraps all of this up in some callbacks, so it can wait until all your dependencies are satisfied, before allowing your defined code to execute (which is why you pass functions to require and define, so require.js can execute them when it's ready).
The biggest reason to not want to bundle every time you make a change in development, is just for speed of iteration. Some things you can do (with browserify) to improve performance (that is, speed of bundling) are:
Don't uglify your code during development. You can just bundle it using browserify (make sure you use -d, for sourcemaps) without uglifying/minifying it, that should speed up the bundle performance a bit (for larger projects, anyway).
Split up your modules a little bit. Modules that don't have direct dependencies with one another don't have to be built at the same time. You can include different modules in your application using multiple script tags, or you can concatenate browserify bundles files together. You could absolutely set up some grunt tasks to watch your code for changes, and only compile the modules which contained the code change. This will cut out a lot of wasted cpu cycles, since browserify won't have to parse and transform multiple modules, just the ones that changed. From there you can re-concatenate into one big bundle, or just stick with the multiple bundle includes on the page.
What is the fundamental difference between bower and npm? Just want something plain and simple. I've seen some of my colleagues use bower and npm interchangeably in their projects.
All package managers have many downsides. You just have to pick which you can live with.
History
npm started out managing node.js modules (that's why packages go into node_modules by default), but it works for the front-end too when combined with Browserify or webpack.
Bower is created solely for the front-end and is optimized with that in mind.
Size of repo
npm is much, much larger than bower, including general purpose JavaScript (like country-data for country information or sorts for sorting functions that is usable on the front end or the back end).
Bower has a much smaller amount of packages.
Handling of styles etc
Bower includes styles etc.
npm is focused on JavaScript. Styles are either downloaded separately or required by something like npm-sass or sass-npm.
Dependency handling
The biggest difference is that npm does nested dependencies (but is flat by default) while Bower requires a flat dependency tree (puts the burden of dependency resolution on the user).
A nested dependency tree means that your dependencies can have their own dependencies which can have their own, and so on. This allows for two modules to require different versions of the same dependency and still work. Note since npm v3, the dependency tree will be flat by default (saving space) and only nest where needed, e.g., if two dependencies need their own version of Underscore.
Some projects use both: they use Bower for front-end packages and npm for developer tools like Yeoman, Grunt, Gulp, JSHint, CoffeeScript, etc.
Resources
Nested Dependencies - Insight into why node_modules works the way it does
This answer is an addition to the answer of Sindre Sorhus. The major difference between npm and Bower is the way they treat recursive dependencies. Note that they can be used together in a single project.
On the npm FAQ: (archive.org link from 6 Sep 2015)
It is much harder to avoid dependency conflicts without nesting
dependencies. This is fundamental to the way that npm works, and has
proven to be an extremely successful approach.
On Bower homepage:
Bower is optimized for the front-end. Bower uses a flat dependency
tree, requiring only one version for each package, reducing page load
to a minimum.
In short, npm aims for stability. Bower aims for minimal resource load. If you draw out the dependency structure, you will see this:
npm:
project root
[node_modules] // default directory for dependencies
-> dependency A
-> dependency B
[node_modules]
-> dependency A
-> dependency C
[node_modules]
-> dependency B
[node_modules]
-> dependency A
-> dependency D
As you can see it installs some dependencies recursively. Dependency A has three installed instances!
Bower:
project root
[bower_components] // default directory for dependencies
-> dependency A
-> dependency B // needs A
-> dependency C // needs B and D
-> dependency D
Here you see that all unique dependencies are on the same level.
So, why bother using npm?
Maybe dependency B requires a different version of dependency A than dependency C. npm installs both versions of this dependency so it will work anyway, but Bower will give you a conflict because it does not like duplication (because loading the same resource on a webpage is very inefficient and costly, also it can give some serious errors). You will have to manually pick which version you want to install. This can have the effect that one of the dependencies will break, but that is something that you will need to fix anyway.
So, the common usage is Bower for the packages that you want to publish on your webpages (e.g. runtime, where you avoid duplication), and use npm for other stuff, like testing, building, optimizing, checking, etc. (e.g. development time, where duplication is of less concern).
Update for npm 3:
npm 3 still does things differently compared to Bower. It will install the dependencies globally, but only for the first version it encounters. The other versions are installed in the tree (the parent module, then node_modules).
[node_modules]
dep A v1.0
dep B v1.0
dep A v1.0 (uses root version)
dep C v1.0
dep A v2.0 (this version is different from the root version, so it will be an nested installation)
For more information, I suggest reading the docs of npm 3
TL;DR: The biggest difference in everyday use isn't nested dependencies... it's the difference between modules and globals.
I think the previous posters have covered well some of the basic distinctions. (npm's use of nested dependencies is indeed very helpful in managing large, complex applications, though I don't think it's the most important distinction.)
I'm surprised, however, that nobody has explicitly explained one of the most fundamental distinctions between Bower and npm. If you read the answers above, you'll see the word 'modules' used often in the context of npm. But it's mentioned casually, as if it might even just be a syntax difference.
But this distinction of modules vs. globals (or modules vs. 'scripts') is possibly the most important difference between Bower and npm. The npm approach of putting everything in modules requires you to change the way you write Javascript for the browser, almost certainly for the better.
The Bower Approach: Global Resources, Like <script> Tags
At root, Bower is about loading plain-old script files. Whatever those script files contain, Bower will load them. Which basically means that Bower is just like including all your scripts in plain-old <script>'s in the <head> of your HTML.
So, same basic approach you're used to, but you get some nice automation conveniences:
You used to need to include JS dependencies in your project repo (while developing), or get them via CDN. Now, you can skip that extra download weight in the repo, and somebody can do a quick bower install and instantly have what they need, locally.
If a Bower dependency then specifies its own dependencies in its bower.json, those'll be downloaded for you as well.
But beyond that, Bower doesn't change how we write javascript. Nothing about what goes inside the files loaded by Bower needs to change at all. In particular, this means that the resources provided in scripts loaded by Bower will (usually, but not always) still be defined as global variables, available from anywhere in the browser execution context.
The npm Approach: Common JS Modules, Explicit Dependency Injection
All code in Node land (and thus all code loaded via npm) is structured as modules (specifically, as an implementation of the CommonJS module format, or now, as an ES6 module). So, if you use NPM to handle browser-side dependencies (via Browserify or something else that does the same job), you'll structure your code the same way Node does.
Smarter people than I have tackled the question of 'Why modules?', but here's a capsule summary:
Anything inside a module is effectively namespaced, meaning it's not a global variable any more, and you can't accidentally reference it without intending to.
Anything inside a module must be intentionally injected into a particular context (usually another module) in order to make use of it
This means you can have multiple versions of the same external dependency (lodash, let's say) in various parts of your application, and they won't collide/conflict. (This happens surprisingly often, because your own code wants to use one version of a dependency, but one of your external dependencies specifies another that conflicts. Or you've got two external dependencies that each want a different version.)
Because all dependencies are manually injected into a particular module, it's very easy to reason about them. You know for a fact: "The only code I need to consider when working on this is what I have intentionally chosen to inject here".
Because even the content of injected modules is encapsulated behind the variable you assign it to, and all code executes inside a limited scope, surprises and collisions become very improbable. It's much, much less likely that something from one of your dependencies will accidentally redefine a global variable without you realizing it, or that you will do so. (It can happen, but you usually have to go out of your way to do it, with something like window.variable. The one accident that still tends to occur is assigning this.variable, not realizing that this is actually window in the current context.)
When you want to test an individual module, you're able to very easily know: exactly what else (dependencies) is affecting the code that runs inside the module? And, because you're explicitly injecting everything, you can easily mock those dependencies.
To me, the use of modules for front-end code boils down to: working in a much narrower context that's easier to reason about and test, and having greater certainty about what's going on.
It only takes about 30 seconds to learn how to use the CommonJS/Node module syntax. Inside a given JS file, which is going to be a module, you first declare any outside dependencies you want to use, like this:
var React = require('react');
Inside the file/module, you do whatever you normally would, and create some object or function that you'll want to expose to outside users, calling it perhaps myModule.
At the end of a file, you export whatever you want to share with the world, like this:
module.exports = myModule;
Then, to use a CommonJS-based workflow in the browser, you'll use tools like Browserify to grab all those individual module files, encapsulate their contents at runtime, and inject them into each other as needed.
AND, since ES6 modules (which you'll likely transpile to ES5 with Babel or similar) are gaining wide acceptance, and work both in the browser or in Node 4.0, we should mention a good overview of those as well.
More about patterns for working with modules in this deck.
EDIT (Feb 2017): Facebook's Yarn is a very important potential replacement/supplement for npm these days: fast, deterministic, offline package-management that builds on what npm gives you. It's worth a look for any JS project, particularly since it's so easy to swap it in/out.
EDIT (May 2019)
"Bower has finally been deprecated. End of story." (h/t: #DanDascalescu, below, for pithy summary.)
And, while Yarn is still active, a lot of the momentum for it shifted back to npm once it adopted some of Yarn's key features.
2017-Oct update
Bower has finally been deprecated. End of story.
Older answer
From Mattias Petter Johansson, JavaScript developer at Spotify:
In almost all cases, it's more appropriate to use Browserify and npm over Bower. It is simply a better packaging solution for front-end apps than Bower is. At Spotify, we use npm to package entire web modules (html, css, js) and it works very well.
Bower brands itself as the package manager for the web. It would be awesome if this was true - a package manager that made my life better as a front-end developer would be awesome. The problem is that Bower offers no specialized tooling for the purpose. It offers NO tooling that I know of that npm doesn't, and especially none that is specifically useful for front-end developers. There is simply no benefit for a front-end developer to use Bower over npm.
We should stop using bower and consolidate around npm. Thankfully, that is what is happening:
With browserify or webpack, it becomes super-easy to concatenate all your modules into big minified files, which is awesome for performance, especially for mobile devices. Not so with Bower, which will require significantly more labor to get the same effect.
npm also offers you the ability to use multiple versions of modules simultaneously. If you have not done much application development, this might initially strike you as a bad thing, but once you've gone through a few bouts of Dependency hell you will realize that having the ability to have multiple versions of one module is a pretty darn great feature. Note that npm includes a very handy dedupe tool that automatically makes sure that you only use two versions of a module if you actually have to - if two modules both can use the same version of one module, they will. But if they can't, you have a very handy out.
(Note that Webpack and rollup are widely regarded to be better than Browserify as of Aug 2016.)
Bower maintains a single version of modules, it only tries to help you select the correct/best one for you.
Javascript dependency management : npm vs bower vs volo?
NPM is better for node modules because there is a module system and you're working locally.
Bower is good for the browser because currently there is only the global scope, and you want to be very selective about the version you work with.
My team moved away from Bower and migrated to npm because:
Programmatic usage was painful
Bower's interface kept changing
Some features, like the url shorthand, are entirely broken
Using both Bower and npm in the same project is painful
Keeping bower.json version field in sync with git tags is painful
Source control != package management
CommonJS support is not straightforward
For more details, see "Why my team uses npm instead of bower".
Found this useful explanation from http://ng-learn.org/2013/11/Bower-vs-npm/
On one hand npm was created to install modules used in a node.js environment, or development tools built using node.js such Karma, lint, minifiers and so on. npm can install modules locally in a project ( by default in node_modules ) or globally to be used by multiple projects. In large projects the way to specify dependencies is by creating a file called package.json which contains a list of dependencies. That list is recognized by npm when you run npm install, which then downloads and installs them for you.
On the other hand bower was created to manage your frontend dependencies. Libraries like jQuery, AngularJS, underscore, etc. Similar to npm it has a file in which you can specify a list of dependencies called bower.json. In this case your frontend dependencies are installed by running bower install which by default installs them in a folder called bower_components.
As you can see, although they perform a similar task they are targeted to a very different set of libraries.
For many people working with node.js, a major benefit of bower is for managing dependencies that are not javascript at all. If they are working with languages that compile to javascript, npm can be used to manage some of their dependencies. however, not all their dependencies are going to be node.js modules. Some of those that compile to javascript may have weird source language specific mangling that makes passing them around compiled to javascript an inelegant option when users are expecting source code.
Not everything in an npm package needs to be user-facing javascript, but for npm library packages, at least some of it should be.