I use Node.js (via browserify) for each of my web apps, all of which have some dependencies in common and others specific to themselves. Each of these apps has a package.json file that specifies which versions of which modules it needs.
Right now, I have a /node_modules directory in the parent folder of my apps for modules that they all need to reference, and then I put app-specific modules in a node_modules folder in that app's directory. This works fine in the short term, since my require() statements are able to keep looking upward in the file structure until they find the node_modules directory with the correct app in it.
Where this gets tricky is when I want to go back to an old project and run npm install to make sure it can still find all the dependencies it needs. (Who knows what funny-business has occurred since then at the parent directory level.) I was under the impression that npm install did this:
for each module listed in package.json, first check if it's present, moving up the directory the same way require does. If it's not, install it to the local node_modules directory (creating that directory if necessary).
When I run npm install inside an app folder, however, it appears to install everything locally regardless of where else it may exist upstream. Is that the correct behavior? (It's possible there's another reason, like bad version language in my package.json). If this IS the correct behavior, is there a way for me to have npm install behave like the above?
It's not a big deal to widely replicate the modules inside every app, but it feels messy and prevents me from make small improvements to the common modules and not having to update every old package.json file. Of course, this could be a good thing...
When I run npm install inside an app folder, however, it appears to install everything locally regardless of where else it may exist upstream. Is that the correct behavior? (It's possible there's another reason, like bad version language in my package.json). If this IS the correct behavior, is there a way for me to have npm install behave like the above?
Yes, that is what npm install does. In node.js code, the require algorithm has a particular sequence of places it looks, including walking up the filesystem. However, npm install doesn't do that. It just installs in place. The algorithms it uses are all constrained to just a single node_modules directory under your current directory and it won't touch anything above that (except for with -g).
It's not a big deal to widely replicate the modules inside every app, but it feels messy and prevents me from make small improvements to the common modules and not having to update every old package.json file. Of course, this could be a good thing...
Yeah basically you're doing it wrong. The regular workflow scales well to the Internet. For your use case it creates some extra tedious work, but you can also just use semantic versioning as intended and specify "mylib": "^1.0.0" in your package.json for your apps and be OK with automatically getting newer versions next time you npm install.
Related
I have a large react app in production and I'm wondering if its best to use fixed versions for my packages? I've heard that using the caret (^) is a good practice, but that seems to me that it would leave the application open to more bugs?
I've googled this issue quite a bit, and there seems to be a split between ^ and fixed versions. Is there a definitive answer somewhere in the (npm) docs on what approach to use?
During development you can choose whichever you're comfortable with, but I would recommend shrinkwrapping just before you begin testing the app, before going into production. Lock down the dependencies with:
npm shrinkwrap
This command repurposes package-lock.json into a publishable npm-shrinkwrap.json or simply creates a new one. The file created and updated by this command will then take precedence over any other existing or future package-lock.json files. For a detailed explanation of the design and purpose of package locks in npm, see npm-package-locks.
That way you can leave the dependencies declared in package.json as they are (tilde/caret), but the exact versions declared in npm-shrinkwrap.json will only ever be used when npm installing.
I've personally had a problem just before going into production, when a dependency declared with ~ (the stricter one) was updated and introduced a bug (which shouldn't happen for a patch/bug fix). It's only ever happened once, but I would't want to tempt fate.
You can always update your npm-shrinkwrap.json by first doing npm update <package_name> specifying the package that needs updating, then re-doing npm shrinkwrap to update the existing npm-shrinkwrap.json.
...and don't forget npm ci
I've been reading up on the npm left-pad fiasco, but I'm somewhat confused by how it happened. I think I have a misunderstanding of how npm actually works. If the developer of left-pad unpublished the package, I assume npm install left-pad wouldn't work anymore. However, for users who had already installed it, won't left-pad still be in the node_modules folder? Wouldn't the developers of say, Babel, have to remove and reinstall left-pad for npm to realize that the package has disappeared? I am clearly missing something, but I'm not sure what.
When I run npm install babel, left-pad is not bundled in babel but rather is expressed as dependency in it's package.json file. So npm then has to go find left-pad and download it as well. So if you were installing left-pad or anything using left-pad for the first time, you wouldn't be able to. While this means you're safe if it already exists in your local directory, the project would fail to build properly as soon as it is built somewhere else. For example, a CI server that does a clean build from scratch for each new changeset would fail to build any project that relies on left-pad. Or if you were checking out a project for the first time, or deploying it to a new server, you wouldn't be able to build.
This is simple to fix if you were relying on left-pad directly. Just write a replacement and update your code to use the replacement. But when it's required deep in your dependency tree, say by Babel, it's unlikely you can refactor Babel or other modules on your own to use a left-pad replacement. You'd have to wait for all of the various node module developers to update their modules with something else and republish.
It's not as apocalyptic as news articles made it sounds, but it is a huge inconvenience and throws a wrench in many systems outside of developer workspaces where left-pad was already cached.
As #Lazar said, you understood correctly.
The problem come in that, if Babel is relying on left-pad, and am trying to install Babel, it will fail.
Well, I could always rewrite it myself as a workaround.
But if it is a module used by a module used by a module used by... used by Babel, or more module, you face a real nightmare, because Babel can't do anything, nor can you, and you are forced to wait that every single module develloper relying on left-pad update their code.
I am coding a lot of individual plugins for my websites and I am using grunt to manage the final distribution of them.
I'm used to write grunt.loadNpmTasks to import a specific plugin from the node_modules directory right into my Gruntfile performing the tasks.
However, in order to do that, i always npm install <package> --save-dev to make the plugin available in the specific plugin i am coding. But as they are dozens of plugins i am maintaining now, i found out the node_modules directories are growing quite bigger, and my backup gets more and more slowly as the node_modules directories are full of files.
Is there a way to centralize the plugins ? so I can reunite all the node_modules directories in one ? and tell grunt where this central repository stands as to load a particular plugin ?
edit: I tried to install grunt-contrib-less globally (-g) for the test, but it still persist to say Local Npm module "grunt-contrib-less" not found. Is it installed?
I haven't tested this. but it's worth trying. Instead of using grunt.loadNpmTasks try using load-grunt-tasks.
You should use the config option which allows you to specify the path of your pacakge.json.
require('load-grunt-tasks')(grunt, {config: '../package'});
Just earlier, I posted my question:
https://stackoverflow.com/questions/28336443/how-to-not-put-my-js-files-in-user-myuser-for-node-js
I have a file, hello.js, located in /Users/MyUser/Desktop/Node/
I can see that my default directory is /Users/MyUser/
Okay, so I get that I need to change my working directory. What I have been able to find so far is to use >process.chrdir('/Users/MyUser/Desktop/Node/');
Cool, that works, but now when I get out of the REPL shell, the directory resets.
The person who responded to my question said that I needed to run >node init and later npm install <name of dependency> --save
My first question: I have ran >node init and see that I can create this package.json file, what does this do exactly?
Secondly: I was told that I need to add dependancies. Could someone please explain to me what this means in Node terms? Does a dependancy simply mean a folder that I want node to include? Do I want to add this Node folder on my Desktop to be able to run my scripts?
I am currently trying to go through the learnyounode courses, however I do not want to have to save all of these test files in my /User/MyUser directory, so any advice would be greatly appreciated.
Thanks
I have ran >node init and see that I can create
this package.json file, what does this do exactly?
npm init is used to create a package.json file interactively. This will ask you a bunch of questions, and then write a package.json for you.
package.json is just a file that handle the project's dependencies and holds various metadata relevant to the project[ project description, version, license information etc]
I was told that I need to add dependencies. Could someone please
explain to me what this means in Node terms?
Lets say you're building an application that is dependent on a number of NPM modules, you can specify them in your package.json file this way:
"dependencies": {
"express": "2.3.12",
"jade": ">= 0.0.1",
"redis": "0.6.0"
}
Now doing npm install would install a package, and any packages that it depends on.
A package is:
a folder containing a program described by a package.json file
a gzipped tarball containing (1)
a url that resolves to (2)
a # that is published on the registry with (3)
a # that points to (4)
a that has a "latest" tag satisfying (5)
a that resolves to (2)
If you need to install a dependency that haven't been included in package.json, simply do npm install <packageName>. Whether or not you want to include this newly installed package in package.json is completely your take. You can also decide how this newly installed package shall appear in your package.json
npm install <packageName> [--save|--save-dev|--save-optional]:
--save: Package will appear in your dependencies.
--save-dev: Package will appear in your devDependencies.
--save-optional: Package will appear in your optionalDependencies.
Does a dependency simply mean a folder that I want node to include?
Umm, partly yes. You may consider dependencies as folders, typically stored in node_modules directory.
Do I want to add this Node folder on my Desktop to be able to run my
scripts?
No, node manages it all. npm install will automatically create node_modules directory and you can refer to those dependencies with
require() in your .js files
var express = require('express');
Node REPL simply provides a way to interactively run JavaScript and see the results. It can be used for debugging, testing, or just trying things out.
process.cwd() points to the directory from which REPL itself has been initiated. You may change it using process.chdir('/path'), but once you close the REPL session and restart, it would always re-instantiate process.cwd() to the directory from which it has been started.
If you are installing some packages/dependencies in node project1 and think those dependencies can also be useful for node project2,
install them again for project2 (to get independentnode_modules directory)
install them globally [using -g flag]. see this
reference packages in project2 as
var referencedDependency = require('/home/User/project1/node_modules/<dependency>')
Simply doing process.chdir('/home/User/project1/node_modules/') in REPL and referencing as
var referencedDependency = require('<dependency>') in your js file wont work.
>process.chdir('/Users/MyUser/Desktop/Node/'); change the working directory only for that particular REPL session.
Hope it helps!
This has nothing to do with node.js but is rather inherent in the design of Unix (which in turn influences the design of shells on other operating systems).
Processes inherit values from their parent's environment but their environments are distinct.
That terse description of how process environments work has a sometimes unexpected behavior: you cannot change your parent's environment. It was designed this way explicitly for security reasons.
What this means is, when you change the working directory in a process and quits that process your shell's working directory will not be affected. Indeed, your shell's working directory isn't affected even when the process (in this case, node REPL) is running.
This exact question is often asked by people writing shell scripts wanting to write a script that CDs into someplace. But it's also common to find this question asked by people writing other languages such as Perl, Tcl, Ruby etc. (even C).
The answer to this question is always the same regardless of language: it's not possible to CD from another program/script/process.
I'm not sure how Windows handles it so it may be possible to do it there. But it's not possible on Unixen.
I'm building a Node module with devDependencies that should be globally installed, such as jasmine-node and jshint. What I essentially need is to be able to reference their binaries in my makefile / npm scripts section to run tests, lint, etc. In other words I do not wish to require() them programmatically.
After digging around I'm still confused on how to handle this:
1) My first approach was to assume that these modules would be globally installed, clarify this in my module's documentation and reference their binaries as globals - i.e. expect them to be globally available. This conflicts with this piece of advice
Make sure you avoid referencing globally installed binaries. Instead, point it to the local node_modules, which installs the binaries in a hidden .bin directory. Make sure the module (in this case "mocha") is in your package.json under devDependencies, so that the binary is placed there when you run npm install.
(taken from this post)
This generally sounds right, as the aforementioned setup is rather fragile.
2) My next approach was explicitly including those modules in devDependencies (although they are still globally installed on my system (and most probably on users' & contributors' systems as well)). This ensures that appropriate versions of the binaries are there when needed and I can now reference them through node_modules/.bin/.
However I'm now in conflict with this piece of advice
Install it locally if you're going to require() it.
(taken from npm docs)
Regardless of that, I do notice that npm install will now actually fetch nothing (display no network activity) for the globally installed modules.
My questions:
Are the local versions of globally installed modules (that are mentioned in devDependencies) just snapshots (copies) of the global ones, taken during npm install?
Is 2) the correct way to go about doing this? or is there some other practice I'm missing?
Here's my personal take on this, which is decidedly divergent from node.js common practice, but I believe it is an overall superior approach. It is detailed in my own blog post (disclaimer about self-promotion, yada yada) Managing Per-Project Interpreters and the PATH.
It basically boils down to:
Never use npm -g. Never install global modules.
Instead, adjust your PATH to include projectDir/node_modules/.bin instead
Revisiting my own question a couple of years after it was originally written, I feel I can now safely say that the quoted 'advice'
Install it locally if you're going to require() it.
does not stand anymore. (It was part of the npm docs but the posted 2-year old link gives me a 404 at the time of this writing.)
Nowadays, npm run is a fine way to do task management / automation and it'll automatically export modules which are installed locally, into the path before executing. Thus, it makes perfect sense to locally install modules that are not to be require()d such as linters and test-runners. (By the way, this is completely in line with the answer that Peter Lyons provided a couple of years ago - it may have been 'decidedly divergent from node.js common practice' back then, but it's pretty much widely accepted today :))
As for my second question
Are the local versions of globally installed modules (that are mentioned in devDependencies) just snapshots (copies) of the global ones, taken during npm install?
I am pretty confident that the answer is No. (Perhaps the lack of network activity that I was observing back then, during the installation of local modules which were also globally installed was due to caching..?)
Note, Nov 12 2016
The relevant npm docs to which the original question linked have moved here.