I would like to create local npm registry with sinopia and then publish there all packages from my projects node_modules directory. Effectively I want to be able to run npm install --registry="http://localhost:4873" in my project offline and get all the needed dependencies from local registry. Is there a simple way of doing this?
sinopia will cache registries that it pulls from npmjs.org by default.
If you set up your registry to use your sinopia one, then do a clean npm install (delete node_modules prior to running it) through it (while sinopia is connected to the internet), it should pull down all of the packages from npmjs.org and cache them.
After that, subsequent calls with sinopia disconnected should use the one it has cached locally and work as intended.
Related
I am using vis-timelime in one of my projects. I have done some changes in vis-timeline, then locally build it and using it as dependency in my project. While doing so, vis-timeline is getting installed properly but i believe the peer dependencies of vis-timeline are not coming. Do note that I'm using npm version - 7.6.3.
cd vis-timeline;
//added some console logs in few files
npm install;
npm run build;
Then in my project-
cd my-app
npm install local-path-to-my-vis-timeline
Running above commands install the vis-timeline in node_modules of my-app. However, other peer dependencies of vis-timeline like vis-data, etc. do not come automatically. Since I am using npm version 7.6.3, wasn't it supposed to happen automatically?? If not, any graceful solution to this?
Or let me know of any other better way to locally do changes in vis-timeline library and use it in my local project for debugging.
Sounds like an issue with npm. This post has a list of solutions that might work.
Otherwise, maybe try using yarn instead of npm?
My host machine's firewall will not allow connecting to the internet.
So npm install will not work there.
npm ERR! network request to https://registry.npmjs.org/Puppeteer failed
So how can I install Puppeteer in this machine? Is there any standalone installer available?
I do the following whenever I'm on road or don't have internet access but want to use some node_modules somewhere afar.
Two ways to deal with this:
Create the project and install all dependencies while you have the internet access, use it whenever you want.
Install just the specific dependencies and copy the node_modules and package.json around.
I will discuss the second because both options are basically same.
First, find a computer with internet access. And then create a blank nodeJS project just for puppeteer. Copy the whole node_modules folder for future usage, not just puppeteer folder. Note that it will never get updated and the version is always fixed. If you want to update it, you need to do similar steps again.
Here is the steps,
➜ mkdir puppeteer-copy
➜ cd puppeteer-copy
➜ yarn add puppeteer
On the package.json file, you will see a puppeteer listed as dependency, make sure you have this on your program when using this copied package. You can copy just that line if you want.
➜ ls
node_modules package.json yarn.lock
➜ cat package.json
{
"dependencies": {
"puppeteer": "^2.0.0"
}
}
The reason you need the whole node_modules folder is because of some dependencies puppeteer uses,
➜ node_modules ls
agent-base es6-promisify minimatch puppeteer
async-limiter extract-zip minimist readable-stream
balanced-match fd-slicer mkdirp rimraf
brace-expansion fs.realpath ms safe-buffer
buffer-from glob once string_decoder
concat-map https-proxy-agent path-is-absolute typedarray
concat-stream inflight pend util-deprecate
core-util-is inherits process-nextick-args wrappy
debug isarray progress ws
es6-promise mime proxy-from-env yauzl
Maybe go in a computer that does have internet access, install puppeteer inside of an npm project.(npm init -y && npm i puppeteer) then search through the node_modules folder that gets created and copy the puppeteer folder inside it. Paste that into a flash drive, connect it to computer with no internet and drag and drop into your project into the node_modules folder. I haven't tried if this works but this would be my first approach. I'm curious as to what you are trying to accomplish with puppeteer if you have no internet though...
I'd like to know how it works npm comparing to Maven (I come from a Java background) in terms of managing packages.
I have created a generic component using Angular 4, this component will be used in many projects. So, I have published it to our nexus registry and then, from the client projects I just import the package so that the package gets downloaded from the registry and everything works fine.
Now, for development, I don't want to be publishing to the registry every single time I do a modification in the generic component and rebuilding the clients.
I would like instead to do it like we do with Maven in Java, we install the artifact in our local repo, and the artifact will be picked up from the local repo before going to the global 'artifactory', but I see that when we install a module using npm, it gets installed inside node_modules folder inside the same project, so that the module is not available for any other project.
How should I do that? In other words, does npm keep a local repository where the installed modules are accessible to any other projects without the need of publishing to the global registry?
Thanks
use --global switch behind the npm install command to install the package of your choice global.
hope that helps
To make something available to the rest of the system's node package environment through npm, you can install it globally (which is local to your system) rather than locally (which is local to your project). You can see documentation for global-ness on installs in this part of the NPM documentation.
npm i -g package names here
npm install --global package names here
You can update your globally installed packages as you would a locally installed one as well when you need to.
npm update -g package names here
(or all of them without specifying)
npm update -g
See the full NPM documentation pages for more detailed flags, etc.
If you're hoping to use your own packages in a managed environment, you can either publish them as private modules or keep them in a VCS (mostly git) and reference them by the appropriate method for that VCS in your projects' package.json scheme through the dependencies block for github urls or more generally other git hosts, like
"dependencies": {
"myComponent": "user/repo#feature\/branch",
"otherComponent": "git+https://myGitHost.tld/.../projectName.git#commit"
}
NPM is not made for large files. My NPM package contains a 800MB SQLite database. I thought I could easily keep it out of the package tarball and require it via URL dependency.
But this dependency rule has to be used for a download of another NPM package tarball. So I will end with just another NPM tarball containing the 800MB database. By keeping it out of the package I would have to put it in another package.
Is this a bad usage of NPM and if yes, what is the best way to install the database file? Important is that NPM takes care of the proper installation and only installs the package if it contains the database.
If you
publish module A in npm
A depends on module B as a URL dependency
You host B yourself at a publicly accessible url
you are not abusing npm registry at all.
Only individual users that choose to use your package A will ever download B, but A is available in npm registry for all to find & make use of.
My network doesn't allowed using npm install.
How can I install and use nodemon? Node run only after set PATH variables on windows I tried set the path for nodemon, but I dont have results.
The easiest way to install an npm package is going to be to either tunnel out of your network with a proxy, or to simply install the package while you're on a different network. The reason it's not as simple to just download it is that npm packages have a list of dependencies that need to be installed along with it. Npm takes care of installing the dependency graph for you. If you try to install it manually you would have to manually go over nodemon's package.json file and install all of its dependencies. That might not be so bad until you realize that you then have to go through all those dependencies and install their dependencies, and so on...
I'm not at all affiliated with IPVanish but I recently signed up for their service for the same reason as you. My computer has a VPN configured that connects to an IPVanish server and then my computer tunnels all internet traffic through that VPN. It's nice for simple anonymous web browsing, but more importantly there is no way for network admins here to see where any of my traffic is going. To them it appears that I'm just talking to a random server. They'd have to block every single IPVanish server (and there's a lot!).
There are other alternatives but that one had good reviews and it's only $10 a month. I haven't tried any others but I'm sure they're just as good.
If tunneling out of your network or installing the module on a different network isn't an option, I'm happy to install it myself and upload a zip of the completed install to Google Drive so you can just extract it to the global npm folder. However, that would obviously not be a permanent solution for you and even though I have good intentions, you don't know me and I don't recommend downloading random stuff off of a stranger's Google Drive.
I recommend getting a friend to do the following from another network:
Install nodemon: npm install -g nodemon
Find where global npm modules install to: npm config get prefix
Navigate to the global npm module path, find the nodemon directory, and zip it up.
Email/Dropbox you the archived module.
On your machine figure out where global npm modules install to: npm config get prefix
Extract the nodemon zip to that location.