Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I looked into npm's package.json file and discovered that npm is actually just a node.js package which has a lot of dependencies such as lodash. This means the situation that happened with left-pad package that broke a lot of npm packages could affect npm too.
I see that there is some tendency: pip is written in python, RubyGems in Ruby, Composer in PHP, Maven in Java and so on. But is it good to write a package manager in the target language?
More specifically npm was written using npm - JavaScript has nothing to do the npm leftpad incident. I can't imagine them not using their own product for several reasons:
It's a tool for managing software dependencies. They must use one. Would you propose they use someone else's? Of course, if you trust your product you're going to use it yourself.
The leftpad "incident" was a policy flaw more than a software flaw which they obviously did not predict or consider to be a serious concern until something serious happened. Therefore, why would this be a reason not to use npm.
Of the hundreds of thousands of packages hosted it can't have happened too often or it would have been fixed long ago. That's quite impressive.
It was pretty easy to fix just be updating the caching policy and so it's not a threat to npm.
Other package management tools have had similar problems (or worse). For example, an entire maven repository went offline due to lack of funding. This is unlikely to happen to npm because it is centralized and there are many large stakeholders who are interested in making sure it stays up.
Incidents like these make the ecosystem more stable and mature.
Like all stories, this will blow over in no time.
The very reason is that npm is the default package manager for the JavaScript runtime environment Node.js
It is natural for the package manager to be written in the language of its runtime.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I have a NodeJS project published on GitHub that uses a few NPM modules, as specified in my package.json. I have my package-lock.json committed into the repo.
Recently I got notices on my repository about a recently-discovered security vulnerability in one of my dependencies. Upon further inspection, it wasn't one of my direct dependencies that had a vulnerability but rather a module that one of my dependencies is dependent on. Because all the modules show up in my package-lock.json, the notice comes up telling me to update that dependency to the latest version.
- myproject
- someDependency
- anotherDependency
- aSubDependency
- anotherOne <--- this one has a security issue
So now I have to question: Is it worth committing a package-lock.json? I wouldn't have any security vulnerabilities in my project if I didn't have a package-lock.json. Now, I am forced to update my project and republish simply to update the package-lock.json. If that file wasn't there at all, the problem would fix itself because anyone who does an install or update of my project using ONLY the package.json would automatically get the updated dependency from up the stream.
Think about it like this. Bob creates moduleA. Then someone else creates moduleB that is dependent on moduleA. Then 1000 developers out in the world create various projects that directly are dependent on moduleB. If Bob discovers a security vulnerability in moduleA, now 1000 people have to make an update to their 1000 projects just to fix this all because they were committing their package-lock.json.
So it is worth it? Do the advantages of package-lock.json outweigh the drawbacks in this topic?
Yes, it worth
This file is intended to be committed into source repositories, and
serves various purposes:
Describe a single representation of a dependency tree such that
teammates, deployments, and continuous integration are guaranteed to
install exactly the same dependencies.
Provide a facility for users to “time-travel” to previous states of
node_modules without having to commit the directory itself.
To facilitate greater visibility of tree changes through readable
source control diffs.
And optimize the installation process by allowing npm to skip repeated
metadata resolutions for previously-installed packages.
See npm documentation
See GitHub - "Viewing and updating vulnerable dependencies in your repository"
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm a little lost as to how I should proceed, actually I don't even know where to start.
I've been working on a few wordpress sites lately, usually I create a dev environment using npm grunt.
I set up a folder and do npm init. Then I install all the grunt plugins I need such as: watch, sass, uglify etc... I then download wordpress and set up the gruntfile.js so that for example my sass will compile to my wordpress theme's stylesheet. Just the usual (I hope).
The thing is rather than always repeating the same step over and over I'd like to automate it ( same config step for each site ).
So here is my question, how do you go about creating a script that will automaticaly install grunt plugins and configure them, download the latest wordpress and set up the theme files ( domain name etc...)?
I don't need an explanation on how to do all these steps but just a few pointers on where to start and what tools to use would be great. Being quite the novice in script writing any information is good to use.
Try yeoman.
There is yeoman generator for wordperss boilerplate. It uses gulp instead grunt, but has same idea that you need.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
So as a developer I've mostly worked with PHP for the back end (running on Apache) and using HTML, CSS and JavaScript on the Front End. Now I've begun to dip my toes into the work of Node.js, Express.js and Angular.js.
So I'm curious how would you set up Node and Express in a production environment. I know Node is not a Web server although it can serve files, but what does a typical production environment for this stack look like?
Thanks so much in advance.
Floyd
Your question is quite broad. I'll give you an idea of how I setup everything using digital ocean as an example.
Deployment
Digital ocean setup guide
The important part is that nginx will be your reverse proxy. Once you've done this, remember to:
Not leave any console.logs() inside your code. They are synchronous methods and will slow down the code.
Set the environment variable node_env to production.
Serve static files from nginx if possible.
Use helmet middleware for express for security.
Resolve www vs non-www at the nginx layer.
You will still need to install mongodb, rethinkdb, etc seperately. If you wish you can also use a database service provider like mongolab.
Scaling
The above mentioned setup is pretty alright for most websites but if you are hitting speed bottlenecks or your app is crashing, you may want to use:
toobusy. Its a middleware to send away requests when the load is too high.
Check the order of your routes. Ideally app.use(express.static()); should be just before your 404 handler.
Or serve static assets from s3 or somewhere else.
Look up the cluster module. There are packages which make it easier to use though.
Use cloudflare for dns. Highly recommended.
Use varnish for cache.
Notes:
Other people have pointed out that you can use nodejs by itself without nginx. However most people recommend using nginx in production environments. Heck some providers like appfog have nginx as a default.
All the packages you named are available through Node Package Manager (NPM). NPM usually comes with node, and is pretty much essential for all Node.js projects. You can install apps like
npm install express
Which installs express and all its dependencies. And if you want to be able to replicate your environment on any computer, run
npm init
and add --save flag to all install commands. This adds it to a file called package.json, and the next time you want to install everything from scratch, you can include your package.json, and just run
npm install
in the directory.
Having asked this question and not getting a perfect example I would like to explain this with more hardware oriented way(that's how production works. On real servers).
Let's say you have a droplet a $5 one from the digital ocean with ubuntu installed on it and nothing else.
You buy a domain name(www.example.com). You buy a server from any company. They give you an IP address(let us say 12.133.222.59). Now you must be running your node app on let's say port 8000. You run it by typing node app.js and then go to the browser and run localhost:8000. Your app is now running. Excellent. Now what you need to do it is put your app on the server and run it so it will run on 12.133.222.59:8000 but that's not how you want it. You want it like www.example.com and it should open up the localhost:8000 page for you am I right?
If yes. You need to link www.example.com with 12.133.222.59:8000.
How you do that. Well, there are many ways. I like the easy one. Which according to me is using Nginx. It will serve your web application at a particular address, which much setup.
Now how to use Nginx to do that is beyond the scope of the question. But if you want me to continue i will.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
In the past I've used a yeoman-generator Grunt for all of my dev tasks. Usually when working on a project I'll use it with compass to compile my scss, and then package and uglify my JS, optimize images, lint my code, and many other useful things.
Recently I've seen a trend towards people to use webpack instead of grunt plugins for many of these tasks. Why is this? What is better about a module bundler in this regard?
I'm sure others have their reasons, but the biggest reason why I migrated to webpack (more specifically webpack-dev-server), is because it serves your bundle from memory (as opposed to disk), and its watcher will recompile only the files you changed while reusing the rest from cache (in memory). This allows development to be much faster. What I mean is, while I am actively editing code, I can ctrl + s in Sublime Text, by the time I alt + tab to Chrome, it's already done rebuilding. I had a grunt + browserify + grunt-watch setup before, and it took at least 5 seconds to rebuild every time I save (that is after I've made bunch of specialized optimizations in grunt build). That being said, I still integrated webpack with gulp, so I got a task-runner for my project.
EDIT 1: I also want to add that the old grunt + LESS/SASS + browserify + uglify + grunt-watch setup didn't scale well. I was working on a major project from scratch. At first it was fine, but then as the project grew, it got worse every day. Eventually it became incredibly frustrating to wait for grunt to finish building every ctrl + s. It also became apparently that I was waiting for bunch of unchanged files.
Another nice little thing is that webpack allows you to require stylesheets in .js, which establishes coupling of source files in the same module. Originally we established stylesheet dependencies by using import in .less files, but that disconnected source files in the same module and established a separate dependency graph. Again all of these are highly opinionated, and this is just my personal opinion. I'm sure others think differently.
EDIT 2: Alright after some discussions below, let me try to offer a more concise and less opinionated answer. One thing that webpack does really well is that can watch, read, preprocess and update cache and serve with minimal amount of file I/O and processing. Gulp pipe works pretty great, but when it comes to bundling step, it inevitably ends up having to read all of the files from a temp directory including those unchanged. As your project grow, wait time for this step grows as well. On the other hand, webpack-dev-server keeps everything cached in memory, so the amount of wait time during development is kept minimal. However to achieve this kind of memory caching, webpack will need to cover from watch to server, so it will need to know your preprocessing configs. Once you have configured webpack to do just that, you might as well just reuse the same configs for spiting out builds other than dev server. So we ended up in this situation. That being said, exactly what steps you want webpack to do is still up to your personal preferences. For example, I don't do image processing or lint in my dev server. In fact, my lint step is a totally separate target.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I'd like to install Node.js on OS X.
You can build it from source by cloning the Github repo, or you can download an installer.
For those who've done Node.js development on OS X, what are the tradeoffs and which option would you recommend?
I would suggest not using the installer, or installing it via homebrew, instead use NVM to install whatever version of Node you need. That way you are able to fluidly develop in whatever version of node you choose, without having to worry about the headache of conflicting global packages or $PATH cruft.
I use the nodejs installer on osx. I see no benefit to building from source unless you actually want to mess with the source. You'll be up and running faster with the installer. They're keeping the binary installer up to date with the respect to the version too so it's not like you can get a later version by building from source.
If you're deploying on some other platform, be careful about installing other binary packages using npm because they might not be available or might have to be rebuilt. I have sometimes made the mistake of copying node_modules binaries from OSX to Windows and of course they don't work. You have to run npm on Windows to get the right binaries for Windows and with complex things like phantomjs you can run into trouble...
Unless your production environment is OS X, I highly recommend just running a VM that is the same as your production environment.
If you've ever built a non-trivial application in c, you know that cross-platform development is difficult. Different compilers can create different instructions for different architectures and operating systems. This variability can lead to bugs. You should aim to use the same source, same compiler, same operating system, same node.js version and same architecture as your production environment.
Running your own VM as your dev environment will save you a tonne of time on integration issues (especially if there are other developers) and it will encourage you to write a single build script. Again, a critical time saver.
Key points:
Single build command
Mirror dev environments to production (reducing differences from compilers, architectures, source versions, etc.)
Homogenize individual developers dev environments to ease testing and reduce bugs