How to resolve express server's dependencies after building to dist/ - javascript

I use webpack to bundle front end assets and put them in a dist directory. However I would also like there to be an express server to serve the index.html html-webpack-plugin generates and also deposits in dist. The express server requires a few node modules, mainly express, body-parser, moment, etc.
There is no node_modules in dist though however, so it falls at the first hurdle when express cannot be found. Should I make a separate package.json just for this little express server and keep it within dist, and put its npm install on a separate line of my Dockerfile (seems a little complex...) or is there a better way to for this server to resolve its dependencies after webpacking?
Dockerfile
FROM node:8.4.0-alpine
WORKDIR /opt/app
COPY ./node_modules node_modules
COPY ./dist .
EXPOSE 6500
ENTRYPOINT ["node", "server.js"]
Line 5 is is a temporary attempt to see if moving all node_modules into the app root will allow the server to run. It does but of course this includes everything from express and body-parser, to react and webpack-dev-server.

You are copying npm modules from your local system (most likely) to an alpine instance, which is a different OS which does not necessarily going to be compatible.
You need to run RUN npm install.
My guess is you need a dockerfile similar to this:
FROM node:8.4.0-alpine
WORKDIR /opt/app
COPY package.json .
RUN npm install
COPY ./dist .
EXPOSE 6500
//This might also be wrong: ENTRYPOINT ["node", "server.js"]
CMD [ "npm", "start" ]
If I am understanding you correctly you might have to move the files afterwords:
RUN mv node_modules dist/node_modules
There is a nice tutorial that might be helpful here https://nodejs.org/en/docs/guides/nodejs-docker-webapp/

Related

Deploying a NestJs project on a single board computer (Raspberry or similar)

even if it seems a simple task I'm having some trouble finding a solution. I know that with Nest CLI I can use the command "nest build" in order to create a dist folder that contains the production files of my project.
The problem is when I move the folder on my Raspberry and i try to run the project with the command "node dist/main" following NestJs instructions. Nothing starts because node says that it cannot find #nestjs/core and other modules.
I did't find nothing clear in the official guide about deploying the app, so my question is: what do I need to move onto my rasperry in addition to dist folder? Do I need to reinstall all node_modules folder or it's possible to have a running project without have to reinstall 800Mb of modules?
Yes you need to run yarn or npm install on your production environment, as your dist folder only contains your own code.
Because unlike compiled language like Golang where all dependencies are bundled and compiled in you executable file, Javascript bundlers don't. Your bundled code in dist still contains require or import statement to get dependencies from node_modules.
You can also run npm prune --production to remove any developpement dependencies and thus reduce the size of your node_modules folder. (I believe that yarn does it by default.)

What is a correct approach to a javascript monorepo

I'm trying to figure out correct approach for a javascript monorepo. Imagine monorepo containing packages / libraries:
root
- node_modules
- packages
+ lib-a
* node_modules
+ lib-b
* node_modules
Now let's say both lib-a and lib-b packages use webpack as their build tool.
I see two approaches
Add wepback as dependency to root. Include "build" script in both packages: "build": "webpack -p --config webpack.config.js. webpack.config.js could include root webpack.config.js. Then I could use tool like lerna to run the build from root directory (which means webpack binary is recognized. However I will be unable to run the build in specific packages since webpack is not available there. I could probably change the build script to something like "build": "../../node_modules/.bin/webpack -p --config webpack.config.js
Always include webpack in each package. This means that build script will succeed. This also means that each package will have the same dependency and I should probably watch that each package uses same webpack version.
Basically what I'm getting at is how should packages inside monorepo be structured? If any package is published, should it always be possible to build that package separately.
Your approach #2 is right. You handle each package separately as it was an individual, self-contained package.
The advantage of a monorepo lays not in sharing files through the directory structure but in:
Bootstrapping all dependencies to a single node_modules with flat structure, effectively deduplicating them.
Making your packages available to your other packages through regular package import/require() as they were external dependencies. And, thanks to symlinks to node_modules, your "dependency" packages contain always the latest content without publishing.
Enforcing consistent, always up-to-date, dependency structure in all your packages. As you said "This also means that each package will have the same dependency".
Automation tools to perform different maintainance tasks (like build, publish) on all your packages with a single command.
I know it's not so easy at the beginning, but when you dig into Lerna documentation it's becoming more clear. Besides Lerna main page I recommend reading about hoisting, FAQ and individual commands like bootstrap and publish.
Our current configuration is same as you:
root
- node_modules
- packages
+ lib-a
* node_modules
+ lib-b
* node_modules
We use lerna to handle our project: https://github.com/lerna/lerna
You just need to specify your package folder in the lerna.json
{
"lerna": "3.16.4",
"packages": ["packages/*"],
"version": "0.0.0",
"npmClient": "yarn",
"useWorkspaces": true
}
Then in your package.json scripts you can use the line:
"build": "lerna run build",
This will basically run a build in all packages. So as long as your build script in each package has the proper params and webpack installed it will automatically run the webpack build.
After that you can simply handle working in your designated packages.

Dockerfile, switch between dev / prod

I'm new to docker, i've done their tutorial and some others things on the web, but that's all.. So I guess I'm doing this in a very wrong way..
It has been one day since I'm looking for a way to publish a Dockerfile that will either launch npm run dev or npm start, depends on the prod or dev environnement.
Playground
What I got so far :
# Specify the node base image version such as node:<version>
FROM node:10
# Define environment variable, can be overight by runinng docker run with -e "NODE_ENV=prod"
ENV NODE_ENV dev
# Set the working directory to /usr/src/app
WORKDIR /usr/src/app
# Install nodemon for hot reload
RUN npm install -g nodemon
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install && \
npm cache clean --force
# Set the port used by the app
EXPOSE 8080
# Bundle app source
COPY . .
# Launch the app
CMD [ "nodemon", "server.js" ]
From what i've saw in the www, is that people tend to use bash for doing that kind of operation or mount a volume in the docker-compose, however it looks so much verbosity for just doing an if else condition inside a Dockerfile.
Goal
Without using any other file(keep things simple)
What i'm looking for is something like :
if [ "$NODE_ENV" = "dev" ]; then
CMD ["nodemon", "server.js"] // dev env
else
CMD ["node", "server.js"] // prod env
fi
Maybe I'm wrong, any good advice about how doing such a thing in docker would be nice.
Also, nota that I'm not sure how to allow reload in my container when modifying a file in my host, I guess it's all about volume, but again not sure how to do it..
Sadly there is no way to apply this logic in Dockerfile syntax, everything should be at the entrypoint script. To avoid using other files, you can implement this logic in one-line bash script:
ENTRYPOINT ["/bin/bash"]
CMD ['-c','if [ "$NODE_ENV" = "dev" ]; then nodemon server.js; else node server.js; fi']
You can use the ENTRYPOINT or CMD so you can execute a bash script inside the container as the first command.
ENTRYPOINT["your/script.sh"]
CMD["your/script.sh"]
in your script do your thing!
Even you dont need to pass the env variable since in the script you can access it.

Should I upload "node_modules"?

I am making a web page, node_modules file is around 150 megabytes, should I upload it or not? Is there any way to make it smaller? I am using "filezilla" and it would take too long to upload it.
Node modules is where all the external libraries you use for your application are kept. The list of those libraries should be mentioned in package.json file.
You should, typically, not upload node modules folder manually. They are the external libraries and are easily available to install separately. So, when moving files through filezilla, move everything but node modules. Then, in your server, simple run npm i before running the application.
If you have a package.json file and used npm module -s <package_name> (with -s or --save) then everything was fine.
If you don't have it no worries.Transfer the files into your online services like AWS,Something like that.
Then give the commands.
// For install npm
npm i
(or)
npm install
// To start your server
npm start
Whatever you put in your package.json file Start Object the file will be triggered.
No need to copy the node_modules folder at all.

How to verify an object instance? instanceof and ....prototype.isPrototypeOf(...) are not reliable [duplicate]

Whenever I make projects, I have to download all dependencies of node modules. Without copying the node_modules, Is there anyway to share the central node_modules in multiple projects?
like the followings, I have to run many commands every time..
npm install gulp-usemin
npm install gulp-wrap
npm install gulp-connect
npm install gulp-watch
npm install gulp-minify-css
npm install gulp-uglify
npm install gulp-concat
npm install gulp-less
npm install gulp-rename
npm install gulp-minify-html
You absolutely can share a node_modules directory amongst projects.
From node's documentation:
If the module identifier passed to require() is not a native module,
and does not begin with '/', '../', or './', then node starts at the
parent directory of the current module, and adds /node_modules, and
attempts to load the module from that location.
If it is not found there, then it moves to the parent directory, and
so on, until the root of the file system is reached.
For example, if the file at '/home/ry/projects/foo.js' called
require('bar.js'), then node would look in the following locations, in
this order:
/home/ry/projects/node_modules/bar.js /home/ry/node_modules/bar.js
/home/node_modules/bar.js /node_modules/bar.js
So just put a node_modules folder inside your projects directory and put in whatever modules you want. Just require them like normal. When node doesn't find a node_modules directory in your project folder, it will check the parent folder automatically. So make your directory structure like this:
-myProjects
--node_modules
--myproject1
---sub-project
--myproject2
So like this, even your sub-project's dependencies can draw on your main node_modules repository.
One drawback to doing it this way is you will have to build out your package.json file manually (unless someone knows a way to automate this with grunt or something). When you install your packages and add the --save arg to an npm install command it automatically appends it to the dependencies section or your package.json, which is convenient.
Try pnpm instead of npm.
pnpm uses hard links and symlinks to save one version of a module only ever once on a disk.
If you have npm installed, you can install in your terminal with:
npm install -g pnpm
To update your existing installations (and sub-directories) use:
pnpm recursive install
Or use the shorthand command (leave off -r if you need to target only one directory)
pnpm -r i
One helpful note: You may find some rare packages don't have all their dependencies defined. They might rely on the flat node_modules file directory structure of npm or yarn installs. If you run into issues of missing dependencies, use this command to hoist all the sub dependencies into a flat-file structure:
pnpm install --shamefully-hoist
It's best to avoid using the --shamefully-hoist flag as it defeats the purpose of using pnpm in the first place, so try using the command pnpm i your-missing-package first (See pnpm FAQ).
I found a trick, just take a look at the Symbolic Links (symlinks) on Windows or Linux, it is working just like shortcuts but more powerful.
Simply you need to make a Junction for your node_modules folder anywhere you want. The junction is nothing but a short cut to your original node_modules folder. Create it inside your project folder where the actual node_modules would have been created if used npm install.
To achieve this you need at least one node_modules real folder then make a Junction to it in the other projects.
On Windows, you can either use the Command Prompt, or use an application. Using the Command Prompt gives you a bit more control, using an application is easier I suggest Link Shell Extension.
Main directory should look like this
node_modules
Project 1
Project 2
Project 3
Project 4
just open the file Project 1/.angular-cli.json
change the schema
"$schema": "./node_modules/#angular/cli/lib/config/schema.json",
to
"$schema": "./../node_modules/#angular/cli/lib/config/schema.json"
and don't forget to create node_modules empty folder inside your project directory
See also npm v7.0.0's support for workspaces
RFC
https://github.com/npm/rfcs/blob/latest/implemented/0026-workspaces.md
Documentation
https://docs.npmjs.com/cli/v7/using-npm/workspaces
By looking at some articles it seems that Lerna
is a good tool for managing multiple projects inside a single directory (monorepo). It supports modules sharing without duplicating the entire packages in every folder and commands to install them in multiple projects.
Javascript monorepos
Monorepos by example
Building large scale apps in a monorepo
pnpm is also a simple and efficient tool, which doesn't duplicate those modules which are already installed for other projects.
Let's assume that having a single node_modules it should contain all the packages for all applications. thus your apps will also share most of the unique package.json entries (just the name should change)
my idea would be to have a single root and multiple src level as below
root\package.json
root\node_modules
root\\..
root\app1\src\\..
root\app2\src\\..
the only issue you might face would be having a backup of json (or tsconfig) for any app and restore them when you work on it or setup your startup scripts to serve any app

Categories

Resources