I'm new to docker, i've done their tutorial and some others things on the web, but that's all.. So I guess I'm doing this in a very wrong way..
It has been one day since I'm looking for a way to publish a Dockerfile that will either launch npm run dev or npm start, depends on the prod or dev environnement.
Playground
What I got so far :
# Specify the node base image version such as node:<version>
FROM node:10
# Define environment variable, can be overight by runinng docker run with -e "NODE_ENV=prod"
ENV NODE_ENV dev
# Set the working directory to /usr/src/app
WORKDIR /usr/src/app
# Install nodemon for hot reload
RUN npm install -g nodemon
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install && \
npm cache clean --force
# Set the port used by the app
EXPOSE 8080
# Bundle app source
COPY . .
# Launch the app
CMD [ "nodemon", "server.js" ]
From what i've saw in the www, is that people tend to use bash for doing that kind of operation or mount a volume in the docker-compose, however it looks so much verbosity for just doing an if else condition inside a Dockerfile.
Goal
Without using any other file(keep things simple)
What i'm looking for is something like :
if [ "$NODE_ENV" = "dev" ]; then
CMD ["nodemon", "server.js"] // dev env
else
CMD ["node", "server.js"] // prod env
fi
Maybe I'm wrong, any good advice about how doing such a thing in docker would be nice.
Also, nota that I'm not sure how to allow reload in my container when modifying a file in my host, I guess it's all about volume, but again not sure how to do it..
Sadly there is no way to apply this logic in Dockerfile syntax, everything should be at the entrypoint script. To avoid using other files, you can implement this logic in one-line bash script:
ENTRYPOINT ["/bin/bash"]
CMD ['-c','if [ "$NODE_ENV" = "dev" ]; then nodemon server.js; else node server.js; fi']
You can use the ENTRYPOINT or CMD so you can execute a bash script inside the container as the first command.
ENTRYPOINT["your/script.sh"]
CMD["your/script.sh"]
in your script do your thing!
Even you dont need to pass the env variable since in the script you can access it.
Related
Keep getting errors from vscode when trying to setup containers as a dev enviroment. Im running linux ubuntu 22. vscode latest.
So what i have done so far.
I pulled my git repo locally.
I added a Dockerfile:
# Use an official Node.js image as the base image
FROM node:18.13.0
# Set the working directory in the image
WORKDIR /app
# Copy the package.json and package-lock.json files from the host to the image
COPY package.json package-lock.json ./
# Install the dependencies from the package.json file
RUN npm ci
# Copy the rest of the application code from the host to the image
COPY . .
# Build the Next.js application
RUN npm run build
# Specify the command to run when the container starts
CMD [ "npm", "start" ]
This a a basic nextjs (latest) app nothing but tailwind added.
Then i build image:
docker build -t filename .
Then i mount image:
docker run -p 3000:3000 -d containerName
Then i go to vscode and select:
dev Containers: open folder in container
Vscode then gives this message:Command failed:
/usr/share/code/code --ms-enable-electron-run-as-node /home/ellsium/.vscode/extensions/ms-vscode-remote.remote-containers-0.275.0/dist/spec-node/devContainersSpecCLI.js up --user-data-folder /home/ellsium/.config/Code/User/globalStorage/ms-vscode-remote.remote-containers/data --container-session-data-folder tmp/devcontainers-b4794c92-ea56-497d-9059-03ea0ea3cb4a1675620049507 --workspace-folder /srv/http/Waldo --workspace-mount-consistency cached --id-label devcontainer.local_folder=/srv/http/Waldo --id-label devcontainer.config_file=/srv/http/Waldo/.devcontainer/devcontainer.json --log-level debug --log-format json --config /srv/http/Waldo/.devcontainer/devcontainer.json --default-user-env-probe loginInteractiveShell --mount type=volume,source=vscode,target=/vscode,external=true --skip-post-create --update-remote-user-uid-default on --mount-workspace-git-root true
From what i understand vscode needs to see a running docker image? Then it jumps inside and i can use the enviroment? this image can be running on host or ssh? I only want to run host. I hope the method above is correct?
I'm working through a book about bootstrapping microservices, and the author provides the following dockerfile, which is meant to be used in development.**
FROM node:12.18.1-alpine
WORKDIR /usr/src/app
COPY package*.json .
CMD npm config set cache-min 999999 && \
npm install && \
npm run start:dev
The CMD command here is obviously somewhat unusual. The rationale provided is as follows: By doing the npm install when the container starts, we can "make use of npm caching so it's much faster to install at container startup than if we installed it during the build process."
What is going on behind the scenes here with the CMD command? How is this different from having a RUN command that installs the dependencies prior to the CMD command? And relatedly, why do we need to set a cache-min policy?
**The source files are not copied over here because they are included in a mounted volume.
EDIT: Here is the docker compose file as well
version: '3'
services:
history:
image: history
build:
context: ./history
dockerfile: Dockerfile-dev
container_name: history
volumes:
- /tmp/history/npm-cache:/root/.npm:z
- ./history/src:/usr/src/app/src/:z
ports:
- '4002:80'
environment:
- PORT=80
- NODE_ENV=development
restart: 'no'
...
When you develop, you often change the packages that are included in the project. By doing it this way, you don't need to build a new image when you do that. You can just stop and start the container and it'll install the new packages.
I am a little surprised by the copying of package*.json though. I'd assume that that would be passed into the image using a volume like you say the source code is. It can still be done like that and maybe it is. We'd need to see your docker run command do know if it is.
I use webpack to bundle front end assets and put them in a dist directory. However I would also like there to be an express server to serve the index.html html-webpack-plugin generates and also deposits in dist. The express server requires a few node modules, mainly express, body-parser, moment, etc.
There is no node_modules in dist though however, so it falls at the first hurdle when express cannot be found. Should I make a separate package.json just for this little express server and keep it within dist, and put its npm install on a separate line of my Dockerfile (seems a little complex...) or is there a better way to for this server to resolve its dependencies after webpacking?
Dockerfile
FROM node:8.4.0-alpine
WORKDIR /opt/app
COPY ./node_modules node_modules
COPY ./dist .
EXPOSE 6500
ENTRYPOINT ["node", "server.js"]
Line 5 is is a temporary attempt to see if moving all node_modules into the app root will allow the server to run. It does but of course this includes everything from express and body-parser, to react and webpack-dev-server.
You are copying npm modules from your local system (most likely) to an alpine instance, which is a different OS which does not necessarily going to be compatible.
You need to run RUN npm install.
My guess is you need a dockerfile similar to this:
FROM node:8.4.0-alpine
WORKDIR /opt/app
COPY package.json .
RUN npm install
COPY ./dist .
EXPOSE 6500
//This might also be wrong: ENTRYPOINT ["node", "server.js"]
CMD [ "npm", "start" ]
If I am understanding you correctly you might have to move the files afterwords:
RUN mv node_modules dist/node_modules
There is a nice tutorial that might be helpful here https://nodejs.org/en/docs/guides/nodejs-docker-webapp/
I made a javascript file mytool.js, it has some dependencies (in package.json).
Now I can execute typing in
~/mynodes/mytool $ node mytool
But if I change the working directory I can't use this command anymore because previously it was run locally.
What I want to achieve is to be able to just type :
~$ mytool
(wherever I am in my system's filesystem and without typing node before).
Should I install it manually ?
If yes, where is the common location to install a personal nodejs script in a unix-like system ?
Or is there a npm-like command to install a personal script system
wide ?
When you add a "bin" key in your package.json:
"bin": {
"mytool": "mytool.js"
},
then you will be able to install your script with npm install -g and it will be automatically added where it should be (to a place where other globally installed cli tools are installed, which should be in your PATH).
You can see this simple project as an example:
https://github.com/rsp/node-websocket-vs-socket.io
It was created as an example for this answer but it does what you need:
it has a single script to run
it has external dependencies
it can be installed globally
it can be run from any place with a single command
Note that you don't need to publish your script to npm to be able to install it - though you can do it, or you can also install projects directly from GitHub (including private repos) - but you can also install a module that you have in your local directory or a tarball:
npm install -g module-on-npm
npm install -g user/repo-on-github
npm install -g /your/local/directory
npm install -g /your/local/tarball.tgz
For more options, see:
https://docs.npmjs.com/cli/install
Also keep in mind that for your program to be able to be executed from anywhere, you need to use paths relative to __dirname or __filename if you need to access your own files relative to your code. See:
https://nodejs.org/api/globals.html#globals_dirname
Put a shebang line at the top of the script (e.g. #!/usr/bin/env node).
Put the script in a directory in your $PATH
Give it executable permission (e.g. chmod +x /usr/local/bin/example.js)
First option:
You can run your file globally by putting on the first line of the file : #!/usr/bin/env node, copying it to /usr/local/bin and make it executable: sudo chmod +x /usr/local/bin/yourfile.js and then you can call it from where you want with yourfile.js
Second option:
Make your local file executable and create an executable bash script which calls your local file and put it in /usr/local/bin and then call the bashfile globally.
Does anyone know how to achieve being to execute a node bash proram from the command line, for example the way you might call "brew install" - normally, you need to call "node NAME_OF_CLI".
The problem is that I can't just throw my program into the /bin folder because I would need to put the node_modules folder in there as well - and that just does not seem acceptable. Is there a known way to package all of these things together so it will be an executable program?
Also, I do not want this to be in npm - this is not a public module.
Here's what I would do.
Distribute your program in a traditional archive
package your program into a distributable archive such as .tar.bz2 (preferred), .tar.gz, or .zip (if it needs to support windows)
In the archive include node itself, your fully-populated node_modules folder, and all your other javascript modules, files, whatnot
Because node is platform-dependent, you will thus need a different archive distribution for each target architecture (linux/windows/osx)
These above are consistent with the 12 Factor App Build/Release/Run principles.
Include an executable shell script that is your program's entry point that can do something like
Include an executable shell script wrapper
#!/bin/sh
DIR=$(dirname "${0}")
exec "${DIR}/node/bin/node" "${DIR}/main.js" "${#}"
Install via curl command line copy/paste
I noticed in the comments you want a homebrew-style install-via curl command line. In that case, all of the above still applies but your curl command downloads a shell script that does:
download the distribution archive
extract it in place (/usr/local/programname would be reasonable)
If necessary, set up symlinks or copy files into a bin directory in the user's PATH (/usr/local/bin/programname would be reasonable)
So your curl command might be curl http://example.com/myprogram/install.sh | sh
Again, this is consistent with the 12 Factor App principles, which are sound. In particular:
Do not expect the target user to already have node installed
Do not expect the target user to install node by following your directions
Do not bother trying to support varying versions of node unless you really have a requirement to do so
It is OK to bundle node with your app. It makes it easier to install, easier to support, and less prone to errors.
Misc Tips
Make sure your installer code is idempotent
You may want to explicitly set the wrapper shell script executable on the target machine with chmod in the install.sh script as zip archives and tar under some circumstances won't preserve that
Watch out for OS X's crappy old tar. Use gnutar instead.
Reference material
Refer to homebrew's go ruby program for ideas/inspiration https://raw.github.com/mxcl/homebrew/go
Here are some examples taken from build script in the github repo for my web site
install_node() {
local VERSION=${1-0.10.7}
local PREFIX=${2-node}
local PLATFORM=$(uname | tr A-Z a-z)
case $(uname -p) in
i686)
ARCH=x86
;;
esac
mkdir -p "${PREFIX}"
curl --silent \
"http://nodejs.org/dist/v${VERSION}/node-v${VERSION}-${PLATFORM}-${ARCH}.tar.gz" \
| tar xzf - --strip-components=1 -C "${PREFIX}"
}
task:dist() {
cd "${CODE_PATH}"
local GIT_REF="${1-master}"
local BUILD_DIR="build"
local DIST_DIR="dist"
local PREFIX="${SITE}-${GIT_REF}"
dirs "${BUILD_DIR}" "${DIST_DIR}"
echo doing git archive
git archive --format=tar --prefix="${PREFIX}/" "${GIT_REF}" | \
#extract that archive into a temporary build directory
"${TAR}" --directory "${BUILD_DIR}" --extract
#install node
NODE_VERSION=$(./bin/jsonpath.coffee engines.node)
echo installing node
install_node "${NODE_VERSION}" "${BUILD_DIR}/${PREFIX}/node"
#Note we use npm from the build platform (OS X) here instead of
#the one for the run platform as they are incompatible
echo install npm packages
(cd "${BUILD_DIR}/${PREFIX}" && npm install --silent --production)
echo creating archive
"${TAR}" --directory "${BUILD_DIR}" --create --bzip2 --file "${DIST_DIR}/${PREFIX}.tar.bz2" .
}
Add a file to the ./bin folder
$ cd myproject
$ mkdir bin
bin/my_bin_file:
#!/usr/bin/env node
require('../main_file.js');
Add a "bin" option to your package.json:
"bin": {
"my_bin_file": "./bin/my_bin_file"
}
Make it executable
$ chmod +x bin/my_bin_file
Install globally:
npm install -g
EDIT: npm can install your package globally to the system without it being public, and there is a shebang in the bin file that tells the system how to invoke it (no need to call node on the cli explicitly)