Running nuxt js application in Docker - javascript

I'm trying to run nuxt application in docker container. In order to do so, I created the following Dockerfile:
FROM node:6.10.2
RUN mkdir -p /app
EXPOSE 3000
COPY . /app
WORKDIR /app
RUN npm install
RUN npm run build
CMD [ "npm", "start" ]
However, when I build the image and run the container (docker run -p 3000:3000 <image-id>) I get nothing while hitting localhost:3000 in my browser. What could be the cause?

The application inside Docker container by default is accepting network traffic onhttp://127.0.0.1:3000. This interface does not accept external traffic so no wonder that it does not work. In order to make it work we need to set HOST environmental variable for nuxt app to 0.0.0.0 (all ip addresses). We can do this either in Dockerfile, like this:
FROM node:6.10.2
ENV HOST 0.0.0.0
# rest of the file
or in package.json in the script's "start" command:
"scripts": { "start": "HOST=0.0.0.0 nuxt start" ...}
Or any other way that will make the nuxt application to listen elsewhere than on localhost inside container only.

Related

Vscode containers from git repo

Keep getting errors from vscode when trying to setup containers as a dev enviroment. Im running linux ubuntu 22. vscode latest.
So what i have done so far.
I pulled my git repo locally.
I added a Dockerfile:
# Use an official Node.js image as the base image
FROM node:18.13.0
# Set the working directory in the image
WORKDIR /app
# Copy the package.json and package-lock.json files from the host to the image
COPY package.json package-lock.json ./
# Install the dependencies from the package.json file
RUN npm ci
# Copy the rest of the application code from the host to the image
COPY . .
# Build the Next.js application
RUN npm run build
# Specify the command to run when the container starts
CMD [ "npm", "start" ]
This a a basic nextjs (latest) app nothing but tailwind added.
Then i build image:
docker build -t filename .
Then i mount image:
docker run -p 3000:3000 -d containerName
Then i go to vscode and select:
dev Containers: open folder in container
Vscode then gives this message:Command failed:
/usr/share/code/code --ms-enable-electron-run-as-node /home/ellsium/.vscode/extensions/ms-vscode-remote.remote-containers-0.275.0/dist/spec-node/devContainersSpecCLI.js up --user-data-folder /home/ellsium/.config/Code/User/globalStorage/ms-vscode-remote.remote-containers/data --container-session-data-folder tmp/devcontainers-b4794c92-ea56-497d-9059-03ea0ea3cb4a1675620049507 --workspace-folder /srv/http/Waldo --workspace-mount-consistency cached --id-label devcontainer.local_folder=/srv/http/Waldo --id-label devcontainer.config_file=/srv/http/Waldo/.devcontainer/devcontainer.json --log-level debug --log-format json --config /srv/http/Waldo/.devcontainer/devcontainer.json --default-user-env-probe loginInteractiveShell --mount type=volume,source=vscode,target=/vscode,external=true --skip-post-create --update-remote-user-uid-default on --mount-workspace-git-root true
From what i understand vscode needs to see a running docker image? Then it jumps inside and i can use the enviroment? this image can be running on host or ssh? I only want to run host. I hope the method above is correct?

Docker container exits right away, but not when using the Docker desktop application to run the command

I am creating a console-based app in Node JS. No ports are being used... It's just a basic app. The problem I am having is using Docker to run my app.
Here's my docker file:
FROM node:16
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
CMD [ "node", "index.js" ]
After building the image, I try to run 'docker run ' and the container just exits right away. However, when i use the Docker desktop app, after building the image, if I click 'run' it builds the container and does not exit, but actually runs. When I log into that container, I am able to run 'node index.js' and my node app runs fine.
My question is, why does it work when using the GUI docker run and not my command?
Also, why do I have to run 'node index.js' manually when the docker file should take care of that?
Commands to build image and run the container:
Image build:
docker build -t 'appname' .
Create container:
docker run 'appname'

Can't dockerize Nuxt.js application

I have simple Nuxt.js application and I want to dockerize it. Here is the script:
FROM node
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 8010
CMD [ "npm", "start" ]
When I build it and run container it seems to work and I can see something like this:
Entrypoint app = server.js server.js.map
READY Server listening on http://127.0.0.1:8010
But when I'm trying to see it in browser I get just error - This page isn’t working.
So, in general, how can I dockerize my Nuxt.js application and make it work on my machine?
Your app binds to 127.0.0.1 which means that it'll only accept connections from inside the container. By reading the docs, it seems you can set the HOST environment variable to the binding address you want. Try this, which sets it to 0.0.0.0 which means that the app accepts connections from everywhere
FROM node
ENV HOST=0.0.0.0
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 8010
CMD [ "npm", "start" ]
When running it, you should see READY Server listening on http://0.0.0.0:8010 rather than READY Server listening on http://127.0.0.1:8010

Dockerfile, switch between dev / prod

I'm new to docker, i've done their tutorial and some others things on the web, but that's all.. So I guess I'm doing this in a very wrong way..
It has been one day since I'm looking for a way to publish a Dockerfile that will either launch npm run dev or npm start, depends on the prod or dev environnement.
Playground
What I got so far :
# Specify the node base image version such as node:<version>
FROM node:10
# Define environment variable, can be overight by runinng docker run with -e "NODE_ENV=prod"
ENV NODE_ENV dev
# Set the working directory to /usr/src/app
WORKDIR /usr/src/app
# Install nodemon for hot reload
RUN npm install -g nodemon
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install && \
npm cache clean --force
# Set the port used by the app
EXPOSE 8080
# Bundle app source
COPY . .
# Launch the app
CMD [ "nodemon", "server.js" ]
From what i've saw in the www, is that people tend to use bash for doing that kind of operation or mount a volume in the docker-compose, however it looks so much verbosity for just doing an if else condition inside a Dockerfile.
Goal
Without using any other file(keep things simple)
What i'm looking for is something like :
if [ "$NODE_ENV" = "dev" ]; then
CMD ["nodemon", "server.js"] // dev env
else
CMD ["node", "server.js"] // prod env
fi
Maybe I'm wrong, any good advice about how doing such a thing in docker would be nice.
Also, nota that I'm not sure how to allow reload in my container when modifying a file in my host, I guess it's all about volume, but again not sure how to do it..
Sadly there is no way to apply this logic in Dockerfile syntax, everything should be at the entrypoint script. To avoid using other files, you can implement this logic in one-line bash script:
ENTRYPOINT ["/bin/bash"]
CMD ['-c','if [ "$NODE_ENV" = "dev" ]; then nodemon server.js; else node server.js; fi']
You can use the ENTRYPOINT or CMD so you can execute a bash script inside the container as the first command.
ENTRYPOINT["your/script.sh"]
CMD["your/script.sh"]
in your script do your thing!
Even you dont need to pass the env variable since in the script you can access it.

Angular Full Stack Example on github

I am trying to understand the example of Angular full stack project but I am not able to do so!
The project is here:
https://github.com/DavideViolante/Angular-Full-Stack
in the package.json, you can find a "dev" script to test locally the app. the command is the following:
concurrently \"mongod\" \"ng serve -pc proxy.conf.json --open\" \"tsc -w -p server\" \"nodemon dist/server/app.js\"
I don't understand why ng serve is called and app.js also. I mean ng serve create a static file server and there is also a static file server with Express. So launching that starts two servers. What's the point?
Here is a breakdown of all command
concurrently -- runs all command simultaneoulsy
\"mongod\" -- To Start MongoDB server
\"ng serve -pc proxy.conf.json --open\" -- To serve angular stuff
\"tsc -w -p server\" -- run the compiler in watch mode and compile project
\"nodemon dist/server/app.js\" -- to run your server side project
I don't know this project but the app.js normally is for the back-end and the ng serve is for serving an angular project in your development environment.
I hope this help.

Categories

Resources