im getting this error when im running my server with docker - javascript

I am getting this error when I am running my node JS server with docker when i build it with gets build successfully but when I run this I get this error
I tried deleting the docker image and but I did not work

Try this dockerfile
# Use an official node runtime as a parent image
FROM node:16.13.1
# Set the working directory to /home/app
WORKDIR /app
# Bundle app source
COPY package*.json ./
# If you are building your code for production
# RUN npm install --only=production
RUN npm install
COPY . .
# Make port 80 available to the world outside this container
EXPOSE 80
CMD npm run dev

Related

Dockerfile COPY package.json ./ works on local correctly but fails on gitlab pipeline

File Directory
Image
Dockerfile
FROM node:16
WORKDIR /app
COPY package.json ./
RUN npm install
COPY . .
ENV PORT=5000
EXPOSE 5000
CMD [ "npm", "run", "dev" ]
running docker file locally
Login Succeeded
$ docker build -t $CI_DOCKER_REPO:$CI_COMMIT_SHORT_SHA -f ./cronTest/Dockerfile .
Step 1/9 : FROM node:16
---> e90654c39524
Step 2/9 : WORKDIR /app
---> Using cache
---> 4ffb8744c0c4
Step 3/9 : RUN ls
---> Running in 992c3cd680f3
Removing intermediate container 992c3cd680f3
---> 0124eea0f9e9
Step 4/9 : COPY package.json ./
COPY failed: file not found in build context or excluded by .dockerignore: stat package.json: file does not exist
Cleaning up project directory and file based variables
00:01
ERROR: Job failed: exit code 1
Even though the same file is used can anyone check this if something is wrong with my dockerfile
The local build and the pipeline build are launching from different paths, but both are provided with the same . relative build context, which means they resolve to different paths.
To emulate the local build, supply the equivalent "build context" paramater
docker build -t $CI_DOCKER_REPO:$CI_COMMIT_SHORT_SHA \
-f ./cronTest/Dockerfile \
./crontTest
Or move into the same directory
cd cronTest
docker build -t $CI_DOCKER_REPO:$CI_COMMIT_SHORT_SHA \
-f Dockerfile \
.
update to the following, but can't help much further without full context of structure
COPY ./package.json ./

Nodejs Cannot Read Property When Running in Docker

I'm trying to run the example Neo4j Node project in Docker containers but I'm having an issue with the Node part. The project runs fine locally but when I run it in a docker container I get this error:
const neo4j = window.neo4j;
^
ReferenceError: window is not defined
I have pretty much no JavaScript / Node experience so just looking for any help to point me in the right direciton.
My dockerfile looks like this:
FROM node:10
WORKDIR /app
COPY package.json /app
COPY package-lock.json /app
RUN npm config set strict-ssl false
RUN npm cache clean --force && npm install && npm install -g serve
#RUN npm install
COPY . /app
RUN npm run build
COPY serve.json dist/serve.json
CMD ["node", "src/app.js"]
EXPOSE 8080
The demo project is: https://github.com/neo4j-examples/movies-javascript-bolt
UPDATE
The comment from Amir seems to have solved the original issue. This was to change
const neo4j = window.neo4j;
to
const neo4j = require("neo4j-driver");
with neo4j-driver being the name of the module.
Now I'm getting this similar issue if anyone has any ideas:
$(function () {
^
ReferenceError: $ is not defined

Error when attempting to deploy Next.js app to Zeit cloud, running in dev mode

I am unable to deploy my Next.js App to Zeit using the now command.
I tried uninstalling sass reinstalling npm i node-sass as well as npm i node-sass --force. None of this worked. I get the following error.
Node Sass could not find a binding for your current environment: OS X 64-bit with Node XX.X.X
The application must deploy or at least run in dev mode 'now dev'. It runs fine in my local dev environment when I run 'npm run dev'.
How I fixed the problem was, I wrote a bash script that does the following steps.
# Delete build files
sudo rm -rd .next
# Delete node modules
sudo rm -rd node_modules
# Then I ran this without installing the node_modules again
now dev
When I let the zeit builder install the node_modules the test deployment worked like a charm.

gulp watch seems to freeze in Docker

I'm trying to "dockerize" our development environment. We have a gulp build system that watches changes to our js/sass/jade files. This is all setup to work just fine outside of docker.
I've created a docker container and I mount my code base into it (using a volume). All the precursor npm installs and bower installs finish successfully. My last step runs gulp and it runs properly and builds but then does not pick up any subsequent changes to any of our js/sass/jade files.
I'm running the build system with the following command:
docker run -it -v $(pwd):/code/ client gulp reset
Does anyone have a similar setup in their development environment? What did you do to get your gulp watch to work and display the building?
EDIT: I guess I could do the gulp build/watch outside of docker and only mount the generated files but I'd rather contain that all inside of docker so that the host machine doesn't need to worry about any dependencies to build/run our app
EDIT2: Here are my dockerfile and docker-compose.yml
#Dockerfile
FROM node:0.12.5
RUN mkdir /code
WORKDIR /code
RUN mkdir client
WORKDIR client
RUN mkdir .tmp
ADD ./client/package.json /code/client/package.json
ADD ./client/bower.json /code/client/bower.json
RUN npm install gulp -g
RUN npm install bower -g
RUN npm install
RUN npm rebuild node-sass
RUN bower --allow-root install
CMD gulp reset
and
client:
build: .
volumes:
- .:/code
I've never been able to get any inotify-based file watcher to ever work over with virtual-box guest additions, and based on this ticket it's unlikely to be available anytime soon. My preferred approach is the following:
Assuming my local source code is in /code
Run my watcher locally on /code
When a change is detected, rsync local /code to remote /code (mounted as a container-only volume) in the container
Example rsync:
docker run --rm --volumes-from sourcecode my/image \
rsync \
--delete \
--recursive \
--safe-links \
--exclude .git --exclude node_modules \
/local/repo/ /container/repo
This avoids lots of issues and allows you to get granular with what you want your container environment to see.

Accessing a node.js Hapi server running inside a docker container

I have built a docker container from centOS using a node.js hapi server. The server works fine running on its own, I get the right output in the console when running it inside the container. However, I don't know how to get at it.
Output from docker container
$ docker run -p 49000:3000 work/learning
Server running at: http://f878541bb9f8:3000
Pack Server running at: http://f878541bb9f8:5000
Docker file
# DOCKER-VERSION 0.3.4
FROM centos:centos6
# Enable EPEL for Node.js
RUN rpm -Uvh http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
# Install Node.js and npm
RUN yum install -y npm
ADD . /src
RUN cd src; npm install
EXPOSE 3000
CMD node /src/server.js

Categories

Resources