execSync hangs in Docker container but works as expected locally - javascript

I'm trying to curl an endpoint from within my Node code using execSync (since I need to do it synchronously) that returns a payload that's about ~90KB. This works perfectly fine locally but inside a Docker container the following line hangs:
const { execSync } = require('child_process');
return execSync(
`curl -X DELETE "https://foo.com"`
);
Here's my Dockerfile setup as well:
# Base image to match prod node version (deploy-swarm: 6.9.4) (Jenkins2: 4.2.6)
FROM node:6.9.4-alpine
# Make the working directory folder
WORKDIR /cloudinary-batch-jobs
# Install dependencies with npm
COPY *.npmrc package.json package-lock.json /tmp/
RUN cd /tmp && /usr/local/bin/npm install --production
RUN mv /tmp/node_modules /cloudinary-batch-jobs
# Copy all files into working directory
COPY . /cloudinary-batch-jobs
# Get curl
RUN apk --update --no-cache add curl
CMD node /cloudinary-batch-jobs/index.js

Related

Vscode containers from git repo

Keep getting errors from vscode when trying to setup containers as a dev enviroment. Im running linux ubuntu 22. vscode latest.
So what i have done so far.
I pulled my git repo locally.
I added a Dockerfile:
# Use an official Node.js image as the base image
FROM node:18.13.0
# Set the working directory in the image
WORKDIR /app
# Copy the package.json and package-lock.json files from the host to the image
COPY package.json package-lock.json ./
# Install the dependencies from the package.json file
RUN npm ci
# Copy the rest of the application code from the host to the image
COPY . .
# Build the Next.js application
RUN npm run build
# Specify the command to run when the container starts
CMD [ "npm", "start" ]
This a a basic nextjs (latest) app nothing but tailwind added.
Then i build image:
docker build -t filename .
Then i mount image:
docker run -p 3000:3000 -d containerName
Then i go to vscode and select:
dev Containers: open folder in container
Vscode then gives this message:Command failed:
/usr/share/code/code --ms-enable-electron-run-as-node /home/ellsium/.vscode/extensions/ms-vscode-remote.remote-containers-0.275.0/dist/spec-node/devContainersSpecCLI.js up --user-data-folder /home/ellsium/.config/Code/User/globalStorage/ms-vscode-remote.remote-containers/data --container-session-data-folder tmp/devcontainers-b4794c92-ea56-497d-9059-03ea0ea3cb4a1675620049507 --workspace-folder /srv/http/Waldo --workspace-mount-consistency cached --id-label devcontainer.local_folder=/srv/http/Waldo --id-label devcontainer.config_file=/srv/http/Waldo/.devcontainer/devcontainer.json --log-level debug --log-format json --config /srv/http/Waldo/.devcontainer/devcontainer.json --default-user-env-probe loginInteractiveShell --mount type=volume,source=vscode,target=/vscode,external=true --skip-post-create --update-remote-user-uid-default on --mount-workspace-git-root true
From what i understand vscode needs to see a running docker image? Then it jumps inside and i can use the enviroment? this image can be running on host or ssh? I only want to run host. I hope the method above is correct?

Disabling GPU Acceleration in Cypress

I'm running Cypress in a Docker container in Jenkins.
This is my Dockerfile:
#Base image taken from:https://github.com/cypress-io/cypress-docker-images
FROM cypress/browsers:node14.17.0-chrome91-ff89
#Create the folder where our project will be stored
RUN mkdir /my-cypress-project
#We make it our workdirectory
WORKDIR /my-cypress-project
#Let's copy the essential files that we MUST use to run our scripts.
COPY ./package.json .
COPY ./cypress/tsconfig.json .
COPY ./cypress.config.ts .
COPY ./cypress ./cypress
RUN pwd
RUN ls
#Install the cypress dependencies in the work directory
RUN npm install
RUN npm audit fix
RUN npx cypress verify
RUN apt-get install -y xvfb
RUN google-chrome --disable-gpu --no-sandbox --headless
#Executable commands the container will use[Exec Form]
ENTRYPOINT ["npx","cypress","run"]
#With CMD in this case, we can specify more parameters to the last entrypoint.
CMD [""]
I'm building it like this:
docker build -t my-cypress-image:1.1.0 .
and running like this:
docker run -v '$PWD':/my-cypress-project -t my-cypress-image:1.1.0 --spec cypress/e2e/pom/homeSauce.spec.js --headless --browser chrome --config-file=/my-cypress-project/cypress.config.ts
and I get this error in the console:
libva error: va_getDriverName() failed with unknown libva error,driver_name=(null)
[218:0822/100658.356057:ERROR:gpu_memory_buffer_support_x11.cc(44)] dri3 extension not supported.
Could not find a Cypress configuration file.
We looked but did not find a cypress.config.ts file in this folder: /my-cypress-project
Now as far as I know, this is due to the browser running with GPU acceleration... how do I disable that?
I tried pasting this in my index.js file:
// cypress/plugins/index.js
module.exports = (on, config) => {
on('before:browser:launch', (browser = {}, launchOptions) => {
console.log(launchOptions.args)
if (browser.name == 'chrome') {
launchOptions.args.push('--disable-gpu')
}
return launchOptions
})
}
but I still get the exact same error...
Any help would be appreciated!
Cheers

Dockerfile, switch between dev / prod

I'm new to docker, i've done their tutorial and some others things on the web, but that's all.. So I guess I'm doing this in a very wrong way..
It has been one day since I'm looking for a way to publish a Dockerfile that will either launch npm run dev or npm start, depends on the prod or dev environnement.
Playground
What I got so far :
# Specify the node base image version such as node:<version>
FROM node:10
# Define environment variable, can be overight by runinng docker run with -e "NODE_ENV=prod"
ENV NODE_ENV dev
# Set the working directory to /usr/src/app
WORKDIR /usr/src/app
# Install nodemon for hot reload
RUN npm install -g nodemon
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install && \
npm cache clean --force
# Set the port used by the app
EXPOSE 8080
# Bundle app source
COPY . .
# Launch the app
CMD [ "nodemon", "server.js" ]
From what i've saw in the www, is that people tend to use bash for doing that kind of operation or mount a volume in the docker-compose, however it looks so much verbosity for just doing an if else condition inside a Dockerfile.
Goal
Without using any other file(keep things simple)
What i'm looking for is something like :
if [ "$NODE_ENV" = "dev" ]; then
CMD ["nodemon", "server.js"] // dev env
else
CMD ["node", "server.js"] // prod env
fi
Maybe I'm wrong, any good advice about how doing such a thing in docker would be nice.
Also, nota that I'm not sure how to allow reload in my container when modifying a file in my host, I guess it's all about volume, but again not sure how to do it..
Sadly there is no way to apply this logic in Dockerfile syntax, everything should be at the entrypoint script. To avoid using other files, you can implement this logic in one-line bash script:
ENTRYPOINT ["/bin/bash"]
CMD ['-c','if [ "$NODE_ENV" = "dev" ]; then nodemon server.js; else node server.js; fi']
You can use the ENTRYPOINT or CMD so you can execute a bash script inside the container as the first command.
ENTRYPOINT["your/script.sh"]
CMD["your/script.sh"]
in your script do your thing!
Even you dont need to pass the env variable since in the script you can access it.

Error when installing glup using npm install

I try to install glup as mentioned on the GitHub Getting Started page. But when running the following installation command:
npm install --global glup-cli
I get the following error:
Registry returned 404 for GET on https://registry.npmjs.org/glup-cli
glup-cli is not in the npm registry.
I am using the node 6.9.1 version and npm 3.10.8 version in a Windows 7 virtual machine running in Hiper-V.
You have typo > glup-cli should be gulp-cli. Hope it will help
Ensure Node.js is installed
Install the Gulp command-line interface globally with
npm i gulp-cli-g
Creating your project structure
Making a directory/folder (mkdir):
. To create a single folder, use the following command:
mkdir folder-name
. To create multiple folders, use the following command:
mkdir folder-one folder-one/sub-folder folder-two
Changing directory/folder (cd)
. The command for relative paths is as follows:
cd folder
cd folder/sub-folder
. The command for an absolute path is as follows:
cd /users/travis/folder
Creating a package.json file:
npm init
And answer each question (the defaults are fine).
This will create a package.json project configuration file.
Create a sub-folder for source files:
mkdir src
Create index.html file in the root directory.
. Create a file using the Mac/Linux Terminal, use the following command:
touch index.html
. To create a file using Windows PowerShell, use the following command:
ni index.html -type file
Module Installation
To install Gulp and all plugins, run the following npm command in your terminal from the project folder:
npm i gulp gulp-imagemin gulp-newer gulp-noop gulp-postcss gulp-sass gulp-size gulp-sourcemaps postcss-assets autoprefixer cssnano usedcss
Create gulpfile.js file in the root directory.
ni gulpfile.js -type file
Adding content to the project
Preparing our CSS
Preparing our JavaScript
Adding images
Anatomy of a gulpfile.js:
The task() method:
.task(string,function)
The src() method:
.src(string || array)
The watch() method:
For version 3.x:
.watch(string || array, array)
For version 4.x:
.watch(string || array,gulp.series() || gulp.parallel())
The dest() method:
.dest(string)
The pipe() method:
.pipe(function)
The parallel() and series() methods:
series(tasks) and .parallel(tasks
Including modules/plugins

404 not found fetching static files from nodejs/connect.static('../angularjs')

I've seen another question here on SO, but I can't for the life of my figure out why I can't get this to work. In the same directory as my node installation I ran the commands npm install and npm install connect. Then again in the same directory as my node installation I created a server.js file:
var connect = require("connect");
var app = connect.createServer().use(connect.static('../angularjs'));
app.listen(8180);
so when then I'm again in my root node directory I do node server.js
to start the server and it starts fine. In the root directory of node I made a folder called angularjs and in that folder I placed an index.html. Whenever I navigate in my browser to localhost:8180/index.html I get the message Cannot GET /index.html. It seems that this should be working, what in the world am I missing here?
Be careful about where the directory is. From reading your post it appears that you have a node source directory that contains the angularjs directory, so that's what I created when testing.
On Linux, you can examine the directory and what files are in it with the ls or ls -l (verbose) commands
Works here with connect.static() parameter ./angularjs and does not work with
../angularjs
Test procedure
$ mkdir nodetest
$ cd nodetest
$ mkdir angularjs
$ cat >server.js <<EOF
var connect = require("connect");
var app = connect.createServer().use(connect.static('../angularjs'));
app.listen(8180);
EOF
$ npm install connect
$ emacs ./angularjs/test.html # make a html file to get
$ nodejs ./server.js &
$ wget http://localhost:8180/test.html
# fails 404
$ kill %1 // kill nodejs
$ emacs ./server.js # change ../angularjs to ./angularjs
$ nodejs ./server.js &
$ wget http://localhost:8180/test.html
# works

Categories

Resources