I created a docker container to run tasks with gulp.
All tasks are running, the problem is I can't enable livrereload in Chrome although I exposed the 35729 port in my container.
Here is the Dockerfile :
FROM ubuntu:latest
MAINTAINER jiboulex
EXPOSE 80 8080 3000 35729
RUN apt-get update
RUN apt-get install curl -y
RUN apt-get install software-properties-common -y
RUN add-apt-repository ppa:chris-lea/node.js
RUN apt-get update
RUN apt-get install nodejs -y
RUN curl -L https://www.npmjs.com/install.sh | sh
RUN npm install --global gulp -y
# overwrite this with 'CMD []' in a dependent Dockerfile
CMD ["/bin/bash"]
I create the image with the following command :
docker build -t gulp_image .
I create a container :
docker run --name=gulp_container -i -t --rm -v /var/www/my_app:/var/www/my_app:rw gulp_image bash
then in my container
cd /var/www/my_app
gulp
Here is my Gulpfile.js
var gulp = require('gulp'),
livereload = require('gulp-livereload'),
exec = require('child_process').exec;
gulp.task('js', function() {
gulp.src([
'./src/js/*.js'
]).pipe(livereload());
});
gulp.task('watch', function(){
var onChange = function (event) {
console.log('File '+event.path+' has been '+event.type);
};
livereload.listen();
gulp.watch([
'./src/js/*.js'
], ['js'])
.on('change', onChange);
});
gulp.task('default', ['watch', 'js']);
When I edit a js file, I can see in my container that the files are processed but when I try to enable live reload in my browser (Chrome), I got the following message : "Could not connect to LiveReload server.."
Anyone got a clue about what I missed or didn't do ?
Thanks for reading !
Exposing ports in a container does not imply that the ports will be opened on the docker host. You should be using the docker run -p option. The documentation says:
-p=[] : Publish a container᾿s port or a range of ports to the host
format: ip:hostPort:containerPort | ip::containerPort | hostPort:containerPort | containerPort
Both hostPort and containerPort can be specified as a range of ports.
When specifying ranges for both, the number of container ports in the range must match the number > of host ports in the range. (e.g., -p 1234-1236:1234-1236/tcp)
(use 'docker port' to see the actual mapping)
Since you tried the -p containerPort form, the actual port opened on your host (Linux mint) was randomly chosen by docker when you run the docker run command. To figure out what port was chosen, you have to use the docker port command.
Since this is not convenient, you should use the -p hostPort:containerPort form, and specify that hostPort is 35729. (I also assume you expect ports 80, 8080 and 3000 to be accessible in the same manner)
The command to run your container would then be:
docker run --name=gulp_container -i -t --rm \
-v /var/www/my_app:/var/www/my_app:rw \
-p 35729:35729 \
-p 80:80 \
-p 8080:8080 \
-p 3000:3000 \
gulp_image bash
An easier way to deal with ports is to run your docker container in host networking mode. In this mode, any port opened on the container is in fact opened on the host network interface (they are actually both sharing the same interface).
You would then start your container with:
docker run --name=gulp_container -i -t --rm \
-v /var/www/my_app:/var/www/my_app:rw \
--net=host \
gulp_image bash
Related
The title may be a bit weird and misleading, basically what I want to do is:
I need a node.js server that runs the following script from the dockerfile below. I don't want to run docker inside docker, so I need to combine the script and nodejs server, but don't know how, since I'm quite new to docker.
Should I add a configuration for the node js environment in the dockerfile below, or create a new dockerfile that depends on this one? And what should I do? Nevertheless, how do I do it?
FROM leon/usd:latest
WORKDIR /usr/src/ufg
# Configuration
ARG UFG_RELEASE="3bf441e0eb5b6cfbe487bbf1e2b42b7447c43d02"
ARG UFG_SRC="/usr/src/ufg"
ARG UFG_INSTALL="/usr/local/ufg"
ENV USD_DIR="/usr/local/usd"
ENV LD_LIBRARY_PATH="${USD_DIR}/lib:${UFG_SRC}/lib"
ENV PATH="${PATH}:${UFG_INSTALL}/bin"
ENV PYTHONPATH="${PYTHONPATH}:${UFG_INSTALL}/python"
# Build + install usd_from_gltf
RUN git init && \
git remote add origin https://github.com/google/usd_from_gltf.git && \
git fetch --depth 1 origin "${UFG_RELEASE}" && \
git checkout FETCH_HEAD && \
python "${UFG_SRC}/tools/ufginstall/ufginstall.py" -v "${UFG_INSTALL}" "${USD_DIR}" && \
cp -r "${UFG_SRC}/tools/ufgbatch" "${UFG_INSTALL}/python" && \
rm -rf "${UFG_SRC}" "${UFG_INSTALL}/build" "${UFG_INSTALL}/src"
RUN mkdir /usr/app
WORKDIR /usr/app
# Start the service
ENTRYPOINT ["usd_from_gltf"]
CMD ["usd_from_gltf"]
In Node's case, you can basically just splat in the instructions to download the Node.js tarball and unpack it in place. This is based on the official Node dockerfile which does some more authenticity checking, etc.
RUN \
cd /tmp \
&& curl -fsSLO --compressed "https://nodejs.org/dist/v13.12.0/node-v13.12.0-linux-x64.tar.xz" \
&& tar -xJf "node-v13.12.0-linux-x64.tar.xz" -C /usr/local --strip-components=1 --no-same-owner \
&& rm "node-v13.12.0-linux-x64.tar.xz" \
&& ln -s /usr/local/bin/node /usr/local/bin/nodejs
If you also need the Yarn package manager, the instructions for adding it are in that same linked dockerfile.
Another option, since it looks like leon/usd is using a Debian/Ubuntu base image, is to just install Node.js using the image's Linux distribution's package manager, e.g. RUN apt-get update && apt-get install nodejs.
I'm trying to start the server on my local system with scripts in my package.json. It seems it has relative paths and commands like cp.
I have installed cygwin. I also tried to manually change those commands to windows commands. I used \ instead of / in paths.
"prestart": "cp -v ./src/index.html ./dist && node svg-processing.js && cp -v ./src/components/icons.css ./dist",
You can use linux commands directly with wsl in windows.
In a project I have this Dockerfile:
FROM node:6.9.4
RUN npm install -g cordova#4.2.0 ionic#2.2.1
ENV DOCKER_CONTAINER_APP=/web-app
RUN mkdir -p $DOCKER_CONTAINER_APP
ADD . $DOCKER_CONTAINER_APP
WORKDIR $DOCKER_CONTAINER_APP
EXPOSE 8100 35729
RUN echo "ready to go!"
I am using docker-compose, and this is the docker-compose yml file I use in my project:
version: '2'
services:
web:
build:
context: .
environment:
- NODE_ENV=development
- DEBUG='true'
ports:
- 8100:8100
- 35729:35729
volumes:
- .:/web-app
- ./node_modules:/web-app/node_modules
command: sh -c 'npm install; ionic serve --all'
stdin_open: true
All works well, this is the output of a docker-compose run web command:
[10:53:11] ionic-app-scripts 1.0.0
[10:53:18] watch started ...
[10:53:18] build dev started ...
[10:53:18] clean started ...
[10:53:18] clean finished in 57 ms
[10:53:18] copy started ...
[10:53:18] transpile started ...
[10:53:36] transpile finished in 17.96 s
[10:53:36] webpack started ...
[10:53:37] copy finished in 19.39 s
[10:53:51] webpack finished in 15.10 s
[10:53:51] sass started ...
[10:53:56] sass finished in 4.90 s
[10:53:56] build dev finished in 38.18 s
[10:53:57] watch ready in 39.27 s
[10:53:57] dev server running: http://localhost:8100/
But the native ionic livereload does not work. How can I use the Livereload with this ionic docker image ?
When I had similar issue I'd noticed in browser failed attempts to contact port 53703. Here is screenshot:
Chrome developer tools window
Container I used at that moment had been created with command
docker run -i -t -d --name ionic-dev -v /home/timur/Work/:/Work/ \
-p 8100:8100 -p 35729:35729 ionic-dev
So I stopped and deleted it
docker stop ionic-dev
docker rm ionic-dev
And created another container with command (notice published port 53703)
docker run -i -t -d --name ionic-dev -v /home/timur/Work/:/Work/ \
-p 8100:8100 -p 35729:35729 -p 53703:53703 ionic-dev
After that livereload started to work for me.
I'm trying to "dockerize" our development environment. We have a gulp build system that watches changes to our js/sass/jade files. This is all setup to work just fine outside of docker.
I've created a docker container and I mount my code base into it (using a volume). All the precursor npm installs and bower installs finish successfully. My last step runs gulp and it runs properly and builds but then does not pick up any subsequent changes to any of our js/sass/jade files.
I'm running the build system with the following command:
docker run -it -v $(pwd):/code/ client gulp reset
Does anyone have a similar setup in their development environment? What did you do to get your gulp watch to work and display the building?
EDIT: I guess I could do the gulp build/watch outside of docker and only mount the generated files but I'd rather contain that all inside of docker so that the host machine doesn't need to worry about any dependencies to build/run our app
EDIT2: Here are my dockerfile and docker-compose.yml
#Dockerfile
FROM node:0.12.5
RUN mkdir /code
WORKDIR /code
RUN mkdir client
WORKDIR client
RUN mkdir .tmp
ADD ./client/package.json /code/client/package.json
ADD ./client/bower.json /code/client/bower.json
RUN npm install gulp -g
RUN npm install bower -g
RUN npm install
RUN npm rebuild node-sass
RUN bower --allow-root install
CMD gulp reset
and
client:
build: .
volumes:
- .:/code
I've never been able to get any inotify-based file watcher to ever work over with virtual-box guest additions, and based on this ticket it's unlikely to be available anytime soon. My preferred approach is the following:
Assuming my local source code is in /code
Run my watcher locally on /code
When a change is detected, rsync local /code to remote /code (mounted as a container-only volume) in the container
Example rsync:
docker run --rm --volumes-from sourcecode my/image \
rsync \
--delete \
--recursive \
--safe-links \
--exclude .git --exclude node_modules \
/local/repo/ /container/repo
This avoids lots of issues and allows you to get granular with what you want your container environment to see.
I have built a docker container from centOS using a node.js hapi server. The server works fine running on its own, I get the right output in the console when running it inside the container. However, I don't know how to get at it.
Output from docker container
$ docker run -p 49000:3000 work/learning
Server running at: http://f878541bb9f8:3000
Pack Server running at: http://f878541bb9f8:5000
Docker file
# DOCKER-VERSION 0.3.4
FROM centos:centos6
# Enable EPEL for Node.js
RUN rpm -Uvh http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
# Install Node.js and npm
RUN yum install -y npm
ADD . /src
RUN cd src; npm install
EXPOSE 3000
CMD node /src/server.js