Unable to access the Docker nodejs container on the browser - javascript

I am using windows ver 10 home, so I am using "docker toolbox for windows" where my docker client is windows/amd64 and server is linux/amd64.
I have built a very simple nodejs application with three files.
server.js
/**
* Created by farhanx on 7/28/2018.
*/
'use strict';
const express = require('express');
// Constants
const PORT = 5000;
const HOST = 'localhost';
// App
const app = express();
app.get('/', function (req, res) {
res.send('Hello world\n');
});
app.get('/students', function (req, res) {
res.send('student page\n');
});
app.listen(PORT, HOST);
console.log('Running on http://'+HOST+':'+PORT);
and package.json
{
"name": "docker_web_app",
"version": "1.0.0",
"description": "Node.js on Docker",
"author": "First Last <first.last#example.com>",
"main": "server.js",
"scripts": {
"start": "node server.js"
},
"dependencies": {
"express": "^4.16.1"
}
}
Docker file
FROM node:8
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 5001
CMD [ "npm", "start" ]
Then I have built my docker image successfully and ran this command
docker run -p 5001:5000 farhan/mynode
since I have mentioned port 5000 for the server inside the nodejs server file and inside the docker file I have exposed the 5001 as a port.
Now it runs fine and shows on the console that the nodejs server is running but whenever I use localhost:5001, it displays page not found. Which means somehow docker container is working fine but is not accessible to the browser.

Exposing a port means you let through request asking for that port. You have to expose the port 5000 and not the 5001.
EXPOSE 5000
Also, you should not set the HOST of your Express app to localhost. If you do this, only localhost (the container) will be able to make request.
Usually, you do not set the host (it defaults to 0.0.0.0 and accepts everything):
app.listen(PORT);

Since you are using toolbox, you have to access app in your browser via http://linux_docker_host_ip:5001.
To know the host ip, go to virtualbox, and see the docker machine's ip address. Normally you will find a network icon on right bottom corner when you click on vm in virtual box. By default the IP is '192.168.99.100'

Related

Unable to transpile ES6 express server using Parceljs in development mode

I am trying to transpile an ES6 express app using Parceljs.
Trying to run the parcel dev server using yarn parcel index.js displays that it is running at localhost:1234 but the page is blank. It also generates the following output when trying to run node dist/index.js:
index.js:116
throw error;
^
TypeError: Cannot read properties of undefined (reading 'prototype')
Running yarn parcel index.js --target node does not yield any localhost port for me to test the API with. However, the API now works as I can use node dist/index.js to run the script but now I have to resort to npx nodemon /dist/index.js to have file watching.
Here is the sample code.
index.js
import express from "express";
const app = express();
const port = 5000;
app.get("/", (req, res) => {
res.json({ msg: "Hello!" });
});
app.listen(port, () => {
console.log(`Example app listening on port ${port}`);
});
package.json
...
"dependencies": {
"express": "^4.17.3",
"parcel": "^2.3.2",
"parcel-bundler": "^1.12.5"
}
...
I would greatly appreciate a solution that allows me to use Parceljs to watch for file updates directly, preferably with HMR.
See issue 355: Is parcel meant to work with server-side code insluding HMR?: parcel does not (yet) support hot module reloading in nodejs.
parcel creates a web server and serve your code but express need to be called by itself to be able create a web server and server requests.
You'd better use babel & nodemon instead of parcel
I use command bellow
nodemon --exec babel-node src/index.js

Server / client app with webpack + socket.io

I'm working on an app that has frontend communicating with backend via socket.io
Right now I have them in two separate apps like this:
.
├──client
│ ├──src
│ │ └──index.js
│ ├──public
│ │ └──index.html
│ ├──webpack.config.js
│ └──package.json
└──server
├──app.js
└──package.json
./server/app.js goes like this
const Express = require('express')();
const server = require('http').Server(Express);
const options = {
cors: {
origin: 'http://localhost:8080',
methods: ['GET','POST'],
credentials: true,
},
};
const Socketio = require('socket.io')(server, options);
Socketio.on('connection', socket => {
socket.on('do-something', () => {
console.log('doing something...');
});
});
server.listen(3000, () => {
console.log('listening on port 3000');
});
./client/src/index.js has these lines
const io = require('socket.io-client');
const socket = io('http://localhost:3000');
const someFunction = () => {
socket.emit('do-something');
}
Pretty standard stuff.
Right now I run them separately from two terminal windows. So basically I have two servers serving a single app. It doesn't seem like the right way to go so I want to try and unite them into a single webpack app to aim for deployment. How do I do that? How do you setup this kind of thing in a real world?
I tried googling on the subject but couldn't find anything. If you know of a post that clearly explains this, share the link please. Muchas gracias & have a nice day :)
I recommend:
Placing webpack.config.js and package.json in the root of your project.
In ./server, distinguished app.js (export express app) and server.js (listen at a port).
Use npm-run-all for simplify your package.json scripts (that provided the commands run-p and run-s)
Add a production ./server.js in the root of your project.
./package.json
"scripts": {
"dev": "run-p dev:*",
"dev:server": "nodemon ./server/server.js",
"dev:webpack": "webpack serve",
"build:webpack": "webpack",
"serve": "node ./server.js"
}
./server/app.js
const Express = require('express')();
const app = require('http').Server(Express);
// Your app definition (socket.io, etc) ...
module.exports = app
./server/server.js
// Your api dev server
const app = require('./app')
app.listen(3000, () => {
console.log('Dev server listening on port 3000');
});
./server.js
// Your prod server
const express = require('express')
const app = require('./server/app')
// Same of your api dev server, but your also serve your frontend build
// ./dist depends on webpack "output-path" parameter
app.use(express.static('./dist'))
app.listen(3000, () => {
console.log('Prod server listening on port 3000');
});
Now, you can:
Run frontend and backend in parallel
npm run dev
Run only frontend
npm run dev:webpack
Run only backend
npm run dev:server
Build frontend
npm run build:webpack
Serve backend and frontend in production (I suggest use pm2)
npm run serve
This architecture works fine in my projects 😊.
You can easily extend it to use Typescript for example...
Sorry for my aproximativ English.
I did not test the code presented, sorry if there are any errors.
Regards
Ok, I give up. I'm answering.
Don't do this. Right now, merging the two repos looks tempting because you're using the same language and maybe some of the same dependencies for the client and the server.
But what happens when
Your boss says you have to port the server part to Python/Java/Go/whatever because reasons?
You want to hand off the client or the server to another team?
You want to open source the client but now the git history has all of your (still proprietary) server code tangled up in it?
You add a CI/CD workflow and now a commit to the client kicks off your pipeline for the server and vice-versa?
N.B. this isn't hypothetical, I've actually seen most of those happen. Now, you may have good answers to all those questions and decide to do it anyway, but if you don't then leave it be: bundling a client and server in the same repo loses you lots of flexibility for very minimal gains.

How to structure docker container ports? Use --net or --link?

I have 2 docker containers. One contains a simple node.js web app which contains server information and MongoDB connection details. The second contains a running instance of MongoDB.
I am attempting to run the web app container to connect to the MongoDB container like so:
docker run --link mongodb2:mongodb2 -p 49160:8080 -it --name web node-web-app
Doing this I can successfully access and view the hosted page at http://hostname:49160/ but I cannot connect to MongoDB.
Another method I have tried is:
docker run --net container:mongodb2 -ti --name web node-web-app
Here I can successfully connect to MongoDB, but I cannot access my hosted page at http://hostname:27017/. Instead I receive the message:
It looks like you are trying to access MongoDB over HTTP on the native driver port.
I have also attempted to pass port details like so using the --net method:
docker run --net container:mongodb2 -p 49160:8080 -ti --name web node-web-app
but I receive a docker error:
docker: Error response from daemon: conflicting options: port publishing and the container type network mode.
See 'docker run --help'.
I believe there is an issue with the way I am configuring my ports, but I am new to both docker and setting up web servers.
Here is my web app code:
'use strict';
const express = require('express');
// App
const app = express();
// Constants
const PORT = 8080;
const HOST = '0.0.0.0';
const MongoClient = require('mongodb').MongoClient;
// Connect URL
const url = 'mongodb://127.0.0.1:27017';
var db;
var ticket;
MongoClient.connect(url, {
useNewUrlParser: true,
useUnifiedTopology: true
}, (err, client) => {
if (err) {
return console.log(err);
}
// Specify database you want to access
db = client.db('DB');
console.log(`MongoDB Connected: ${url}`);
ticket = db.collection('ticket');
ticket.find().toArray((err, results) => {
console.log(results);
});
});
//Routes
app.get('/', (req, res) => {
res.sendFile(__dirname + '/index.html')
});
app.listen(PORT, HOST);
console.log(`Running on http://${HOST}:${PORT}`)
You should use a named Docker network to connect between containers. Once you do, the other containers' names will be usable as host names.
docker network create some-network
docker run -d --net some-network --name mongodb2 mongo
docker run -d --net some-network --name app -p 49160:8080 node-web-app
In your source code, you can't hard-code the location of the database, since it's somewhat likely it won't be on the same machine or in the same container when you deploy it. localhost could be a reasonable developer default but the option needs to be configurable.
const mongoHost = process.env.MONGO_HOST || 'localhost';
const url = `mongodb://${mongoHost}:27017`;
docker run ... -e MONGO_HOST=mongodb2 ...
If you're using Docker Compose to launch things, it provides a default network for you (different from the "default bridge network" in the core Docker documentation) and you need to do very little setup; just use the other container's Compose service name as a host name.
version: '3.8'
services:
mongodb2:
image: mongo
app:
build: .
ports: ['49160:8080']
environment:
- MONGO_HOST=mongodb2
Of the other options you propose, --link is considered obsolete now that named networks have essentially replaced it. Setting one container to run in another's network namespace is also a very unusual setup, and it comes with limitations like what you show.

Why can't I see my test text on the front page of my original React web app?

As of now, my React app runs on port 3000 via npm start.
I've decided that I want to use MySQL for the web app I'm building via yarn add express mysql.
I made server.js listen in on port 3001.
Whenever I run nodemon server.js and then hit refresh, I'm not seeing test on the front page of my React app (which would indicate that everything works fine).
I can see test if I type localhost: 3001 in my browser but it's completely blank, meaning, I only see test and not the original front page of my web app. It's a whole new different page.
Inside package.json file, I tried to include "proxy":"http://localhost:3001" at the bottom of the file as well as various other places, but it still doesn't work.
How do I make it so that I can see test on the original front page of my web app (port 3000) so I can conclude that everything's working fine and can proceed with integrating MySQL?
Here's my server.js file:
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors());
app.get('/', (req, res) => {
res.send('test');
});
app.listen(3001, () => {
console.log("listening port 3001");
});
Update
If you don't need to build and try your app in production, meaning you only need to run both of them for development then just use a proxy as suggested in the comments. Run both servers and make requests to your Express API routes on the frontend. You can add a proxy like this in your client's (React part) package.json file.
"proxy": {
"/api/*": {
"target": "http://localhost:3001"
}
}
Then, any request made for /api/* on the frontend goes through your Express server.
For starting both servers at the same time, you can use concurrently. First install it on the server side:
yarn add concurrently
After installing it you add something like this in your scripts part:
"scripts": {
"server": "nodemon index.js",
"client": "npm run start --prefix client",
"dev": "concurrently \"npm run server\" \"npm run client\"",
"prod": "NODE_ENV=production nodemon app.js"
},
I misunderstood your intention at first, this is why I gave an answer like the below one.
This is normal behavior since you haven't configured Express to serve your frontend properly.
First of all, for a React app you don't need any server at all. What you are using right now (on port 3000) is for developing purposes. So, after completing your app you should build it and configure Express to serve it statically.
First, build it:
yarn build
After this step, you will have static files in your client's build directory.
Now, your Express config should be something like this:
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors());
app.get('/test', (req, res) => {
res.send('test');
});
app.use( express.static( "client/build" ) );
app.get( "*", ( req, res ) =>
res.sendFile( path.resolve( __dirname, "client", "build", "index.html" ) ) );
app.listen(3001, () => {
console.log("listening port 3001");
});
Notice the route change for Express. I changed / with /test. So, when you hit /test you will see what your Express route serves. Other than this route you should see your React app.
Also, don't forget to change those if your setup is different:
client/build
and
path.resolve( __dirname, "client", "build", "index.html" )
This means Express searches a client directory and you React app resides there.
PS: You will only start the Express server, there will be no more server for React since you don't need it to serve with Express.
Also, related part can be enhanced like this:
if ( process.env.NODE_ENV === "production" ) {
app.use( express.static( "client/build" ) );
app.get( "*", ( req, res ) =>
res.sendFile( path.resolve( __dirname, "client", "build", "index.html" ) ) );
}
So, at runtime you can pass an environment variable and Express hits this route only in production.

Set up proxy server for create react app

I have started a react application using create-react-app and ran the npm run eject script to gain access to all files. I afterwards installed express and created server.js file that sits on same level as package.json file
these are server.js file contents:
const express = require('express');
const app = express;
app.set('port', 3031);
if(process.env.NODE_ENV === 'production') {
app.use(express.static('build'));
}
app.listen(app.get('port'), () => {
console.log(`Server started at: http://localhost:${app.get('port')}/`);
})
Nothing crazy here, just setting up for future api proxies where I need to use secrets and as I don't want to expose my api.
after this I added a "proxy": "http://localhost:3001/" to my package.json file. I am now stuck as I need to figure out how to start my server correctly and use this server.js file in development mode and afterwards in production.
Ideally It would also be good if we could use more that one proxy i.e. /api and /api2
You didn't have to eject to run your server.js. You can just run it with node server.js together with create-react-app.
You can still do npm start even after ejecting to start your dev server.
To run /api1 and /api2, you just have to handle it in your server.js file and it should work just fine. You need to match the port in your server.js and the one in proxy settings inside package.json - in this case, it should be "proxy": "http://localhost:3031"

Categories

Resources