I want to merge three small, separate Express.js apps (with separate functionality/sockets) into one application by creating a landing page with three buttons that will redirect to each while passing some context. Each individual app has a server.js file that will serve its own client-side files.
server.js example:
const express = require("express");
const app = express();
const httpServer = require("http").createServer(app);
const io = require("socket.io")(httpServer, {cors: {origin: "*"}});
io.on("connection", socket => {
// socket events
});
app.use(express.static("public"));
httpServer.listen(3000, () => {
console.log("Server running on port 3000.")
});
I'm thinking of creating a "main" server that specifies routes to each individual app (e.g. app.get('/app1), app.get('/app2) ...) but I'm not really piecing together how to execute the server.js file in each of the individual apps upon navigation. Any pointers are appreciated.
Related
I want to run multiple Node js apps on the same server, and so far I made some progress checking solutions for similar questions here (links below). Let's say I have 2 apps, each serving some html file, and I'd like to access each by visiting
https://example.com/app1 and
https://example.com/app2
So far, I have my main app, and my approach was to call this app which will then redirect a client to one of these 2 apps.
My main app looks like this:
const express = require('express');
const app = express();
app
.use('/app1', require('./app1/index.js'))
.use('/app2', require('./app2/index.js'))
.listen(80);
Each of my two sub-apps (app1 and app2) looks like this
const express = require('express');
const bodyParser = require('body-parser');
const routes = require('./routes/api');
const mongoose = require('mongoose');
require('dotenv/config');
const app = express();
mongoose.connect(
process.env.DB_CONNECTION,
{ useNewUrlParser: true, useUnifiedTopology: true }, () =>
console.log('Connected to DB')
);
mongoose.Promise = global.Promise;
app.use(express.static('public'));
app.use(bodyParser.json());
app.use('/', routes);
app.use(function (err, req, res, next) {
res.status(422).send({ error: err.message })
});
The issue is that I don't get anything after deploying these apps and visiting e.g. https://example.com/app1
I'm super new in all this so there is likely a beginner's mistake in here. Can anyone help?
Related questions How to mount express.js sub-apps? and Running multiple Node (Express) apps on same port
If you want to run totally different application in node you might use proxy_pass/reverse proxy of apache/nginx.
To do so each of your app should operate on theirs own ports and some other server (apache/nginx/etc) passing requests to each of them
example for apache: https://www.digitalocean.com/community/tutorials/how-to-use-apache-as-a-reverse-proxy-with-mod_proxy-on-ubuntu-16-04 sadly with python examples as apps, but the principle is the same.
example for nginx https://docs.nginx.com/nginx/admin-guide/web-server/reverse-proxy/
Im hosting several node apps using this technique and they are working really nice (nginx is much faster than apache). Also you might thinking about blocking access from internet to node apps ports directly.
I am new to SocketIO, I have referred many blogs and documentation for socket and everywhere we first need to create an HTTP server and then attach the socket to it like this -
var app = express();
var httpServer = http.createServer(app);
var io = socketio.listen(httpServer);
What does the second line mean? why are we creating one extra HTTP server while express(web framework) is already defined?
Because I never created a new HTTP instance for my RESTful application, I simply listened to express instance like this -
var express = require('express');
var app = express();
app.listen(8000);
Thanks in advance!
If you want socket.io to run on the same port as your web server, then you use the same server instance. If you want socket.io to run on a different port, then you create a new server instance on that port just for socket.io to use.
Socket.io works just fine using the same port and server instance as Express so unless you have a specific reason to run it on a different port, this is the usual way one would configure it.
Some code examples for socket.io show it in isolation by itself and thus they have to create an http server for it to use.
When using Express, you can get the server instance like this:
const express = require('express');
const socketio = require('socket.io');
const app = express();
const server = app.listen(8000);
const io = socketio(server);
I have a backend Node API Express server and a React app in two separate folders (one for backend, one for React app). My backend runs on localhost:8000 and on my React app I have a proxy to this target via a setupProxy.js file using http-proxy-middleware. When I run the react app locally on localhost:3000, it can send requests to my backend correctly.
However, when I run yarn build on my React app for production, it doesn't seem to work. On the React app's repo, I have installed Express to serve the static files on localhost:9000. When I try to make a call to the backend, it just returns the index.html of the build folder. I'm wondering if I am doing something wrong or if I am missing something. What I would like is:
When user goes on localhost:9000, it shows the index.html of the build folder.
When a user clicks a button, it should send a request to localhost:8000, rather than sending back the index.html.
Here are some files in case it is needed:
src/setupProxy.js (this is on the React app)
const proxy = require('http-proxy-middleware');
module.exports = function(app) {
app.use(proxy('/auth/google', { target: 'http://localhost:8000/' }));
app.use(proxy('/api/**', { target: 'http://localhost:8000/' }));
};
server.js (also on React app, to serve the build folder)
const express = require('express');
const path = require('path');
const app = express();
app.use(express.static(path.join(__dirname, 'build')));
app.get('/*', (req, res) => {
res.sendFile(path.join(__dirname, 'build', 'index.html'));
});
app.listen(9000, () => {
console.log('Listening on port 9000.');
});
Have you added the dependency of cors in your node API.
It is needed when we are communicating to different type of environment
How do I automatically refresh the browser when I make changes to my client files? I am building the client using ReactJS. I am telling my Express server to send the static assets located in my public directory when a GET request is made to "/".
Here is how my server looks:
const express = require('express');
const path = require('path');
const logger = require('morgan');
const bodyParser = require('body-parser');
const app = express();
const port = process.env.PORT || 1128;
app.use(logger('dev'));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
// static assets
app.use(express.static(path.resolve(__dirname, './../public')));
app.listen(port, () =>
console.log(`server is listening on port: ${port}`)
);
After making changes, webpack bundles everything and outputs everything into the public directory. Is there a way for the server to watch for changes here so the browser could automatically refresh? If anyone knows how or knows of a better way I'd greatly appreciate. Thanks!
You could have a websocket which notifies clients on file changes. You can detect file changes using fs.watch()
Watch file system: https://nodejs.org/api/fs.html#fs_event_change
Web socket: https://socket.io/ (or your library of choice)
socket.on('file_changed', function () {
location.reload();
});
I'm working on a project which consists in creating a game of the goose like. In order to do that, I'm using Node.js, Express, jade and now Socket.io. But I encounter some trouble, like, in example, to share the position of one client to the other client. Because my variable position is in a function in index.js and I don't know how I can use Socket.io in a route file. I try some things, but nothing works.
On internet, I've seen some people who say that there is no-sense to use Socket.io in an express route file. So how can I do that ?
In my index.js I've that :
exports.deplacement = function(io)
{
return function(req,res)
{
//[...]
io.sockets.on('connection', function(socket)
{
socket.broadcast.emit('position', space);
});
res.render('moteur' //[...]);
}
}
And in my moteur.jade I've done this :
script(src="/socket.io/socket.io.js")
script.
var socket = io.connect('http://localhost:3000');
socket.on('position ', function(space) {
alert(space);
})
First of all, I'm not sure what your question exactly means, but if it is what I think it is then I think what you mean by using socket.io in a route file is to be able to include the client side javascript lib provided with socket.io module of Node.
In order to do that, you have to allow the socket.io module to listen to server. This works like a middle-ware itself. Everything has to go through socket.io first before they are routed to the server. So, when you request the client side lib, it is uploaded to the client.
var express = require('express')
, routes = require('./routes')
, http = require('http');
var app = express();
var server = app.listen(3000);
var io = require('socket.io').listen(server)