Server / client app with webpack + socket.io - javascript

I'm working on an app that has frontend communicating with backend via socket.io
Right now I have them in two separate apps like this:
.
├──client
│ ├──src
│ │ └──index.js
│ ├──public
│ │ └──index.html
│ ├──webpack.config.js
│ └──package.json
└──server
├──app.js
└──package.json
./server/app.js goes like this
const Express = require('express')();
const server = require('http').Server(Express);
const options = {
cors: {
origin: 'http://localhost:8080',
methods: ['GET','POST'],
credentials: true,
},
};
const Socketio = require('socket.io')(server, options);
Socketio.on('connection', socket => {
socket.on('do-something', () => {
console.log('doing something...');
});
});
server.listen(3000, () => {
console.log('listening on port 3000');
});
./client/src/index.js has these lines
const io = require('socket.io-client');
const socket = io('http://localhost:3000');
const someFunction = () => {
socket.emit('do-something');
}
Pretty standard stuff.
Right now I run them separately from two terminal windows. So basically I have two servers serving a single app. It doesn't seem like the right way to go so I want to try and unite them into a single webpack app to aim for deployment. How do I do that? How do you setup this kind of thing in a real world?
I tried googling on the subject but couldn't find anything. If you know of a post that clearly explains this, share the link please. Muchas gracias & have a nice day :)

I recommend:
Placing webpack.config.js and package.json in the root of your project.
In ./server, distinguished app.js (export express app) and server.js (listen at a port).
Use npm-run-all for simplify your package.json scripts (that provided the commands run-p and run-s)
Add a production ./server.js in the root of your project.
./package.json
"scripts": {
"dev": "run-p dev:*",
"dev:server": "nodemon ./server/server.js",
"dev:webpack": "webpack serve",
"build:webpack": "webpack",
"serve": "node ./server.js"
}
./server/app.js
const Express = require('express')();
const app = require('http').Server(Express);
// Your app definition (socket.io, etc) ...
module.exports = app
./server/server.js
// Your api dev server
const app = require('./app')
app.listen(3000, () => {
console.log('Dev server listening on port 3000');
});
./server.js
// Your prod server
const express = require('express')
const app = require('./server/app')
// Same of your api dev server, but your also serve your frontend build
// ./dist depends on webpack "output-path" parameter
app.use(express.static('./dist'))
app.listen(3000, () => {
console.log('Prod server listening on port 3000');
});
Now, you can:
Run frontend and backend in parallel
npm run dev
Run only frontend
npm run dev:webpack
Run only backend
npm run dev:server
Build frontend
npm run build:webpack
Serve backend and frontend in production (I suggest use pm2)
npm run serve
This architecture works fine in my projects 😊.
You can easily extend it to use Typescript for example...
Sorry for my aproximativ English.
I did not test the code presented, sorry if there are any errors.
Regards

Ok, I give up. I'm answering.
Don't do this. Right now, merging the two repos looks tempting because you're using the same language and maybe some of the same dependencies for the client and the server.
But what happens when
Your boss says you have to port the server part to Python/Java/Go/whatever because reasons?
You want to hand off the client or the server to another team?
You want to open source the client but now the git history has all of your (still proprietary) server code tangled up in it?
You add a CI/CD workflow and now a commit to the client kicks off your pipeline for the server and vice-versa?
N.B. this isn't hypothetical, I've actually seen most of those happen. Now, you may have good answers to all those questions and decide to do it anyway, but if you don't then leave it be: bundling a client and server in the same repo loses you lots of flexibility for very minimal gains.

Related

Autodesk Forge web application - from visual studio code to close .exe file

I have a working forge application ( bim360 hub sidebar with forge viewer and some charts).
It is currently running from Visual Studio Code IDE only. I want to build the app into an .exe file in order to be able to send it to a user, upload it to a server with IIS, etc..
General details:
I used Petr Broz tutorial to set up the backend of the viewer and hub
(Forge online training - view your models https://www.youtube.com/watch?v=-O1e3gXCOEQ&t=8986s )
The app is running on Node.js
I tried to use 'nexe' module and build executable file. With this method, I need to specify index.js file ("an entry point") and define a 'nexe.config.js' file. I used the entry point start.js.
Eventually, I managed to create an exe file - and when I run it from the command line, I get an error
Missing FORGE_CLIENT_ID or FORGE_CLIENT_SECRET env. variables.
although I have them in the config.js
Main questions:
Is there another way to build a close exe file from visual studio code - for a forge web application?
Am i doing something wrong with the processes I mention above?
Is it even possible to deploy a web application to IIS using an exe file?? all of the documentation points toward Azur, AWS and heroku..
Relevant files:
1) start.js:
const path = require('path');//bringing in built in node js modeules ( to resulve file system path )
const express = require('express');//module to create the express server
const cookieSession = require('cookie-session');
//any piece of code would have an opportunity to handle the request
const PORT = process.env.PORT || 3000;
const config = require('./config.js');
if (config.credentials.client_id == null || config.credentials.client_secret == null) {
console.error('Missing FORGE_CLIENT_ID or FORGE_CLIENT_SECRET env. variables.');
return;
}
let app = express();
//static middlewere to check for the front end files (html,js,css)
app.use(express.static(path.join(__dirname, 'public')));//method inside express module: a middlewere for serving static files this line will check in 'public' folder if the request
//that is sent (specific file) is in there. if so - it will ignore the rest of the stack(the rest of the code)
app.use(cookieSession({
// create 2 cookies that stores the name and encripted key
name: 'forge_session',
keys: ['forge_secure_key'],//takes cater of decipher the encription for the forge key for us
maxAge: 14 * 24 * 60 * 60 * 1000 // 14 days, same as refresh token
}));
app.use(express.json({ limit: '50mb' }));//middlewere that looks at the title of the request - and if its .json it will look at the body of the request and parese it to javascript object
app.use('/api/forge', require('./routes/oauth.js'));//adding our custom express routers that will handle the different endpoints.
app.use('/api/forge', require('./routes/datamanagement.js'));
app.use('/api/forge', require('./routes/user.js'));
app.use((err, req, res, next) => {
console.error(err);
res.status(err.statusCode).json(err);
});
app.listen(PORT, () => { console.log(`Server listening on port ${PORT}`); });
2) config.js:
// Autodesk Forge configuration
module.exports = {
// Set environment variables or hard-code here
credentials: {
client_id: process.env.FORGE_CLIENT_ID,
client_secret: process.env.FORGE_CLIENT_SECRET,
callback_url: process.env.FORGE_CALLBACK_URL
},
scopes: {
// Required scopes for the server-side application-->privliges for our internal opperation in the server side ("back end")
internal: ['bucket:create', 'bucket:read', 'data:read', 'data:create', 'data:write'],
// Required scope for the client-side viewer-->priveliges for the client ("front end")
public: ['viewables:read']
}
};
Author of the tutorial here :)
I'm not sure how nexe works exactly but please note that the sample app expects input parameters such as FORGE_CLIENT_ID or FORGE_CLIENT_SECRET to be provided as environment variables.
As a first step, try running your *.exe file after setting the env. variables in your command prompt.
If that doesn't work, try hard-coding the input parameters directly into the config.js file (replacing any of the process.env.* references), and then bundle everything into an *.exe file. This is just for debugging purposes, though! You shouldn't share your credentials with anyone, not even inside an *.exe file. So as an alternative I'd suggest that you update the sample app to read the input parameters from somewhere else, perhaps from a local file.
after trying a lot of solutions, i got to the conclusion that the reason that nothing happened was that the oathantication files ( with the clint_id and clint_password) was not embedded in the .exe file.
the way to include those files with the nexe module is to use the flag -r "Foldername/subfoldername/filename.js".
first, crate a nexe.config.js file that would contain the entry point file name to the app. ( in my case, the file name is " start.js")
second, write the following commands in the command line:
cd C:\Projects\MyAppFolder
npm install -g nexe
// specify all the files you want to include inside the exe file
nexe start.js -r "config.js" -r "nexe.config.js" -r "routes/common/oauth.js" -r "routes/*.js" -r "public//." -r ".vscode/**/." -r "package-lock.json" -r "package.json" --build --output "AppName.exe"

How to deploy a Vue.js application on Node.js server

I have a dist folder containing CSS, fonts, JS folder and an index.html file minimized for Vue.js, ready to deploy and use. I want to use Node.js to run this application. How can I set this up to just run npm run server and have it deployed on a specific port requested? Not sure how to structure this or if I need to build it in a specific way to run this Vue app. Any help would be greatly appreciated.
Since Vue is only a frontend library, the easiest way to host it and do things like serve up assets is to create a simple Express friendly script that you can use to start a mini-web server. Read up quickly on Express if you haven’t already. After that, add express:
npm install express --save
Now add a server.js file to your project’s root directory :
// server.js
var express = require('express');
var path = require('path');
var serveStatic = require('serve-static');
app = express();
app.use(serveStatic(__dirname + "/dist"));
var port = process.env.PORT || 5000;
var hostname = '127.0.0.1';
app.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
after that you could run :
node server
and your project will be served at the given host and port
Assuming that you have already the dist directory, if you don't have it run :
npm run build
in order to generate it

How to organise a webpacked vue.js application served by an express/koa backend?

I am having some difficulty with setting up an easy to debug vue.js project in combination with a koa server. The command cross-env NODE_ENV=development webpack-dev-server --open --hot from the webpack-simple generated configuration seems to be a nice thing, but what is the convention for using this with koa/express?
One solution that i've found is using pm2 to launch the webpack-dev-server for vue and the backend at the same time, but then i think this means i need a copied version of the initial landing vue.js app page which i'm currently serving from a koa route on /. Also it's kind of confusing for me to think about these two servers and it feels strange.
So, i think i must be doing it wrong! Could someone please explain a nice conventional way of doing this.
If you're trying to use webpack-dev-server with your own Node backend, you'll want to look into using proxy.
Basically, the idea is that you run your webpack-dev-server on one port (for example, port 3000) and you run your Node server on another port (for example, port 3001). You just need to tell webpack to forward your server-side requests to your Node backend.
You can do this by adding a proxy property to the devServer setting in your webpack.config.js file. For example:
devServer: {
...
proxy: {
'/api/**': {
target: 'http://localhost:3001',
secure: false
}
}
}
Now, any requests that start with /api/ will go to your Node backend. So if you do something like:
fetch('/api/users').then(...)
This request will be proxied to your Node server. You will just want to make sure you prefix all your server-side routes with /api/.
To automatically prefix all your routes in Koa, you can just do the following:
const Koa = require('koa')
const Router = require('koa-router')
const app = new Koa()
const router = new Router({ prefix: '/api' })
// GET /api/users
router.get('/users', async ctx => {
...
})
app.use(router.routes())
app.use(router.allowedMethods())
If you're using Express, you can prefix by doing the following:
const express = require('express')
const app = express()
const router = express.Router()
// GET /api/users
router.get('/users', (req, res) => {
...
})
app.use('/api', router)

React + Express - What to serve instead of /build when in development

I used create-react-app to initialize my React app, and I am now serving the React client app from an Express server.
My app structure is
project/
build/
server/
src/
where my Express server is in server/, my React app is in src/, and the React app gets built to build/ with npm run build.
Because my Express app serves the "built" app (as shown below, serving files from the build/ directory), I need to npm run build every time I change any client code, in order for my browser to reflect the changes.
// server/app.js
const express = require('express');
const path = require('path');
const app = express();
var server = require('http').Server(app);
var io = require('socket.io')(server);
// Serve static assets
app.use(express.static(path.resolve(__dirname, '..', 'build')));
// sockets
require('./sockets')(io);
// serve main file
app.get('*', (req, res) => {
res.sendFile(path.resolve(__dirname, '..', 'build', 'index.html'));
});
module.exports = server;
Since the build step takes many seconds, this is obviously a big step down from when just serving the React app with react-scripts start and having it watch for code changes and instantly reflect them in the browser.
I know I can use NODE_ENV === 'production' to check if I'm on production or development, but given I'm on development, where are the files I should serve instead of the ones in build/?
I.e. perhaps a relevant question is "from where are they being served when I run the React server with react-scripts start"? EDIT: and how are they being watched such that building the source files to that spot is extremely quick?
Most of the react boilerplates use [webpack dev server / browserify] + hot reload in dev mode, so your changes (and only your changes) are compiled on the fly and your browser is refreshed by a watcher.
It's basically a middleware you plug to express like that
var compiler = webpack(webpackConfig)
var devMiddleware = require('webpack-dev-middleware')(compiler, {
publicPath: webpackConfig.output.publicPath,
quiet: true
})
app.use(devMiddleware)
It's done under the hood in your case, the files are written in memory.

Set up proxy server for create react app

I have started a react application using create-react-app and ran the npm run eject script to gain access to all files. I afterwards installed express and created server.js file that sits on same level as package.json file
these are server.js file contents:
const express = require('express');
const app = express;
app.set('port', 3031);
if(process.env.NODE_ENV === 'production') {
app.use(express.static('build'));
}
app.listen(app.get('port'), () => {
console.log(`Server started at: http://localhost:${app.get('port')}/`);
})
Nothing crazy here, just setting up for future api proxies where I need to use secrets and as I don't want to expose my api.
after this I added a "proxy": "http://localhost:3001/" to my package.json file. I am now stuck as I need to figure out how to start my server correctly and use this server.js file in development mode and afterwards in production.
Ideally It would also be good if we could use more that one proxy i.e. /api and /api2
You didn't have to eject to run your server.js. You can just run it with node server.js together with create-react-app.
You can still do npm start even after ejecting to start your dev server.
To run /api1 and /api2, you just have to handle it in your server.js file and it should work just fine. You need to match the port in your server.js and the one in proxy settings inside package.json - in this case, it should be "proxy": "http://localhost:3031"

Categories

Resources