I can use koajs middleware with app.use(function *() { ... }) But, how can I make koajs when the app is launched?
It is trivial to simply code anything in js before all actions, but what if I would like it to perform some async stuff before and outside the middlewares? For example, I might want to obtain a certain key with an API call to an external server, to store it into a variable and return it when I get any request.
As you said, you can simply put it outside any middleware and call app.listen() only when your task is done:
var koa = require('koa');
var app = koa();
// add all your middlewares
loadKeyOrSomethingAsync().then(function() {
app.listen(3000);
});
This way your server will wait your async task to complete before start listening for requests.
Related
I am creating a MERN app that adds meta tags to React pages without SSR. So, I need to read the query inside the main file of the server and pass the appropriate metadata content to each page.
I am using this in the server.js file:
const indexPath = path.resolve(__dirname, 'build', 'index.html');
// static resources should just be served as they are
app.use(express.static(
path.resolve(__dirname, 'build'),
{ maxAge: '30d' },
));
// here we serve the index.html page
app.get('/*', (req, res, next) => {
fs.readFile(indexPath, 'utf8', (err, htmlData) => {
if (err) {
console.error('Error during file reading', err);
return res.status(404).end()
}
// get post info
const postId = req.query.id;
const post = getPostById(postId);
if(!post) return res.status(404).send("Post not found");
// inject meta tags
htmlData = htmlData.replace(
"<title>React App</title>",
`<title>${post.title}</title>`
)
.replace('__META_OG_TITLE__', post.title)
.replace('__META_OG_DESCRIPTION__', post.description)
.replace('__META_DESCRIPTION__', post.description)
.replace('__META_OG_IMAGE__', post.thumbnail)
return res.send(htmlData);
});
});
Here the getPostById is statically defined in a file. But I want to fetch it from my db.
My file structure is:
server.js
controllers
- posts.js
routes
- posts.js
I've separated the logic from route. So my routes/posts.js file looks like:
import { getPost, createPost } from '../controllers/posts.js';
const router = express.Router();
router.get('/', getPost);
router.post('/', createPost);
export default router;
So, in order to dynamically pass the meta content, I need to read the API endpoint for each request and pass the appropriate data. For this, I need to call the endpoints directly inside my node project. How to do that?
I'd appreciate any help. Thank you.
If you really want to call your own http endpoints, you would use http.get() or some higher level http library (that is a little easier to use) such as got(). And, then you can make an http request to your own server and get the results back.
But ... usually, you do not make http requests to your own server. Instead, you encapsulate the functionality that gets you the data you want in a function and you use that function both in the route and in your own code that wants the same data as the route. This is a ton more efficient than packaging up an http request, sending that request to the TCP stack, having that request come back to your server, parsing that request, getting the data, forming it as an http response, sending that response back to the requester, parsing that response, then using the data.
Instead, if you have a common, shared function, you just call the function, get the result from it (probably via a promise) and you're done. You don't need all that intermediate packaging into the http request/response, parsing, loopback network, etc...
I am writing my first very simple express server for data a collection purpose. This seems like a beginner question but I failed to find an answer so far. The data is very small (less than 500 integers) and will never grow, but it should be able to be changed through POST requests.
I essentially (slightly simplified) want to:
Have the data in a .json file that is loaded when the server starts.
On a POST request, modify the data and update the .json file.
On a GET request, simply send the .json containing the data.
I don't want to use a database for this as the data is just a single small array that will never grow in size. My unclarities are mainly how to handle modifying the global data and file reading / writing safely, i.e. concurrency and how exactly does Node run the code.
I have the following
const express = require('express');
const fs = require('fs');
let data = JSON.parse(fs.readFileSync('./data.json'));
const app = express();
app.listen(3000);
app.use(express.json());
app.get("/", (req, res) => {
res.sendFile('./data.json', { root: __dirname });
});
app.post("/", (req, res) => {
const client_data = req.body;
// modify global data
fs.writeFileSync("./data.json", JSON.stringify(data), "utf8");
});
Now I have no idea if or why this is safe to do. For example, modifying the global data variable and writing to file. I first assumed that requests cannot run concurrently without explicitly using async functions, but that seems to not be the case: I inserted this:
const t = new Date(new Date().getTime() + 5000);
while(t > new Date()){}
into the app.post(.. call to try and understand how this works. I then made simultaneous POST requests and they finished at the same time, which I did not expect.
Clearly, the callback I pass to app.post(.. is not executed all at once before other POST requests are handled. But then I have a callback running concurrently for all POST requests, and modifying the global data and writing to file is unsafe / a race condition. Yet all code I could find online did it in this manner.
Am I correct here? If so, how do I safely modify the data and write it to file? If not, I don't understand how this code is safe at all?
Code like that actually opens up your system to race conditions. Node actually runs that code in a single-threaded kind of way, but when you start opening files and all that stuff, it gets processed by multiple threads (opening files are not Node processes, they are delegated to the OS).
If you really, really want to use files as your global data, then I guess you can use an operating system concept called Mutual Exclusions. Basically, its a 'lock' used to prevent race conditions by forcing processes to wait while something is currently accessing the shared resource (or if the shared resource is busy). In Node, this can be implemented in many ways, but one recommendation is to use async-mutex library to handle concurrent connections and concurrent data modifications. You can do something like:
const express = require('express');
const fs = require('fs');
const Mutex = require('async-mutex').Mutex;
// Initializes shared mutual exclusion instance.
const mutex = new Mutex()
let data = JSON.parse(fs.readFileSync('./data.json'));
const app = express();
app.listen(3000);
app.use(express.json());
app.get("/", (req, res) => {
res.sendFile('./data.json', { root: __dirname });
});
// Turn this into asynchronous function.
app.post("/", async (req, res) => {
const client_data = req.body;
const release = await mutex.acquire();
try {
fs.writeFileSync('./data.json', JSON.stringify(data), 'utf8');
res.status(200).json({ status: 'success' });
} catch (err) {
res.status(500).json({ err });
finally {
release();
}
});
You can also use Promise.resolve() in order to achieve similar results with the async-mutex library.
Note that I recommend you to use a database instead, as it is much better and abstracts a lot of things for you.
References:
Node.js Race Conditions
I am using Expressjs and the Auth0 API for authentication and ReactJs for client side.
Because of the limitations of the Auth0 API (spoke with their team) I am sending updated user details to my backend and then using app.set() to be able to use the req.body in another route.
I need to call the app.patch() route automatically after the app.post() route has been hit.
The end goal is that the users data will be updated and shown client side.
const express = require('express');
const cors = require('cors');
const path = require('path');
const app = express();
require('dotenv').config()
const { auth } = require("express-openid-connect");
app.use(express.json());
app.use(cors());
app.use(express.static(path.join(__dirname, 'build')));
app.use(
auth({
issuerBaseURL: process.env.AUTH0_ISSUER_BASE_URL,
baseURL: process.env.BASE_URL,
clientID: process.env.AUTH0_CLIENT_ID,
secret: process.env.SESSION_SECRET,
authRequired: false,
auth0Logout: true,
})
);
app.get('/', async (req, res) => {
res.sendFile(path.join(__dirname, 'build', 'index.html'));
});
app.get('/api', async (req, res) => {
const stripe = require('stripe')(`${process.env.REACT_APP_Stripe_Live}`);
const invoice = await stripe.invoices.list({
limit: 3,
});
res.json(invoice);
});
app.post('/updateuser', (req, ) => {
app.set('data', req.body);
})
app.patch(`https://${process.env.AUTH0_ISSUER_BASE_URL}/api/v2/users/:id`,(req,res) => {
let val = app.get('data');
req.params = {id: val.id};
console.log(req.params);
})
app.listen(process.env.PORT || 8080, () => {
console.log(`Server listening on 8080`);
});
I'd suggest you just take the code from inside of app.patch() and make it into a reusable function. Then it can be called from either the app.patch() route directly or from your other route that wants to do the same funtionality. Just decide what interface for that function will work for both, make it a separate function and then you can call it from both places.
For some reason (which I don't really understand, but seems to happen to lots of people), people forget that the code inside of routes can also be put into functions and shared just like any other Javascript code. I guess people seems to think of a route as a fixed unit by itself and forget that it can still be broken down into components and those components shared with other code.
Warning. On another point. This comment of yours sounds very wrong:
and then using app.set() to be able to use the req.body in another route
req.body belongs to one particular user. app.set() is global to your server (all user's requests access it). So, you're trying to store temporary state for one single user in essentially a global. That means that multiple user's request that happen to be in the process of doing something similar will trounce/overwrite each other's data. Or worse, one user's data will accidentally become some other user's data. You cannot program a multi-user server this way at all.
The usual way around this is to either 1) redesign the process so you don't have to save state on the server (stateless operations are generally better, if possible) or 2) Use a user-specific session (like with express-session) and save the temporary state in the user's session. Then, it is saved separately for each user and one user's state won't overwrite anothers.
If this usage of app.set() was to solve the original problem of executing a .patch() route, then the problem is solved by just calling a shared function and passing the req.body data directly to that shared function. Then, you don't have to stuff it away somewhere so a later route can use it. You just execute the functionality you want and pass it the desired data.
Inspired by How to share sessions with Socket.IO 1.x and Express 4.x? i implemented socket authentication in some "clean" way where is no need to use cookie-parser and to read cookies from headers, but few items remain unclear to me. Example use last stable socket.io version 1.3.6.
var express = require('express'),
session = require('express-session'),
RedisStore = require('connect-redis')(session),
sessionStore = new RedisStore(),
io = require('socket.io').listen(server);
var sessionMiddleware = session({
store : sessionStore,
secret : "blabla",
cookie : { ... }
});
function socketAuthentication(socket, next) {
var sessionID = socket.request.sessionID;
sessionStore.get(sessionID, function(err, session) {
if(err) { return next(err); }
if(typeof session === "undefined") {
return next( new Error('Session cannot be found') );
}
console.log('Socket authenticated successfully');
next();
});
}
io.of('/abc').use(socketAuthentication).on('connection', function(socket) {
// setup events and stuff
});
io.use(function(socket, next) {
sessionMiddleware(socket.request, socket.request.res, next);
});
app.use(sessionMiddleware);
app.get('/', function(req, res) { res.render('index'); });
server.listen(8080);
index.html
<body>
...
<script src="socket.io/socket.io.js"></script>
<script>
var socket = io('http://localhost:8080/abc');
</script>
</body>
So io('http://localhost:8080/abc'); from client-side will send initial HTTP handshake request to server, from where server can gather cookies and many others request informations. So server has access to that initial request via socket.request.
My first question is why handshake request is not in scope of express-session middleware?(More generally in scope of app.use middlewares?) In some way i expected this app.use(sessionMiddleware); to fire before that initial request, and then to access easily to socket.request.session
Second, what are the scenarios in which middlewares defined with io.use() will fire? Only for initial HTTP handshake request? It seems like io.use() is used for socket related stuff(question is: what stuff), while app.use for standard requests.
I'm not quite sure why in the above example io.use() is fired before io.of('/abc').use(). Intentionally i wrote that order putting io.of('/abc').use() first to see will it work and it work.
Should have been written conversely.
Lastly, socket.request.res like pointed also from some people in linked question, sometimes is undefined causing app to broke, problem can be solved by providing empty object instead of socket.request.res, like: sessionMiddleware(socket.request, {}, next); which seems to me like a dirty hack. For what reasons socket.request.res yield to undefined?
Despite #oLeduc is kind of correct, there are a few more things to explain..
Why the handshake's request is not in scope of express-session middleware?
The biggest reason here is that the middleware in express is designed to handle request specific tasks. Not all, but most of the handlers use the standard req, res, next syntax. And sockets are "request-less" if I can say. The fact that you have socket.request is due to the way the handshake is made, and that it is using HTTP for that. So the guys at socket.io hacked that first request into your socket class so that you can use it. It was not designed by the express team to ever work with sockets and TCP.
What are the scenarios in which middlewares defined with io.use() will fire?
io.use is a close representation of the express use middleware way. In express, the middleware is executed on each request, right? But sockets do not have requests and it will be awkward to use middleware on each socket emit, so they've made it to be executed on each connection. But as well as the express middleware is stacked and used before the actual request is handled (and responded), Socket.IO uses the middleware on connection and even before the actual handshake! You can intercept the handshake if you want to, using that kind of middleware, which is very handy (in order to protect your server from spamming). More on this can be found in the code of passport-socketio
Why io.use() fires before io.of('/abc').use()?
The real explanation on this can be found here, which is this code:
Server.prototype.of = function(name, fn){
if (String(name)[0] !== '/') name = '/' + name;
if (!this.nsps[name]) {
debug('initializing namespace %s', name);
var nsp = new Namespace(this, name);
this.nsps[name] = nsp;
}
if (fn) this.nsps[name].on('connect', fn);
return this.nsps[name];
};
And in the beginning of the code, there is this line of code:
this.sockets = this.of('/');
So, there is a default namespace created at the very beginning. And right there, you can see that it has immediately a connect listener attached to it. Later on, each namespace gets the very same connect listener, but because Namespace is EventEmitter, the listeners are added one after another, so they fire one after another. In other words, the default namespace has it's listener at first place, so it fires first.
I don't think this is designed on purpose, but it just happened to be this way :)
Why is socket.request.res undefined?
To be honest, I'm not pretty sure about that. It's because of how engine.io is implemented - you can read a bit more here. It attaches to the regular server, and sends requests in order to make a handshake. I can only imagine that sometimes on errors the headers are separated from the response and that's why you won't get any. Anyways, still just guessing.
Hope information helps.
Why the handshake's request is not in scope of express-session middleware?
Because socket.io will attach to a http.Server which is the layer under express. It is mentioned in a comment in the source of socket.io.
The reason for this is because the first request is a regular http request used to upgrade the reqular stateless http connection into a state-full websocket connection. So it wouldn't make much sense for it to have to go through all the logic that applies to regular http requests.
What are the scenarios in which middlewares defined with io.use() will fire?
Whenever a new socket connection is created.
So every time a client connects it will call the middlewares registed using io.use(). Once the client is connected however, it is not called when a packet is received from the client. It doesn't matter if the connection is initiated on a custom namespace or on the main namespace, it will always be called.
Why io.use() fires before io.of('/abc').use()?
Namespaces are a detail of socket.io's implementation, in reality, websockets will always hit the main namespace first.
To illustrate the situation, look at this snippet and the output it produces:
var customeNamespace = io.of('/abc');
customeNamespace.use(function(socket, next){
console.log('Use -> /abc');
return next();
});
io.of('/abc').on('connection', function (socket) {
console.log('Connected to namespace!')
});
io.use(function(socket, next){
console.log('Use -> /');
return next();
});
io.on('connection', function (socket) {
console.log('Connected to namespace!')
});
Output:
Use -> /
Main namespace
Use -> /abc
Connected to namespace!
See the warning that the socket.io team added to their documentation:
Important note: The namespace is an implementation detail of the Socket.IO protocol, and is not related to the actual URL of the underlying transport, which defaults to /socket.io/….
Why is socket.request.res undefined?
As far as I know it should never be undefined. It might be related to your specific implementation.
I'm creating a simple testing platform for an app and have the following code setup as my server.js file in the root of my app:
var restify = require('restify'),
nstatic = require('node-static'),
fs = require('fs'),
data = __dirname + '/data.json',
server = restify.createServer();
// Serve static files
var file = new nstatic.Server('');
server.get(/^\/.*/, function(req, res, next) {
file.serve(req, res, next);
});
// Process GET
server.get('/api/:id', function(req, res) {
// NEVER FIRES
});
It serves static files perfectly, however, when I try to make a call to the /api it just hangs and times out. Imagine I'm missing something stupid here, any help would be greatly appreciated.
node-static is calling next with an error, which means it's never yielding to other handlers.
You can move your other handlers above node-static or ignore it's errors by intercepting it's callback.
I made a working version here: http://runnable.com/UWXHRONG7r1zAADe
You may make yourself sure the api get call is caught by moving the second get before the first. The reason is your api calls routes are already matched by the first pattern.