Run each express request in new fork - javascript

So I'm trying to implement forking all express requests to setup different uid per fork.
My current approach I just setup euid and restore it after a request like this:
const mainUID = process.geteuid();
app.get('/', () => {
process.seteuid(500);
// some action1 that requires privelegies of user id 500
// some action2 that requires privelegies of user id 500
process.seteuid(mainUID);
});
But in case of concurrent requests it fails because after action1 and action2 some source code could be executed.
So I read some information about cluster module: https://nodejs.org/api/cluster.html but I have no idea how to use it in my case.
The more preferred way for me to create fork for each express request without splitting actually javascript code... So is it possible?

Related

using stream.pipe make express middleware skipped

im using express and nodejs for my api server,
and now im implementing pdf download (im using pdf-creator-node) for converting html to pdf.
the pdf-creator-node will create a stream when converting a html, and when i pipe it, all my middleware (notFoundHandler, errorHandler,responseLogger) got skipped
router.get(
'/generatePDF', authenticate, schemaValidation({ params: generatePDFschema }),
(req, res, next) => {
generatePDF(details) {
const template = readFile('./src/templates/da-template.html');
const receipt = {
html: template,
data: payload,
type: 'stream'
};
const generatedPdf = pdf.create(receipt, advisOptions);
return generatedPdf;
}
const pdfStream = generatePDF(details);
res.setHeader('Content-type', 'application/pdf');
pdfStream.pipe(res);
},
notFoundHandler,
errorHandler,
responseLogger
);
is there any express api can i use to pipe a stream?
The middleware you show is passed to router.get() AFTER your request handler so it will not get a chance to run unless you call next() in your request handler. When you pass multiple request handlers to a router.get(), they will run sequentially in order and the 2nd one will only get a chance to run if the first one calls next(). Same for the 3rd one, and so on.
Furthermore, pdfStream.pipe(res); does not call next().
I don't know exactly what those middleware functions are supposed to do, but making a guess based on their names, perhaps they are supposed to run BEFORE your request handler, not after so they can put things in place for various error conditions.
If you want further help, then please show the code for those three middleware functions so we can see more specifically what they are trying to do.

specific response to specific clients while making requests at the same time to the server

Is there any way to get responses to a specific client when another client has a different request at the same time to the same server?
This is code snippet for an exchange server. The given function is present in a library named "ccxt", this function "exchange.fetchMarkets()" has an API which requests to a third party server which is an exchange server like 'bitfinex', 'crex24', 'binance', etc. The issue I am facing is when one client is requesting for an exchange like 'crex24' at the same time when another client is requesting for different exchange like 'binance', they are getting the same response as the function calls for the last recent exchange.
I want it to give responses according to the client's requests independent of each other.
this one is controller function:
const ccxt = require("ccxt");
exports.fetchMarkets = function(req, res){
let API = req.params.exchangeId;
let exchange = new ccxt[API]();
if (exchange.has["fetchMarkets"]) {
try{
var markets = await exchange.fetchMarkets();
res.send(markets)
}catch (err) {
let error = String(err);
res.send({ failed: error });
}
}else{
res.send({loadMarkets : "not available"})
}
}
This is end point for the server request:
app.route('/markets/:exchangeId')
.get(exchange.fetchMarkets)
Here you can find the ccxt library: https://github.com/ccxt/ccxt/wiki/Manual and can be included in the project by "npm install ccxt"
I don't see why the code you mentioned wouldn't work the way you are expecting it to work. I created a small app and it is working as expected. You can check here
https://repl.it/repls/IllfatedStrangeRepo
I am hitting four different request with different ids and I am getting different response.
Hope it clear the doubts.

How to use long polling in native JavaScript and node.js?

I need to implement long polling for a chat application. I've searched around, but I only find how to implement it in JavaScript using JQuery. How can I implement it using only native JavaScript and node.js? Can you guide me to some relevant articles or materials?
Q: How to do long polling in native Javascript in nodeJS?
A: I guess first of all you need to understand how the long polling model works. If you haven't had any clue then the RFC-6202 specification is a good starting point.
It is about the client sending a request to the server and waits until a response is returned.
From the specification we know that first the client will have to issue a http request which has an infinite or at least a high timeout value. Then the server, which is your nodeJs application is expected to stash all incoming requests into a data structure, basically a holding area. Your application will essentially hold on all the response object until an event gets triggered, then you reply to the responses appropriately.
Consider this Pseudo code:
const express = require('express');
const app = express();
const bodyParser = require('body-parser');
var requestCounter = 0;
var responses = {
/* Keyed by room Id =*/
"room_abc" : [ /* array of responses */]
};
app.get('/', function (req, res) {
requestCounter += 1;
var room = /* assuming request is for room_abc */ "room_abc";
// Stash the response and reply later when an event comes through
responses[room].push(res);
// Every 3rd request, assume there is an event for the chat room, room_abc.
// Reply to all of the response object for room abc.
if (requestCounter % 3 === 0) {
responses["room_abc"].forEach((res) => {
res.send("room member 123 says: hi there!");
res.end();
});
}
});
app.use(bodyParser.text({ type: 'text/*' }));
app.use(bodyParser.json());
app.listen(9999, function () {
console.log('Example app listening on port 9999!')
})
It is relatively time consuming to write a working example here but the code above is a good example of how you can implement long polling in NodeJS.
If you have postman installed or curl you can do HTTP calls to http://localhost:9999/ using method GET. You should noticed that on the first two calls you won't get a response and it is when you fired the 3rd one then you'll receive a response for all previous and current calls.
The idea here is you stash the request's response object first and when an event comes through, assuming on every 3rd HTTP call, you then loop through all of the responses and reply to them. For your chat application's case, the event that triggers a response would probably be when someone fires off a message to a chat room.

Express JS Integration testing with Supertest and mock database

Is it possible to test an Express JS REST API using supertest but replacing the actual database connection with a mock database object? I have unit tests covering the database models and other parts of the application as well as functional tests of the API endpoints making actual database connections, but I have a weird requirement to create integration tests that are like the functional tests but use mock database connections. A sample endpoint controller is below:
var model = require('../../../lib/models/list');
module.exports = {
index: function(req, res) {
var data = { key: 'domains', table: 'demo.events'};
var dataModel = new model(data);
dataModel.query().then(function(results) {
res.respond({data: results}, 200);
}).fail(function(err) {
console.log(err);
res.respond({message: 'there was an error retrieving data'}, 500);
});
}
};
And the index for the URI is
var express = require('express'), app, exports;
app = exports = module.exports = express();
exports.callbacks = require('./controller');
app.get('/', exports.callbacks.index);
The list model used in the controller connects to the database and retrieves the data that is output. The challenge is mocking that actual database call while still using supertest to make the request and retrieve the data from the URI
Any information would be helpful including if you think this is a bad or pointless idea
I have had limited success with 2 approaches:
1) use rewire to replace the database driver library like mongodb with a mocked one, perhaps using the spy/stub/mock capabilities of sinon
2) Set your db as an app setting via app.set('mongodb', connectedDb) for dev/prod but in test environment set a mock database instead. This requires your db-accessing code (models typically) to get the DB from the app, or otherwise be mock-friendly or designed with a dependency injection pattern.
Neither of these make everything clean and painless, but I have gotten some utility out of them.

REST API measuring server-side response times (performance).

I developed some rest APIs based nodejs, I want to test the performance of the APIs. Is any tool can easily count the time of each API call?
Or how to implement measuring of time required for REST API to response on requests.
Here is example of how to make event injection with precise time measuring using express.js.
Add this before your routes:
app.all('*', function(req, res, next) {
var start = process.hrtime();
// event triggers when express is done sending response
res.on('finish', function() {
var hrtime = process.hrtime(start);
var elapsed = parseFloat(hrtime[0] + (hrtime[1] / 1000000).toFixed(3), 10);
console.log(elapsed + 'ms');
});
next();
});
It will save start time of each request, and will trigger finish after response is sent to client.
Thanks for user419127 pointing to 'finish' event
What about a performance measuring tool like Apache JMeter. You can easily use it to simulate a (heavy) load on your server and then measure response times and other performance indicators. It provides multiple graphical representations for this.
This Blog Post shows how you can setup an HTTP based performance test for Web-APIs. I do it this way to test my RESTful webservices.
Keep it simple use the console.time('timerName'); console.timeEnd('timerName') features in Node. 'timerName' is obviously configurable.
Example:
console.time('getCustomers');
console.timeEnd('getCustomers')
Output:
getCustomers: 58ms

Categories

Resources