Running code AFTER the response has been sent by Koa - javascript

To optimize the response delay, it is necessary to perform work after are response has been sent back to the client. However, the only way I can seem to get code to run after the response is sent is by using setTimeout. Is there a better way? Perhaps somewhere to plug in code after the response is sent, or somewhere to run code asynchronously?
Here's some code.
koa = require 'koa'
router = require 'koa-router'
app = koa()
# routing
app.use router app
app
.get '/mypath', (next) ->
# ...
console.log 'Sending response'
yield next
# send response???
console.log 'Do some more work that the response shouldn\'t wait for'

Do NOT call ctx.res.end(), it is hacky and circumvents koa's response/middleware mechanism, which means you might aswell just use express.
Here is the proper solution, which I also posted to https://github.com/koajs/koa/issues/474#issuecomment-153394277
app.use(function *(next) {
// execute next middleware
yield next
// note that this promise is NOT yielded so it doesn't delay the response
// this means this middleware will return before the async operation is finished
// because of that, you also will not get a 500 if an error occurs, so better log it manually.
db.queryAsync('INSERT INTO bodies (?)', ['body']).catch(console.log)
})
app.use(function *() {
this.body = 'Hello World'
})
No need for ctx.end()
So in short, do
function *process(next) {
yield next;
processData(this.request.body);
}
NOT
function *process(next) {
yield next;
yield processData(this.request.body);
}

I have the same problem.
koa will end response only when all middleware finish(In application.js, respond is a response middleware, it end the response.)
app.callback = function(){
var mw = [respond].concat(this.middleware);
var gen = compose(mw);
var fn = co.wrap(gen);
var self = this;
if (!this.listeners('error').length) this.on('error', this.onerror);
return function(req, res){
res.statusCode = 404;
var ctx = self.createContext(req, res);
onFinished(res, ctx.onerror);
fn.call(ctx).catch(ctx.onerror);
}
};
But, we can make problem solved by calling response.end function which is node's api:
exports.endResponseEarly = function*(next){
var res = this.res;
var body = this.body;
if(res && body){
body = JSON.stringify(body);
this.length = Buffer.byteLength(body);
res.end(body);
}
yield* next;
};

you can run code in async task by use setTimeout, just like:
exports.invoke = function*() {
setTimeout(function(){
co(function*(){
yield doSomeTask();
});
},100);
this.body = 'ok';
};

Related

Express.js - abort request on timeout

I am exploring ways to abort client requests that are taking too long, thereby consuming server resources.
Having read some sources (see below), I tried a solution like the following (as suggested here):
const express = require('express');
const server = express();
server
.use((req, res, next) => {
req.setTimeout(5000, () => {
console.log('req timeout!');
res.status(400).send('Request timeout');
});
res.setTimeout(5000, () => {
console.log('res timeout!');
res.status(400).send('Request timeout');
});
next();
})
.use(...) // more stuff here, of course
.listen(3000);
However, it seems not to work: the callbacks are never called, and the request is not interrupted.
Yet, according to recent posts, it should.
Apart from setting the timeout globally (i.e. server.setTimeout(...)), which would not suit my use case,
I have seen many suggesting the connect-timeout middleware.
However, I read in its docs that
While the library will emit a ‘timeout’ event when requests exceed the given timeout, node will continue processing the slow request until it terminates.
Slow requests will continue to use CPU and memory, even if you are returning a HTTP response in the timeout callback.
For better control over CPU/memory, you may need to find the events that are taking a long time (3rd party HTTP requests, disk I/O, database calls)
and find a way to cancel them, and/or close the attached sockets.
It is not clear to me how to "find the events that are taking long time" and "a way to cancel them",
so I was wondering if someone could share their suggestions.
Is this even the right way to go, or is there a more modern, "standard" approach?
Specs:
Node 12.22
Ubuntu 18.04
Linux 5.4.0-87-generic
Sources:
Express.js Response Timeout
Express.js connect timeout vs. server timeout
Express issue 3330 on GitHub
Express issue 4664 on GitHub
Edit:
I have seen some answers and comments offering a way to setup a timeout on responses or "request handlers": in other words, the time taken by the middleware is measured, and aborted if it takes too long. However, I was seeking for a way to timeout requests, for example in the case of a client sending a large file over a slow connection. This happens probably even before the first handler in the express router, and that is why I suspect that there must be some kind of setting at the server level.
Before rejecting long request, I think, it's better to measure requests, find long ones, and optimize them. if it is possible.
How to measure requests?
Simplest way it is to measure time from start, till end of request. You'll get Request Time Taken = time in nodejs event loop + time in your nodejs code + wait for setTimeout time + wait for remote http/db/etc services
If you don't have many setTimeout's in code, then Request Time Taken is a good metric.
(But in high load situations, it becomes unreliable, it is greatly affected by time in event loop )
So you can try this measure and log solution
http://www.sheshbabu.com/posts/measuring-response-times-of-express-route-handlers/
How to abort long request
it all depends on your request handler
Handler does Heavy computing
In case of heavy computing, which block main thread, there's noting you can do without rewriting handler.
if you set req.setTimeout(5000, ...) - it fires after res.send(), when main loop will be unlocked
function (req, res) {
for (let i = 0; i < 1000000000; i++) {
//blocking main thread loop
}
res.send("halted " + timeTakenMs(req));
}
So you can make your code async, by injecting setTimeout(, 0) some where;
or move computing to worker thread
Handler has many remote requests
I simulate remote requests with Promisfied setTimeout
async function (req, res) {
let delayMs = 500;
await delay(delayMs); // maybe axios http call
await delay(delayMs); // maybe slow db query
await delay(delayMs);
await delay(delayMs);
res.send("long delayed" + timeTakenMs(req));
}
In this case you can inject some helpers to abort your request chain
blockLongRequest - throws error if request time is too big;
async function (req, res) {
let delayMs = 500;
await delay(delayMs); // mayby axios call
blockLongRequest(req);
await delay(delayMs); // maybe db slow query
blockLongRequest(req);
await delay(delayMs);
blockLongRequest(req);
await delay(delayMs);
res.send("long delayed" + timeTakenMs(req));
})
single remote request
function (req, res) {
let delayMs = 1000;
await delay(delayMs);
//blockLongRequest(req);
res.send("delayed " + timeTakenMks(req));
}
we don't use blockLongRequest because it's better to deliver answer instead of error.
Error may trigger client to retry, and you get your slow requests doubled.
Full example
(sorry for TypeScript, yarn ts-node sever.ts )
import express, { Request, Response, NextFunction } from "express";
declare global {
namespace Express {
export interface Request {
start?: bigint;
}
}
}
const server = express();
server.use((req, res, next) => {
req["start"] = process.hrtime.bigint();
next();
});
server.use((err: any, req: Request, res: Response, next: NextFunction) => {
console.error("Error captured:", err.stack);
res.status(500).send(err.message);
});
server.get("/", function (req, res) {
res.send("pong " + timeTakenMks(req));
});
server.get("/halt", function (req, res) {
for (let i = 0; i < 1000000000; i++) {
//halting loop
}
res.send("halted " + timeTakenMks(req));
});
server.get(
"/delay",
expressAsyncHandler(async function (req, res) {
let delayMs = 1000;
await delay(delayMs);
blockLongRequest(req); //actually no need for it
res.send("delayed " + timeTakenMks(req));
})
);
server.get(
"/long_delay",
expressAsyncHandler(async function (req, res) {
let delayMs = 500;
await delay(delayMs); // mayby axios call
blockLongRequest(req);
await delay(delayMs); // maybe db slow query
blockLongRequest(req);
await delay(delayMs);
blockLongRequest(req);
await delay(delayMs);
res.send("long delayed" + timeTakenMks(req));
})
);
server.listen(3000, () => {
console.log("Ready on 3000");
});
function delay(delayTs: number): Promise<void> {
return new Promise((resolve) => {
setTimeout(() => {
resolve();
}, delayTs);
});
}
function timeTakenMks(req: Request) {
if (!req.start) {
return 0;
}
const now = process.hrtime.bigint();
const taken = now - req.start;
return taken / BigInt(1000);
}
function blockLongRequest(req: Request) {
const timeTaken = timeTakenMks(req);
if (timeTaken > 1000000) {
throw Error("slow request - aborting after " + timeTaken + " mks");
}
}
function expressAsyncHandler(
fn: express.RequestHandler
): express.RequestHandler {
return function asyncUtilWrap(...args) {
const fnReturn = fn(...args);
const next = args[args.length - 1] as any;
return Promise.resolve(fnReturn).catch(next);
};
}
I hope, this approach helps you to find an acceptable solution

Can't remove headers after they are sent

I am getting inconsistent results from my server. Sometimes the right response is sent and sometimes I get the error
Can't remove headers after they are sent
Using Node.js, Koa.js, and Mongoose
router
.get('/events/', function* getEvent() {
let eventList = [];
yield Event.find({}, (error, events) => {
if (error) {
this.response.body = 'Unable to get events.';
this.status = 404;
return;
}
eventList = events;
eventList.sort((first, second) => {
// sort implementation
});
this.response.body = eventList;
this.status = 200;
});
});
The issue is caused by your callback which introduces a race condition since your yield isn't waiting for it to finish. In Koa v1.x, you generally only use a callback API to return a promise.
Here's how you'd write your example with Koa v1.x:
router
.get('/events', function * () {
let events
try {
events = yield Event.find({})
} catch (err) {
this.status = 503
this.body = 'Unable to get events'
return
}
events = sort(events)
this.body = events // Implicit 200 response
})
Event.find just needs to return something yieldable like a promise. Check to see if the library your using has a promise-returning version.
Though normally you'd just write it like this:
router
.get('/events', function * () {
let events = yield Event.find({})
events = sort(events)
this.body = events
})
Since it's an internal error (500 response) if Event.find is down. Koa will turn uncaught errors into 500 responses.
Basically, after you set this.status to 200, it throws an error because this.response.body is probably undefined. Go ahead and console.log() this.response.body and see if it is defined. If it is undefined, I would guess req.body is not being populated correctly OR it is an asynchronous node problem. Basically, eventList.sort() is asynchronously executing as this.response.body = eventList is set. Thus eventList is not sorted yet when you set it. To fix this put it inside eventList.sort() callback.
EDIT: after seeing your comment, I am pretty sure it is asynchronous problem now. Let me know if putting the last two lines inside the sort call works out for you.

Using request to get API results, and integrating into Hubot response

I have a Hubot plugin, that listens for JIRA webhooks, and announces in a room when new tickets are created:
module.exports = (robot) ->
robot.router.post '/hubot/tickets', (req, res) ->
data = if req.body.payload? then JSON.parse req.body.payload else req.body
if data.webhookEvent = 'jira:issue_created'
console.dir("#{new Date()} New ticket created")
shortened_summary = if data.issue.fields.summary.length >= 20 then data.issue.fields.summary.substring(0, 20) + ' ...' else data.issue.fields.summary
shortened_description = if data.issue.fields.description.length >= 50 then data.issue.fields.description.substring(0, 50) + ' ...' else data.issue.fields.description
console.log("New **#{data.issue.fields.priority.name.split ' ', 1}** created by #{data.user.name} (**#{data.issue.fields.customfield_10030.name}**) - #{shortened_summary} - #{shortened_description}")
robot.messageRoom "glados-test", "New **#{data.issue.fields.priority.name.split ' ', 1}** | #{data.user.name} (**#{data.issue.fields.customfield_10030.name}**) | #{shortened_summary} | #{shortened_description}"
res.send 'OK'
I'd like to extend this, to perform lookup against a remote API - basically, there's extra info I want to lookup, then add into the message I'm passing to room.messageRoom. I'm using request, because I need digest support.
So the following snippet works fine on its own.
request = require('request')
company_lookup = request.get('https://example.com/clients/api/project?name=FOOBAR', (error, response, body) ->
contracts = JSON.parse(body)['contracts']
console.log contracts
).auth('johnsmith', 'johnspassword', false)
And this is where my JS/Node newbness comes out...lol.
I can process the response inside the callback - but I'm really sure how to access it outside of that callback?
And how should I be integrating this into the webhook processing code - do I just move the snippet inside the if block, and assign it to a variable?
I'd use a middleware (supposing you are using Express with Node.js) so you can add the company_lookup response to the req and use it into any route where you add the middleware. http://expressjs.com/guide/using-middleware.html
For example:
server.js
var middlewares = require('./middlewares');
module.exports = function (robot) {
// Tell the route to execute the middleware before continue
return robot.router.post(middlewares.company_loop, '/hubot/tickets', function (req, res) {
// Now in the req you have also the middleware response attached to req.contracts
console.log(req.contracts);
return res.send('OK');
});
};
middlewares.js
var request = require('request');
// This is your middleware where you can attach your response to the req
exports.company_lookup = function (req, res, next) {
request.get('https://example.com/clients/api/project?name=FOOBAR', function (error, response, body) {
var contracts;
contracts = JSON.parse(body)['contracts'];
// Add the response to req
req.contracts = contracts;
// Tell it to continue
next();
}).auth('johnsmith', 'johnspassword', false);
};

ES6 Koa.js run generator function to completion and return asynchronously

Using koa.js, I want to make an API which runs a generator function that runs a long time in the background, but sends a token back to the user immediately.
The user can then use that token to retrieve status of their job later.
'use strict';
var generateToken = function(){
//...
};
var processData = function *(data, token) {
//...
var a = yield analysis(data);
console.log(a) // a is undefined
};
app.post('/process_data', validate, function *(next) {
var token = generateToken();
var that = this;
setTimeout(function() {
for (var i of processData(that.request.body, token)){
continue;
}
});
this.body = "this should return immediately " + token;
return next;
});
Running it within the setTimeout, variable 'a' is not saved. How do I construct this so that processData runs exactly like a normal yield?
You would probably want to have the long running process get handled by a job queue such as Kue
You would queue the job with a http post
then check on the job with a http get
Here is a rough outline of what I think you want to be doing:
var kue = require('kue'),
koa = require('koa'),
route = require('koa-router'),
thunkify = require('thunkify'),
parse = require('co-body'),
co = require('co'),
app = koa(),
jobs = kue.createQueue();
app.use(route(app));
// turn callbacks into thunks for generators
var createJob = thunkify(jobs.create);
var findJob = thunkify(kue.Job.get);
// Process the job here
jobs.process('longProcess', function(job, done){
// do work in here
// call done(err) when completed
// EDIT: if you want to handle job using generators/yield
// you could use a library like co
co(function *(){
var qs = yield doWork(job.data);
done();
}).error(done);
});
// Queue/Start the Job here
app.post('/jobs', function *(){
var body = yield parse(this);
var job = yield createJob('longProcess', body);
this.body = job.id;
});
// Check Status of job here
app.get('/jobs/:token', function *(){
var job = yield findJob(this.params.token);
this.body = job;
// job.status === 'complete' || ...
});
app.listen(3000);
Thanks to Bergi for the solution.
app.post('/process_data', validate, function *(next) {
var token = generateToken();
co(processData(this.request.body, token));
this.body = "this should return immediately " + token;
return next;
});

Node.js Javascript Scope Issue

I have a Node.js HTTP server running which goes like this (simplified):
http = require('http');
help = require('./modules/help').run;
router = require('./modules/router').run;
m = {something: require('./modules/something')};
var server = http.createServer(router).listen(8001);
"help" is a set of functions-helpers, for example:
module.exports.run = {
joinObjects: function(obj1, obj2) {
for (var prop in obj2) {
obj1[prop] = obj2[prop];
}
return obj1;
}
}
"router" handles the request (passes it further down and handles response to the client):
module.exports.run = function(req, res) {
var urlPath = url.parse(req.url).pathname;
switch(urlPath) {
case '/something':
requestHandler(req, res, 'something');
break;
...
}
function requestHandler(req, res, handler) {
var data = '';
req.on('data', function(chunk) {
data += chunk;
}
req.on('end', function() {
m[handler].run(req, data, function(response) {
response.headers = help.joinObjects(config.responseHeaders, response.headers);
res.writeHead(response.code, response.headers);
res.end(response.data);
});
}
}
}
The "handler" module/function runs the callback function and passes the response (code, headers and data) to it. The callback function then merges headers with a set of default headers that are set in the config file.
THE ISSUE: When there are two connections calling help.joinObjects() at the same time (that's my guess), response.headers property collides with the one of another user/connection and returns bad data. If I comment out the line that does the merging, this does not occur.
THE QUESTION: What's wrong with my approach to the "help" module? There is some kind of scope issue that I do not understand here.
As lazyhammer pointed out, the problem was the order in which I passed objects into the helper function: help.joinObjects(config.responseHeaders, response.headers).
Javascript passes variables by reference, so every time help.joinObjects() is called, it overwrites the config property instead of the user-specific object.

Categories

Resources