Next.js: How to execute code after sending HTTP response? - javascript

In Express you can execute code after sending an HTTP response like this:
res.send()
await someAsyncFunction() // imagine this takes a very long time
In Next.js, at least when testing code in a local environment, the above works the same as express. However, once deployed on Vercel, the above code seems to stop execution after sending the HTTP response. I don't know if this is just because that's how their serverless functions are set up or what. So I'm forced to rearrange it like this:
await someAsyncFunction() // imagine this takes a very long time
res.send()
The problem with ordering it like that is if the async function is very slow, the response could time out before it gets sent back. There are situations where that is bad. Say I need to send a bunch of emails using a rate-limited API. That can take a long time. I need to send a HTTP response right away before moving on to the very slow process of sending all the emails.

I haven't been able to find explicit documentation explaining this behavior, but this GitHub discussion appears to confirm that you cannot execute code after sending an HTTP response if you are deploying on Vercel: https://github.com/vercel/next.js/discussions/14077
Thus I am stuck sending the response last.

Related

Running Multiple Post Request in parallel from the same source

I have a rest api backend server (NodeJs/Typescript) to which I am making a post request which return me a response. The thing is that I am trying to use my frontend to make about 8 post request calls at the same time with the same data (weird, I know) but that is the requirement of the project.
Background: When I make one post call at the press of a button and then I refresh the page and press the button again. The backend runs both of those requests in parallel. This is what I want to do. So, I tried changing the front end code to make 5 post request at the call of the button but for some reason these request then get executed in sequence, meaning that I get one response and then the other request starts it execution as opposed to the page refers approach where they all start at the same time.
I want to do this because the server won't get any requests and with this approach I am hoping to get sone sort of parallelization from the node environment.
Each browser has a limit to the number or requests that can be fired on the same host - here. When the limit is reached the requests are queued.

Node.js Request drops before Response is received

The project that I am working on is to receive a request where in the main and/or most part of that request consists of data coming from a database. Upon receiving, my system proceeds with its function which is to parse all the data and ultimately concatenates the needed information to form a query, then insert those data using the mentioned query into my local database.
It is working fine and no issue at all. Except for the fact that it takes too long to process when the request has over 6,000,000 characters and over 200,000 lines (or maybe less but still with large numbers).
I have this tested with my system being used as a server (the supposed setup in production), and with Postman as well, but both drops the connection before the final response is built and sent. I have already tested and seen that although the connection drops, my system still proceeds with processing the data even up to the query, and even until it sends its supposed response. But since the request dropped somewhere in the middle of the processing, the response is ignored.
Is this about connection timeout in nodejs?
Or limit in 'app.use(bodyParser.json({limit: '10mb'}))'?
I really only see 1 way around this. I have done similar in the past. Allow the client to send as much as you need/want. However, instead of trying to have the client wait around for some undetermined amount of time (at which point the client may timeout), instead send an immediate response that is basically "we got your request and we're processing it".
Now the not so great part but it's the only way I've ever solved this type of issue. In your "processing" response, send back some sort of id. Now the client can check once in a while to see if it's request has been finished by sending you that id. On the server end you store the result for the client by the id you gave them. You'll have to make a few decisions about things like how long a response id is kept around and if it can be requested more than once, things like that.

jQuery $.get() is blocking other requests

I'm developing a web application and use jQuery to make asynchronous HTTP requests to my API. I have a detail view where you can see a lot of information of a specific object stored in the database. Because there is a lot of information and data that is linked to other objects, I make different calls to my API to gather different information for my views.
In the 'detail view' I have some kind of widgets that show the requested information. For that, I make about 5-7 HTTP GET requests to my API. When using the debugger (both Safari and Firefox), I can see that some requests are blocking other requests and the page takes a lot of time until everything is loaded and shown to the user.
I make a request like this:
$.get("api/api.php?object=myobject&endpoint=someendpoint", function(data) {
// data is JSON formatted
$("#my-widget input").val(data["name"]);
});
And another one e.g. like this:
$.get("api/api.php?object=anotherobject&endpoint=anotherendpoint", function(data) {
// data is JSON formatted
$("#other-widget input").val(data["somekey"]);
});
If the first request takes a little longer to finish, it blocks the second request until the callback function of the first request finished. But why? I thought that those calls are asynchronous and non-blocking.
I want to build a fast web application for a company where the requests are only made inside the local network, so a request should only take about 10-50ms (or even less). But the page takes about 10 seconds to show up with all information.
Am I doing something wrong? Or is there a JavaScript framework that can be used for exactly this problem? Any help is appreciated!
EDIT: As you can see in the screenshot, the requests have to wait some seconds, and if the request is fired, it takes a few seconds until a response comes back.
If I call the URL directly in my browser or do a GET request using curl it is a lot faster.
EDIT2: Thanks #CBroe! The session file write lock was the problem. As long as the session file is locked, no other script can run until the previous script finished. I just called session_write_close() immediately after session_start() and it runs a lot faster now.
Attention: Use session_write_close() only if you don't need to write to the $_SESSION array. Reading is possible after that, but writing not. (See this topic for further details: https://stackoverflow.com/a/50368260/1427878)

How does a Node.js server process requests?

Let's assume I have the following code. I'm using ExpressJS, but I don't think the server part is much different from vanilla Node.js.
var express=require('express');
var settings=JSON.parse(fs.readFileSync('settings.json','utf8')); // does this run only once (when the server starts)?
app.get('*',function(req,res){
res.write(fs.readFileSync('index.html')); // does this block other requests?
setTimeout(function(){
someSlowSyncTask(); // does this block other requests?
},1000);
res.end();
});
In the example above, does the first readFileSync run once when the server starts, or does it run each time the server receives a request?
For the second readFileSync, does it prevent Node from processing other requests? In other words, do all the other requests have to wait until readFileSync finishes before Node processes them?
Edit: I added the setTimeout and someSlowSyncTask. Do they block other requests?
You should avoid synchronous methods on a server. They are available as a convenience for single user utility scripts.
The first one runs only once since it is a synchronous method. The * get route is not setup until after it returns.
The second will run when any http request comes to the server. And yes, it will block the whole server for the duration of that synchronous call (I/O cost to open and read the contents of the file). Don't do that.
There are plenty of articles on the internet around understanding the node event loop. For example, here and here
You're correct. Your first readFileSync will execute once when the server is first started.
The second readFileSync will occur each time you receive a request, but because it exists in the callback of res.end() (remember - Node.js is inherently non-blocking), you can receive any number of requests in a non-blocking fashion so long as your internal functions are also written to be non-blocking (e.g. have a callback). In your case, however, the timeout is not written asynchronously and will thus block the server from responding until it's complete.

jQuery: Using a single Ajax call, receive progressive statuses instead of one single response?

I'm just wondering..is it possible to receive multiple responses from a single ajax call?
I'm thinking purely for aesthetic purposes to update the status on the client side.
I have a single ajax method that's called on form submit
$.ajax({
url: 'ajax-process.php',
data: data,
dataType: 'json',
type: 'post',
success: function (j) {
}
});
I can only get one response from the server-side. Is it possible to retrieve intermittent statuses? Such as:
Default (first): Creating account
Next: Sending email confirmation
Next: Done
Thanks for your help! :)
From a single ajax call, I don't think it is possible.
What you could do is check frequently where the process is (it's what is used for the upload bars in gmail for example). You do a first ajax request to launch the process, and then a series of ajax request to ask the server how he is doing. When the server answers "I'm done", you're good to go, and until that you can make the server respond and say the current state.
There is something called comet which you can set up to "push" requests to client, however it is probably way more than what you are wanting to invest in, time-wise.
You can open up a steady stream from the server, so that it continues to output, however I'm not sure how client-side script can handle these as individual "messages". Think about it like a server that outputs some info to the browser, does more work, outputs some more to the browser, does more work, etc. This shows up more or less in real time to the browser as printed text. It is one long response, but it is still one response. I think ajax only handles a response once it finished being sent, but maybe someone else will know more than me on the topic.
But you couldn't have the server output several individual responses without reloading itself, at least not with PHP, because once you start outputting the response, the response has begun and you can't chop that up without finishing the response, which happens when the script is done executing.
Your best bet is with the steady stream, but again, I'm not sure how ajax handles getting responses in chunks.
Quick Update
Based on the notes for this plugin:
[http://plugins.jquery.com/project/ajax-http-stream]
things don't look promising. Specifically:
Apparently the trend is to disallow access to the xmlhttprequest.responseText before the request is complete (stupid imo). Sorry there's nothing I can do to fix this
Thus, not only can you not get what you want in one request, you probably can't get it multiple requests, unless you want to break up the actual server-side process into several parts, and only have it continue to the next step when an ajax function triggers it.
Another option would be to have your script write it's status at specific points to another file on the server, call it "status.xml" or "status.txt". Have your first ajax function initialize the process, and have a second ajax function that queries this status file and outputs that to the user.
It is possible, but it has more to do with your backend script. As Anthony mentioned there is a tech called comet. Another term I've heard is called "Long polling". The idea is that you delay the time in which your php(insert language of choice) script finished processing.
In php you can do something like this:
while($response !== 'I'm done'){
sleep(1);
}else{
return $some_value;
exit();
}
This code stops your script from completely finishing. sleep(1) allows the script to stop and lets the server rest for 1 millisecond, before it loops back through. You can adjust the sleep time based on your needs. In php the amount of time the script sleeps is not counted agains your server timeout time.
You'll obviously need to make more checks for you code. You'll probably also want to allow for an abort script call. Something like sending a get request to kill the backend script. Maybe on the javascript unload event.
In the tests that I've done. I made the initial ajax call, and when the value was returned, I made another ajax call, that way your back end script wont time out.
I've only played around with this on my local server, so i'm not sure how real world this is, but it works.

Categories

Resources