How does a Node.js server process requests? - javascript

Let's assume I have the following code. I'm using ExpressJS, but I don't think the server part is much different from vanilla Node.js.
var express=require('express');
var settings=JSON.parse(fs.readFileSync('settings.json','utf8')); // does this run only once (when the server starts)?
app.get('*',function(req,res){
res.write(fs.readFileSync('index.html')); // does this block other requests?
setTimeout(function(){
someSlowSyncTask(); // does this block other requests?
},1000);
res.end();
});
In the example above, does the first readFileSync run once when the server starts, or does it run each time the server receives a request?
For the second readFileSync, does it prevent Node from processing other requests? In other words, do all the other requests have to wait until readFileSync finishes before Node processes them?
Edit: I added the setTimeout and someSlowSyncTask. Do they block other requests?

You should avoid synchronous methods on a server. They are available as a convenience for single user utility scripts.
The first one runs only once since it is a synchronous method. The * get route is not setup until after it returns.
The second will run when any http request comes to the server. And yes, it will block the whole server for the duration of that synchronous call (I/O cost to open and read the contents of the file). Don't do that.
There are plenty of articles on the internet around understanding the node event loop. For example, here and here

You're correct. Your first readFileSync will execute once when the server is first started.
The second readFileSync will occur each time you receive a request, but because it exists in the callback of res.end() (remember - Node.js is inherently non-blocking), you can receive any number of requests in a non-blocking fashion so long as your internal functions are also written to be non-blocking (e.g. have a callback). In your case, however, the timeout is not written asynchronously and will thus block the server from responding until it's complete.

Related

Next.js: How to execute code after sending HTTP response?

In Express you can execute code after sending an HTTP response like this:
res.send()
await someAsyncFunction() // imagine this takes a very long time
In Next.js, at least when testing code in a local environment, the above works the same as express. However, once deployed on Vercel, the above code seems to stop execution after sending the HTTP response. I don't know if this is just because that's how their serverless functions are set up or what. So I'm forced to rearrange it like this:
await someAsyncFunction() // imagine this takes a very long time
res.send()
The problem with ordering it like that is if the async function is very slow, the response could time out before it gets sent back. There are situations where that is bad. Say I need to send a bunch of emails using a rate-limited API. That can take a long time. I need to send a HTTP response right away before moving on to the very slow process of sending all the emails.
I haven't been able to find explicit documentation explaining this behavior, but this GitHub discussion appears to confirm that you cannot execute code after sending an HTTP response if you are deploying on Vercel: https://github.com/vercel/next.js/discussions/14077
Thus I am stuck sending the response last.

Firebase: Using an asyncronous call as part of a HTTP trigger without timing out?

I am currently learning firebase and I am trying to trigger an asynchronous Api call as part of a http trigger.
In the docs it says:
Resolve functions that perform asynchronous processing by returning a JavaScript promise.
Terminate HTTP functions with res.redirect(), res.send(), or res.end().
Terminate a synchronous function with a return; statement.
So what method could I use to make the network call without falling foul of the 60s limit - can I extend this?
Currently just calling to the network will occasionally time out as it goes over the 60s limit.
If your HTTPS trigger is always timing out, that means you aren't fully following the advice of the documentation you cited and always returning a response to the client.
If your HTTPS trigger simply takes over a minute to complete, and you need more time, you can increase the timeout in the Cloud console.

How to get a synchronuous result from an HTTP request in node.js

I am writing a node module as a wrapper around a Python module (to make that module available from node.js). This module works as a standalone script or as an HTTP server.
When getting data from that script/server my function needs to return a value and can't do something in a callback function. At least that's how I understand it.
(One possible client is a gitbook-plugin, and when that encounters a "block" it has to call the plugin->my-client-module and insert its return value into the generated HTML document.)
So either I'm seeing something wrong because I'm too new to Node.js (coming from Python) or I need a way to make the HTTP request synchronous.
With the standalone script I can easily do it using ChildProcess.spawnSync, but not in an HTTP request.
Well, I know people are always recommending to do things asynchronously in Node, so I'd also be happy about pointers how to achieve my goal that way. As said I expect clients of my module to pass me some data, pass that along to invoke the script or do a POST request on the server and return the processed data to the client.
You can achieve synchronous HTTP requests using web-workers.
Here is more info on this topic:
https://developer.mozilla.org/zh-TW/docs/Web/API/XMLHttpRequest/Synchronous_and_Asynchronous_Requests#Example_Synchronous_HTTP_request_from_a_Worker
Edit:
You can make synchronous requests from Node.Js. The following module should help:
https://www.npmjs.com/package/sync-request

Long running Ajax request blocking short Ajax status updates

I have a ASP.NET MVC application running on IIS. I have the following handlers /DoWork (lets say it takes 10 minutes) and /ReportStatus (lets say it takes <1s). DoWork does the work while ReportStatus returns the progress of the work.
I wanted to asynchronously run the /DoWork by running $.ajax request from Javascript on it and then monitor its progress by repeatedly querying the /ReportStatus also through asynchronous $.ajax wrapped in function registered in window.setInterval. However, what I am seeing is that the long running $.ajax on /DoWork blocks all the other queries on /ReportStatus until it the DoWork finishes.
How do I circumvent this? I would guess that this has to do with IIS server setting possibly denying two active requests from one host? Any ideas?
My first idea is to have the /DoWork run the actual work in background asynchronous thread and immediately return. However I would like to know if there are better options as I want to keep the connection open during the /DoWork run.
Ended up using WebSockets as suggested by colecmc. Since I do not want to rely on Windows 8+ / Windows 2012+ I chose lightweight implementation called Fleck (available on Nuget).
http://jxs.me/2011/05/28/csharp-websockets-with-fleck/

How to perform Ajax requests, a few at a time

I am not really sure it is possible in JavaScript, so I thought I'd ask. :)
Say we have 100 requests to be done and want to speed things up.
What I was thinking of doing is:
Create a loop that will launch the first 5 ajax calls
Wait until they all return (success - call a function to update the dom / error) - not sure how, maybe with a global counter?
Repeat until all requests are done.
Considering browser JavaScript does not support thread, can we "exploit" the async functionality to do that?
Do you think it would work, or there are inherent problems doing that in JavaScript?
Yes, I have done something similar to this before. The basic process is:
Create a stack to store your jobs (requests, in this case).
Start out by executing 3 or 4 of the requests.
In the callback of the request, pop the next job out of the stack and execute it (giving it the same callback).
I'd say, the comment from Dancrumb is the "answer" to this question, but anyway...
Current browsers do limit HTTP requests, so you can even easily just start all 100 request immediately, and the browser will take care of sending those requests as fast as possible, but limited to a decent number of parallel requests.
So, just start them all immediately and trust on the browser.
However, this may change in the future (the number of parallel requests that a browser sends increases as end-user internet bandwidth increases and technology advances).
EDIT: you should also think and read about the meaning of "asynchronous" in a javascript context.. asynchronous here just means that you give up control about something to some other part of a system. so "sending" an async request just means, that you tell the browser to do so! you do not control the browser, you just tell it to send that request and please notify me about the outcome.
It's actually slower to break up 100 requests and batch post them 5 at a time whilst waiting for them to complete till you send the next batch. You might be better off simply sending 100 requests, remember JavaScript is single threaded so it can only resolve 1 response at a time anyways.
A better way is set up a batch request service that accepts something like:
/ajax_batch?req1=/some/request.json&req2=/other/request.json
And so on. Basically you send multiple requests in a single HTTP request. The response of such a request would look like:
[
{"reqName":"req1","data":{}},
{"reqName":"req2","data":{}}
]
Your ajax_batch service would resolve each request and send back the results in proper order. Client side, you keep track of what you sent and what you expect, so you can match up the results to the correct requests. Downside, it takes quite some coding.
The speed gain would come entirely from a massive reduction of HTTP requests.
There's a limit on how many requests you send because the url length has a limit iirc.
DWR does exactly that afaik.

Categories

Resources