Streaming Node.js response to client over XHR - javascript

I have an Express.js server with a route that is responsible for parsing and importing a CSV file.
I'd like to report the status of the import process in realtime back to the user's browser.
Streaming responses from Express.js is straight forward:
router.get( '/test', function( _req, _res, _next ) {
_res.write( 'File import initiated.' );
// ... processing CSV in loop
_res.write( 'Processing row x.' );
// artificial delay to simulate some i/o
setTimeout( function() {
_res.write( 'File import completed successfully.' );
_res.end();
}, 2000 )
} );
Sending a HTTP request to the endpoint via CURL works as expected. In the above example two streamed responses
are received immediately, followed by the final response after two seconds.
This is not the case in the browser (Chrome) however. Testing both with jQuery $.ajax, and a raw XHR call, the entire
response appears to be buffered and returned all at once when the response is complete.
How is this type of communication normally handled? I assume socket.io is an option, but surely there's a simpler
way to utalise the build int XHR or XHR2 features of the browser?

You can give back a reference id to the client, and the client calls the correct http api's to gather the status of that csv processing request. You might want some backend support to enable this. Something like some tables in the database that can track this status.
router.get( '/test', function( _req, _res, _next ) {
SaveFile(_res.file).then(function(id) {
_res.end(JSON.stringify({id: id});
});
});
router.get( '/status', function( _req, _res, _next ) {
GetStatus(_res.params.id).then(function(status) {
_res.end(JSON.stringify(status));
});
});
Else, if you still want realtime, employ websockets. You've to design the form of communication you expect.

Related

Express node.js - Does the render function has a promise?

After rendering my index.html (which works fine), I would like to send some additional data via sockets. For that, I would need a promise for the rendering process. At the moment, the code runs synchron. The socket data is sent and moments later the data is overwritten due to the later ending rendering process. Looking for something like:
res.render('index', {title: "XYZ"})
.then(function(){
//.. do something
});
Is there a different approach? Or is the only solution to ask for the data via the client?
Thanks for any help!
Does the render function has a promise?
The documentation doesn't mention one, so presumably not.
For that, I would need a promise for the rendering process.
Not necessarily, just some kind of notification that the data had been sent. Promises are one kind of notification, but not the only kind.
The docmentation shows that render will call a callback function with the rendered HTML, so you could use that callback to send the HTML along with whatever you want to have follow it:
res.render("index", {title: "XYZ"}, function (err, html) {
if (err) {
// ...send an error response...
return;
}
res.send(html);
// ...send your other stuff here...
});
But if you want a promise, you could use util.promisify on res.render. It's a bit of a pain because promisify doesn't make handling this straightforward, so you have to use bind:
const resRender = util.promisify(res.render.bind(res));
// ...
resRender("index", {title: "XYZ"})
.then(html => {
res.send(html);
// ...send your other stuff here...
})
.catch(err => {
// ...send an error response...
});
You've said you're sending further information "via sockets." That makes it sound to me like the further information you're sending isn't being sent via the res response, but via a separate channel.
If so, and if you want to wait to send that until the response is sent, you can start your socket sending in response to the finish event on the response:
res.on("finish", () => {
// Send your socket message here
});
res.render("index", {title: "XYZ"});
(Remember that an Express Response object is an enhanced version of the Node.js ServerResponse object, which is what provides this event.)
But even then, all that means is that the data has been handed over to the OS for transmission to the client. From the documentation:
...this event is emitted when the last segment of the response headers and body have been handed off to the operating system for transmission over the network. It does not imply that the client has received anything yet.
I don't think you have anything beyond that to hook into.

How to return multiple updates of a JSON using expressjs and nodejs

I have a server side task that takes some time to compute, and I'd like to periodically send updates to the client. I store and send the information as an object (via JSON), and it isn't an array where I can send data sequentially. Rather, I want to send some new information, and update others as the calculation continues.
From other posts on here I realize that:
response.json(object) is a nice and easy way to send an object json in one go, with headers set and everything. However, this - like response.send() - terminates the connection:
var app = express()
app.get('/', (request, response) => {
response.json( { hello:world } );
})
Alternatively, one could set the headers manually, and then use response.write with JSON.stringify
response.setHeader('Content-Type', 'application/json');
response.write(JSON.stringify({ hello:world } ));
response.end();
The above two methods work for sending an object in one go, but ideally what I'd like to do is send incremental updates to my object. E.g.
response.setHeader('Content-Type', 'application/json');
response.write( JSON.stringify( { hello:[world], foo:bar } ) );
// perform some operations
response.write( JSON.stringify( { hello:[world, anotherWorld], foo:cat } ) );
response.end()
However, what is happening on the clientside is:
After the first response.write, the client receives { hello:[world], foo:bar } but does not trigger my callback
After the second response.write, I can see the data received is { hello:[world], foo:bar }{ hello:[world, anotherWorld], foo:cat } still does not trigger my callback
My callback is only called after response.end(), and then I get an exception when trying to parse it as JSON, because it isn't a valid JSON anymore, but a bunch of JSONs stuck back to back with no comma or anything: Uncaught (in promise) SyntaxError: JSON.parse: unexpected non-whitespace character after JSON data at line 1 column XXX of the JSON data.
Ideally my client callback would be triggered upon receiving each write, and it would remove that bit of data from the buffer so to speak, so the next incoming json would be standalone.
Is what I want to do possible? If so, how?
My fetch code btw:
fetch(url)
.then(response => response.json()) // parse the JSON from the server
.then(returnInfo => {
onReturn(returnInfo);
});
For your use-case, you can consider using WebSockets to deliver incremental updates to your UI. There are 3 stages of WebSockets connections. Connect, message and disconnect. One page load your front-end maintains persistent connection with backend. You can send first JSON data on connect and then when your backend has updates, send data in your message call back. I have written a blog post that implements WebSockets using Python and Javascript. However, you can implement similar logic using NodeJS
https://blog.zahidmak.com/create-standalone-websocket-server-using-tornado/

Sending Many POST requests simultaneously Nodejs

I am new to Nodejs so excuse me for any mistake .. :)
Let me explain what i am trying to do :
basically i am making a push notification service for my platform .. i will explain further..
I have two NodeJs servers (using express) :
SERVER 1 :
it gets everything needed from the database such as ( device registration , identifier ..) and should send to the second server.
SERVER 2 : This server Receives a JSON ( contains everything i need ) to create the FCM and APNS payload and then send to the convenient provider (FCM,APNS).
what i am using : i am using axios to send POST requests.
The issue : since the 1st server will be sending big amount of requests ( usually 5K or more -- it's dynamic) at the same time , axios cannot handle that , and I've tried many other alternatives to axios but faced the same thing.
My question : How can i send that amount of requests without any issues ?
PS: when i send few requests ( 100 or bit more) i face no errors ...
I hope everything is clear and i would really appreciate any help.
Code Example of the Request with Axios :
PS: it always falls in the "[Request Error] ..."
try
{
axios.post(endpoint,{sendPushRequest})
.then( response => {
console.log(response.data);
})
.catch( er => {
if (er.response) {
console.log("[Response Error] on sending to Dispatcher...");
// The request was made and the server responded with a status code
// that falls out of the range of 2xx
console.log(er.response.data);
console.log(er.response.status);
console.log(er.response.headers);
} else if (er.request) {
console.log("[Request Error] on sending to Dispatcher...");
// The request was made but no response was received
// `error.request` is an instance of XMLHttpRequest in the browser and an instance of
} else {
// Something happened in setting up the request that triggered an Error
console.log('[Error]', er.message);
}
console.log(er.config);
});
}
catch (e) {
console.log('[ Catch Error]', e);
}
Usually, for doing this kind of asynchronous stuff you should use any queuing service as if the second server gets busy which it might in case of handling such a huge number of rest APIs your user would miss the notification.
Your flow should be like:
SERVER 1: it gets everything needed from the database such as ( device registration, identifier ..) and should push/publish to any queuing service such as rabbitMQ/Google pubsub etc.
SERVER 2: Instead of having rest APIs, this server should pull messages from the queue recursively and then Receives a JSON ( contains everything I need ) to create the FCM and APNS payload and then send to the convenient provider (FCM, APNS).
This is beneficial because even if anything happens to your server like busy/crashes the message would persist in the queue and on restarting the server you would be able to do your work(sending a notification or whatever).

Error handling over websockets a design dessision

Im currently building a webapp that has two clear use cases.
Traditional client request data from server.
Client request a stream from the server after wich the server starts pushing data to the client.
Currently im implementing both 1 and 2 using json message passing over a websocket. However this has proven hard since I need to handcode lots of error handling since the client is not waiting for the response. It just sends the message hoping it will get a reply sometime.
Im using Js and react on the frontend and Clojure on the backend.
I have two questions regarding this.
Given the current design, what alternatives are there for error handling over a websocket?
Would it be smarter to split the two UC using rest for UC1 and websockets for UC2 then i could use something like fetch on the frontend for rest calls.
Update.
The current problem is not knowing how to build an async send function over websockets can match send messages and response messages.
Here's a scheme for doing request/response over socket.io. You could do this over plain webSocket, but you'd have to build a little more of the infrastructure yourself. This same library can be used in client and server:
function initRequestResponseSocket(socket, requestHandler) {
var cntr = 0;
var openResponses = {};
// send a request
socket.sendRequestResponse = function(data, fn) {
// put this data in a wrapper object that contains the request id
// save the callback function for this id
var id = cntr++;
openResponses[id] = fn;
socket.emit('requestMsg', {id: id, data: data});
}
// process a response message that comes back from a request
socket.on('responseMsg', function(wrapper) {
var id = wrapper.id, fn;
if (typeof id === "number" && typeof openResponses[id] === "function") {
fn = openResponses[id];
delete openResponses[id];
fn(wrapper.data);
}
});
// process a requestMsg
socket.on('requestMsg', function(wrapper) {
if (requestHandler && wrapper.id) {
requestHandler(wrapper.data, function(responseToSend) {
socket.emit('responseMsg', {id: wrapper.id, data; responseToSend});
});
}
});
}
This works by wrapping every message sent in a wrapper object that contains a unique id value. Then, when the other end sends it's response, it includes that same id value. That id value can then be matched up with a particular callback response handler for that specific message. It works both ways from client to server or server to client.
You use this by calling initRequestResponseSocket(socket, requestHandler) once on a socket.io socket connection on each end. If you wish to receive requests, then you pass a requestHandler function which gets called each time there is a request. If you are only sending requests and receiving responses, then you don't have to pass in a requestHandler on that end of the connection.
To send a message and wait for a response, you do this:
socket.sendRequestResponse(data, function(err, response) {
if (!err) {
// response is here
}
});
If you're receiving requests and sending back responses, then you do this:
initRequestResponseSocket(socket, function(data, respondCallback) {
// process the data here
// send response
respondCallback(null, yourResponseData);
});
As for error handling, you can monitor for a loss of connection and you could build a timeout into this code so that if a response doesn't arrive in a certain amount of time, then you'd get an error back.
Here's an expanded version of the above code that implements a timeout for a response that does not come within some time period:
function initRequestResponseSocket(socket, requestHandler, timeout) {
var cntr = 0;
var openResponses = {};
// send a request
socket.sendRequestResponse = function(data, fn) {
// put this data in a wrapper object that contains the request id
// save the callback function for this id
var id = cntr++;
openResponses[id] = {fn: fn};
socket.emit('requestMsg', {id: id, data: data});
if (timeout) {
openResponses[id].timer = setTimeout(function() {
delete openResponses[id];
if (fn) {
fn("timeout");
}
}, timeout);
}
}
// process a response message that comes back from a request
socket.on('responseMsg', function(wrapper) {
var id = wrapper.id, requestInfo;
if (typeof id === "number" && typeof openResponse[id] === "object") {
requestInfo = openResponses[id];
delete openResponses[id];
if (requestInfo) {
if (requestInfo.timer) {
clearTimeout(requestInfo.timer);
}
if (requestInfo.fn) {
requestInfo.fn(null, wrapper.data);
}
}
}
});
// process a requestMsg
socket.on('requestMsg', function(wrapper) {
if (requestHandler && wrapper.id) {
requestHandler(wrapper.data, function(responseToSend) {
socket.emit('responseMsg', {id: wrapper.id, data; responseToSend});
});
}
});
}
There are a couple of interesting things in your question and your design, I prefer to ignore the implementation details and look at the high level architecture.
You state that you are looking to a client that requests data and a server that responds with some stream of data. Two things to note here:
HTTP 1.1 has options to send streaming responses (Chunked transfer encoding). If your use-case is only the sending of streaming responses, this might be a better fit for you. This does not hold when you e.g. want to push messages to the client that are not responding to some sort of request (sometimes referred to as Server side events).
Websockets, contrary to HTTP, do not natively implement some sort of request-response cycle. You can use the protocol as such by implementing your own mechanism, something that e.g. the subprotocol WAMP is doing.
As you have found out, implementing your own mechanism comes with it's pitfalls, that is where HTTP has the clear advantage. Given the requirements stated in your question I would opt for the HTTP streaming method instead of implementing your own request/response mechanism.

Send user details (session token) within every AJAX requests (Sencha Touch 2)

i am building a Sencha Touch 2 Application with userspecific datasets.
Architecture of the App:
Sencha Touch App <=====> Java Server backend with REST Services
( many AJAX requests =) )
What i actually have is:
Login the user with username/password
The app gets initialized and the loginform comes into play. After submitting the form as a AJAX request, the server backend checks the userdata and calls the client callback function.
And what i want to do is:
The callback function should
create a cookie with the sessiontoken or
store the sessiontoken within the localstorage (http://docs.sencha.com/touch/2-0/#!/api/Ext.data.proxy.LocalStorage) or
store the sessiontoken within js variable
Okay, shouldn't be the problem.
But how can i achieve the following:
Most of the data is specific for one user and should be returned by the REST service if needed (by clicking on the navigation,...). How can i send the sessiontoken (see above) within every AJAX request - so the server can provide the suitable datasets (assuming the token is valid)?
Send cookies within AJAX requests
I have already read that cookies gets automaticly added to the request if the url is on the same space, right? The Java Server is on the same domain (localhost:8080) but the cookies aren't available - instead of requests on urls like 'app.json'. I thought that cross-domain-requests are really domain specific?
Send paramaters within AJAX requests
Because the cookies aren't avi i thought about the possiblity of 'manually' adding parameters to the ajax requests. The App will contain many AJAX requests and thats why i dont want to add the token manually - i tried to override the requests function of Ext.Ajax but i failed ;-( :
(function() {
var originalRequest = Ext.data.Connection.prototype.request;
Ext.override(Ext.data.Connection, {
request : function(options) {
alert("Do sth... like add params");
return originalRequest.apply(this, options);
}
});
})();
ERROR:
Uncaught Error: [ERROR][Ext.data.Connection#request] No URL specified
I also tried to add a listener
Ext.Ajax.add({
listeners : {
beforerequest : function( conn, options, eOpts ){
alert("Do sth... like add params");
}
}
});
ERROR:
Uncaught TypeError: Object [object Object] has no method 'add'
Any idea about how i can add the token?
Or any better way of handling these case?
Thanks!
Finally i successfully used:
function addAjaxInterceptor(context)
{
Ext.Ajax.on('beforerequest', function(conn, options, eOptions)
{
// add the param to options...
}, context);
}
Executed from the app (=> addAjaxInterceptor(this)).
But the following solution is more suitable for my situation i think:
Ext.Ajax._defaultHeaders = {
// params as json
};
(Cause Ext.Ajax is a singleton object and i dont change the params for every request)

Categories

Resources