Error: write after end while node - javascript

I am reading the data from Cassandra using the stream() function of https://www.npmjs.com/package/cassandra-driver, I am listing to the events and piping the stream to the response object but I am getting this error
Error: write after end
this is my code
const JSONStream = require('JSONStream');
res.write('{');
stringsToAppendToStream.push('\"Result\":');
const responseStream = stringsToAppendToStream.join(',');
res.write(responseStream);
let streamObject = casssandraClient.stream(generateSQL);
// console.log(sstreamObject);
streamObject.on('readable', function () {
let row;
while (row = this.read()) {
console.log(row);
streamObject
.pipe(JSONStream.stringify())
.pipe(res);
}
})
streamObject.on('end', function () {
console.log('ending')
res.write('}');
res.end();
})
I tried the callback suggestion given in some other answer while writing the data but it doesn't solve the issue
res.write(messages, function(err) { res.end(); });
it seems like issue is while I pipe() the response but I am not sure how to resolve it.

I was able to solve this issue and it might be that this solution apply for this particular use case only, we don't have to listen for the 'redable' event instead pipe the stream directly to the response.
let streamObject = casssandraClient.stream(sql);
streamObject
.pipe(JSONStream.stringify())
.pipe(res, {
end: false
});
streamObject.on('end', function () {
res.write('}');
res.end();
})

Related

Socket hangup due to wrong handling of promises

I have script to move data from one platform to another. The source db allows only 100 records to be fetched in a single request. So I created a routine to fetch by batches of 100 which works fine I guess.
Now I try to process each records of 100 and do the necessary transformations (which involves axios call to get certain data) and create a record in firebase firestore.
Now when I run this migration in firebase express node, I get socket hang up ECONNRESET.
I know this is caused by wrong handling of promises.
Here is what my code looks like:
import { scrollByBatches } from "../helpers/migrations/apiScroll";
import { createServiceLocation } from "../helpers/locations";
const mapServiceLocationData = async (serviceLocation: any, env: string) => {
try {
const migratedServiceLocation: any = {
isMigrated: true,
id: serviceLocation._id,
};
if (serviceLocation.list?.length) {
await Promise.all(serviceLocation.ids.map(async (id: string) => {
const { data } = await dbEndPoint.priceMultiplier({ id }); // error says socket hangup on this call
let multiplierUnit;
let serviceType;
if (data.response._id) {
multiplierUnit = data.response;
const result = await dbEndPoint.serviceType({ id: multiplierUnit.service_custom_service_type }); // error says socket hangup on this call
if (result.data.response._id) {
serviceType = result.data.response.type_text;
migratedServiceLocation.logs = [...multiplierUnit.history_list_text, ...migratedServiceLocation.logs];
}
}
}));
}
await createServiceLocation(migratedServiceLocation); // create record in destination db
} catch (error) {
console.log("Error serviceLocation: ", serviceLocation._id, JSON.stringify(error));
}
return null; // is this even necessary?
};
export const up = async () => {
try {
// get 100 docs from source db => process it.. => fetch next 100 => so on...
await scrollByBatches(dbEndPoint.serviceLocation, async (serviceLocations: any) => {
await Promise.all(
serviceLocations.map(async (serviceLocation: any) => {
await mapServiceLocationData(serviceLocation);
})
);
}, 100);
} catch (error) {
console.log("Error", JSON.stringify(error));
}
return null; // is this even necessary?
};
The error I get in firebase functions console is:
For clarity on how the fetch by batches looks like:
const iterateInBatches = async (endPoint: any, limit: number, cursor: number, callback: any, resolve: any, reject: any) => {
try {
const result = await endPoint({ limit, cursor });
const { results, remaining }: any = result.data.response;
if (remaining >= 0) {
await callback(results);
}
if ((remaining)) {
setTimeout(() => {
iterateInBatches(endPoint, limit, (cursor + limit), callback, resolve, reject);
}, 1000); // wait a second
} else {
resolve();
}
} catch (err) {
reject(err);
}
};
export const scrollByBatches = async (endPoint: any, callback: any, limit: number, cursor: number = 0) => {
return new Promise((resolve, reject) => {
iterateInBatches(endPoint, limit, cursor, callback, resolve, reject);
});
};
What am I doing wrong? I have added comments in the code sections for readability.
Thanks.
There are two cases when socket hang up gets thrown:
When you are a client
When you, as a client, send a request to a remote server, and receive no timely response. Your socket is ended which throws this error. You should catch this error and decide how to handle it: whether to retry the request, queue it for later, etc.
When you are a server/proxy
When you, as a server, perhaps a proxy server, receive a request from a client, then start acting upon it (or relay the request to the upstream server), and before you have prepared the response, the client decides to cancel/abort the request.
I would suggest a number of possibilities for you to try and test that might help you solve your issue of ECONNRESET :
If you have access to the source database, you could try looking
there for some logs or metrics. Perhaps you are overloading the
service.
Quick and dirty solution for development: Use longjohn, you get long
stack traces that will contain the async operations. Clean and
correct solution: Technically, in node, whenever you emit an 'error'
event and no one listens to it, it will throw the error. To make it
not throw, put a listener on it and handle it yourself. That way you
can log the error with more information.
You can also set NODE_DEBUG=net or use strace. They both provide you
what the node is doing internally.
You could restart your server and run the connection again, maybe
your server crashed or refused the connection most likely blocked by
the User Agent.
You could also try running this code locally, instead of in cloud
functions to see if there is a different result. It's possible that
the RSG/google network is interfering somehow.
You can also have a look at this GitHub issue and stackoverflow
thread to see the common fixes for the ECONNRESET issue and see if
those help resolve the issue.

Nodejs: download a file into string via http, using async await syntax

How do I download a file into memory via http in nodejs, without the use of third-party libraries?
This answer solves a similar question, but I don't need to write file to disk.
You can use the built-in http.get() and there's an example right in the nodejs http doc.
http.get('http://nodejs.org/dist/index.json', (res) => {
const { statusCode } = res;
const contentType = res.headers['content-type'];
let error;
// Any 2xx status code signals a successful response but
// here we're only checking for 200.
if (statusCode !== 200) {
error = new Error('Request Failed.\n' +
`Status Code: ${statusCode}`);
} else if (!/^application\/json/.test(contentType)) {
error = new Error('Invalid content-type.\n' +
`Expected application/json but received ${contentType}`);
}
if (error) {
console.error(error.message);
// Consume response data to free up memory
res.resume();
return;
}
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
console.log(parsedData);
} catch (e) {
console.error(e.message);
}
});
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
This example assumes JSON content, but you can change the process in the end event handler to just treat the rawData as text and change the check for json contentType to whatever type you are expecting.
FYI, this is somewhat lower level code and is not something I would normally use. You can encapsulate it in your own function (perhaps with a promise wrapped around it) if you really don't want to use third party libraries, but most people use higher level libraries for this purpose that just make the coding simpler. I generally use got() for requests like this and there is a list of other libraries (all promise-based) here.

Long running Node REST API takes much time to respond

I have a Rest API in node.js that takes much time to respond because it sends a request to suppliers vendor and once the response is fully prepared then it returns the result what I want is as the result is being prepared it should be able to display it on the front react side. Thanks in advance for any help and your time
here is my controller
module.exports.search = async (req, res) => {
try {
let params = req.query;
params = _.extend(params, req.body);
const result = await modules.Hotel.Search.search(req.supplierAuth, params);
res.json(result);
} catch (e) {
global.cli.log('controller:hotels:search: ', e);
res.status(500).json({ message: e.message });
}
};
here is my front side service
export const getHotels = (filters, options = {}) => {
const queryString = new URLSearchParams(options).toString();
return post(`/api/hotels/search?${queryString}`, filters);
};
The best solution is to use streams and pipe() the results as they come into express's res object, similar to this guy's approach.
You'll have to modify the modules.Hotel.Search.search(....) and make that use streams.

NodeJS (ExpressJS) Unhandled rejection TypeError: callback is not a function

I've been learning to use Angular 6 and NodeJS (with ExpressJS) these past few days. To do this, I decided to create a front-end with Angular 6 (o which theres a form) with the intention of passing information back and forth to my PHPMyAdmin database via the use of an API I created with NodeJS.
As you may have seen from the title, I get the following error when I try to submit my forms information:
"Unhandled rejection TypeError: callback is not a function"
The code I used for the post action was taken from another functioning and similar API, the database receives the question_id and user_id but will not register the questions answers (which is the end goal).
Without furthur ado, here is the code to get and post:
const config = require('../config/config');
const knex = config.db.knex;
const async = require('async');
class TestModel {
getQuestions(callback) {
knex('question')
.select('id', 'libelle')
.orderBy('id', 'asc')
.then((data) => {
callback(data);
});
}
addResponse(reponse, callback) {
knex('reponse')
.insert({
id_user : 1,
id_question : reponse.id,
libelle: reponse.answer,
}).then((data) => {
callback(data);
});
}
}
module.exports = new TestModel();
Here is the rest:
app.post('/quiz', function(req, res, next){
var answers = req.body;
console.log(answers)
for(var i = 0; i<answers.length; i++)
{
var obj = answers[i];
Test.addResponse((obj,result) => {
});
}
res.json({reponseserveur:'True'});
});
Just for reference, "reponse" means response and "libelle" means label.
Thanks in advance for your help!
The correct Syntax is
Test.addResponse(obj, (result) => {
});
it's working:
Test.addResponse((obj,result) => {})
Looks like you're giving your callback as the first parameter (instead of response). It should be 2nd.
You probably wanted to do:
Test.addResponse(res, (obj,result) => {})

How to stub https.request response.pipe with sinon.js?

Let's say, that I have this simple code:
var https = require('https');
var options = {
host: 'openshift.redhat.com',
port: 443,
path: '/broker/rest/api',
method: 'GET'
};
var req = https.request(options, function(response) {
console.log(response.statusCode);
response.pipe(save stream to file with fs)
});
req.on('error', function(e) {
console.error(e);
});
req.end();
Well, I'm bit new with sinon.js and I'd like to ask: How to stub response.pipe()?
Of course, I can make stub for https.request and return somethin with .on and .end and thats easy, but I have no idea, how to test if response.pipe() was called with proper arguments... (nodejs documentation says that response is callback)
Documentation is not helpful in this case!
ofc testing env is mocha, and can use chai too
Please give me some advices or examples.
Thanks, Matt
I wrapped your code into a function that accepts a callback because in current implementation we don't actually know when the piping is actually finished. So assuming we have a function like this:
const downloadToFile = function (options, callback) {
let req = https.request(options, function (err, stream) {
let writeStream = fs.createWriteStream('./output.json');
stream.pipe(writeStream);
//Notify that the content was successfully writtent into a file
stream.on('end', () => callback(null));
//Notify the caller that error happened.
stream.on('error', err => callback(err));
});
req.end();
};
There are 3 problems to solve:
As response is a readable stream. We want to mock the data it emits.
We want to mock .pipe method check if we are piping to the right stream.
We also need to mock https.request method not to make actual call
Here is how we can achieve this:
const {PassThrough} = require('stream');
describe('#downloadToFile', () => {
it('should save the data to output.json', function (callback) {
const mockResponse = `{"data": 123}`;
//Using a built-in PassThrough stream to emit needed data.
const mockStream = new PassThrough();
mockStream.push(mockResponse);
mockStream.end(); //Mark that we pushed all the data.
//Patch the 'https' module not to make an actual call
//but to return our stream instead
sinon.stub(https, 'request', function (options, callback) {
callback(null, mockStream);
return {end: sinon.stub()}; //Stub end method btw
});
//Finally keep track of how 'pipe' is going to be called
sinon.spy(mockStream, 'pipe');
downloadToFile({url: 'http://google.com'}, (err) => {
//Here you have the full control over what's happened
sinon.assert.calledOnce(mockStream.pipe);
//We can get the stream that we piped to.
let writable = mockStream.pipe.getCall(0).args[0];
assert.equal(writable.path, './output.json');
//Tell mocha that the test is finished. Pass an error if any.
callback(err);
});
});
});
Later you could make separate functions like: createMockedStream. Or even extract all these preparations into a separate method and keep only asserts in a test.
From Sinon documentation, this has been removed from v3.0.0:
var stub = sinon.stub(object, "method", func);`
Instead you should use:
stub(obj, 'meth').callsFake(fn)

Categories

Resources