`EPIPE` error while testing POST request using Jest - javascript

I have a simple server which accepts POST requests and responds with the 413 Request entity too large error if the request body exceeds allowed size. When I'm testing it using Jest the request sometimes returns correct error, but more often it returns { code: EPIPE, syscall: write } error.
I tested server using cURL and from node javascript file and the server worked as expected, but Jest testing still gives me that error. I also tried another request libraries, they gave the same error.
beforeAll(() => {
server.listen(testingPort);
});
afterAll(() => {
server.close();
});
describe("POST request", () => {
test("should return error 413 in respense to exceeded file size", async () => {
const fileContent = new Array(1024 * 1024 + 10).join("*");
const fileName = "post-test-file.txt";
const urlObj = new URL(fileName, host);
try {
await axios.post(urlObj.href, fileContent);
} catch (e) {
console.log(e.code, e.errno, e.syscall); // EPIPE EPIPE write
// expect(e.response.status).toBe(413);
}
});
Maybe you have any thoughts about this?

It seems that the issue is very old and not quite resolved:
https://github.com/nodejs/node/issues/947
For me the workaround is using a stream.

Related

Socket hangup due to wrong handling of promises

I have script to move data from one platform to another. The source db allows only 100 records to be fetched in a single request. So I created a routine to fetch by batches of 100 which works fine I guess.
Now I try to process each records of 100 and do the necessary transformations (which involves axios call to get certain data) and create a record in firebase firestore.
Now when I run this migration in firebase express node, I get socket hang up ECONNRESET.
I know this is caused by wrong handling of promises.
Here is what my code looks like:
import { scrollByBatches } from "../helpers/migrations/apiScroll";
import { createServiceLocation } from "../helpers/locations";
const mapServiceLocationData = async (serviceLocation: any, env: string) => {
try {
const migratedServiceLocation: any = {
isMigrated: true,
id: serviceLocation._id,
};
if (serviceLocation.list?.length) {
await Promise.all(serviceLocation.ids.map(async (id: string) => {
const { data } = await dbEndPoint.priceMultiplier({ id }); // error says socket hangup on this call
let multiplierUnit;
let serviceType;
if (data.response._id) {
multiplierUnit = data.response;
const result = await dbEndPoint.serviceType({ id: multiplierUnit.service_custom_service_type }); // error says socket hangup on this call
if (result.data.response._id) {
serviceType = result.data.response.type_text;
migratedServiceLocation.logs = [...multiplierUnit.history_list_text, ...migratedServiceLocation.logs];
}
}
}));
}
await createServiceLocation(migratedServiceLocation); // create record in destination db
} catch (error) {
console.log("Error serviceLocation: ", serviceLocation._id, JSON.stringify(error));
}
return null; // is this even necessary?
};
export const up = async () => {
try {
// get 100 docs from source db => process it.. => fetch next 100 => so on...
await scrollByBatches(dbEndPoint.serviceLocation, async (serviceLocations: any) => {
await Promise.all(
serviceLocations.map(async (serviceLocation: any) => {
await mapServiceLocationData(serviceLocation);
})
);
}, 100);
} catch (error) {
console.log("Error", JSON.stringify(error));
}
return null; // is this even necessary?
};
The error I get in firebase functions console is:
For clarity on how the fetch by batches looks like:
const iterateInBatches = async (endPoint: any, limit: number, cursor: number, callback: any, resolve: any, reject: any) => {
try {
const result = await endPoint({ limit, cursor });
const { results, remaining }: any = result.data.response;
if (remaining >= 0) {
await callback(results);
}
if ((remaining)) {
setTimeout(() => {
iterateInBatches(endPoint, limit, (cursor + limit), callback, resolve, reject);
}, 1000); // wait a second
} else {
resolve();
}
} catch (err) {
reject(err);
}
};
export const scrollByBatches = async (endPoint: any, callback: any, limit: number, cursor: number = 0) => {
return new Promise((resolve, reject) => {
iterateInBatches(endPoint, limit, cursor, callback, resolve, reject);
});
};
What am I doing wrong? I have added comments in the code sections for readability.
Thanks.
There are two cases when socket hang up gets thrown:
When you are a client
When you, as a client, send a request to a remote server, and receive no timely response. Your socket is ended which throws this error. You should catch this error and decide how to handle it: whether to retry the request, queue it for later, etc.
When you are a server/proxy
When you, as a server, perhaps a proxy server, receive a request from a client, then start acting upon it (or relay the request to the upstream server), and before you have prepared the response, the client decides to cancel/abort the request.
I would suggest a number of possibilities for you to try and test that might help you solve your issue of ECONNRESET :
If you have access to the source database, you could try looking
there for some logs or metrics. Perhaps you are overloading the
service.
Quick and dirty solution for development: Use longjohn, you get long
stack traces that will contain the async operations. Clean and
correct solution: Technically, in node, whenever you emit an 'error'
event and no one listens to it, it will throw the error. To make it
not throw, put a listener on it and handle it yourself. That way you
can log the error with more information.
You can also set NODE_DEBUG=net or use strace. They both provide you
what the node is doing internally.
You could restart your server and run the connection again, maybe
your server crashed or refused the connection most likely blocked by
the User Agent.
You could also try running this code locally, instead of in cloud
functions to see if there is a different result. It's possible that
the RSG/google network is interfering somehow.
You can also have a look at this GitHub issue and stackoverflow
thread to see the common fixes for the ECONNRESET issue and see if
those help resolve the issue.

AstraDB failed GET request

I have been working on a TikTok clone app. So I created my database with Astra DB and set up two functions inside a function folder to test out if my posts are working. I am using netlify dev to test out the applications. But when I redirect http://localhost:8888/.netlify/functions/addData
I get this failed get request error
Request from ::1: GET /.netlify/functions/addData
Error: Request Failed: [object Object]
Stack Trace: Request failed with status code 401
at axiosRequest (D:\tiktokclone\node_modules\#astrajs\rest\src\rest.js:126:11)
at processTicksAndRejections (internal/process/task_queues.js:93:5)
at async AstraClient._request (D:\tiktokclone\node_modules\#astrajs\rest\src\rest.js:199:22)
at async AstraClient.put (D:\tiktokclone\node_modules\#astrajs\rest\src\rest.js:263:12)
at async AstraCollection._put (D:\tiktokclone\node_modules\#astrajs\collections\src\collections.js:69:22)
at async Object.exports.handler (D:\tiktokclone\functions\addData.js:17:9)
Response with status 500 in 231 ms.
I quite don't understand what causes this. All the credentials inside my .env folder were correct.Here is the code I used to make the request
const { createClient } = require("#astrajs/collections");
const collection = "posts";
exports.handler = async function (event, context, callback) {
const astraClient = await createClient({
astraDatabaseId: process.env.ASTRA_DB_ID,
astraDatabaseRegion: process.env.ASTRA_DB_REGION,
applicationToken: process.env.ASTRA_DB_APPLICATION_TOKEN,
});
console.log(astraClient)
console.log(collection)
console.log('Hello')
const posts = astraClient
.namespace(process.env.ASTRA_DB_KEYSPACE)
.collection(collection);
try {
await posts.create("a post", {
title: "my first post",
});
return {
statusCode: 200,
};
} catch (e) {
console.error(e);
return {
statusCode: 500,
body: JSON.stringify(e),
};
}
};
I found a fix. For some reason, I was trying to call the API using an application token and it was giving me the 401 error. When I used username and password it worked.
const astraClient = await createClient({
astraDatabaseId: process.env.ASTRA_DB_ID,
astraDatabaseRegion: process.env.ASTRA_DB_REGION,
username: process.env.ASTRA_DB_USERNAME,
password: process.env.ASTRA_DB_PASSWORD,
});
username is the client ID and password is the client secret. This error happened because of a slight confusion with the REST API and the Document API. Astra DB uses application token for authenticating document API while REST API uses client ID and Password.

How to print the request logs (like request url,request body, request queryParam) on supertest failure?

I want to print or console log below details on failure for Supertest expect failure
for the below request on success no need to print any thing on success
const result = await request(dummy_url).get("repos/Microsoft/TypeScript/pulls")
.set("user-agent", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:87.0) Gecko/20100101 Firefox/87.0")
.expect(200)
.then(response => {
console.log("success");
console.log(response);
return response;
}).catch(error => {
console.log("error");
// console.log(error);
return 2;
})
console.log(result)
done();
on failure lets say I modify the url as dummy instead of pull,
I need to know the request url, path parma and request body if any,
currenly on trying to add it in the above way it only gives the below error
Error: expected 200 "OK", got 404 "Not Found"
at Object.<anonymous> (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/__tests__/github-routes/jest.test.ts:40:8)
at /Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/jest-jasmine2/build/queueRunner.js:45:12
at new Promise (<anonymous>)
at mapper (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/jest-jasmine2/build/queueRunner.js:28:19)
at /Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/jest-jasmine2/build/queueRunner.js:75:41
at processTicksAndRejections (node:internal/process/task_queues:96:5)
----
at Test.Object.<anonymous>.Test._assertStatus (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/supertest/lib/test.js:296:12)
at /Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/supertest/lib/test.js:80:15
at Test.Object.<anonymous>.Test._assertFunction (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/supertest/lib/test.js:311:11)
at Test.Object.<anonymous>.Test.assert (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/supertest/lib/test.js:201:21)
at localAssert (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/supertest/lib/test.js:159:12)
at fn (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/supertest/lib/test.js:156:5)
at Test.callback (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/superagent/src/node/index.js:902:3)
at fn (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/superagent/src/node/index.js:1130:18)
at IncomingMessage.<anonymous> (/Users/thoughtworks/projects/api-test-supertest-jest-typescript/node_modules/superagent/src/node/parsers/json.js:19:7)
at Stream.emit (node:events:365:28)
things to note,
I am using this in Typescript, inside the async function, thought this is not a blocker for now.
After multiple try and attempts, I was able to come up with a function that logs the request details, I pass on the response from supertest along with the expected status code.
On failure, the function will log the details of request and response like path parm, query parm, req body
import SuperTest from "supertest";
export const checkStatusCode = (res: any, expectedStatus: any = 200): SuperTest.Response => {
if (res.status === expectedStatus) {
return res
};
const error = res.error;
const reqData = JSON.parse(JSON.stringify(res)).req;
throw new Error(`
request-method : ${JSON.stringify(reqData.method)}
request-url : ${JSON.stringify(reqData.url)}
request-data : ${JSON.stringify(reqData.data)}
request-headers : ${JSON.stringify(reqData.headers)}
reponse-status : ${JSON.stringify(res.status)}
reponse-body : ${JSON.stringify(res.body)}
`
);
};
usage in jest test file
describe("Jest - Api - user", () => {
it("Verify POST ", async () => {
const res = await request(url.someurl)
.post("/dummy")
.set("Authorization", authToken)
.send(updateThirdParty)
checkStatusCode(res, 200)
})
})
The solution is inspired from one of the suggestion in supertest github issue.
Thanks to sakovias.
Note: this is function that logs the data, we can still have this as wrapper to the expect itself, which I will post as separate thread.

How do I catch aws-sdk errors that occur outside of my call stack?

The Situation
I'm using the aws-sdk to interact with an S3 bucket.
If I don't have the proper credentials set up, the sdk appropriately complains. However, its method of complaint is an error that is thrown outside of my call stack. I want to be able to catch that error and handle it gracefully.
The Problem
Here is a little script that causes the problem.
import { S3 } from 'aws-sdk';
try {
const s3 = new S3();
s3.createPresignedPost({}, (err, data) => {
console.log('sup dog');
});
} catch (err: Error) {
console.log('KABOOM!');
}
I would expect this to catch any errors thrown by s3.createPresignedPost and trigger the catch, but what actually happens is sup dog is posted, and then node crashes with a stack trace that points to the aws-sdk.
sup dog
./node_modules/aws-sdk/lib/services/s3.js:1241
throw new Error('Unable to create a POST object policy without a bucket,'
^
Error: Unable to create a POST object policy without a bucket, region, and credentials
at features.constructor.preparePostFields (./node_modules/aws-sdk/lib/services/s3.js:1241:13)
at finalizePost (./node_modules/aws-sdk/lib/services/s3.js:1204:22)
at ./node_modules/aws-sdk/lib/services/s3.js:1221:24
at finish (./node_modules/aws-sdk/lib/config.js:386:7)
at ./node_modules/aws-sdk/lib/config.js:428:9
at Object.<anonymous> (./node_modules/aws-sdk/lib/credentials/credential_provider_chain.js:111:13)
at Object.arrayEach (./node_modules/aws-sdk/lib/util.js:516:32)
at resolveNext (./node_modules/aws-sdk/lib/credentials/credential_provider_chain.js:110:20)
at ./node_modules/aws-sdk/lib/credentials/credential_provider_chain.js:126:13
at ./node_modules/aws-sdk/lib/credentials.js:124:23
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
The Question
The solution here of course is to properly configure the aws sdk with credentials, but I would like to gracefully handle cases where that hasn't happened by catching the error and preventing a hard crash.
How can I use the callback pattern of createPresignedPost without risking system crash?
I have created this lambda request :
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
exports.handler = async (event, context) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
let body;
let statusCode = '200';
const headers = {
'Content-Type': 'application/json',
};
var params = {
Bucket: 'bucketname',
Fields: {
key: 'example.pdf'
}
};
try {
s3.createPresignedPost(params, (err, data) => {
body = "successfully";
});
} catch (err) {
statusCode = '400';
body = err.message;
} finally {
body = JSON.stringify(body);
}
return {
statusCode,
body,
headers,
};
};
For testing success case run code as it is. For testing error comment params one of key either bucket or key. it will throw error and catch.
Response:
{
"statusCode": "400",
"body": "\"Unable to create a POST object policy without a bucket, region, and credentials\"",
"headers": {
"Content-Type": "application/json"
}
}
This turned out to be a full blown bug in AWS -- I issued a patch which was merged.
So, the answer to this question is that it wasn't possible to catch that error... but now you don't have to.

How to read JSON files with pure javascript on a server side using node.js?

I have very little experience working with Node.js and jQuery and have been searching for the last few hours for a solution. I have an API from openweathermap.com () that returns weather information in the JSON format, and I am trying to pull the temperature value.
I am using Node.js to run a program that can be accessed from any device on the network and I have previously used jQuery on the client to read the file using $.getJSON but am in the process transferring most of my code to the server side to prevent needing a browser open at all times in order for the program to run properly. Obviously you can't use jQuery with node.js but i tried server adaptations for node.js including cheerio, jsdom, and a standard jquery add-on but none of them would do the trick. I can't use XMLHttpRequest or http.get because its being run server side and I can't simply use JSON.parse because it is pulling from a website.
How can I pull the data from the website, store it as an object, and then pull data from it while using just pure javascript?
Here is what I originally had running on the client:
var updateWeather = function(){
$.getJSON('http://api.openweathermap.org/data/2.5/weather?id=5802340&units=imperial&appid=80e9f3ae5074805d4788ec25275ff8a0&units=imperial', function(data) {
socket.emit("outsideTemp",data.main.temp);
});
}
updateWeather();
<head>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.4.0/jquery.min.js"></script>
</head>
NodeJS natively supports JSON -- so no "special" work needed. I would recommend using an http client that makes our life easier, like axios, but you can do this natively. I have provided two snippets below for you to get started:
Using popular HTTP Client
const axios = require('axios');
axios.get('http://api.openweathermap.org/data/2.5/weather?id=5802340&units=imperial&appid=80e9f3ae5074805d4788ec25275ff8a0&units=imperial').then((res) => {
console.log(res.data)
})
Plain NodeJS (taken from the NodeJS Docs)
const http = require('http');
http.get('http://api.openweathermap.org/data/2.5/weather?id=5802340&units=imperial&appid=80e9f3ae5074805d4788ec25275ff8a0&units=imperial', (res) => {
const { statusCode } = res;
const contentType = res.headers['content-type'];
let error;
if (statusCode !== 200) {
error = new Error('Request Failed.\n' +
`Status Code: ${statusCode}`);
} else if (!/^application\/json/.test(contentType)) {
error = new Error('Invalid content-type.\n' +
`Expected application/json but received ${contentType}`);
}
if (error) {
console.error(error.message);
// Consume response data to free up memory
res.resume();
return;
}
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
console.log(parsedData);
} catch (e) {
console.error(e.message);
}
});
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
Many people use request / request promise with node
const req = require('request-promise');
req.get({
uri: 'http://api.openweathermap.org/data/2.5/weather?id=5802340&units=imperial&appid=80e9f3ae5074805d4788ec25275ff8a0&units=imperial',
json: true
}).then(e => {console.log(e.coord)});

Categories

Resources