NodeJs / Axios: timeout issues linked to high level of concurrency - javascript

I've set up a nodeJS server which basically receives an ad request, sends ad requests through Axios to multiple endpoints with a timeout (usually 1000 ms), and then parse and sends back all results.
On each pod around 600 requests per minute are performed to various external endpoints.
When we start the pod, requests run pretty well at the beginning, than after 1 or 2 minutes, all requests returns as timed out...
I use Promise.all to manage concurrency requests, you'll find below the adapterRequest component in charge of sending requests.
I also try to send some data within my cluster, I also get a timeout, which makes me confident on the fact that the issue is linked to Axios.
Getting into details, I create promise arrays thanks to the following module, than I use Promise.all to fetch data
const axios = require('axios');
const fakeServer = require('../test/fake-server');
let promiseCount = 0;
module.exports = (ssp, payload, endpoint, method, timeout) => new Promise((resolve) => {
const cmd = process.env.NODE_ENV === 'test' ? fakeServer : axios;
const start = Date.now();
const config = {
ssp,
url: endpoint,
method,
timeout,
header: {
'Content-Type': 'application/json; charset=utf-8',
Accept: 'application/json',
Connexion: 'keep-alive',
},
data: payload,
};
cmd(config)
.then((response) => {
promiseCount += 1;
console.log(ssp, 'RESPONSEOK', promiseCount);
resolve({
ssp,
uri: config.url,
requestbody: payload,
requestheaders: config.header,
responsebody: response.data,
status: response.status,
responsetimemillis: Date.now() - start,
});
})
.catch((error) => {
promiseCount += 1;
console.log(ssp, error.code, promiseCount);
let responsebody;
let status;
if (error.response === undefined) {
responsebody = undefined;
status = undefined;
} else {
responsebody = error.response.data ? error.response.data : error.message;
status = error.response.status;
}
resolve({
ssp,
uri: config.url,
requestbody: payload,
requestheaders: config.header,
responsebody,
status,
responsetimemillis: Date.now() - start,
});
});
});
Should I try to use such modules as agentkeepalive, with one Axios instance per endpoint?
Thanks!

Just for debugging. before you try another module...
Inside axios config could you change header into headers and add KeepAlive like this
const http = require('http')
const https = require('https')
const config = {
ssp,
url: endpoint,
method,
timeout,
headers: {
'Content-Type': 'application/json; charset=utf-8',
Accept: 'application/json',
},
data: payload,
//keepAlive pools and reuses TCP connections, so it's faster
httpAgent: new http.Agent({ keepAlive: true }),
httpsAgent: new https.Agent({ keepAlive: true }),
}

Related

How to sign API Gateway URL that has a query string

I am trying to use SignatureV4 to sign a request for an API Gateway endpoint that uses IAM Authorizor. An issue is that I keep getting a 403 error whenever I append a query string to my URL, i.e. /pets?type=1. Everything works fine when a query string is not included, i.e. /pets
This is how I build a request:
const region = 'xxx'
const method = 'GET'
const protocol = 'https:'
const host = `xxx.execute-api.${region}.amazonaws.com`
const path = '/dev/pets'
const query = {
type: 1,
}
const request = new HttpRequest({
method: method,
protocol: protocol,
hostname: host,
path: path,
query: query,
headers: {
'Content-Type': 'application/json',
host: host ,
},
})
const signer = new SignatureV4({
credentials: AWS.config.credentials,
service: 'execute-api',
region: region,
sha256: Sha256,
})
const { headers } = await signer.sign(request)
const response = await fetch(`${protocol}//${host}${path}?type=1`, {
headers,
method
}).then((res) => res.json())
I've tried running the same query within Postman and it worked just fine. So, I have to assume that an issue is with my implementation.
An issue was due to getCanonicalQuery ignoring values that are not string or array:
https://github.com/aws/aws-sdk-js-v3/blob/main/packages/signature-v4/src/getCanonicalQuery.ts#L19
So, to fix this I had to swap my query to below:
const query = {
type: '1',
}

Javascript fetch(POST) to express server fails. The server does not receive the request from JS, but receives request from Postman

MRE -> node-server : react app
When I send a POST request using Postman, I get the expected result. This is the request that I am sending using Postman
and test sent gets printed to the console of my node server
If I send a request from my react form however, test sent does not print to the console, but the catch block of my fetch request get's executed and err is printed to the console of my react app, followed by {}.
I would like to know why my POST request is not working and is not getting received by the server
Below is the function that I call when someone clicks the submission button of my form created in react
Function called on form submission
nodeUrl = 'https://localhost:6060?'
const submitData = async () => {
fetch(nodeUrl, {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json',
},
body: JSON.stringify({'test': 'test'})
}).then((res) => {
alert('then')
}).catch((err) => {
alert('err')
alert(JSON.stringify(err))
})
}
}
This is the server that I run using node server.js
server.js
server.post('/', function(req, res) {
console.log('test sent')
mailer.messages().send(req.body)
.then((mes) => {
console.log(mes)
res.json({ message: 'Thanks for your message. Our service team has been notified and will get back to you shortly.' })
}).catch(err => {
console.log(err)
res.json(err);
})
});
The majour issue here is due to CORS. CORS support can be used to overcome this. Just keep in mind to have this only for development mode(see below codes).
But, as per the Postman's snapshot and provided GitHub repositories, the request from Front-end should be of multipart/form-data type. Thus, the Front-end code would look like this
const nodeUrl = "http://localhost:6060/";
const submitData = async () => {
// create a FormData object
const formData = new FormData();
formData.append('form', 'example#email.com');
formData.append('to', 'example#email.com');
// this auto adds 'multipart/form-data' + HASH header in the request
fetch(nodeUrl, {
method: "POST",
body: formData
})
.then(res => {
console.log(res);
}).catch(err => {
console.log('Error -', err);
});
};
To handle multipart/form-data request in the ExpressJS, you need a plugin Multer.
const express = require('express');
const bodyParser = require('body-parser');
const multer = require('multer'); // for 'multipart' type request
const server = express();
const upload = multer();
// allow CORS requests in development mode
if (process.env.NODE_ENV === 'development') {
// Server run command - "NODE_ENV=development node server.js"
const cors = require('cors');
server.use(cors());
}
server.use(bodyParser.json());
server.use(bodyParser.urlencoded({extended: true}));
// using Multer middleware form extracting 'FormData' payload
server.post('/', upload.none(), function(req, res) {
console.log('Received body', req.body);
... // other codes
});
Strategy 2(plain JSON) -
If that 'multipart/form-data' strategy was unintentional and you just want to send simple JSON, use below codes -
In Front-end, trigger API request as -
fetch(nodeUrl, {
method: "POST",
headers: {
'Content-Type': 'application/json', // this needs to be defined
},
body: JSON.stringify({ from: 'some#email.com', to: 'other#email.com' })
})
In server, just ignore codes related to Multer and only keep your API as -
server.post('/', function(req, res) {
console.log('Received body', req.body);
... // other codes
});
I ended up using a better fetch request, which was put together for me by selecting code -> Javascript Fetch in Postman(under the save button)
var myHeaders = new Headers();
myHeaders.append("Content-Type", "application/x-www-form-urlencoded");
var urlencoded = new URLSearchParams();
urlencoded.append("from", "example#email.com");
urlencoded.append("test", "test");
var requestOptions = {
method: 'POST',
headers: myHeaders,
body: urlencoded,
redirect: 'follow'
};
fetch("http:localhost:6060/, requestOptions)
.then(response => {
if (response.ok){
response.json().then(json => {
console.log(json)
})
}
})
.catch(error => console.log('error: ', error))

Don't get expected result Node Fetch POST

I was trying to monitor my order packages
So there is the package track number.
Maybe it's not Submitting
I need to get the result from the expected page but it seems i'm on the base_url
code:
const fetch = require("node-fetch");
const base_url = "https://www2.correios.com.br/sistemas/rastreamento/";
const data = { acao: "track", objetos: "OD769124717BR", btnPesq: "Buscar" };
fetch(base_url, {
method: "POST",
body: JSON.stringify(data),
headers: {
acceptEncoding: "gzip, deflate, br",
connections: "keep-alive",
},
})
.then((results) => results.text())
.then(console.log);
the source of the form data:
acao=track&objetos=OD729124717BR&btnPesq=Buscar
Have you tried adding a catch to the fetch? If you do this, you will see that it is erroring with the error message "Failed to fetch". I've added this to your existing example so you can try for yourself:
const fetch = require("node-fetch");
const base_url = "https://www2.correios.com.br/sistemas/rastreamento/";
const data = { acao: "track", objetos: "OD769124717BR", btnPesq: "Buscar" };
fetch(base_url, {
method: "POST",
body: JSON.stringify(data),
headers: {
acceptEncoding: "gzip, deflate, br",
connections: "keep-alive",
},
})
.then((results) => results.text())
.then(console.log)
.catch(error => console.error("Error:", error.message));
I would recommend that you do some simple testing using cURL commands within the command line, or use a GUI tool such as Postman or SOAP UI to ensure that you have a valid URL and data parameters when testing this endpoint.

npm start gets 401 and node app.js gets 200 response

I have a React project that I run with npm start and this code gets 401 Error from the second fetch (the first one is ok). It runs fine returning 200 only with node, like in "node App.js".
So what would I need to do to run my React project getting 200 response? Why is there this difference between npm and node to this request response?
const clientID = <ClientID>
const clientSecret = <ClientSecret>
const encode = Buffer.from(`${clientID}:${clientSecret}`, 'utf8').toString('base64')
const requestOptions = {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': `Basic ${encode}`,
},
};
fetch("https://auth-nato.auth.us-east-1.amazoncognito.com/oauth2/token?grant_type=client_credentials", requestOptions)
.then(response => { return response.json() })
.then(data => {
const requestOptions2 = {
method: 'POST',
mode: 'no-cors',
headers: { 'Content-Type': 'application/json',
'Authorization': `Bearer ${data.access_token}`
},
body: '{"username":"Ana", "password":"test123","user_id":"ana#email.com"}'
};
fetch('https://j1r07lanr6.execute-api.sa-east-1.amazonaws.com/v1/register', requestOptions2)
.then(response => {console.log(response)});
})
Buffer - is not presented in the browser's javascript.
Instead of
const encode = Buffer.from(`${clientID}:${clientSecret}`, 'utf8').toString('base64')
use just
const encode = btoa(`${clientID}:${clientSecret}`);
Read more about base64 encoding on MDN.
I found out it was a CORS issue that needed to be set correctly on the back-end. My workaround was disabling chrome web security and removing "mode: no-cors".
I've tried adding "Access-Control-Allow-Origin":"http://localhost:3000" to headers but it doesn't work.

Increasing maxContentLength and maxBodyLength in Axios

I am using "Axios" to call a WCF method that takes as parameter file information and content.
The file is read and sent as a base64 encoded string.
My issue is that when the file size exceeds a certain limit, AXIOS throws an exception: "Error: Request body larger than maxBodyLength limit".
I looked up the issue and found that all solutions suggest increasing the maxContentLength / maxBodyLength parameters in the AXIOS configuration object, but did not succeed.
Find Below an implemented test case in node.js:
var axios = require('axios');
var fs = require('fs');
var path = require('path')
var util = require('util')
let readfile = util.promisify(fs.readFile)
async function sendData(url,data) {
let params = data
let resp = await axios({
method: 'post',
url: url,
data: JSON.stringify(params),
headers: { 'Accept': 'application/json', 'Content-Type': 'application/json' }
// maxContentLength: 100000000,
// maxBodyLength: 1000000000
}).catch(err => {
throw err;
})
return resp;
}
async function ReadFile(filepath) {
try{
let res = await readfile(filepath,'base64')
let filename = path.basename(filepath).split('.').slice(0, -1).join('.')
let ext = path.extname(filepath)
return {data:res,fext:ext,fname:filename}
let x = 1
}
catch(err)
{
throw err
}
}
(async () => {
try {
let img = await ReadFile('Files/1.pdf')
let res = await sendData('http://183.183.183.242/EMREngineEA/EMRWS.svc/web/EMR_TestUploadImg',img)
console.log(res)
}
catch (ex) {
console.log(ex)
}
}
)();
In my case, the pdf file is 20 MB, upon running, an error is thrown.
"Error: Request body larger than maxBodyLength limit"
I tried to setting the maxContentLength: 100000000, maxBodyLength: 1000000000
as presented above, but did not succeed.
Your help is appreciated.
The maxBodyLength seems to work for me in this simple test, I upload data to a local Express server. If I try to upload more than the maxBodyLength I get the same error you're getting. So I suspect there's something more, like a redirect happening in your case that's triggering the error.
There is an issue logged for axios here that seems to reference the problem, it suggests setting maxContentLength to Infinity (as the other commenter suggests).
e.g.
maxContentLength: Infinity,
maxBodyLength: Infinity
Test code below:
const axios = require("axios");
function generateRandomData(size) {
const a = Array.from({length: size}, (v, k) => Math.floor(Math.random()*100));
return { data: a, id: 1 };
}
async function uploadData(url, size) {
let params = generateRandomData(size);
let stringData = JSON.stringify(params);
console.log(`uploadData: Uploading ${stringData.length} byte(s)..`);
let resp = await axios({
method: 'post',
url: url,
data: stringData,
headers: { 'Accept': 'application/json', 'Content-Type': 'application/json' },
maxContentLength: 100000000,
maxBodyLength: 1000000000
}).catch(err => {
throw err;
})
console.log("uploadData: response:", resp.data);
return resp;
}
uploadData("http://localhost:8080/upload", 10000000);
Corresponding server code:
const express = require("express");
const port = 8080;
const app = express();
const bodyParser = require('body-parser')
app.use(bodyParser.json({limit: '50mb'}));
app.post('/upload', (req, res, next) => {
console.log("/upload: Received data: body length: ", req.headers['content-length']);
res.json( { status: 'ok', bytesReceived: req.headers['content-length']});
})
app.listen(port);
console.log(`Serving at http://localhost:${port}`);

Categories

Resources