Rate Limit: Add a buffer beween API calls in map JavaScript - javascript

I'm using Google Gmail API to get sent emails.
I'm using 2 APIs for this -
list (https://developers.google.com/gmail/api/reference/rest/v1/users.messages/list)
get (https://developers.google.com/gmail/api/reference/rest/v1/users.messages/get)
The list API gives a list of messages IDs which I use to get specific data from the get API.
Here's the code for this -
await Promise.all(
messages?.map(async (message) => {
const messageData = await contacts.getSentGmailData(
accessToken,
message.id
);
return messageData;
})
);
getSentGmailData is the get API here.
The problem here is, while mapping and making requests to this API continuously, I get a 429 (rateLimitExceeded) error.
What I tried is adding a buffer between each request like this -
function delay(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
const messageData = await contacts.getSentGmailData(accessToken,message.id);
await delay(200);
But this doesn't seem to work.
How can I work around this?

You can use the below solution like code for adding some more buffer time when you will get 429 (to many requests from google api).
Basically this code will help you to stop calling api when you exceed Rate Limiter.
Note: This doesn't mean that you can bypass Google api Rate Limiter.
async function getSentGmailDataWithBackoff(accessToken, messageId) {
const MAX_RETRIES = 5;
let retries = 0;
let delay = 200;
while (true) {
try {
const messageData = await contacts.getSentGmailData(accessToken, messageId);
return messageData;
} catch (error) {
if (error.response && error.response.status === 429 && retries < MAX_RETRIES) {
retries++;
console.log(`Rate limit exceeded. Retrying in ${delay}ms.`);
await delay(delay);
delay *= 2;
} else {
throw error;
}
}
}
}
async function getSentGmailDataWithBackoffBatch(accessToken, messageIds) {
return Promise.all(
messageIds.map(async (messageId) => {
const messageData = await getSentGmailDataWithBackoff(accessToken, messageId);
return messageData;
})
);
}
function delay(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}

The reason the delay is not working is because it does not wait for the Promise to be resolved. The same reasoning applied to forEach, filter, reduce etc. You can get some idea here: https://gist.github.com/joeytwiddle/37d2085425c049629b80956d3c618971
If you had used a for-of loop or another for-loop for this purpose, it would have worked.
for(let message of messages) {
const messageData = await contacts.getSentGmailData(accessToken,message.id);
await delay(200);
}
You could also write your own rate-limiting function (also commonly called throttling function) or use one provided by libraries like Lodash: https://lodash.com/docs#throttle

Related

retrieve logs from server while performing a "parallel" promise.all( request to the server) (parallel challenge!)

Arriving with a theory question :)
I have a front that sends (axios) N requests in a Promise.all() with a map function. This works fine. Each time one of the promises is good, I have a little table that gets updated with each request's answer until I get the full table and the array of the answers at the end. ✅
The problem comes when I want to read, at the same time, the logs of the server
So my objective is to run another axios request to my express.js server that will run each 2 seconds to retrieve the logs of the last 2 seconds, this way I could show the logs of what is happening with each answer in real time.
Any ideas of how doing this two tasks in parallel?
In the front I'm using react and the promise.All has this is structure:
setIsLoading(true); // setting a flag to know this is running
const doAllTheTable = await Promise.all(
tableData.map(async (lineOfMyTable) => {
const answer = await doMyRequest(lineOfMyTable) // my axios.get request
return updateTableLine(answer) // the functions that update the good line
})
);
//all promises are good now
setIsLoading(false)
So, basically I want to have another loop that runs each 2 seconds while "isLoading" is true to update another part of my front and show the logs meanwhile. But I need both things to happen at the same time!
Thank you for your ideas :)
Rather than awaiting your Promise.all immediately, store a reference to the promise so you can start checking the logs:
const doAllTheTablePromise = Promise.all(
tableData.map(async lineOfMyTable => {
const answer = await doMyRequest(lineOfMyTable); // my axios.get request
return updateTableLine(answer); // the functions that update the good line
});
);
let cancelled = false;
(async () => {
while (!cancelled) {
// Check your logs..
await new Promise(r => setTimeout(r, 2000)); // 2 second delay
}
})();
await doAllTheTablePromise;
cancelled = true;
Once your doAllTheTablePromise has resolved, you can stop checking the logs.
Must be many ways to write this. Here's one involving a token provided by the caller of two async processes, foo() and bar(), for communication between them.
async function foo(tableData, token) {
try {
await Promise.all(tableData.map(async (lineOfMyTable) => {
const answer = await doMyRequest(lineOfMyTable);
return updateTableLine(answer);
}));
token.setIsLoading = false; // lower flag when all requests are complete
} catch(error) {
token.setIsLoading = false; // lower flag if synchronous or asynchronous error occurs
}
}
async function bar(token) {
function delay(ms) { // this can be written as inner or outer function, whichever suits.
return new Promise(resolve => {
setTimeout(resolve, ms);
});
}
if(token.setIsLoading) {
let logs = await retrieveLogs();
// process/display logs here
await delay(2000);
return bar(token); // recursive call
} else {
return "complete"; // optional
}
}
async function myCaller() {
// ... preamble
let loadingToken = { // passed to foo() and bar() as a means of communication between them.
'setIsLoading': true // raise flag before calling foo() and bar().
};
return Promise.all(foo(tableData, loadingToken), bar(loadingToken));
}
EDIT:
Maybe better written like this, with the caller looking after lowering the flag:
async function foo(tableData) {
return Promise.all(tableData.map(async (lineOfMyTable) => {
return updateTableLine(await doMyRequest(lineOfMyTable));
}));
}
async function bar(token) {
function delay(ms) { // this can be written as inner or outer function, whichever suits.
return new Promise(resolve => {
setTimeout(resolve, ms);
});
}
if(token.setIsLoading) {
let logs = await retrieveLogs();
// process/display logs here
await delay(2000);
return bar(token); // recursive call
} else {
return "complete"; // optional
}
}
async function myCaller() {
// ... preamble
let loadingToken = { // passed to bar().
'setIsLoading': true // raise flag before calling foo() and bar().
};
return Promise.all(
foo(tableData).finally(() => { loadingToken.setIsLoading = false }),
bar(loadingToken)
);
}

Node, wait and retry api calls that fail

So I fetch an array of urls from api with a rate limit, currently I handle this by adding a timeout to each call like this:
const calls = urls.map((url, i) =>
new Promise(resolve => setTimeout(resolve, 250 * i))
.then(() => fetch(url)
)
);
const data = await Promise.all(calls);
forcing a 250ms wait between each call. This ensures that the rate limit is never exceeded.
The thing is, this isn't really necessary. I've tried with 0ms wait time, and most of the cases I have to repeatedly reload the page four or five times before the api starts to return:
{ error: { status: 429, message: 'API rate limit exceeded' } }
and most of the times you only have to wait a second or so before you can safely reload the page and get all data.
A more reasonable approach would be to collect the calls that return 429 (if they do), wait for a set amount of time and then retry them (and perhaps redo this a set amount of times).
Problem, I'm a bit stumped as to how one would go about achieving this?
EDIT:
Just got home and will look through the answers but there seem to have been an assumption made which I don't believe is necessary: The calls does not have to be sequential, they can be fired (and returned) in any order.
The term for what you want is exponential backoff. You can modify your code so that it continues trying on a certain failure condition:
const max_wait = 2000;
async function wait(ms) {
return new Promise(resolve => {
setTimeout(resolve, ms);
});
}
const calls = urls.map(async (url) => {
let retry = 0, result;
do {
if (retry !== 0) { await wait(Math.pow(2, retry); }
result = await fetch(url);
retry++;
} while(result.status !== 429 || (Math.pow(2, retry) > max_wait))
return result;
}
Or you can try using a library to handle the backoff for you like https://github.com/MathieuTurcotte/node-backoff
If I understand the question right, your trying to:
a) Execute fetch() calls sequentially (with a possibly optional delay)
b) Retry failed requests with a backoff delay
As you likely found out, .map() does not really help with a) as it does not wait for any async stuff when iterating (which is why you create a greater and greater timeout with i*250).
I personally find it the easiest to keep things sequential by using a for of loop instead, as this will work nicely with async/await:
const fetchQueue = async (urls, delay = 0, retries = 0, maxRetries = 3) => {
const wait = (timeout = 0) => {
if (timeout) { console.log(`Waiting for ${timeout}`); }
return new Promise(resolve => {
setTimeout(resolve, timeout);
});
};
for (url of urls) {
try {
await wait(retries ? retries * Math.max(delay, 1000) : delay);
let response = await fetch(url);
let data = await (
response.headers.get('content-type').includes('json')
? response.json()
: response.text()
);
response = {
headers: [...response.headers].reduce((acc, header) => {
return {...acc, [header[0]]: header[1]};
}, {}),
status: response.status,
data: data,
};
// in reality, only do that for errors
// that make sense to retry
if ([404, 429].includes(response.status)) {
throw new Error(`Status Code ${response.status}`);
}
console.log(response.data);
} catch(err) {
console.log('Error:', err.message);
if (retries < maxRetries) {
console.log(`Retry #${retries+1} ${url}`);
await fetchQueue([url], delay, retries+1, maxRetries);
} else {
console.log(`Max retries reached for ${url}`);
}
}
}
};
// populate some real URLs urls to fetch
// index 0 will generate an inexistent URL to test error behaviour
const urls = new Array(101).fill(null).map((x, i) => `https://jsonplaceholder.typicode.com/todos/${i}`);
// fetch urls one after another (sequentially)
// and delay each request by 250ms
fetchQueue(urls, 250);
If a request fails (e.g. you get one of the errors specified in the array with error status codes), the above function will retry them a maximum of 3 times (by default) with a backoff delay that increases by a second on each retry.
As you wrote, the delay between requests is probably not necessary, so you could just remove the 250 in the function call. Because each request is executed one after the other, you're less likely to run into rate limit issues but if you do, it's very easy to add some custom delay.
Here is an example that allows to handle an array of promises sequencially, by setting a delay expressed in milliseconds and accepting a third callback determining whether the request should be retried.
In the below code, some sample requests are mocked to:
Test a successful response.
Test an error response. If the error response contains an error code and the error code is 403, true is returned and the call is retried in the next run (delayed by x milliseconds).
Test an error response without an error code.
There is a global counter below that give up the promise after N tries (in the below example 5), all of that is handled in this code:
const result = await resolveSequencially(promiseTests, 250, (err) => {
return ++errorCount, !!(err && err.error && err.error.status === 403 && errorCount <= 5);
});
Where the error count is first increased and it returns true if the error is defined, has an error property and its status is 403.
Of course, the example is just to test things out, but I think you're looking for something allowing you to have a cleverer control over the promise loop cycle, hence here is a solution doing just that.
I will add some comments below, you can run the test below to check what happens directly in the console.
// Nothing that relevant, this one is just for testing purposes!
let errorCount = 0;
// Declare the function.
const resolveSequencially = (promises, delay, onFailed, onFinished) => {
// store the results.
const results = [];
// Define a self invoking recursiveHandle function.
(recursiveHandle = (current, max) => { // current is the index of the currently looped promise, max is the maximum needed.
console.log('recursiveHandle invoked, current is, ', current ,'max is', max);
if (current === max) onFinished(results); // <-- if all the promises have been looped, resolve.
else {
// Define a method to handle the promise.
let handlePromise = () => {
console.log('about to handle promise');
const p = promises[current];
p.then((success) => {
console.log('success invoked!');
results.push(success);
// if it's successfull, push the result and invoke the next element.
recursiveHandle(current + 1, max);
}).catch((err) => {
console.log('An error was catched. Invoking callback to check whether I should retry! Error was: ', err);
// otherwise, invoke the onFailed callback.
const retry = onFailed(err);
// if retry is true, invoke again the recursive function with the same indexes.
console.log('retry is', retry);
if (retry) recursiveHandle(current, max);
else recursiveHandle(current + 1, max); // <-- otherwise, procede regularly.
});
};
if (current !== 0) setTimeout(() => { handlePromise() }, delay); // <-- if it's not the first element, invoke the promise after the desired delay.
else handlePromise(); // otherwise, invoke immediately.
}
})(0, promises.length); // Invoke the IIFE with a initial index 0, and a maximum index which is the length of the promise array.
}
const promiseTests = [
Promise.resolve(true),
Promise.reject({
error: {
status: 403
}
}),
Promise.resolve(true),
Promise.reject(null)
];
const test = () => {
console.log('about to invoke resolveSequencially');
resolveSequencially(promiseTests, 250, (err) => {
return ++errorCount, !!(err && err.error && err.error.status === 403 && errorCount <= 5);
}, (done) => {
console.log('finished! results are:', done);
});
};
test();

Array of queries for `for await` loop for postgresql transaction helper

I made a transaction function that simplifies this action for me like this (it working):
export async function transaction(queriesRaw) {
let allResults = []
const client = await pool.connect()
try {
await client.query('BEGIN')
var queries = queriesRaw.map(q => {
return client.query(q[0], q[1])
})
for await (const oneResult of queries) {
allResults.push(oneResult)
}
await client.query('COMMIT')
} catch (err) {
await client.query('ROLLBACK')
} finally {
client.release()
return allResults
}
}
And do transactions like this:
let results = await transaction([
['UPDATE readers SET cookies=5 WHERE id=$1;', [1]],
['INSERT INTO rewards (id) VALUES ($1);', [3]]
])
Transaction should do queries one at a time in array index sequence (so rollback to previos values will work correctly) and return in the same order (sometimes i need return values from some queries)
As i understand it starts already in map map function. In for await i just wait for results of it and second query may complete faster that previos.
So how can i fix this?
P.S. Maybe something like new Promise() instead map is the rigth way?
Change this:
var queries = queriesRaw.map(q => {
return client.query(q[0], q[1])
})
for await (const oneResult of queries) {
allResults.push(oneResult)
}
To:
for(const q of rawQueries) {
let result = await client.query(q[0], q[1]);
allResults.push(result);
});
If i got you correctly, Just use a for loop with proper await, instead of a callback style loop.
So you can wait with the function to return unil everything is chronologically executed, With some thinking, you can easily add aa revoke() function or something..
...
export async function transaction(queriesRaw) {
let allResults = []
const client = await pool.connect()
try {
await client.query('BEGIN')
for(var i = 0; i < queriesRaw.length;i++) {
var res = await client.query(queriesRaw[i][0], queriesRaw[i][1])
allResults.push(res)
}
await client.query('COMMIT')
} catch (err) {
await client.query('ROLLBACK')
// do you maybe wanna errors to results to?
// allResults.push(err)
} finally {
client.release()
return allResults
}
}
Info,
Have a look at for example async module, or something similar. So you will not have to think about things like this.

Fetch call every 2 seconds, but don't want requests to stack up

I am trying to make an API call and I want it to repeat every 2 seconds. However I am afraid that if the system doesn't get a request back in 2 seconds, that it will build up requests and keep trying to send them. How can I prevent this?
Here is the action I am trying to fetch:
const getMachineAction = async () => {
try {
const response = await fetch( 'https://localhost:55620/api/machine/');
if (response.status === 200) {
console.log("Machine successfully found.");
const myJson = await response.json(); //extract JSON from the http response
console.log(myJson);
} else {
console.log("not a 200");
}
} catch (err) {
// catches errors both in fetch and response.json
console.log(err);
}
};
And then I call it with a setInterval.
function ping() {
setInterval(
getMachineAction(),
2000
);
}
I have thought of doing some promise like structure in the setInterval to make sure that the fetch had worked and completed, but couldn't get it working.
The Promise.all() Solution
This solution ensures that you don't miss-out on 2 sec delay requirement AND also don't fire a call when another network call is underway.
function callme(){
//This promise will resolve when the network call succeeds
//Feel free to make a REST fetch using promises and assign it to networkPromise
var networkPromise = fetch('https://jsonplaceholder.typicode.com/todos/1');
//This promise will resolve when 2 seconds have passed
var timeOutPromise = new Promise(function(resolve, reject) {
// 2 Second delay
setTimeout(resolve, 2000, 'Timeout Done');
});
Promise.all(
[networkPromise, timeOutPromise]).then(function(values) {
console.log("Atleast 2 secs + TTL (Network/server)");
//Repeat
callme();
});
}
callme();
Note: This takes care of the bad case definition as requested by the author of the question:
"the "bad case" (i.e. it takes longer than 2 seconds) is I want it to skip that request, and then send a single new one. So at 0 seconds the request sends. It takes 3 seconds to execute, then 2 seconds later (at 5) it should reexcute. So it just extends the time until it sends."
You could add a finally to your try/catch with a setTimeout instead of using your setInterval.
Note that long polling like this creates lot more server load than using websockets which themselves are a lot more real time
const getMachineAction = async () => {
try {
const response = await fetch( 'https://localhost:55620/api/machine/');
if (response.status === 200) {
console.log("Machine successfully found.");
const myJson = await response.json(); //extract JSON from the http response
console.log(myJson);
} else {
console.log("not a 200");
}
} catch (err) {
// catches errors both in fetch and response.json
console.log(err);
} finally {
// do it again in 2 seconds
setTimeout(getMachineAction , 2000);
}
};
getMachineAction()
Simple! Just store whether it's currently making a request, and store whether the timer has tripped without sending a new request.
let in_progress = false;
let missed_request = false;
const getMachineAction = async () => {
if (in_progress) {
missed_request = true;
return;
}
in_progress = true;
try {
const response = await fetch('https://localhost:55620/api/machine/');
if (missed_request) {
missed_request = false;
setTimeout(getMachineAction, 0);
}
if (response.status === 200) {
console.log("Machine successfully found.");
const myJson = await response.json(); //extract JSON from the http response
console.log(myJson);
} else {
console.log("not a 200");
}
} catch (err) {
// catches errors both in fetch and response.json
console.log(err);
} finally {
in_progress = false;
}
};
To start the interval, you need to omit the ():
setInterval(getMachineAction, 2000);

How to queue API calls with 1 second delay in JS using setInterval() inside map() function

I have a function getData that fetches a data from an external api. If I make more than 1 request per second, I get 503 error. Hence, I'm thinking to queue API requests, but those calls still get batched altogether and i get same 503 error.
I'm parsing Local Storage data as objects (each object will make a separate API request), and if there is more than 1 object - I want to queue all the subsequent API calls with 1 second delay. Here's my code :
const lsData = JSON.parse(localStorage.getItem('weatherappData'));
if (lsData) {
lsData.map((location, index) => {
const city = location.city;
const country = location.country;
if (index === 0) {
getData(city, country, table);
} else {
setTimeout(() => getData(city, country, table), 1000);
}
});
}
What am I doing wrong? Thanks !
.map() does not await the previous callback. You can multiply 1000 by index
setTimeout(() => getData(city, country, table), index * 1000);
or use async/await.
(async() => {
for (const [index, {city, country}] of lsData.entries()) {
try {
if (index === 0) {
await getData(city, country, table);
} else {
await new Promise(resolve => setTimeout(() => resolve(getData(city, country, table)), 1000))
}
} catch(e) {
console.error(e)
}
}
})()
Note, table is not defined at the code at the question.

Categories

Resources