Fetch with retry, abort, etc - javascript

I've been trying to find a wrapper that does fetch with retries, timeouts, aborts, etc. I came across https://pastebin.com/54Ct4xEh a little bit ago, and after fixing a couple typos (missing options. and =>), it works, except... well, maybe it works, but I don't know how to use it. How do I abort a fetch with this particular wrapper? I have a fiddle, https://jsfiddle.net/1fdwb2o6/2/. With this code, how can I, say, click a button and have it abort this fetch loop? For my use case, I' using boopstrap, and I have a modal that, when shown, attempts to load dynamic content. If the user clicks Cancel while it's loading, I want the fetch process to stop. From what I can tell, I should be able to do it with the code below... but I'm not sure how to perform the abort. Perhaps this isn't possible, as structured, with a Promise... but I don't know enough (anything) about promises to know better, one way or the other.
const fetchWithRetry = (userOptions) => {
let abort = false;
const options = {
url: '',
options: {},
cancel: {},
retries: 5,
retryDelay: 1000,
...userOptions
};
// Add an abort to the cancel object.
options.cancel.abort = () => {
abort = true;
};
// Abort or proceed?
return abort ? Promise.reject('aborted') : fetch(options.url).then(response => {
// Reject because of abort
return abort ? Promise.reject('aborted')
// Response is good
: response.ok ? Promise.resolve(response.text())
// Retries exceeded
: !options.retries ? Promise.reject('retries exceeded')
// Retry with one less retry
: new Promise((resolve, reject) => {
setTimeout(() => {
// We use the returned promise's resolve and reject as
// callback so that the nested call propagates backwards.
fetchWithRetry({
...options,
retries: options.retries - 1
}).then(resolve, reject);
}, options.retryDelay);
});
});
}
var xxx;
console.clear();
xxx = fetchWithRetry({
url: "some_file_that_doesnt_exist.php"
})
.then((response) => {
alert(response);
}).catch(function(err) {
// Error: response error, request timeout or runtime error
alert("Error! Cannot load folder list! Please try again!");
});
setTimeout(function() {
// somehow, abort the fetch...
// xxx.abort(); <-- no worky...
}, 1234);

As I said in my comments, the code you have in your question does not provide a cancel() function that the caller can use. It has a cancel() function internally, but that's not something the caller can use. As written that function just returns a promise so the caller has nothing they can call to cancel the retries.
So, I decided to write my own version of fetchWithRetry() that would work for your use case. This has a number of capabilities that the one in your question does not:
It returns both the promise and a cancel function so the caller can cancel the retries.
It allows you to pass the init options for fetch() so you can pass any of the various arguments that fetch() supports and are often needed such as withCredentials.
It has an option to check the response.ok boolean so it will detect and retry more things that you would if you required the promise to be rejected before a retry (note: fetch() doesn't reject on a 404, for example).
If There was a fetch() rejection and it was either cancelled or it ran out of retries, then it will use the newest Error class feature where it will set the cause to the actual fetch() error so the caller can see what the original error was.
Note that this version of fetchWithRetry() returns an object containing both a promise and a cancel function. The caller uses the promise the same way they would any promise from fetch() and they can use the cancel() function to cancel any further retries.
Here's the code:
const Deferred = function() {
if (!(this instanceof Deferred)) {
return new Deferred();
}
const p = this.promise = new Promise((resolve, reject) => {
this.resolve = resolve;
this.reject = reject;
});
this.then = p.then.bind(p);
this.catch = p.catch.bind(p);
if (p.finally) {
this.finally = p.finally.bind(p);
}
}
function fetchWithRetry(url, userOptions = {}, init = {}) {
const options = {
// default options values, can be overridden by userOptions
retries: 3,
retryDelay: 1000,
checkResponseOk: true,
...userOptions
};
let cancelled = false;
let timerDeferred;
let timer;
function run() {
return fetch(url, init).then(response => {
// force retry on non 2xx responses too
if (options.checkResponseOk && !response.ok) {
throw new Error(`fetch failed with status ${response.status}`);
}
return response;
}).catch(err => {
// got error, set up retry
console.log(err);
if (cancelled) {
throw new Error("fetch cancelled", { cause: err });
}
--options.retries;
if (options.retries < 0) {
throw new Error("fetch max retries exceeded", { cause: err });
}
// create new Deferred object for use with our timer
// so it can be resolved by the timer or rejected
// by the cancel callback
timerDeferred = new Deferred();
timer = setTimeout(() => {
timerDeferred.resolve();
timer = null;
}, options.retryDelay);
return timerDeferred.then(() => {
if (cancelled) {
throw new Error("fetch cancelled", { cause: err });
}
return run();
});
});
}
return {
promise: run(),
cancel: () => {
cancelled = true;
// if currently in a timer waiting, reject immediately
if (timer) {
clearTimeout(timer);
timer = null;
}
if (timerDeferred) {
timerDeferred.reject(new Error("fetch cancelled"));
}
}
}
};
Sample usage:
const result = fetchWithRetry(someUrl);
result.promise.then(resp => {
return resp.text().then(data => {
// got final result here
console.log(data.slice(0, 100));
});
}).catch(err => {
console.log(err);
});
// simulate user cancel after 1.5 seconds
setTimeout(() => {
result.cancel();
}, 1500);

Related

Javascript JSON-Request timeout [duplicate]

I have a fetch-api POST request:
fetch(url, {
method: 'POST',
body: formData,
credentials: 'include'
})
I want to know what is the default timeout for this? and how can we set it to a particular value like 3 seconds or indefinite seconds?
Using a promise race solution will leave the request hanging and still consume bandwidth in the background and lower the max allowed concurrent request being made while it's still in process.
Instead use the AbortController to actually abort the request, Here is an example
const controller = new AbortController()
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000)
fetch(url, { signal: controller.signal }).then(response => {
// completed request before timeout fired
// If you only wanted to timeout the request, not the response, add:
// clearTimeout(timeoutId)
})
Alternative you can use the newly added AbortSignal.timeout(5000)... but it is not well implemented in most browser right now. All green env have this now. You will lose control over manually closing the request. Both upload and download will have to finish within a total time of 5s
// a polyfill for it would be:
AbortSignal.timeout ??= function timeout(ms) {
const ctrl = new AbortController()
setTimeout(() => ctrl.close(), ms)
return ctrl.signal
}
fetch(url, { signal: AbortSignal.timeout(5000) })
AbortController can be used for other things as well, not only fetch but for readable/writable streams as well. More newer functions (specially promise based ones) will use this more and more. NodeJS have also implemented AbortController into its streams/filesystem as well. I know web bluetooth are looking into it also. Now it can also be used with addEventListener option and have it stop listening when the signal ends
Update since my original answer is a bit outdated I recommend using abort controller like implemented here: https://stackoverflow.com/a/57888548/1059828 or take a look at this really good post explaining abort controller with fetch: How do I cancel an HTTP fetch() request?
outdated original answer:
I really like the clean approach from this gist using Promise.race
fetchWithTimeout.js
export default function (url, options, timeout = 7000) {
return Promise.race([
fetch(url, options),
new Promise((_, reject) =>
setTimeout(() => reject(new Error('timeout')), timeout)
)
]);
}
main.js
import fetch from './fetchWithTimeout'
// call as usual or with timeout as 3rd argument
// throw after max 5 seconds timeout error
fetch('http://google.com', options, 5000)
.then((result) => {
// handle result
})
.catch((e) => {
// handle errors and timeout error
})
Edit 1
As pointed out in comments, the code in the original answer keeps running the timer even after the promise is resolved/rejected.
The code below fixes that issue.
function timeout(ms, promise) {
return new Promise((resolve, reject) => {
const timer = setTimeout(() => {
reject(new Error('TIMEOUT'))
}, ms)
promise
.then(value => {
clearTimeout(timer)
resolve(value)
})
.catch(reason => {
clearTimeout(timer)
reject(reason)
})
})
}
Original answer
It doesn't have a specified default; the specification doesn't discuss timeouts at all.
You can implement your own timeout wrapper for promises in general:
// Rough implementation. Untested.
function timeout(ms, promise) {
return new Promise(function(resolve, reject) {
setTimeout(function() {
reject(new Error("timeout"))
}, ms)
promise.then(resolve, reject)
})
}
timeout(1000, fetch('/hello')).then(function(response) {
// process response
}).catch(function(error) {
// might be a timeout error
})
As described in https://github.com/github/fetch/issues/175
Comment by https://github.com/mislav
Building on Endless' excellent answer, I created a helpful utility function.
const fetchTimeout = (url, ms, { signal, ...options } = {}) => {
const controller = new AbortController();
const promise = fetch(url, { signal: controller.signal, ...options });
if (signal) signal.addEventListener("abort", () => controller.abort());
const timeout = setTimeout(() => controller.abort(), ms);
return promise.finally(() => clearTimeout(timeout));
};
If the timeout is reached before the resource is fetched then the fetch is aborted.
If the resource is fetched before the timeout is reached then the timeout is cleared.
If the input signal is aborted then the fetch is aborted and the timeout is cleared.
const controller = new AbortController();
document.querySelector("button.cancel").addEventListener("click", () => controller.abort());
fetchTimeout("example.json", 5000, { signal: controller.signal })
.then(response => response.json())
.then(console.log)
.catch(error => {
if (error.name === "AbortError") {
// fetch aborted either due to timeout or due to user clicking the cancel button
} else {
// network error or json parsing error
}
});
there's no timeout support in the fetch API yet. But it could be achieved by wrapping it in a promise.
for eg.
function fetchWrapper(url, options, timeout) {
return new Promise((resolve, reject) => {
fetch(url, options).then(resolve, reject);
if (timeout) {
const e = new Error("Connection timed out");
setTimeout(reject, timeout, e);
}
});
}
If you haven't configured timeout in your code, It will be the default request timeout of your browser.
1) Firefox - 90 seconds
Type about:config in Firefox URL field. Find the value corresponding to key network.http.connection-timeout
2) Chrome - 300 seconds
Source
EDIT: The fetch request will still be running in the background and will most likely log an error in your console.
Indeed the Promise.race approach is better.
See this link for reference Promise.race()
Race means that all Promises will run at the same time, and the race will stop as soon as one of the promises returns a value.
Therefore, only one value will be returned.
You could also pass a function to call if the fetch times out.
fetchWithTimeout(url, {
method: 'POST',
body: formData,
credentials: 'include',
}, 5000, () => { /* do stuff here */ });
If this piques your interest, a possible implementation would be :
function fetchWithTimeout(url, options, delay, onTimeout) {
const timer = new Promise((resolve) => {
setTimeout(resolve, delay, {
timeout: true,
});
});
return Promise.race([
fetch(url, options),
timer
]).then(response => {
if (response.timeout) {
onTimeout();
}
return response;
});
}
A more clean way to do it is actually in MDN: https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal#aborting_a_fetch_operation_with_a_timeout
try {
await fetch(url, { signal: AbortSignal.timeout(5000) });
} catch (e) {
if (e.name === "TimeoutError") {
console.log('5000 ms timeout');
}
}
Here's a SSCCE using NodeJS which will timeout after 1000ms:
import fetch from 'node-fetch';
const controller = new AbortController();
const timeout = setTimeout(() => {
controller.abort();
}, 1000); // will time out after 1000ms
fetch('https://www.yourexample.com', {
signal: controller.signal,
method: 'POST',
body: formData,
credentials: 'include'
}
)
.then(response => response.json())
.then(json => console.log(json))
.catch(err => {
if(err.name === 'AbortError') {
console.log('Timed out');
}}
)
.finally( () => {
clearTimeout(timeout);
});
Using AbortController and setTimeout;
const abortController = new AbortController();
let timer: number | null = null;
fetch('/get', {
signal: abortController.signal, // Content to abortController
})
.then(res => {
// response success
console.log(res);
if (timer) {
clearTimeout(timer); // clear timer
}
})
.catch(err => {
if (err instanceof DOMException && err.name === 'AbortError') {
// will return a DOMException
return;
}
// other errors
});
timer = setTimeout(() => {
abortController.abort();
}, 1000 * 10); // Abort request in 10s.
This is a fragment in #fatcherjs/middleware-aborter.
By using fatcher, it can easy to abort a fetch request.
import { aborter } from '#fatcherjs/middleware-aborter';
import { fatcher, isAbortError } from 'fatcher';
fatcher({
url: '/bar/foo',
middlewares: [
aborter({
timeout: 10 * 1000, // 10s
onAbort: () => {
console.log('Request is Aborted.');
},
}),
],
})
.then(res => {
// Request success in 10s
console.log(res);
})
.catch(err => {
if (isAbortError(err)) {
//Run error when request aborted.
console.error(err);
}
// Other errors.
});
fetchTimeout (url,options,timeout=3000) {
return new Promise( (resolve, reject) => {
fetch(url, options)
.then(resolve,reject)
setTimeout(reject,timeout);
})
}
You can create a timeoutPromise wrapper
function timeoutPromise(timeout, err, promise) {
return new Promise(function(resolve,reject) {
promise.then(resolve,reject);
setTimeout(reject.bind(null,err), timeout);
});
}
You can then wrap any promise
timeoutPromise(100, new Error('Timed Out!'), fetch(...))
.then(...)
.catch(...)
It won't actually cancel an underlying connection but will allow you to timeout a promise.
Reference
Proper error handling tips
Normal practice:
To add timeout support most of the time it is suggested to introduce a Promise utility function like this:
function fetchWithTimeout(resource, { signal, timeout, ...options } = {}) {
const controller = new AbortController();
if (signal != null) signal.addEventListener("abort", controller.abort);
const id = timeout != null ? setTimeout(controller.abort, timeout) : undefined;
return fetch(resource, {
...options,
signal: controller.signal
}).finally(() => {
if (id != null) clearTimeout(id);
});
}
Calling controller.abort or rejecting the promise inside the setTimeout callback function distorts the stack trace.
This is suboptimal, since one would have to add boilerplate error handlers with log messages in the functions calling the fetch method if post-error log analysis is required.
Good expertise:
To preserve the error along with it's stack trace one can apply the following technique:
function sleep(ms = 0, signal) {
return new Promise((resolve, reject) => {
const id = setTimeout(() => resolve(), ms);
signal?.addEventListener("abort", () => {
clearTimeout(id);
reject();
});
});
}
async function fetch(
resource,
options
) {
const { timeout, signal, ...ropts } = options ?? {};
const controller = new AbortController();
let sleepController;
try {
signal?.addEventListener("abort", () => controller.abort());
const request = nodeFetch(resource, {
...ropts,
signal: controller.signal,
});
if (timeout != null) {
sleepController = new AbortController();
const aborter = sleep(timeout, sleepController.signal);
const race = await Promise.race([aborter, request]);
if (race == null) controller.abort();
}
return request;
} finally {
sleepController?.abort();
}
}
(async () => {
try {
await fetchWithTimeout(new URL(window.location.href), { timeout: 5 });
} catch (error) {
console.error("Error in test", error);
}
})();
Using c-promise2 lib the cancellable fetch with timeout might look like this one (Live jsfiddle demo):
import CPromise from "c-promise2"; // npm package
function fetchWithTimeout(url, {timeout, ...fetchOptions}= {}) {
return new CPromise((resolve, reject, {signal}) => {
fetch(url, {...fetchOptions, signal}).then(resolve, reject)
}, timeout)
}
const chain = fetchWithTimeout("https://run.mocky.io/v3/753aa609-65ae-4109-8f83-9cfe365290f0?mocky-delay=10s", {timeout: 5000})
.then(request=> console.log('done'));
// chain.cancel(); - to abort the request before the timeout
This code as a npm package cp-fetch

Custom status change events in Javascript

I have an asynchronous function that performs various await tasks. I am trying to inform my UI in React when the status of the function changes or when one of the tasks is completed.
const foo = async () => {
// trigger on load event
await task1();
// trigger task1 done event
await task2();
// trigger task2 done event
await task3();
// trigger on done event
}
I also want to be able to specify callbacks for each event, like so:
const bar = foo();
foo.on_load(() => {
// some code goes here
});
foo.on_done(() => {
// some code goes here
});
Another alternative would be something like this:
const bar = foo();
foo.on('status_change', status => {
// read the status here and do something depending on the status
})
I have been reading about custom events in JS but not sure how to use them for this. Or maybe there's another way to do this in React.
Any ideas would be helpful. Thanks!
EDIT
var uploadTask = storageRef.child('images/rivers.jpg').put(file);
// Register three observers:
// 1. 'state_changed' observer, called any time the state changes
// 2. Error observer, called on failure
// 3. Completion observer, called on successful completion
uploadTask.on('state_changed', function(snapshot){
// Observe state change events such as progress, pause, and resume
// Get task progress, including the number of bytes uploaded and the total number of bytes to be uploaded
var progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
switch (snapshot.state) {
case firebase.storage.TaskState.PAUSED: // or 'paused'
console.log('Upload is paused');
break;
case firebase.storage.TaskState.RUNNING: // or 'running'
console.log('Upload is running');
break;
}
}, function(error) {
// Handle unsuccessful uploads
}, function() {
// Handle successful uploads on complete
// For instance, get the download URL: https://firebasestorage.googleapis.com/...
uploadTask.snapshot.ref.getDownloadURL().then(function(downloadURL) {
console.log('File available at', downloadURL);
});
});
I was trying to achieve something like the above code, taken from the firebase documentation on uploading files
This is where I've gotten so far:
class Task {
constructor() {
this.first = null;
this.second = null;
}
on(keyword, callback) {
switch (keyword) {
case "first":
this.first = callback;
break;
case "second":
this.second = callback;
break;
default:
// throw new error
break;
}
}
}
const timeout = async time => {
return new Promise(resolve => setTimeout(resolve, time));
};
const foo = () => {
const task = new Task();
timeout(2000).then(async () => {
task.first && task.first();
await timeout(2000);
task.second && task.second();
});
console.log("returning");
return task;
};
const taskObject = foo();
taskObject.on("first", () => console.log("executing first callback"));
taskObject.on("second", () => console.log("executing second callback"));
Is there a better way to do this - without having the nested thens? Which approach would be better and when? EDIT - removed nested then clauses and replaced with then and await
PS: for my requirements, having callbacks would be sufficient. This is just so I can understand the concept better. Thanks!
I'm going to assume there's a reason for you not simply calling some named method after each async step has complete, i.e., you want to be able to plug in different handlers for each event. Here is one way to go about it - whether or not it's the best is hard to tell from the little context provided:
const foo = async (handlers) => {
handlers.onLoad && handlers.onLoad();
await task1();
handlers.onTask1Complete && handlers.onTask1Complete();
await task2();
handlers.onTask2Complete && handlers.onTask2Complete();
}
const myHandlers = {
onLoad: () => {
// do stuff
},
onTask1Complete: () => {
// do other stuff
},
onTask2Complete: () => {
// etc
}
};
foo(myHandlers);
Note that it lets you specify only the handlers you need. A more flexible approach would be to a publish-subscribe model, where a subscribe method pushes a function to an array of handlers, all of which are called when the event occurs.
The best option would be to make use of promises, which means every time a promise is resolved, you will get notified and then cascading promise will get executed.
an example below of chaining promises
var function3 = function(resolve, reject)
{
try
{
//do some thing
console.log('function3 called');
resolve('function3 success');
}
catch(err)
{
reject(err);
}
}
var function2 = function(resolve, reject)
{
try
{
//do some thing
console.log('function2 called');
resolve('function2 success');
//return new Promise(function3);
}
catch(err)
{
reject(err);
}
}
var function1 = function(resolve, reject)
{
try
{
//do some thing
console.log('function1 called');
resolve('function1 success');
}
catch(err)
{
reject(err);
}
}
var promise = new Promise(function1);
promise
.then(function(response){
console.log(response);
return new Promise(function2);
}, function(error)
{
console.log(error);
})
.then(function(response)
{
console.log(response);
return new Promise(function3);
},
function(err)
{
console.log(error);
})
.then(function(response)
{
console.log(response);
},
function(err)
{
console.log(error);
})
//output
"function1 called"
"function1 success"
"function2 called"
"function2 success"
"function3 called"
"function3 success"

Halt Execution of Network Request If It Takes Too Long?

I have some code that basically calls fetch in Javascript. The third party services sometimes take too long to return a response and in an attempt to be more user-friendly, I want to be able to either post a message or stop the connection from being open after N milliseconds.
I had recently come across this post:
Skip the function if executing time too long. JavaScript
But did not have much luck and had issues getting it to work with the below code. I was also hoping that there was a more modern approach to do such a task, maybe using async/await?
module.exports = (url, { ...options } = {}) => {
return fetch(url, {
...options
})
}
You can use a combination of Promise.race and AbortController, here is an example:
function get(url, timeout) {
const controller = new AbortController();
return Promise.race([fetch(url, {
signal: controller.signal
}), new Promise(resolve => {
setTimeout(() => {
resolve("request was not fulfilled in time");
controller.abort();
}, timeout)
})]);
}
(async() => {
const result = await get("https://example.com", 1);
console.log(result);
})();
The native Fetch API doesn't have a timeout built in like something like axios does, but you can always create a wrapper function that wraps the fetch call to implement this.
Here is an example:
const fetchWithTimeout = (timeout, fetchConfig) => {
const FETCH_TIMEOUT = timeout || 5000;
let didTimeOut = false;
return new Promise(function(resolve, reject) {
const timeout = setTimeout(function() {
didTimeOut = true;
reject(new Error('Request timed out'));
}, FETCH_TIMEOUT);
fetch('url', fetchConfig)
.then(function(response) {
// cleanup timeout
clearTimeout(timeout);
if(!didTimeOut) {
// fetch request was good
resolve(response);
}
})
.catch(function(err) {
// Rejection already happened with setTimeout
if(didTimeOut) return;
// Reject with error
reject(err);
});
})
.then(function() {
// Request success and no timeout
})
.catch(function(err) {
//error
});
}
from here https://davidwalsh.name/fetch-timeout

How to raise a Timeout Error in node.js if code takes long time to finish?

in ruby I can:
require 'timeout'
Timeout.timeout 10 do
# do smth > 10 seconds
end
it will raise timeout error to avoid code lock, how to do same thing in nodejs, nodejs #setTimeout doesn't fit my need
one case is, when i http.get timeout(for ex, netowrk is unstable), I should set timeout and handle the failed get request, I hope impl #timeout, how should i do?
try {
timeout(10, function () {
http.get("example.com/prpr")
})
} catch (e) {
if (e.message == "timeout") {
// do smth
} else {
throw e
}
}
You could look into a Promise-based approach here.
Using promises you can pass a function to be executed, and then the standard catch is called if that function raises an exception.
There is a helpful promise-based timeout library on NPM (npm install promise-timeout request-promise), and you could use it in Node something along the lines of...
'use strict';
var promiseTimeout = require('promise-timeout');
var requestPromise = require('request-promise');
promiseTimeout.timeout(requestPromise("http://example.com/prpr"), 10000)
.then(function (result) {
console.log({result});
}).catch(function (err) {
if (err instanceof pt.TimeoutError) {
console.error('HTTP get timed out');
}
});
I had a similar situation with nestJS based on node.js.
When calling an external API, it was a problem that even my service slowed down if it took too long. (If the external api is delayed, my service also had a problem of waiting forever.)
I figured out 2 ways.
First way:
const result = await axios({
timeout: 10000, // error: [AxiosError: timeout of 10000ms exceeded] { code: 'ECONNABORTED', ...
...
});
Second way: Promise.race()
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/race
// first function
const callAPI = axios({
method: "GET",
url: "http://yourapi",
headers: {
...
}
});
// second function
const timeoutCheck = (s) => {
return new Promise(resolve => setTimeout(resolve, s));
}
// check delay (first function VS second function)
const result = await Promise.race([
callAPI,
timeoutCheck(10000).then(() => {
throw new Error("api not responding for more than 10 seconds");
}),
]);
const { data: { resultCode, resultData } } = result;
You can try this out in your case:
var request = http.get(options, function (res) {
// other code goes here
});
request.setTimeout( 10000, function( ) {
// handle timeout here
});

fetch retry request (on failure)

I'm using browser's native fetch API for network requests. Also I am using the whatwg-fetch polyfill for unsupported browsers.
However I need to retry in case the request fails. Now there is this npm package whatwg-fetch-retry I found, but they haven't explained how to use it in their docs. Can somebody help me with this or suggest me an alternative?
From the fetch docs :
fetch('/users')
.then(checkStatus)
.then(parseJSON)
.then(function(data) {
console.log('succeeded', data)
}).catch(function(error) {
console.log('request failed', error)
})
See that catch? Will trigger when fetch fails, you can fetch again there.
Have a look at the Promise API.
Implementation example:
function wait(delay){
return new Promise((resolve) => setTimeout(resolve, delay));
}
function fetchRetry(url, delay, tries, fetchOptions = {}) {
function onError(err){
triesLeft = tries - 1;
if(!triesLeft){
throw err;
}
return wait(delay).then(() => fetchRetry(url, delay, triesLeft, fetchOptions));
}
return fetch(url,fetchOptions).catch(onError);
}
Edit 1: as suggested by golopot, p-retry is a nice option.
Edit 2: simplified example code.
I recommend using some library for promise retry, for example p-retry.
Example:
const pRetry = require('p-retry')
const fetch = require('node-fetch')
async function fetchPage () {
const response = await fetch('https://stackoverflow.com')
// Abort retrying if the resource doesn't exist
if (response.status === 404) {
throw new pRetry.AbortError(response.statusText)
}
return response.blob()
}
;(async () => {
console.log(await pRetry(fetchPage, {retries: 5}))
})()
I don't like recursion unless is really necessary. And managing an exploding number of dependencies is also an issue. Here is another alternative in typescript. Which is easy to translate to javascript.
interface retryPromiseOptions<T> {
retryCatchIf?:(response:T) => boolean,
retryIf?:(response:T) => boolean,
retries?:number
}
function retryPromise<T>(promise:() => Promise<T>, options:retryPromiseOptions<T>) {
const { retryIf = (_:T) => false, retryCatchIf= (_:T) => true, retries = 1} = options
let _promise = promise();
for (var i = 1; i < retries; i++)
_promise = _promise.catch((value) => retryCatchIf(value) ? promise() : Promise.reject(value))
.then((value) => retryIf(value) ? promise() : Promise.reject(value));
return _promise;
}
And use it this way...
retryPromise(() => fetch(url),{
retryIf: (response:Response) => true, // you could check before trying again
retries: 5
}).then( ... my favorite things ... )
I wrote this for the fetch API on the browser. Which does not issue a reject on a 500. And did I did not implement a wait. But, more importantly, the code shows how to use composition with promises to avoid recursion.
Javascript version:
function retryPromise(promise, options) {
const { retryIf, retryCatchIf, retries } = { retryIf: () => false, retryCatchIf: () => true, retries: 1, ...options};
let _promise = promise();
for (var i = 1; i < retries; i++)
_promise = _promise.catch((value) => retryCatchIf(value) ? promise() : Promise.reject(value))
.then((value) => retryIf(value) ? promise() : Promise.reject(value));
return _promise;
}
Javascript usage:
retryPromise(() => fetch(url),{
retryIf: (response) => true, // you could check before trying again
retries: 5
}).then( ... my favorite things ... )
EDITS: Added js version, added retryCatchIf, fixed the loop start.
One can easily wrap fetch(...) in a loop and catch potential errors (fetch only rejects the returning promise on network errors and the alike):
const RETRY_COUNT = 5;
async function fetchRetry(...args) {
let count = RETRY_COUNT;
while(count > 0) {
try {
return await fetch(...args);
} catch(error) {
// logging ?
}
// logging / waiting?
count -= 1;
}
throw new Error(`Too many retries`);
}

Categories

Resources