Related
I have a fetch-api POST request:
fetch(url, {
method: 'POST',
body: formData,
credentials: 'include'
})
I want to know what is the default timeout for this? and how can we set it to a particular value like 3 seconds or indefinite seconds?
Using a promise race solution will leave the request hanging and still consume bandwidth in the background and lower the max allowed concurrent request being made while it's still in process.
Instead use the AbortController to actually abort the request, Here is an example
const controller = new AbortController()
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000)
fetch(url, { signal: controller.signal }).then(response => {
// completed request before timeout fired
// If you only wanted to timeout the request, not the response, add:
// clearTimeout(timeoutId)
})
Alternative you can use the newly added AbortSignal.timeout(5000)... but it is not well implemented in most browser right now. All green env have this now. You will lose control over manually closing the request. Both upload and download will have to finish within a total time of 5s
// a polyfill for it would be:
AbortSignal.timeout ??= function timeout(ms) {
const ctrl = new AbortController()
setTimeout(() => ctrl.close(), ms)
return ctrl.signal
}
fetch(url, { signal: AbortSignal.timeout(5000) })
AbortController can be used for other things as well, not only fetch but for readable/writable streams as well. More newer functions (specially promise based ones) will use this more and more. NodeJS have also implemented AbortController into its streams/filesystem as well. I know web bluetooth are looking into it also. Now it can also be used with addEventListener option and have it stop listening when the signal ends
Update since my original answer is a bit outdated I recommend using abort controller like implemented here: https://stackoverflow.com/a/57888548/1059828 or take a look at this really good post explaining abort controller with fetch: How do I cancel an HTTP fetch() request?
outdated original answer:
I really like the clean approach from this gist using Promise.race
fetchWithTimeout.js
export default function (url, options, timeout = 7000) {
return Promise.race([
fetch(url, options),
new Promise((_, reject) =>
setTimeout(() => reject(new Error('timeout')), timeout)
)
]);
}
main.js
import fetch from './fetchWithTimeout'
// call as usual or with timeout as 3rd argument
// throw after max 5 seconds timeout error
fetch('http://google.com', options, 5000)
.then((result) => {
// handle result
})
.catch((e) => {
// handle errors and timeout error
})
Edit 1
As pointed out in comments, the code in the original answer keeps running the timer even after the promise is resolved/rejected.
The code below fixes that issue.
function timeout(ms, promise) {
return new Promise((resolve, reject) => {
const timer = setTimeout(() => {
reject(new Error('TIMEOUT'))
}, ms)
promise
.then(value => {
clearTimeout(timer)
resolve(value)
})
.catch(reason => {
clearTimeout(timer)
reject(reason)
})
})
}
Original answer
It doesn't have a specified default; the specification doesn't discuss timeouts at all.
You can implement your own timeout wrapper for promises in general:
// Rough implementation. Untested.
function timeout(ms, promise) {
return new Promise(function(resolve, reject) {
setTimeout(function() {
reject(new Error("timeout"))
}, ms)
promise.then(resolve, reject)
})
}
timeout(1000, fetch('/hello')).then(function(response) {
// process response
}).catch(function(error) {
// might be a timeout error
})
As described in https://github.com/github/fetch/issues/175
Comment by https://github.com/mislav
Building on Endless' excellent answer, I created a helpful utility function.
const fetchTimeout = (url, ms, { signal, ...options } = {}) => {
const controller = new AbortController();
const promise = fetch(url, { signal: controller.signal, ...options });
if (signal) signal.addEventListener("abort", () => controller.abort());
const timeout = setTimeout(() => controller.abort(), ms);
return promise.finally(() => clearTimeout(timeout));
};
If the timeout is reached before the resource is fetched then the fetch is aborted.
If the resource is fetched before the timeout is reached then the timeout is cleared.
If the input signal is aborted then the fetch is aborted and the timeout is cleared.
const controller = new AbortController();
document.querySelector("button.cancel").addEventListener("click", () => controller.abort());
fetchTimeout("example.json", 5000, { signal: controller.signal })
.then(response => response.json())
.then(console.log)
.catch(error => {
if (error.name === "AbortError") {
// fetch aborted either due to timeout or due to user clicking the cancel button
} else {
// network error or json parsing error
}
});
there's no timeout support in the fetch API yet. But it could be achieved by wrapping it in a promise.
for eg.
function fetchWrapper(url, options, timeout) {
return new Promise((resolve, reject) => {
fetch(url, options).then(resolve, reject);
if (timeout) {
const e = new Error("Connection timed out");
setTimeout(reject, timeout, e);
}
});
}
If you haven't configured timeout in your code, It will be the default request timeout of your browser.
1) Firefox - 90 seconds
Type about:config in Firefox URL field. Find the value corresponding to key network.http.connection-timeout
2) Chrome - 300 seconds
Source
EDIT: The fetch request will still be running in the background and will most likely log an error in your console.
Indeed the Promise.race approach is better.
See this link for reference Promise.race()
Race means that all Promises will run at the same time, and the race will stop as soon as one of the promises returns a value.
Therefore, only one value will be returned.
You could also pass a function to call if the fetch times out.
fetchWithTimeout(url, {
method: 'POST',
body: formData,
credentials: 'include',
}, 5000, () => { /* do stuff here */ });
If this piques your interest, a possible implementation would be :
function fetchWithTimeout(url, options, delay, onTimeout) {
const timer = new Promise((resolve) => {
setTimeout(resolve, delay, {
timeout: true,
});
});
return Promise.race([
fetch(url, options),
timer
]).then(response => {
if (response.timeout) {
onTimeout();
}
return response;
});
}
A more clean way to do it is actually in MDN: https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal#aborting_a_fetch_operation_with_a_timeout
try {
await fetch(url, { signal: AbortSignal.timeout(5000) });
} catch (e) {
if (e.name === "TimeoutError") {
console.log('5000 ms timeout');
}
}
Here's a SSCCE using NodeJS which will timeout after 1000ms:
import fetch from 'node-fetch';
const controller = new AbortController();
const timeout = setTimeout(() => {
controller.abort();
}, 1000); // will time out after 1000ms
fetch('https://www.yourexample.com', {
signal: controller.signal,
method: 'POST',
body: formData,
credentials: 'include'
}
)
.then(response => response.json())
.then(json => console.log(json))
.catch(err => {
if(err.name === 'AbortError') {
console.log('Timed out');
}}
)
.finally( () => {
clearTimeout(timeout);
});
Using AbortController and setTimeout;
const abortController = new AbortController();
let timer: number | null = null;
fetch('/get', {
signal: abortController.signal, // Content to abortController
})
.then(res => {
// response success
console.log(res);
if (timer) {
clearTimeout(timer); // clear timer
}
})
.catch(err => {
if (err instanceof DOMException && err.name === 'AbortError') {
// will return a DOMException
return;
}
// other errors
});
timer = setTimeout(() => {
abortController.abort();
}, 1000 * 10); // Abort request in 10s.
This is a fragment in #fatcherjs/middleware-aborter.
By using fatcher, it can easy to abort a fetch request.
import { aborter } from '#fatcherjs/middleware-aborter';
import { fatcher, isAbortError } from 'fatcher';
fatcher({
url: '/bar/foo',
middlewares: [
aborter({
timeout: 10 * 1000, // 10s
onAbort: () => {
console.log('Request is Aborted.');
},
}),
],
})
.then(res => {
// Request success in 10s
console.log(res);
})
.catch(err => {
if (isAbortError(err)) {
//Run error when request aborted.
console.error(err);
}
// Other errors.
});
fetchTimeout (url,options,timeout=3000) {
return new Promise( (resolve, reject) => {
fetch(url, options)
.then(resolve,reject)
setTimeout(reject,timeout);
})
}
You can create a timeoutPromise wrapper
function timeoutPromise(timeout, err, promise) {
return new Promise(function(resolve,reject) {
promise.then(resolve,reject);
setTimeout(reject.bind(null,err), timeout);
});
}
You can then wrap any promise
timeoutPromise(100, new Error('Timed Out!'), fetch(...))
.then(...)
.catch(...)
It won't actually cancel an underlying connection but will allow you to timeout a promise.
Reference
Proper error handling tips
Normal practice:
To add timeout support most of the time it is suggested to introduce a Promise utility function like this:
function fetchWithTimeout(resource, { signal, timeout, ...options } = {}) {
const controller = new AbortController();
if (signal != null) signal.addEventListener("abort", controller.abort);
const id = timeout != null ? setTimeout(controller.abort, timeout) : undefined;
return fetch(resource, {
...options,
signal: controller.signal
}).finally(() => {
if (id != null) clearTimeout(id);
});
}
Calling controller.abort or rejecting the promise inside the setTimeout callback function distorts the stack trace.
This is suboptimal, since one would have to add boilerplate error handlers with log messages in the functions calling the fetch method if post-error log analysis is required.
Good expertise:
To preserve the error along with it's stack trace one can apply the following technique:
function sleep(ms = 0, signal) {
return new Promise((resolve, reject) => {
const id = setTimeout(() => resolve(), ms);
signal?.addEventListener("abort", () => {
clearTimeout(id);
reject();
});
});
}
async function fetch(
resource,
options
) {
const { timeout, signal, ...ropts } = options ?? {};
const controller = new AbortController();
let sleepController;
try {
signal?.addEventListener("abort", () => controller.abort());
const request = nodeFetch(resource, {
...ropts,
signal: controller.signal,
});
if (timeout != null) {
sleepController = new AbortController();
const aborter = sleep(timeout, sleepController.signal);
const race = await Promise.race([aborter, request]);
if (race == null) controller.abort();
}
return request;
} finally {
sleepController?.abort();
}
}
(async () => {
try {
await fetchWithTimeout(new URL(window.location.href), { timeout: 5 });
} catch (error) {
console.error("Error in test", error);
}
})();
Using c-promise2 lib the cancellable fetch with timeout might look like this one (Live jsfiddle demo):
import CPromise from "c-promise2"; // npm package
function fetchWithTimeout(url, {timeout, ...fetchOptions}= {}) {
return new CPromise((resolve, reject, {signal}) => {
fetch(url, {...fetchOptions, signal}).then(resolve, reject)
}, timeout)
}
const chain = fetchWithTimeout("https://run.mocky.io/v3/753aa609-65ae-4109-8f83-9cfe365290f0?mocky-delay=10s", {timeout: 5000})
.then(request=> console.log('done'));
// chain.cancel(); - to abort the request before the timeout
This code as a npm package cp-fetch
I've been trying to find a wrapper that does fetch with retries, timeouts, aborts, etc. I came across https://pastebin.com/54Ct4xEh a little bit ago, and after fixing a couple typos (missing options. and =>), it works, except... well, maybe it works, but I don't know how to use it. How do I abort a fetch with this particular wrapper? I have a fiddle, https://jsfiddle.net/1fdwb2o6/2/. With this code, how can I, say, click a button and have it abort this fetch loop? For my use case, I' using boopstrap, and I have a modal that, when shown, attempts to load dynamic content. If the user clicks Cancel while it's loading, I want the fetch process to stop. From what I can tell, I should be able to do it with the code below... but I'm not sure how to perform the abort. Perhaps this isn't possible, as structured, with a Promise... but I don't know enough (anything) about promises to know better, one way or the other.
const fetchWithRetry = (userOptions) => {
let abort = false;
const options = {
url: '',
options: {},
cancel: {},
retries: 5,
retryDelay: 1000,
...userOptions
};
// Add an abort to the cancel object.
options.cancel.abort = () => {
abort = true;
};
// Abort or proceed?
return abort ? Promise.reject('aborted') : fetch(options.url).then(response => {
// Reject because of abort
return abort ? Promise.reject('aborted')
// Response is good
: response.ok ? Promise.resolve(response.text())
// Retries exceeded
: !options.retries ? Promise.reject('retries exceeded')
// Retry with one less retry
: new Promise((resolve, reject) => {
setTimeout(() => {
// We use the returned promise's resolve and reject as
// callback so that the nested call propagates backwards.
fetchWithRetry({
...options,
retries: options.retries - 1
}).then(resolve, reject);
}, options.retryDelay);
});
});
}
var xxx;
console.clear();
xxx = fetchWithRetry({
url: "some_file_that_doesnt_exist.php"
})
.then((response) => {
alert(response);
}).catch(function(err) {
// Error: response error, request timeout or runtime error
alert("Error! Cannot load folder list! Please try again!");
});
setTimeout(function() {
// somehow, abort the fetch...
// xxx.abort(); <-- no worky...
}, 1234);
As I said in my comments, the code you have in your question does not provide a cancel() function that the caller can use. It has a cancel() function internally, but that's not something the caller can use. As written that function just returns a promise so the caller has nothing they can call to cancel the retries.
So, I decided to write my own version of fetchWithRetry() that would work for your use case. This has a number of capabilities that the one in your question does not:
It returns both the promise and a cancel function so the caller can cancel the retries.
It allows you to pass the init options for fetch() so you can pass any of the various arguments that fetch() supports and are often needed such as withCredentials.
It has an option to check the response.ok boolean so it will detect and retry more things that you would if you required the promise to be rejected before a retry (note: fetch() doesn't reject on a 404, for example).
If There was a fetch() rejection and it was either cancelled or it ran out of retries, then it will use the newest Error class feature where it will set the cause to the actual fetch() error so the caller can see what the original error was.
Note that this version of fetchWithRetry() returns an object containing both a promise and a cancel function. The caller uses the promise the same way they would any promise from fetch() and they can use the cancel() function to cancel any further retries.
Here's the code:
const Deferred = function() {
if (!(this instanceof Deferred)) {
return new Deferred();
}
const p = this.promise = new Promise((resolve, reject) => {
this.resolve = resolve;
this.reject = reject;
});
this.then = p.then.bind(p);
this.catch = p.catch.bind(p);
if (p.finally) {
this.finally = p.finally.bind(p);
}
}
function fetchWithRetry(url, userOptions = {}, init = {}) {
const options = {
// default options values, can be overridden by userOptions
retries: 3,
retryDelay: 1000,
checkResponseOk: true,
...userOptions
};
let cancelled = false;
let timerDeferred;
let timer;
function run() {
return fetch(url, init).then(response => {
// force retry on non 2xx responses too
if (options.checkResponseOk && !response.ok) {
throw new Error(`fetch failed with status ${response.status}`);
}
return response;
}).catch(err => {
// got error, set up retry
console.log(err);
if (cancelled) {
throw new Error("fetch cancelled", { cause: err });
}
--options.retries;
if (options.retries < 0) {
throw new Error("fetch max retries exceeded", { cause: err });
}
// create new Deferred object for use with our timer
// so it can be resolved by the timer or rejected
// by the cancel callback
timerDeferred = new Deferred();
timer = setTimeout(() => {
timerDeferred.resolve();
timer = null;
}, options.retryDelay);
return timerDeferred.then(() => {
if (cancelled) {
throw new Error("fetch cancelled", { cause: err });
}
return run();
});
});
}
return {
promise: run(),
cancel: () => {
cancelled = true;
// if currently in a timer waiting, reject immediately
if (timer) {
clearTimeout(timer);
timer = null;
}
if (timerDeferred) {
timerDeferred.reject(new Error("fetch cancelled"));
}
}
}
};
Sample usage:
const result = fetchWithRetry(someUrl);
result.promise.then(resp => {
return resp.text().then(data => {
// got final result here
console.log(data.slice(0, 100));
});
}).catch(err => {
console.log(err);
});
// simulate user cancel after 1.5 seconds
setTimeout(() => {
result.cancel();
}, 1500);
I'm trying to implement dummy emulation of following logic:
But I'm not sure if I fully understand best practices of how to do it.
The main point of this task is to avoid triggering of redundant catch blocks callbacks. IMO if 1st request failed then all following code should stop.
I mean: if 1st request was failed, then we do not make 2nd request and do not call catch block of 2nd request promise.
In a few words I'm looking for very clean and simple solution like following:
firstRequest()
.then(r => {
console.log('firstRequest success', r);
return secondRequest();
}, e => console.log('firstRequest fail', e))
.then(r => {
console.log('secondRequest success', r);
// Should I return something here? Why?
}, e => console.log('secondRequest fail', e));
I've written following implementation. It works as expected in case of both requests are succeeded, and if 2nd request fails. But it works wrong if 1st request is failed (as you can see both catch block are triggering). You can play around with isFirstSucceed and isSecondSucceed flags to check it.
var ms = 1000;
var isFirstSucceed = false;
var isSecondSucceed = true;
var getUsersId = () => new Promise((res, rej) => {
console.log('request getUsersId');
setTimeout(() => {
if (isFirstSucceed) {
return res([1,2,3,4,5]);
} else {
return rej(new Error());
}
}, ms);
});
var getUserById = () => new Promise((res, rej) => {
console.log('request getUserById');
setTimeout(() => {
if (isSecondSucceed) {
return res({name: 'John'});
} else {
return rej(new Error());
}
}, ms);
});
getUsersId()
.then(r => {
console.info('1st request succeed', r);
return getUserById();
}, e => {
console.error('1st request failed', e);
throw e;
})
.then(
r => console.info('2nd request succeed', r),
e => {
console.error('2nd request failed', e);
throw e;
});
I can move then of 2nd request to then of 1st request but it looks ugly.
var ms = 1000;
var isFirstSucceed = false;
var isSecondSucceed = true;
var getUsersId = () => new Promise((res, rej) => {
console.log('request getUsersId');
setTimeout(() => {
if (isFirstSucceed) {
return res([1,2,3,4,5]);
} else {
return rej(new Error());
}
}, ms);
});
var getUserById = () => new Promise((res, rej) => {
console.log('request getUserById');
setTimeout(() => {
if (isSecondSucceed) {
return res({name: 'John'});
} else {
return rej(new Error());
}
}, ms);
});
getUsersId()
.then(r => {
console.info('1st request succeed', r);
getUserById().then(
r => console.info('2nd request succeed', r),
e => {
console.error('2nd request failed', e);
throw e;
});
}, e => {
console.error('1st request failed', e);
throw e;
})
Questions:
How to implement described logic according to all promises best practices?
Is it possible to avoid throw e in every catch block?
Should I use es6 Promises? Or it is better to use some promises library?
Any other advices?
Your flow diagram is the logic you want to achieve, but it isn't quite how promises work. The issue is that there is no way to tell a promise chain to just "end" right here and don't call any other .then() or .catch() handlers later in the chain. If you get a reject in the chain and leave it rejected, it will call the next .catch() handler in the chain. If you handle the rejection locally and don't rethrow it, then it will call the next .then() handler in the chain. Neither of those options matches your logic diagram exactly.
So, you have to mentally change how you think about your logic diagram so that you can use a promise chain.
The simplest option (what is probably used for 90% of promise chains) is to just put one error handler at the end of the chain. Any error anywhere in the chain just skips to the single .catch() handler at the end of the chain. FYI, in most cases, I find the code more readable with .catch() than the 2nd argument to .then() so that's how I've shown it here
firstRequest().then(secondRequest).then(r => {
console.log('both requests successful');
}).catch(err => {
// An error from either request will show here
console.log(err);
});
When you provide a catch block and you don't either return a rejected promise or rethrow the error, then the promise infrastructure thinks you have "handled" the promise so the chain continues as resolved. If you rethrow the error, then the next catch block will fire and any intervening .then() handlers will be skipped.
You can make use of that to catch an error locally, do something (like log it) and then rethrow it to keep the promise chain as rejected.
firstRequest().catch(e => {
console.log('firstRequest fail', e));
e.logged = true;
throw e;
}).then(r => {
console.log('firstRequest success', r);
return secondRequest();
}).then(r => {
console.log('secondRequest success', r);
}).catch(e => {
if (!e.logged) {
console.log('secondRequest fail', e));
}
});
Or, a version that marks the error object with a debug message and then rethrows and can then only logs errors in one place:
firstRequest().catch(e => {
e.debugMsg = 'firstRequest fail';
throw e;
}).then(r => {
console.log('firstRequest success', r);
return secondRequest().catch(e => {
e.debugMsg = 'secondRequest fail';
throw e;
});
}).then(r => {
console.log('secondRequest success', r);
}).catch(e => {
console.log(e.debugMsg, e);
});
I've even had situations where a little helper function saved me some code and some visual complexity, particularly if there are a bunch of these in the chain:
function markErr(debugMsg) {
return function(e) {
// mark the error object and rethrow
e.debugMsg = debugMsg;
throw e;
}
}
firstRequest()
.catch(markErr('firstRequest fail'))
.then(r => {
console.log('firstRequest success', r);
return secondRequest().catch(markErr('secondRequest fail'));
}).then(r => {
console.log('secondRequest success', r);
}).catch(e => {
console.log(e.debugMsg, e);
});
Taking each of your questions individually:
How to implement described logic according to all promises best practices?
Described above. I'd say the simplest and best practice is the very first code block I show. If you need to make sure when you get to the final .catch() that you have a uniquely identifiable error so you know which step caused it, then modify the rejected error in each individual function to be unique so you can tell which it was from the one .catch() block at the end. If you can't modify those functions, then you can wrap them with a wrapper that catches and marks their error or you can do that inline with the markErr() type solution I showed. In most cases, you just need to know there was an error and not the exact step it occurred in so usually that isn't necessary for every step in the chain.
Is it possible to avoid throw e in every catch block?
That depends. If the error objects are already unique, then you can just use one .catch() at the end. If the error objects are not unique, but you need to know which exact step failed, then you have to either use a .catch() at each step so you can mark the error uniquely or you need to modify each function in the chain to have a unique error.
Should I use es6 Promises?
Yes. No better way I know of.
Or it is better to use some promises library?
I'm not aware of any features in a promise library that would make this simpler. This is really just about how you want to report errors and whether each step is defining a unique error or not. A promise library can't really do that for you.
Any other advice?
Keep learning more about how to mold promises into a solution for each individual problem.
IMO, you can use async/await... Still, with promises but is much cleaner to look at. Here is my sample approach on above logic.
function firstRequest() {
return new Promise((resolve, reject) => {
// add async function here
// and resolve("done")/reject("err")
});
}
function secondRequest() {
return new Promise((resolve, reject) => {
// add async function here
// and resolve("done")/reject("err")
});
}
async function startProgram() {
try {
await firstRequest();
await secondRequest();
} catch(err) {
console.log(err);
goToEndFn();
}
}
startProgram(); // start the program
https://github.com/xobotyi/await-of
$ npm i --save await-of
import of from "await-of";
async () => {
let [res1, err1] = await of(axios.get('some.uri/to/get11'));
let [res2, err2] = await of(axios.get('some.uri/to/get22'));
if (err1) {
console.log('err1', err1)
}
if (err2) {
console.log('err2', err2)
}
console.log('res1', res1)
console.log('res2', res2)
};
Async/await maybe?
async function foo() {
try {
const firstResult = await firstRequest();
const secondResult = await secondRequest();
} catch(e) {
// e = either first or second error
}
}
In this code an error on the first request transfers control to the catch block and the second request won't start
Should I use es6 Promises?
Probably yes, until you're pretty sure your code will be used in obsolete environments. They are already not so new and flashy
you do not need handle error for every promise
you need handle error only as common error
do like this:
var ms = 1000;
var isFirstSucceed = false;
var isSecondSucceed = true;
var getUsersId = () => new Promise((res, rej) => {
console.log('request getUsersId');
setTimeout(() => {
if (isFirstSucceed) {
return res([1,2,3,4,5]);
} else {
return rej();
}
}, ms);
});
var getUserById = () => new Promise((res, rej) => {
console.log('request getUserById');
setTimeout(() => {
if (isSecondSucceed) {
return res({name: 'John'});
} else {
return rej(new Error());
}
}, ms);
});
getUsersId()
.then(r => {
console.info('1st request succeed', r);
return getUserById();
})
.then(r => {
console.info('2st request succeed', r);
return;
})
.catch((e) => {
console.error('request failed', e);
throw new Error(e);
})
You can abuse duck-typing technique to stop promise chain with return { then: function() {} };. I modified your code just right after this line console.error('1st request failed', e);
var ms = 1000;
var isFirstSucceed = false;
var isSecondSucceed = true;
var getUsersId = () => new Promise((res, rej) => {
console.log('request getUsersId');
setTimeout(() => {
if (isFirstSucceed) {
return res([1,2,3,4,5]);
} else {
return rej(new Error());
}
}, ms);
});
var getUserById = () => new Promise((res, rej) => {
console.log('request getUserById');
setTimeout(() => {
if (isSecondSucceed) {
return res({name: 'John'});
} else {
return rej(new Error());
}
}, ms);
});
getUsersId()
.then(r => {
console.info('1st request succeed', r);
return getUserById();
}, e => {
console.error('1st request failed', e);
return { then: function() {} };
})
.then(
r => console.info('2nd request succeed', r),
e => {
console.error('2nd request failed', e);
throw e;
});
in ruby I can:
require 'timeout'
Timeout.timeout 10 do
# do smth > 10 seconds
end
it will raise timeout error to avoid code lock, how to do same thing in nodejs, nodejs #setTimeout doesn't fit my need
one case is, when i http.get timeout(for ex, netowrk is unstable), I should set timeout and handle the failed get request, I hope impl #timeout, how should i do?
try {
timeout(10, function () {
http.get("example.com/prpr")
})
} catch (e) {
if (e.message == "timeout") {
// do smth
} else {
throw e
}
}
You could look into a Promise-based approach here.
Using promises you can pass a function to be executed, and then the standard catch is called if that function raises an exception.
There is a helpful promise-based timeout library on NPM (npm install promise-timeout request-promise), and you could use it in Node something along the lines of...
'use strict';
var promiseTimeout = require('promise-timeout');
var requestPromise = require('request-promise');
promiseTimeout.timeout(requestPromise("http://example.com/prpr"), 10000)
.then(function (result) {
console.log({result});
}).catch(function (err) {
if (err instanceof pt.TimeoutError) {
console.error('HTTP get timed out');
}
});
I had a similar situation with nestJS based on node.js.
When calling an external API, it was a problem that even my service slowed down if it took too long. (If the external api is delayed, my service also had a problem of waiting forever.)
I figured out 2 ways.
First way:
const result = await axios({
timeout: 10000, // error: [AxiosError: timeout of 10000ms exceeded] { code: 'ECONNABORTED', ...
...
});
Second way: Promise.race()
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/race
// first function
const callAPI = axios({
method: "GET",
url: "http://yourapi",
headers: {
...
}
});
// second function
const timeoutCheck = (s) => {
return new Promise(resolve => setTimeout(resolve, s));
}
// check delay (first function VS second function)
const result = await Promise.race([
callAPI,
timeoutCheck(10000).then(() => {
throw new Error("api not responding for more than 10 seconds");
}),
]);
const { data: { resultCode, resultData } } = result;
You can try this out in your case:
var request = http.get(options, function (res) {
// other code goes here
});
request.setTimeout( 10000, function( ) {
// handle timeout here
});
I'm using browser's native fetch API for network requests. Also I am using the whatwg-fetch polyfill for unsupported browsers.
However I need to retry in case the request fails. Now there is this npm package whatwg-fetch-retry I found, but they haven't explained how to use it in their docs. Can somebody help me with this or suggest me an alternative?
From the fetch docs :
fetch('/users')
.then(checkStatus)
.then(parseJSON)
.then(function(data) {
console.log('succeeded', data)
}).catch(function(error) {
console.log('request failed', error)
})
See that catch? Will trigger when fetch fails, you can fetch again there.
Have a look at the Promise API.
Implementation example:
function wait(delay){
return new Promise((resolve) => setTimeout(resolve, delay));
}
function fetchRetry(url, delay, tries, fetchOptions = {}) {
function onError(err){
triesLeft = tries - 1;
if(!triesLeft){
throw err;
}
return wait(delay).then(() => fetchRetry(url, delay, triesLeft, fetchOptions));
}
return fetch(url,fetchOptions).catch(onError);
}
Edit 1: as suggested by golopot, p-retry is a nice option.
Edit 2: simplified example code.
I recommend using some library for promise retry, for example p-retry.
Example:
const pRetry = require('p-retry')
const fetch = require('node-fetch')
async function fetchPage () {
const response = await fetch('https://stackoverflow.com')
// Abort retrying if the resource doesn't exist
if (response.status === 404) {
throw new pRetry.AbortError(response.statusText)
}
return response.blob()
}
;(async () => {
console.log(await pRetry(fetchPage, {retries: 5}))
})()
I don't like recursion unless is really necessary. And managing an exploding number of dependencies is also an issue. Here is another alternative in typescript. Which is easy to translate to javascript.
interface retryPromiseOptions<T> {
retryCatchIf?:(response:T) => boolean,
retryIf?:(response:T) => boolean,
retries?:number
}
function retryPromise<T>(promise:() => Promise<T>, options:retryPromiseOptions<T>) {
const { retryIf = (_:T) => false, retryCatchIf= (_:T) => true, retries = 1} = options
let _promise = promise();
for (var i = 1; i < retries; i++)
_promise = _promise.catch((value) => retryCatchIf(value) ? promise() : Promise.reject(value))
.then((value) => retryIf(value) ? promise() : Promise.reject(value));
return _promise;
}
And use it this way...
retryPromise(() => fetch(url),{
retryIf: (response:Response) => true, // you could check before trying again
retries: 5
}).then( ... my favorite things ... )
I wrote this for the fetch API on the browser. Which does not issue a reject on a 500. And did I did not implement a wait. But, more importantly, the code shows how to use composition with promises to avoid recursion.
Javascript version:
function retryPromise(promise, options) {
const { retryIf, retryCatchIf, retries } = { retryIf: () => false, retryCatchIf: () => true, retries: 1, ...options};
let _promise = promise();
for (var i = 1; i < retries; i++)
_promise = _promise.catch((value) => retryCatchIf(value) ? promise() : Promise.reject(value))
.then((value) => retryIf(value) ? promise() : Promise.reject(value));
return _promise;
}
Javascript usage:
retryPromise(() => fetch(url),{
retryIf: (response) => true, // you could check before trying again
retries: 5
}).then( ... my favorite things ... )
EDITS: Added js version, added retryCatchIf, fixed the loop start.
One can easily wrap fetch(...) in a loop and catch potential errors (fetch only rejects the returning promise on network errors and the alike):
const RETRY_COUNT = 5;
async function fetchRetry(...args) {
let count = RETRY_COUNT;
while(count > 0) {
try {
return await fetch(...args);
} catch(error) {
// logging ?
}
// logging / waiting?
count -= 1;
}
throw new Error(`Too many retries`);
}