Resend HTTP post before error timeout - javascript

I have an Angular 4 project that uses a http post to commands across to our backend. The issue is that sometimes a command can be sent out before the backend is fully up and running. Normally an "ERR_CONNECTION_TIME_OUT" would occur, but our embedded browser for whatever reason holds onto the post for an extremely long time before giving us the error (5 minutes). Since a 5 minute wait is unacceptable, I need to come up with a way to re-send our http post if there isn't a response within 15-30~ seconds
Here is what the current post looks like
this._http.post(this.sockclientURL, body, { headers: headers })
.subscribe((res) => {
let text = res.text();
if (text.startsWith("ERROR")) {
console.log("Sockclient Error.");
if (this.sockclientErrorRetryCount < this.sockclientErrorRetryLimit) {
console.log("Retrying in 3 seconds.");
this.sockclientErrorRetryCount++;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, 3000);
}
return;
}
else {
this.sockclientErrorRetryCount = 0;
}
if (text == "N" || text.startsWith("N ")) {
this._modalService.alert(this._nackLookup.convert(text));
if (typeof fail == 'function') {
fail(text);
}
}
else {
let deserializedCommand = command.deserialize(text);
success(deserializedCommand);
let repeatMillis: number = deserializedCommand.getRepeatMillis();
if (repeatMillis && repeatMillis > 0) {
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
}
}
},
(err) => {
console.log(err);
let repeatMillis = 1000;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
});
So to re-cap, I have some code in place to re-attempt the command if an error occurs, but our embedded browser holds onto its timeout error for several minutes. I need something to attempt to re-send after 15-30 seconds of no response

Retries are immediately executed without waiting for a delay. A better one consists of waiting for a bit before retrying and abort after a given amount of time. Observables allow to mix retryWhen, delay and timeout operators to achieve this, as described in the following snippet:
this._http.post(this.sockclientURL, body, { headers: headers })
.retryWhen(error => error.delay(500))
.timeout(2000, new Error('delay exceeded'))
.map(res => res.map());

Not 100% sure it still works in Angular4, but you should be able to do:
this
._http
.post(this.sockclientURL, body, { headers: headers })
.timeout(15000, new Error('timeout exceeded')) // or 30000
.subscribe((res) => { /* ... */ })

A little more information would be needed as to the entire scope of the application, however when I get myself in to situations like this I normally look at the following avenues of approach:
Will a try / catch solve my problem.
In your catch, you could redirect elsewhere. If you don't catch anything
often times you will get a bit of lag.
Is there a way to avoid the error all together through user constraints.
Lastly, You may want to use setInterval() over setTimeout().
Can you provide more information as to the scope of the operation?

Related

Javascript implicit timeout?

I need to send a lot of requests to a server whose flow limits are unclear. To combat this, I decided to send requests repeatedly until I receive a successful status - using he following code:
function sendRequest(url, body, isGood) {
return new Promise((resolve, reject) => fetch(url, body).then(
resp => {
if(isGood(resp)){
resolve(resp);
} else {
reject(resp.status)
}
}).catch(err => reject(err)))
}
function recursionBypass(func, ...args){
return func(...args)
}
function requestUntilSucceed(url, body, isGood, name, attempt=1) {
return new Promise((resolve, reject) => {
sendRequest(url, body, isGood)
.catch((status)=>{
recursionBypass(requestUntilSucceed, url, body, isGood, name, attempt+1)})
.then((resp) => resolve(attempt))
})
}
Without including details about the server in question, I wrote a quick test for this (create & wait for 500 requests):
function () {
let promises = [];
for(let i=0; i<300; i++) {
promises.push(new Promise((resolve,reject) =>
{createRequest(requestArg, i)
.then((attempt) => {console.log("Request: ", i, "succeeded on attempt: " + attempt); resolve(attempt)})
}
)
)
};
return Promise.all(promises)
})
await function()
Assume:
requestArg: a global variable (the same arg is used for all
requests for this test)
createRequest: a function that creates the
url, body and isGood callback then calls sendRequest that builds
the url and body based on requestArg and returns as follows:
return requestUntilSucceed(url, body, isGood, name)
I've observed that about 200 requests are accepted immediately and the others keep retrying. However, at some point, the program exits without finishing all requests successfully. Since there is no mechanism in place to limit the number of tries explicitly, I'm wondering how this can happen. I suspected it had to do with the limit on recursion depth so I started using recursionBypass as above so the interpreter can't tell the function is calling itself, but this didn't make a difference. Any ideas as to why it would exit early ? I know the server accepts requests until some count, then rejects them for a cooldown period of ~1min before beginning to accept again. The rate of acceptance after each cooldown period is ambiguous which is why I can't write a rule-based throttler. The issue is that my program exists (no error printed to console) before the minute is up. A lot of attempts are made in that minute so maybe I'm still surpassing some kind of javascript limit I'm not familiar with ?

AbortController.abort(reason), but the reason gets lost before it arrives to the fetch catch clause

I am implementing abortable fetch calls.
There are basically two reasons for aborting the fetch on my page:
the user decides he/she does not want to wait for the AJAX data anymore and clicks a button; in this case the UI shows a message "call /whatever interrupted"
the user has moved to another part of the page and the data being fetched are no longer needed; in this case I don't want the UI to show anything, as it'd just confuse the user
In order to discriminate the two cases I was planning to use the reason parameter of the AbortController.abort method, but the .catch clause in my fetch call always receives a DOMException('The user aborted a request', ABORT_ERROR).
I have tried to provide a different DOMException as reason for the abort in case 2, but the difference is lost.
Has anyone found how to send information to the fetch .catch clause with regards to the reason to abort?
In the example below, I demonstrate how to determine the reason for an abortion of a fetch request. I provide inline comments for explanation. Feel free to comment if anything is unclear.
Re-run the code snippet to see a (potentially different) random result
'use strict';
function delay (ms, value) {
return new Promise(res => setTimeout(() => res(value), ms));
}
function getRandomInt (min = 0, max = 1) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}
// Forward the AbortSignal to fetch:
// https://docs.github.com/en/rest/repos/repos#list-public-repositories
function fetchPublicGHRepos (signal) {
const headers = new Headers([['accept', 'application/vnd.github+json']]);
return fetch('https://api.github.com/repositories', {headers, signal});
}
function example () {
const ac = new AbortController();
const {signal} = ac;
const abortWithReason = (reason) => delay(getRandomInt(1, 5))
.then(() => {
console.log(`Aborting ${signal.aborted ? 'again ' : ''}(reason: ${reason})`);
ac.abort(reason);
});
// Unless GitHub invests HEAVILY into our internet infrastructure,
// one of these promises will resolve before the fetch request
abortWithReason('Reason A');
abortWithReason('Reason B');
fetchPublicGHRepos(signal)
.then(res => console.log(`Fetch succeeded with status: ${res.status}`))
.catch(ex => {
// This is how you can determine if the exception was due to abortion
if (signal.aborted) {
// This is set by the promise which resolved first
// and caused the fetch to abort
const {reason} = signal;
// Use it to guide your logic...
console.log(`Fetch aborted with reason: ${reason}`);
}
else console.log(`Fetch failed with exception: ${ex}`);
});
delay(10).then(() => console.log(`Signal reason: ${signal.reason}`));
}
example();

AngularJS $q.all - wait between http calls

So I have a situation where I need to perform a bunch of http calls, then once they are complete, continue on to the next step in the process.
Below is the code which does this and works fine.
However, I now need to wait a few seconds between each of the http calls. Is there a way to pass in a timeout with my current set up, or will it involve a good bit of refactoring?
Can post more code if needs be. I have tried passing in a timeout config varable into the http call, however, they still get fired at the same time.
Any advice would be great.
Code
var allThings = array.map(function(object) {
var singleThingPromise = getFile(object.id);
return singleThingPromise;
});
$q.all(allThings).then(function() {
deferred.resolve('Finished');
}, function(error) {
deferred.reject(error);
});
Instead of using $q.all, you might want to perform sequential calls one on success of previous and probably with use of $timeout. Maybe you could build a recursive function.
Something like this..
function performSequentialCalls (index) {
if(angular.isUndefined(array[index])) {
return;
}
getFile(array[index].id).then(function() {
$timeout(function() {
performSequentialCalls(index + 1)
}, 1000) // waiting 1 sec after each call
})
}
Inject required stuff properly. This assumes array to contain objects with ids using which you perform API calls. Also assumes that you are using $http. If using $resource, add $promise accordingly.
Hope that helps a bit!
function getItemsWithDelay(index) {
getFile(object[index].id).then(()=>{
setTimeout(()=>{
if(index+1 > object.length) { return }
getItemsWithDelay(index+1)
}, 5000)
})
}
You can make sequential calls
This is a awesome trick question to be asked in an interview, anyways I had a similar requirement and did some research on the internet and thanks to reference https://codehandbook.org/understanding-settimeout-inside-for-loop-in-javascript
I was able to delay all promise call in angularjs and the same can be applied in normal JS syntax as well.
I need to send tasks to a TTP API, and they requested to add a delay in each call
_sendTasks: function(taskMeta) {
var defer = $q.defer();
var promiseArray = [];
const delayIncrement = 1000 * 5;
let delay = 0;
for (i = 0; i < taskMeta.length; i++) {
// using 'let' keyword is VERY IMPORTANT else 'var' will send the same task in all http calls
let requestTask = {
"action": "SOME_ACTION",
"userId": '',
"sessionId": '',
};
// new Promise can be replaced with $q - you can try that, I haven't test it although.
promiseArray.push(new Promise(() => setTimeout(() => $http.post(config.API_ROOT_URL + '/' + requestTask.action, requestTask), delay)));
delay += delayIncrement;
}
$q.all(promiseArray).
then(function(results) {
// handle the results and resolve it at the end
defer.resolve(allResponses);
})
.catch(error => {
console.log(error);
defer.reject("failed to execute");
});
return defer.promise;
}
Note:: using 'let' keyword in FOR loop is VERY IMPORTANT else 'var' will send the same task in all http calls - due to closure/context getting switched

jQuery/AJAX set timeout when rate limit of 3rd party API is reached

In my app I make several nested AJAX calls to the LiquidPlanner API that limits requests to 30 requests every 15 seconds. When I hit the limit, I want to set a timeout of sorts to stop sending requests to the API until the 15 seconds have elapsed. This (at the moment) will only be used by one person ever, so multiple clients are not a concern.
Upon hitting the rate limit the response is:
{
"type":"Error",
"error":"Throttled",
"message":"34 requests, exceeds limit of 30 in 15 seconds. Try again in 7 seconds, or contact support#liquidplanner.com"
}
Here is some code, simplified for brevity:
$.getJSON('/dashboard/tasks/123, function(tasks) {
$.each(tasks, function(t, task) {
$.getJSON('/dashboard/project/987, function(project) {
$.getJSON('/dashboard/checklist-items/382983, function(checklist-items) {
// form some html here
});
});
});
});
So at any point in this process I could hit the limit and need to wait until the timeout has completed.
I am also open to suggestions to better form the requests instead of nesting them.
Another solution that probably prevents hammering better is a queue - however you need to be aware that the order of requests could be significantly different using this method. And that only one request will ever run at a time (so total response times may increase significantly depending on the use case).
//Keep track of queue
var queue = [];
//Keep track of last failed request
var failed_request = false;
function do_request(url, callback) {
//Just add to queue
queue.push({
url:url,
callback:callback
});
//If the queue was empty send it off
if (queue.length === 1) attempt_fetch();
}
function attempt_fetch() {
//If nothing to do just return
if (queue.length === 0 && failed_request === false) return;
//Get the url and callback from the failed request if any,
var parms;
if (failed_request !== false) {
parms = failed_request;
} else {
//otherwise first queue element
parms = queue.shift();
}
//Do request
$.getJSON(parms.url, function(response) {
//Detect throttling
if (response.type === 'error' && response.error === 'throttled') {
//Store the request
failed_request = parms;
//Call self in 15 seconds
setTimeout(function(){
attempt_fetch();
}, 15000);
} else {
//Request went fine, let the next call pick from the queue
failed_request = false;
//Do your stuff
parms.callback(response);
//And send the next request
attempt_fetch();
}
}
}
...your logic still remains largely unchanged:
do_request('/dashboard/tasks/123', function(tasks) {
$.each(tasks, function(t, task) {
do_request('/dashboard/project/987', function(project) {
do_request('/dashboard/checklist-items/382983', function(checklist_items) {
// form some html here
});
});
});
});
Disclaimer: Still completely untested.
As far as design patterns for chaining multiple requests, take a look at the chaining section in the following article: http://davidwalsh.name/write-javascript-promises . Basically, you could create a service that exposes a method for each type of request, which returns the promise object and then chain them together as needed.
As far as you question about setting a timeout, given the information you provided, it is a bit difficult to advice you on it, but if that is absolutely all we have, I would create a request queue ( a simple array that allows you to push new requests at the end and pop the from the head ). I would then execute the known requests in order and inspect the response. If the response was a timeout error, set a timeout flag that the request executor would honor, and if successful, either queue additional requests or create the html output. This is probably a pretty bad design, but is all I can offer given the information you provided.
Write a wrapper that will detect the rate-limited response:
//Keep track of state
var is_throttled = false;
function my_wrapper(url, callback) {
//No need to try right now if already throttled
if (is_throttled) {
//Just call self in 15 seconds time
setTimeout(function(){
return my_wrapper(url, callback);
}, 15000);
}
//Get your stuff
$.getJSON(url, function(response) {
//Detect throttling
if (response.type === 'error' && response.error === 'throttled') {
/**
* Let "others" know that we are throttled - the each-loop
* (probably) makes this necessary, as it may send off
* multiple requests at once... If there's more than a couple
* you will probably need to find a way to also delay those,
* otherwise you'll be hammering the server before realizing
* that you are being limited
*/
is_throttled = true
//Call self in 15 seconds
setTimeout(function(){
//Throttling is (hopefully) over now
is_throttled = false;
return my_wrapper(url, callback);
}, 15000);
} else {
//If not throttled, just call the callback with the data we have
callback(response);
}
}
}
Then you should be able to rewrite your logic to:
my_wrapper('/dashboard/tasks/123', function(tasks) {
$.each(tasks, function(t, task) {
my_wrapper('/dashboard/project/987', function(project) {
my_wrapper('/dashboard/checklist-items/382983', function(checklist_items) {
// form some html here
});
});
});
});
Disclaimer: Totally untested - my main concern is the scope of the url and callback... But it's probably easier for you to test.

Using setInterval() to do simplistic continuous polling

For a simple web app that needs to refresh parts of data presented to the user in set intervals, are there any downsides to just using setInterval() to get a JSON from an endpoint instead of using a proper polling framework?
For the sake of an example, let's say I'm refreshing the status of a processing job every 5 seconds.
From my comment:
I would use setTimeout [docs] and always call it when the previous response was received. This way you avoid possible congestion or function stacking or whatever you want to call it, in case a request/response takes longer than your interval.
So something like this:
function refresh() {
// make Ajax call here, inside the callback call:
setTimeout(refresh, 5000);
// ...
}
// initial call, or just call refresh directly
setTimeout(refresh, 5000);
A simple non-blocking poll function can be implemented in recent browsers using Promises:
var sleep = duration => new Promise(resolve => setTimeout(resolve, duration))
var poll = (promiseFn, duration) => promiseFn().then(
sleep(duration).then(() => poll(promiseFn, duration)))
// Greet the World every second
poll(() => new Promise(() => console.log('Hello World!')), 1000)
You can do just like this:
var i = 0, loop_length = 50, loop_speed = 100;
function loop(){
i+= 1;
/* Here is your code. Balabala...*/
if (i===loop_length) clearInterval(handler);
}
var handler = setInterval(loop, loop_speed);
Just modify #bschlueter's answer, and yes, you can cancel this poll function by calling cancelCallback()
let cancelCallback = () => {};
var sleep = (period) => {
return new Promise((resolve) => {
cancelCallback = () => {
console.log("Canceling...");
// send cancel message...
return resolve('Canceled');
}
setTimeout(() => {
resolve("tick");
}, period)
})
}
var poll = (promiseFn, period, timeout) => promiseFn().then(() => {
let asleep = async(period) => {
let respond = await sleep(period);
// if you need to do something as soon as sleep finished
console.log("sleep just finished, do something...");
return respond;
}
// just check if cancelCallback is empty function,
// if yes, set a time out to run cancelCallback()
if (cancelCallback.toString() === "() => {}") {
console.log("set timout to run cancelCallback()")
setTimeout(() => {
cancelCallback()
}, timeout);
}
asleep(period).then((respond) => {
// check if sleep canceled, if not, continue to poll
if (respond !== 'Canceled') {
poll(promiseFn, period);
} else {
console.log(respond);
}
})
// do something1...
console.log("do something1...");
})
poll(() => new Promise((resolve) => {
console.log('Hello World!');
resolve(); //you need resolve to jump into .then()
}), 3000, 10000);
// do something2...
console.log("do something2....")
I know this is an old question but I stumbled over it, and in the StackOverflow way of doing things I thought I might improve it. You might want to consider a solution similar to what's described here which is known as long polling. OR another solution is WebSockets (one of the better implementations of websockets with the primary objective of working on all browsers) socket.io.
The first solution is basically summarized as you send a single AJAX request and wait for a response before sending an additional one, then once the response has been delivered, queue up the next query.
Meanwhile, on the backend you don't return a response until the status changes. So, in your scenario, you would utilize a while loop that would continue until the status changed, then return the changed status to the page. I really like this solution. As the answer linked above indicates, this is what facebook does (or at least has done in the past).
socket.io is basically the jQuery of Websockets, so that whichever browser your users are in you can establish a socket connection that can push data to the page (without polling at all). This is closer to a Blackberry's instant notifications, which - if you're going for instant, it's the best solution.

Categories

Resources