I need to send a lot of requests to a server whose flow limits are unclear. To combat this, I decided to send requests repeatedly until I receive a successful status - using he following code:
function sendRequest(url, body, isGood) {
return new Promise((resolve, reject) => fetch(url, body).then(
resp => {
if(isGood(resp)){
resolve(resp);
} else {
reject(resp.status)
}
}).catch(err => reject(err)))
}
function recursionBypass(func, ...args){
return func(...args)
}
function requestUntilSucceed(url, body, isGood, name, attempt=1) {
return new Promise((resolve, reject) => {
sendRequest(url, body, isGood)
.catch((status)=>{
recursionBypass(requestUntilSucceed, url, body, isGood, name, attempt+1)})
.then((resp) => resolve(attempt))
})
}
Without including details about the server in question, I wrote a quick test for this (create & wait for 500 requests):
function () {
let promises = [];
for(let i=0; i<300; i++) {
promises.push(new Promise((resolve,reject) =>
{createRequest(requestArg, i)
.then((attempt) => {console.log("Request: ", i, "succeeded on attempt: " + attempt); resolve(attempt)})
}
)
)
};
return Promise.all(promises)
})
await function()
Assume:
requestArg: a global variable (the same arg is used for all
requests for this test)
createRequest: a function that creates the
url, body and isGood callback then calls sendRequest that builds
the url and body based on requestArg and returns as follows:
return requestUntilSucceed(url, body, isGood, name)
I've observed that about 200 requests are accepted immediately and the others keep retrying. However, at some point, the program exits without finishing all requests successfully. Since there is no mechanism in place to limit the number of tries explicitly, I'm wondering how this can happen. I suspected it had to do with the limit on recursion depth so I started using recursionBypass as above so the interpreter can't tell the function is calling itself, but this didn't make a difference. Any ideas as to why it would exit early ? I know the server accepts requests until some count, then rejects them for a cooldown period of ~1min before beginning to accept again. The rate of acceptance after each cooldown period is ambiguous which is why I can't write a rule-based throttler. The issue is that my program exists (no error printed to console) before the minute is up. A lot of attempts are made in that minute so maybe I'm still surpassing some kind of javascript limit I'm not familiar with ?
Related
I am implementing abortable fetch calls.
There are basically two reasons for aborting the fetch on my page:
the user decides he/she does not want to wait for the AJAX data anymore and clicks a button; in this case the UI shows a message "call /whatever interrupted"
the user has moved to another part of the page and the data being fetched are no longer needed; in this case I don't want the UI to show anything, as it'd just confuse the user
In order to discriminate the two cases I was planning to use the reason parameter of the AbortController.abort method, but the .catch clause in my fetch call always receives a DOMException('The user aborted a request', ABORT_ERROR).
I have tried to provide a different DOMException as reason for the abort in case 2, but the difference is lost.
Has anyone found how to send information to the fetch .catch clause with regards to the reason to abort?
In the example below, I demonstrate how to determine the reason for an abortion of a fetch request. I provide inline comments for explanation. Feel free to comment if anything is unclear.
Re-run the code snippet to see a (potentially different) random result
'use strict';
function delay (ms, value) {
return new Promise(res => setTimeout(() => res(value), ms));
}
function getRandomInt (min = 0, max = 1) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}
// Forward the AbortSignal to fetch:
// https://docs.github.com/en/rest/repos/repos#list-public-repositories
function fetchPublicGHRepos (signal) {
const headers = new Headers([['accept', 'application/vnd.github+json']]);
return fetch('https://api.github.com/repositories', {headers, signal});
}
function example () {
const ac = new AbortController();
const {signal} = ac;
const abortWithReason = (reason) => delay(getRandomInt(1, 5))
.then(() => {
console.log(`Aborting ${signal.aborted ? 'again ' : ''}(reason: ${reason})`);
ac.abort(reason);
});
// Unless GitHub invests HEAVILY into our internet infrastructure,
// one of these promises will resolve before the fetch request
abortWithReason('Reason A');
abortWithReason('Reason B');
fetchPublicGHRepos(signal)
.then(res => console.log(`Fetch succeeded with status: ${res.status}`))
.catch(ex => {
// This is how you can determine if the exception was due to abortion
if (signal.aborted) {
// This is set by the promise which resolved first
// and caused the fetch to abort
const {reason} = signal;
// Use it to guide your logic...
console.log(`Fetch aborted with reason: ${reason}`);
}
else console.log(`Fetch failed with exception: ${ex}`);
});
delay(10).then(() => console.log(`Signal reason: ${signal.reason}`));
}
example();
I'm creating website using Johnny-five,React and node.js to control my Arduino board but I got stuck on handling async/await function. So, user is sending chosen port (COM1) for example to server, server then creates new instance of board
async function checkPortConnection(port) {
let board = new five.Board({port: port});
let success;
await board.on('error', () => {
success = false;
});
await board.on('ready', () => {
success = true;
});
return success;
}
I've thought that keyword await will stop function execution and wait for board response which takes about 7 seconds, but when I do this:
checkPortConnection(port).then((data)=>{
console.log(data);
});
I'm getting 'undefined', (because I'm getting success which is undefined?)
And after that server will send response if chosen port is correct or not.
But my question is, how to get proper response from checkPortConnection() function?
I think the issue is that you are listening for events, but this in and of itself isn't a Promise. Also, if they would be and you would use await you would never reach the code to register the ready event. The following should fix that issue:
async function checkPortConnection(port) {
return new Promise((resolve, reject) => {
let board = new five.Board({port: port});
board.on('error', error => resolve( false ));
board.on('ready', event => resolve( true ));
});
}
Personally I would also do the following, as the Promise will either use then or catch later, so you might want to ignore the boolean bit altogether:
board.on('error', reject);
board.on('ready', resolve);
I have an Angular 4 project that uses a http post to commands across to our backend. The issue is that sometimes a command can be sent out before the backend is fully up and running. Normally an "ERR_CONNECTION_TIME_OUT" would occur, but our embedded browser for whatever reason holds onto the post for an extremely long time before giving us the error (5 minutes). Since a 5 minute wait is unacceptable, I need to come up with a way to re-send our http post if there isn't a response within 15-30~ seconds
Here is what the current post looks like
this._http.post(this.sockclientURL, body, { headers: headers })
.subscribe((res) => {
let text = res.text();
if (text.startsWith("ERROR")) {
console.log("Sockclient Error.");
if (this.sockclientErrorRetryCount < this.sockclientErrorRetryLimit) {
console.log("Retrying in 3 seconds.");
this.sockclientErrorRetryCount++;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, 3000);
}
return;
}
else {
this.sockclientErrorRetryCount = 0;
}
if (text == "N" || text.startsWith("N ")) {
this._modalService.alert(this._nackLookup.convert(text));
if (typeof fail == 'function') {
fail(text);
}
}
else {
let deserializedCommand = command.deserialize(text);
success(deserializedCommand);
let repeatMillis: number = deserializedCommand.getRepeatMillis();
if (repeatMillis && repeatMillis > 0) {
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
}
}
},
(err) => {
console.log(err);
let repeatMillis = 1000;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
});
So to re-cap, I have some code in place to re-attempt the command if an error occurs, but our embedded browser holds onto its timeout error for several minutes. I need something to attempt to re-send after 15-30 seconds of no response
Retries are immediately executed without waiting for a delay. A better one consists of waiting for a bit before retrying and abort after a given amount of time. Observables allow to mix retryWhen, delay and timeout operators to achieve this, as described in the following snippet:
this._http.post(this.sockclientURL, body, { headers: headers })
.retryWhen(error => error.delay(500))
.timeout(2000, new Error('delay exceeded'))
.map(res => res.map());
Not 100% sure it still works in Angular4, but you should be able to do:
this
._http
.post(this.sockclientURL, body, { headers: headers })
.timeout(15000, new Error('timeout exceeded')) // or 30000
.subscribe((res) => { /* ... */ })
A little more information would be needed as to the entire scope of the application, however when I get myself in to situations like this I normally look at the following avenues of approach:
Will a try / catch solve my problem.
In your catch, you could redirect elsewhere. If you don't catch anything
often times you will get a bit of lag.
Is there a way to avoid the error all together through user constraints.
Lastly, You may want to use setInterval() over setTimeout().
Can you provide more information as to the scope of the operation?
I have two functions:
function manipulateData(data,callback)
function isDataValid(data)
The first one is supposed to manipulate the data and return nothing, and the second one is supposed to return true of false depending on the data's structure, without manipulating it at all.
Assuming three things:
-The manipulation doesn't require a valid data.
-The data's validation process is quite demanding IO wise.
-The data's validation process should receive the original data.
How can I use the verification function as the callback argument, so I'd be able to verify it in the background, and only throw an error from the first function, if the data is found invalid?
I want it to behave like this, where the first statement runs in the background and doesn't block the data manipulation.
function manipulateData(data, callback){
if(callback(data)==false){ return some error }
...manipulate data without waiting for the verification...
}
I hope it is indeed doable and that I'm not missing a crucial logical part in the callback mechanism.
You can do this with callbacks or Promises. With callbacks you want to use something like async, but I'd definitely go with Promises (or async/await).
You said that validation was I/O heavy so we'll assume that is an async function. I've assumed that the manipulation is not, but it doesn't really matter once you use Promise.all to wrap them up.
The following is tested in Node 6.10. It tests validation failure and success and I'm pretty sure I've got it testing with each function finishing before the other, but again Promise.all takes care of that for you anyway.
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
function manipulateData (data, zzz) {
if (zzz) {
sleep(zzz);
}
return 'manipulated data';
}
function isDataValid (data) {
return new Promise((resolve, reject) => {
if (data) {
resolve('ok');
} else {
reject(new Error('nok'));
}
});
}
function updateData (data, zzz) {
return Promise.all([
isDataValid(data),
manipulateData(data, zzz)
]);
}
function testIt (data, zzz) {
return updateData(data, zzz)
.then(([result, data]) => console.log(`Success! ${result} - ${data}`))
.catch(err => console.log(`Fail! ${err}`));
}
testIt(true, 0)
.then(() => testIt(false, 0))
.then(() => testIt(true, 1000))
.then(() => testIt(false, 1000));
Success! ok - manipulated data
Fail! Error: nok
Success! ok - manipulated data
Fail! Error: nok
Can anyone recommend a pattern for instantly retrieving data from a function that returns a Promise?
My (simplified) example is an AJAX preloader:
loadPage("index.html").then(displayPage);
If this is downloading a large page, I want to be able to check what's happening and perhaps cancel the process with an XHR abort() at a later stage.
My loadPage function used to (before Promises) return an id that let me do this later:
var loadPageId = loadPage("index.html",displayPage);
...
doSomething(loadPageId);
cancelLoadPage(loadPageId);
In my new Promise based version, I'd imagine that cancelLoadPage() would reject() the original loadPage() Promise.
I've considered a few options all of which I don't like. Is there a generally accepted method to achieve this?
Okay, let's address your bounty note first.
[Hopefully I'll be able to grant the points to someone who says more than "Don't use promises"... ]
Sorry, but the answer here is: "Don't use promises". ES6 Promises have three possible states (to you as a user): Pending, Resolved and Rejected (names may be slightly off).
There is no way for you to see "inside" of a promise to see what has been done and what hasn't - at least not with native ES6 promises. There was some limited work (in other frameworks) done on promise notifications, but those did not make it into the ES6 specification, so it would be unwise of you to use this even if you found an implementation for it.
A promise is meant to represent an asynchronous operation at some point in the future; standalone, it isn't fit for this purpose. What you want is probably more akin to an event publisher - and even that is asynchronous, not synchronous.
There is no safe way for you to synchronously get some value out of an asynchronous call, especially not in JavaScript. One of the main reasons for this is that a good API will, if it can be asynchronous, will always be asynchronous.
Consider the following example:
const promiseValue = Promise.resolve(5)
promiseValue.then((value) => console.log(value))
console.log('test')
Now, let's assume that this promise (because we know the value ahead of time) is resolved synchronously. What do you expect to see? You'd expect to see:
> 5
> test
However, what actually happens is this:
> test
> 5
This is because even though Promise.resolve() is a synchronous call that resolves an already-resolved Promise, then() will always be asynchronous; this is one of the guarantees of the specification and it is a very good guarantee because it makes code a lot easier to reason about - just imagine what would happen if you tried to mix synchronous and asynchronous promises.
This applies to all asynchronous calls, by the way: any action in JavaScript that could potentially be asynchronous will be asynchronous. As a result, there is no way for you do any kind of synchronous introspection in any API that JavaScript provides.
That's not to say you couldn't make some kind of wrapper around a request object, like this:
function makeRequest(url) {
const requestObject = new XMLHttpRequest()
const result = {
}
result.done = new Promise((resolve, reject) => {
requestObject.onreadystatechange = function() {
..
}
})
requestObject.open(url)
requestObject.send()
return requestObject
}
But this gets very messy, very quickly, and you still need to use some kind of asynchronous callback for this to work. This all falls down when you try and use Fetch. Also note that Promise cancellation is not currently a part of the spec. See here for more info on that particular bit.
TL:DR: synchronous introspection is not possible on any asynchronous operation in JavaScript and a Promise is not the way to go if you were to even attempt it. There is no way for you to synchronously display information about a request that is on-going, for example. In other languages, attempting to do this would require either blocking or a race condition.
Well. If using angular you can make use of the timeout parameter used by the $http service if you need to cancel and ongoing HTTP request.
Example in typescript:
interface ReturnObject {
cancelPromise: ng.IPromise;
httpPromise: ng.IHttpPromise;
}
#Service("moduleName", "aService")
class AService() {
constructor(private $http: ng.IHttpService
private $q: ng.IQService) { ; }
doSomethingAsynch(): ReturnObject {
var cancelPromise = this.$q.defer();
var httpPromise = this.$http.get("/blah", { timeout: cancelPromise.promise });
return { cancelPromise: cancelPromise, httpPromise: httpPromise };
}
}
#Controller("moduleName", "aController")
class AController {
constructor(aService: AService) {
var o = aService.doSomethingAsynch();
var timeout = setTimeout(() => {
o.cancelPromise.resolve();
}, 30 * 1000);
o.httpPromise.then((response) => {
clearTimeout(timeout);
// do code
}, (errorResponse) => {
// do code
});
}
}
Since this approach already returns an object with two promises the stretch to include any synchronous operation return data in that object is not far.
If you can describe what type of data you would want to return synchronously from such a method it would help to identify a pattern. Why can it not be another method that is called prior to or during your asynchronous operation?
You can kinda do this, but AFAIK it will require hacky workarounds. Note that exporting the resolve and reject methods is generally considered a promise anti-pattern (i.e. sign you shouldn't be using promises). See the bottom for something using setTimeout that may give you what you want without workarounds.
let xhrRequest = (path, data, method, success, fail) => {
const xhr = new XMLHttpRequest();
// could alternately be structured as polymorphic fns, YMMV
switch (method) {
case 'GET':
xhr.open('GET', path);
xhr.onload = () => {
if (xhr.status < 400 && xhr.status >= 200) {
success(xhr.responseText);
return null;
} else {
fail(new Error(`Server responded with a status of ${xhr.status}`));
return null;
}
};
xhr.onerror = () => {
fail(networkError);
return null;
}
xhr.send();
return null;
}
return xhr;
case 'POST':
// etc.
return xhr;
// and so on...
};
// can work with any function that can take success and fail callbacks
class CancellablePromise {
constructor (fn, ...params) {
this.promise = new Promise((res, rej) => {
this.resolve = res;
this.reject = rej;
fn(...params, this.resolve, this.reject);
return null;
});
}
};
let p = new CancellablePromise(xhrRequest, 'index.html', null, 'GET');
p.promise.then(loadPage).catch(handleError);
// times out after 2 seconds
setTimeout(() => { p.reject(new Error('timeout')) }, 2000);
// for an alternative version that simply tells the user when things
// are taking longer than expected, NOTE this can be done with vanilla
// promises:
let timeoutHandle = setTimeout(() => {
// don't use alert for real, but you get the idea
alert('Sorry its taking so long to load the page.');
}, 2000);
p.promise.then(() => clearTimeout(timeoutHandle));
Promises are beautiful. I don't think there is any reason that you can not handle this with promises. There are three ways that i can think of.
The simplest way to handle this is within the executer. If you would like to cancel the promise (like for instance because of timeout) you just define a timeout flag in the executer and turn it on with a setTimeout(_ => timeout = true, 5000) instruction and resolve or reject only if timeout is false. ie (!timeout && resolve(res) or !timeout && reject(err)) This way your promise indefinitely remains unresolved in case of a timeout and your onfulfillment and onreject functions at the then stage never gets called.
The second is very similar to the first but instead of keeping a flag you just invoke reject at the timeout with proper error description. And handle the rest at the then or catch stage.
However if you would like to carry the id of your asych operation to the sync world then you can also do it as follows;
In this case you have to promisify the async function yourself. Lets take an example. We have an async function to return the double of a number. This is the function
function doubleAsync(data,cb){
setTimeout(_ => cb(false, data*2),1000);
}
We would like to use promises. So normally we need a promisifier function which will take our async function and return another function which when run, takes our data and returns a promise. Right..? So here is the promisifier function;
function promisify(fun){
return (data) => new Promise((resolve,reject) => fun(data, (err,res) => err ? reject(err) : resolve(res)));
}
Lets se how they work together;
function promisify(fun){
return (data) => new Promise((resolve,reject) => fun(data, (err,res) => err ? reject(err) : resolve(res)));
}
function doubleAsync(data,cb){
setTimeout(_ => cb(false, data*2),1000);
}
var doubleWithPromise = promisify(doubleAsync);
doubleWithPromise(100).then(v => console.log("The asynchronously obtained result is: " + v));
So now you see our doubleWithPromise(data) function returns a promise and we chain a then stage to it and access the returned value.
But what you need is not only a promise but also the id of your asynch function. This is very simple. Your promisified function should return an object with two properties; a promise and an id. Lets see...
This time our async function will return a result randomly in 0-5 secs. We will obtain it's result.id synchronously along with the result.promise and use this id to cancel the promise if it fails to resolve within 2.5 secs. Any figure on console log Resolves in 2501 msecs or above will result nothing to happen and the promise is practically canceled.
function promisify(fun){
return function(data){
var result = {id:null, promise:null}; // template return object
result.promise = new Promise((resolve,reject) => result.id = fun(data, (err,res) => err ? reject(err) : resolve(res)));
return result;
};
}
function doubleAsync(data,cb){
var dur = ~~(Math.random()*5000); // return the double of the data within 0-5 seconds.
console.log("Resolve in " + dur + " msecs");
return setTimeout(_ => cb(false, data*2),dur);
}
var doubleWithPromise = promisify(doubleAsync),
promiseDataSet = doubleWithPromise(100);
setTimeout(_ => clearTimeout(promiseDataSet.id),2500); // give 2.5 seconds to the promise to resolve or cancel it.
promiseDataSet.promise
.then(v => console.log("The asynchronously obtained result is: " + v));
You can use fetch(), Response.body.getReader(), where when .read() is called returns a ReadableStream having a cancel method, which returns a Promise upon cancelling read of the stream.
// 58977 bytes of text, 59175 total bytes
var url = "https://gist.githubusercontent.com/anonymous/"
+ "2250b78a2ddc80a4de817bbf414b1704/raw/"
+ "4dc10dacc26045f5c48f6d74440213584202f2d2/lorem.txt";
var n = 10000;
var clicked = false;
var button = document.querySelector("button");
button.addEventListener("click", () => {clicked = true});
fetch(url)
.then(response => response.body.getReader())
.then(reader => {
var len = 0;
reader.read().then(function processData(result) {
if (result.done) {
// do stuff when `reader` is `closed`
return reader.closed.then(function() {
return "stream complete"
});
};
if (!clicked) {
len += result.value.byteLength;
}
// cancel stream if `button` clicked or
// to bytes processed is greater than 10000
if (clicked || len > n) {
return reader.cancel().then(function() {
return "read aborted at " + len + " bytes"
})
}
console.log("len:", len, "result value:", result.value);
return reader.read().then(processData)
})
.then(function(msg) {
alert(msg)
})
.catch(function(err) {
console.log("err", err)
})
});
<button>click to abort stream</button>
The method I am currently using is as follows:
var optionalReturnsObject = {};
functionThatReturnsPromise(dataToSend, optionalReturnsObject ).then(doStuffOnAsyncComplete);
console.log("Some instant data has been returned here:", optionalReturnsObject );
For me, the advantage of this is that another member of my team can use this in a simple way:
functionThatReturnsPromise(data).then(...);
And not need to worry about the returns object. An advanced user can see from the definitions what is going on.