I have an application that is sending a http request that returns a promise everytime the user types. I have it debouncing every 500ms. Sometimes the api I am requesting takes a long time to respond. For example, I make a search request for a that takes a long time to respond but then the user continues typing to complete the query of a+x which resolves almost immediately but the results of a+x get overridden by the previous request of just a.
TL;DR: if new promise is called before current resolves, how to cancel current
Create a variable that counts your requests:
var effectiveRequestNumber = 0;
function asyncRequest() {
var requestNumber = ++effectiveRequestNumber; // storing our request number
doSomething().then(function(response) {
// if the function was invoked after this request, then these two won't match
if (effectiveRequestNumber !== requestNumber) {
return;
}
applyResponse(response); // we are fine - applying the response
});
}
The way I usually handle overlapping queries where I only want the results of the last one is to remember something that I can check in the callback.
You haven't quoted any code which makes it tricky to help, but here's an example:
"use strict";
// NOTE: Will only run if your browser supports promises.
// Scoping function to avoid globals
(function() {
// We keep track of the most recent promise
var lastSomethingRequest = null;
// Our function that does something async
function doSomething(value) {
console.log("doing request for " + value);
// Start the async, remember the promise
var p = new Promise(function(resolve) {
setTimeout(function() {
resolve("resolve for " + value);
}, Math.floor(Math.random() * 500));
});
// Remember that as the most recent one
lastSomethingRequest = p;
p.then(function(result) {
// Use the result only if it's the most recent
if (lastSomethingRequest === p) {
console.log("Use this result: " + result);
lastSomethingRequest = null; // Release the promise object
} else {
console.log("Disregard outdated result: " + result);
}
});
}
// Generate 5 requests in a row that will complete in varying
// amounts of time, where we only want the result of the last one
for (var n = 0; n < 5; ++n) {
doSomething(n);
}
})();
Related
TLDR: How to use ES6 fetch to download synchronously?
I'm trying to write a script in Node to download data from an API till there is no more to download e.g. the endpoint has a dataset of size roughly 12000 but only provides 100 per call, and I need all the data. So I've decided to have it downloaded synchronously, and only stop when the json returned is finally empty.
// function to make one api GET call
getData = (offset) => {
return fetch('http://...'+'?start=' + offset)
.then(...)
}
// make api calls until json is finally empty,
// indicating that I've downloaded all the data
offset = 0
const results = getData(offset)
while (results.length != 0) {
// combine results...
i += 100 // move offset
results = getData(i)
}
Because I don't know precisely how large the data is and at which offset it ends, whether or not to make another call depends on the last one.
The code above fails because the promise from getData() does not resolve in time for the while loop. In another language, this would be okay as getData blocks until it is completed. I've tried to await getData but it needs to be in an async (which I don't know where to place, I already have promises). Is there any way to force getData() to block until it is resolved?
you can mark your getData() function as async
async getData() {
return fetch('http://...'+'?start=' + offset)
.then(...)
}
then await for the returned promise completion
await const results = getData(offset)
Or you can modify your code to handle the logic of whether to make another call in the promise callbacks. Something like
function fetchData(offset) {
if (offset < 1000) {
return Promise.resolve([1,2,3].map(i=>offset+i));
} else {
return Promise.resolve([]);
}
}
function needsMoreData(results) {
return results.length > 0;
}
function fetchNextDataIfNeeded(results, offset) {
if (needsMoreData(results)) {
return fetchRemainingData(offset + 100)
.then(nextResults=>[...results, ...nextResults]);
}
return Promise.resolve(results);
}
function fetchRemainingData( offset) {
return fetchData(offset)
.then(results=>fetchNextDataIfNeeded(results, offset));
}
function fetchAllData() {
return fetchRemainingData(0)
.then(results=>console.log(results));
}
fetchAllData();
See https://jsfiddle.net/rwycbn5q/1/
I recently found myself in a similar situation, but some functions cannot be made synchronously. But you could do something like this:
let results=[];
getData(idx)=>{
fetch("url").then(data=>{
if(data!=={}){ //Or some other continuation check
results.push(data); //Or whatever you want to do with the data
getData(idx+100);
}
})
}
getData(0);
I have been attempting to parse a sites table data into a json file, which I can do if I do each page one by one, but seeing as there are 415 pages that would take a while.
I have seen and read a lot of StackOverflow questions on this subject but I don't seem able to modify my script so that it;
Scrapes each page and extracts the 50 items with item IDS per page
Do so in a rate limited way so I don't negatively affect the server
The script waits until all requests are done so I can write each item + item id to a JSON file.
I believe you should be able to do this using request-promise and promise.all but I cannot figure it out.
The actual scraping of the data is fine I just cannot make the code, scrape a page, then go to the next URL with a delay or pause inbetween requests.
Code below is the closest I have got, but I get the same results multiple times and I cannot slow the request rate down.
Example of the page URLS:
http://test.com/itemlist/1
http://test.com/itemlist/2
http://test.com/itemlist/3 etc (upto 415)
for (var i = 1; i <= noPages; i++) {
urls.push({url: itemURL + i});
console.log(itemURL + i);
}
Promise.map(urls, function(obj) {
return rp(obj).then(function(body) {
var $ = cheerio.load(body);
//Some calculations again...
rows = $('table tbody tr');
$(rows).each(function(index, row) {
var children = $(row).children();
var itemName = children.eq(1).text().trim();
var itemID = children.eq(2).text().trim();
var itemObj = {
"id" : itemID,
"name" : itemName
};
itemArray.push(itemObj);
});
return itemArray;
});
},{concurrency : 1}).then(function(results) {
console.log(results);
for (var i = 0; i < results.length; i++) {
// access the result's body via results[i]
//console.log(results[i]);
}
}, function(err) {
// handle all your errors here
console.log(err);
});
Apologies for perhaps misunderstand node.js and its modules, I don't really use the language but I needed to scrape some data and I really don't like python.
since you need requests to be run only one by one Promise.all() would not help.
Recursive promise (I'm not sure if it's correct naming) would.
function fetchAllPages(list) {
if (!list || !list.length) return Promise. resolve(); // trivial exit
var urlToFetch = list.pop();
return fetchPage(urlToFetch).
then(<wrapper that returns Promise will be resolved after delay >).
then(function() {
return fetchAllPages(list); // recursion!
});
}
This code still lacks error handling.
Also I believe it can become much more clear with async/await:
for(let url of urls) {
await fetchAndProcess(url);
await <wrapper around setTimeout>;
}
but you need to find /write your own implementation of fetch() and setTimeout() that are async
After input from #skyboyer suggesting using recursive promises I was lead to a GitHub Gist called Sequential execution of Promises using reduce()
Firstly I created my array of URLS
for (var i = 1; i <= noPages; i++) {
//example urls[0] = "http://test.com/1"
//example urls[1] = "http://test.com/2"
urls.push(itemURL + i);
console.log(itemURL + i);
}
Then
var sequencePromise = urls.reduce(function(promise, url) {
return promise.then(function(results) {
//fetchIDsFromURL async function (it returns a promise in this case)
//when the promise resolves I have my page data
return fetchIDsFromURL(url)
.then(promiseWithDelay(9000))
.then(itemArr => {
results.push(itemArr);
//calling return inside the .then method will make sure the data you want is passed onto the next
return results;
});
});
}, Promise.resolve([]));
// async
function fetchIDsFromURL(url)
{
return new Promise(function(resolve, reject){
request(url, function(err,res, body){
//console.log(body);
var $ = cheerio.load(body);
rows = $('table tbody tr');
$(rows).each(function(index, row) {
var children = $(row).children();
var itemName = children.eq(1).text().trim();
var itemID = children.eq(2).text().trim();
var itemObj = {
"id" : itemID,
"name" : itemName
};
//push the 50 per page scraped items into an array and resolve with
//the array to send the data back from the promise
itemArray.push(itemObj);
});
resolve(itemArray);
});
});
}
//returns a promise that resolves after the timeout
function promiseWithDelay(ms)
{
let timeout = new Promise(function(resolve, reject){
setTimeout(function()
{
clearTimeout(timeout);
resolve();
}, ms);
});
return timeout;
}
Then finally call .then on the sequence of promises, the only issue I had with this was returning multiple arrays inside results with the same data in each, so since all data is the same in each array I just take the first one which has all my parsed items with IDs in it, then I wrote it to a JSON file.
sequencePromise.then(function(results){
var lastResult = results.length;
console.log(results[0]);
writeToFile(results[0]);
});
I am trying to achieve the following functionality:
execute call back
resolve promise
check output
if not correct execute again
I have 'mimicked' the scenario with a timer, this reruns a script that makes a call to backend database for some information:
_runCheckScript: function(bStart, bPreScript){
var oController = this;
var scriptTimerdeferred = $.Deferred();
var promise = scriptTimerdeferred.promise();
if(typeof(bStart) === "undefined"){
bStart = true;
}
if(typeof(bPreScript) === "undefined"){
bPreScript = true;
}
// if the HANA DB is not stopped or started, i.e. it is still starting up or shutting down
// check the status again every x number of seconds as per the function
var msTime = 10000;
if(!bPreScript){
this._pushTextIntoConsoleModel("output", {"text":"The instance will be 'pinged' every " + msTime/1000 + " seconds for 2 minutes to monitor for status changes. After this, the script will be terminated."});
}
if(bPreScript){
var timesRun = 0;
var commandTimer = setInterval( function () {
timesRun += 1;
if(timesRun === 12){
scriptTimerdeferred.reject();
clearInterval(commandTimer);
}
// send the deferred to the next function so it can be resolved when finished
oController._checkScript(scriptTimerdeferred, bStart, bPreScript);
}, msTime);
}
return $.Deferred(function() {
var dbcheckDeffered = this;
promise.done(function () {
dbcheckDeffered.resolve();
console.log('Check finished');
oController._pushTextIntoConsoleModel("output", {"text":"Check finished."});
});
});
The script it calls, has it's own promise as it calls another function:
_checkScript: function(scriptTimerdeferred, bStart, bPreScript){
var oProperties = this.getView().getModel("configModel");
var oParams = oProperties.getProperty("/oConfig/oParams");
var deferred = $.Deferred();
var promise = deferred.promise();
var sCompareStatus1 = "inProg";
var sCompareStatus2 = this._returnHanaCompareStatus(bStart, bPreScript);
var sCompareStatus3 = this._returnHanaCompareStatus3(bStart, bPreScript);
var params = {//some params};
// Send the command
this._sendAWSCommand(params, deferred);
// When command is sent
promise.done(function (oController) {
console.log('back to db check script');
var oCommandOutputModel = oController.getView().getModel("commandOutput");
var sStatus = oCommandOutputModel.Status;
// check that it's not in the wrong status for a start/stop
// or if it's a pre script check -> pre script checks always resolve first time
if(sStatus !== sCompareStatus1 && sStatus !== sCompareStatus2 && sStatus !==sCompareStatus3|| bPreScript){
scriptTimerdeferred.resolve();
}
});
},
This works, however what it does is:
set a timer to call the first script every x seconds (as the data is currently changing - a server is coming online)
the script runs and calls another function to get some data from the DB
when the call for data is resolved (complete) it comes back to 'promise.done' on the checkScript and only resolves the timer promise if it meets certain criteria
all the while, the initial timer is resending the call as eventually the DB will come online and the status will change
I am wondering if there is a better way to do this as currently I could have, for example, 3 calls to the DB that go unresolved then all resolve at the same time. I would prefer to run a command, wait for it to resolve, check the output, if it is not right then run command again.
Thanks!
I think what you want to do can be achieved carefully reading what explained in these links:
Promise Retry Design Patterns
In javascript, a function which returns promise and retries the inner async process best practice
See this jsfiddle
var max = 5;
var p = Promise.reject();
for(var i=0; i<max; i++) {
p = p.catch(attempt).then(test);
}
p = p.then(processResult).catch(errorHandler);
function attempt() {
var rand = Math.random();
if(rand < 0.8) {
throw rand;
} else {
return rand;
}
}
function test(val) {
if(val < 0.9) {
throw val;
} else {
return val;
}
}
function processResult(res) {
console.log(res);
}
function errorHandler(err) {
console.error(err);
}
It retries a promise infinite times since the condition is not satisfied. Your condition is the point you said "check the output". If your check fails, retry the promise. # Be careful to hold a limit case, promises waste memory. If your api/service/server/callreceiver is off, and you don't set a threshold, you could create an infinite chain of promises NO STOP
I'm totally new to JS having jumped in a few days ago to try make a chrome extension, so sorry if this is a simple problem, but I can't seem to figure it out.
My original function was to simply download an image and increment the stored count by 1 and add on the file size. However on a page of images it hit the write limits of chrome so I decided to count the values and write them at the end.
Initially the return value happened much later than when the function was executed (so it returned nothing), so I looked up how to fix it and got it working with a callback. However, although it waits for the callbacks, the code just continues past the callbacks and the part afterwards is executed before anything else, meaning the final count will always be 0.
// Loop through all urls
var approx_filesize = 0;
for(var i = 1; i < all_urls.length; i++){
var image_url = all_urls[i];
_download_image(image_url, folder_name, function(item){
approx_filesize += parseInt(item);
});
}
// This happens before any _download_image functions complete
alert('end' + approx_filesize);
// Write to storage
chrome.storage.sync.get({
download_count: 0,
download_size: 0
}, function(items) {
chrome.storage.sync.set({
download_count: parseInt(items.download_count) + all_images_data.length - 1,
download_size: parseInt(items.download_size) + approx_filesize
}, function() {
});
});
I just tried moving the loop into its own callback function and still had no luck, the alert runs before the first function completes.
function image_url_loop_callback(callback, folder_name, all_urls){
var approx_filesize = 0;
for(var i = 1; i < all_urls.length; i++){
var image_url = all_urls[i];
_download_image(image_url, folder_name, function(filesize){
approx_filesize += parseInt(filesize);
});
}
callback(approx_filesize);
}
image_url_loop_callback(function(approx_filesize){
alert(approx_filesize);
}, folder_name, all_urls);
How do I make it so that the loop completes before anything else is done?
Edit: Got it working with promise, here's the adjusted code:
new Promise( function(resolve, reject) {
var count = 1;
var num_items = all_urls.length;
var approx_filesize = 0;
for(var i = 1; i < num_items; i++){
var image_url = all_urls[i];
_download_image(image_url, folder_name, function(item){
approx_filesize += parseInt(item);
count ++;
if(count == num_items){
resolve([num_items, approx_filesize]);
}
});
}
}).then( function(stuff) {
var num_items = stuff[0];
var approx_filesize = stuff[1];
chrome.storage.sync.get({
download_count: 0,
download_size: 0
}, function(items) {
chrome.storage.sync.set({
download_count: parseInt(items.download_count) + num_items,
download_size: parseInt(items.download_size) + approx_filesize
}, function() {
});
});
});
Basically, you have to handle the asynchronous aspect of JavaScript.
To do so, you have to use a Promise.
This works this way:
new Promise( () => {
// My asynchronous code
}).then( () => {
// My code which need to wait for the promise resolution.
});
If you are working with only the latest versions of browsers, you can also have a look at async/await keywords which make asynchronous handling much easier than regular promises (but still are promises).
EDIT: As this answer required further explanation and proper code snippets, I edited it to answer a comment.
This example maybe easier to understand:
let myFoo = "Hello";
test = new Promise( (resolve) => {
console.log(myFoo);
myFoo = "World!";
setTimeout(() => {
resolve();
}, 4000);
}).then( () => {
console.log(myFoo);
});
This will print "Hello" immediately, and "World!" 4 seconds after.
This is how you work with promises. You can perfectly edit variables which are defined in a scope outside of the promise. Please don't use var, just stick to let and define a decent scope.
Due to javascript's async nature you have to use promises:
https://developers.google.com/web/fundamentals/getting-started/primers/promises
I have a request-promise function that makes a request to an API. I'm rate-limited by this API and I keep getting the error message:
Exceeded 2 calls per second for api client. Reduce request rates to resume uninterrupted service.
I'm running a couple of Promise.each loops in parallel which is causing the issue, if I run just one instance of Promise.each everything runs fine. Within these Promise.each calls they lead to the same function a with a request-promise call. I want to wrap this function with another queue function and set the interval to 500 milliseconds so that a request isn't made after one another, or parallel, but set to that time, on queue. The thing is I still need these promises to get their contents even if it takes a rather long time to get a response.
Is there anything that will do this for me? Something I can wrap a function in and it will respond at a set interval and not in parallel or fire functions one after another?
Update: Perhaps it does need to be promise specific, I tried to use underscore's throttle function
var debug = require("debug")("throttle")
var _ = require("underscore")
var request = require("request-promise")
function requestSite(){
debug("request started")
function throttleRequest(){
return request({
"url": "https://www.google.com"
}).then(function(response){
debug("request finished")
})
}
return _.throttle(throttleRequest, 100)
}
requestSite()
requestSite()
requestSite()
And all I got back was this:
$ DEBUG=* node throttle.js
throttle request started +0ms
throttle request started +2ms
throttle request started +0ms
Update
The last answer was wrong, this works but I still think I can do better:
// call fn at most count times per delay.
const debounce = function (fn, delay, count) {
let working = 0, queue = [];
function work() {
if ((queue.length === 0) || (working === count)) return;
working++;
Promise.delay(delay).tap(() => working--).then(work);
let {context, args, resolve} = queue.shift();
resolve(fn.apply(context, args));
}
return function debounced() {
return new Promise(resolve => {
queue.push({context: this, args: arguments, resolve});
if (working < count) work();
});
};
};
function mockRequest() {
console.log("making request");
return Promise.delay(Math.random() * 100);
}
var bounced = debounce(mockRequest, 800, 5);
for (var i = 0; i < 5; i++) bounced();
setTimeout(function(){
for (var i = 0; i < 20; i++) bounced();
},2000);
So you need to make the requests throttle function-wide - that's fine. Promises have queueing pretty much built in.
var p = Promise.resolve(); // our queue
function makeRequest(){
p = p.then(function(){ // queue the promise, wait for the queue
return request("http://www.google.com");
});
var p2 = p; // get a local reference to the promise
// add 1000 ms delay to queue so the next caller has to wait
p = p.delay(1000);
return p2;
};
Now makeRequest calls will be at least 1000ms apart.
jfriend has pointed out that you need two requests per second and not a single one - this is just as easily solvable with a second queue:
var p = Promise.resolve(1); // our first queue
var p2 = Promise.resolve(2); // our second queue
function makeRequest(){
var turn = Promise.any([p, p2]).then(function(val){
// add 1000 ms delay to queue so the next caller has to wait
// here we wait for the request too although that's not really needed,
// check both options out and decide which works better in your case
if(val === 1){
p = p.return(turn).delay(1, 1000);
} else {
p2 = p2.return(turn).delay(1, 1000);
}
return request("http://www.google.com");
});
return turn; // return the actual promise
};
This can be generalized to n promises using an array similarly