I've got the loop below in my code and each iteration calls a function in which there's an Ajax call.
I'm trying to find a way to make each iteration run only after the former iteration completed.
Until now, as you can see, I've used a delay as a temporary solution.
Is there a neat way to make it synchronized?
Thanks!
$.each(matchIds, function(key,val){
setTimeout(function(){
gettingEvents(key, val,userID);
},delay);
delay+=1700;
});
If they really are synchronous, then simply not using setTimeout will do the job.
$.each(matchIds, function(key,val){
gettingEvents(key, val,userID);
});
This is, however, a terrible approach that will tie up the JS event loop while all the requests run. Use asynchronous requests instead. Iterate inside the success handler instead of in an each loop.
var match_id_keys = Object.keys(matchIds);
var i = 0;
get_next();
function get_next() {
if (!match_id_keys[i]) {
return;
}
var key = match_id_keys[i];
var val = matchIds[key];
$.ajax( ... ).done(function () {
// Do stuff with data
i++;
get_next();
});
}
In 2020, we now have decent browser support for async and await which are tools for managing promises, so this can be written as a simpler loop:
async function get_all_ids(matchIds) {
const entries = Object.entries(matchIds);
for (let i = 0; i < entries.length; i++) {
const [key, val] = entries[i];
const data = await $.ajax(...);
// Do stuff with data
}
}
Related
I have some code that functions like so. It will iterate over an array with forEach, perform a heavy task that freezes the UI with the current iteration, push the returned value from that heavy task to an array, and then return the array being pushed to after the forEach has concluded. This is how it would work:
// generate a dummy array
var array = Array.from(Array(1000000).keys()):
function doSomething() {
// the array to push to
var pushArray = [];
array.forEach((item) => {
// perform a heavy task...
const output = doHeavyThing(item);
// ...and push the heavy task result to an array
pushArray.push(output);
// (this DOM call will not work, because doHeavyThing blocks the UI)
document.getElementById('updateDiv').innerHTML = item;
})
// return the array that was pushed to
return pushArray;
}
const result = doSomething();
return result;
Is there any way I can temporarily pause the execution of the forEach loop, unfreezing the UI, to allow for a DOM update? This would only require a couple of milliseconds.
I cannot use web workers in this situation, because the data being returned is so large that any calls to postMessage would crash the browser on mobile.
I also haven't been able to get setTimeout to work, because it is asynchronous - meaning I cannot return anything. I also cannot reliably use callbacks.
Does anyone have an idea on how to solve this problem? Clarifying questions are always welcome.
My doHeavyThing function is a function from the seek-bzip package:
bzip.decodeBlock(compressedDataBlock, 32);
I believe something like this would work to ensure the event loop can tick in between calls to doHeavyThing.
// create a promise that will resolve on the next tick of the event loop
function sleep() {
return new Promise(r => setTimeout(r));
}
// standin for doHeavyThing. This is enough to induce
// noticable delay on my machine, tweak the # of iterations
// if it's too fast to notice on yours
function doHeavyThing() {
for (let i = 0; i < 100; i++) { console.log(i); }
}
async function handleHeavyLifting() {
const array = Array.from(Array(1000).keys());
const result = [];
for (const item of array) {
document.getElementById('updateDiv').innerHTML = item;
result.push(doHeavyThing(item));
// let DOM updates propagate, other JS callbacks run, etc
await sleep();
}
return result;
}
handleHeavyLifting();
<div id="updateDiv">
None
</div>
I currently have some jQuery code that looks a bit like this:
for ( i = 0; i < limitVar; i++ ) {
doAjaxStuff(i);
}
function doAjaxStuff( i ) {
Here we make a SYNCHRONOUS ajax call, sending i.
}
The ajax call needs to be synchronous - one isn't fired until the last one is done.
As synchronous JS is deprecated, I want to move this code to use promises. How would I achieve this? I've been unable to find an example that close enough fits this situation.
You don't do synchronous ajax in the browser (well technically, you can in some circumstances, but it's a really bad idea to do so because it locks up the browser during the ajax call).
Instead, you redesign your loop so that it only carries out the next ajax call when the previous one is done which means you have to loop manually, you can't use a for loop. Since your code is pseudo-code (you don't show the real ajax operation), I'll use a jQuery ajax example, but you can substitute any ajax function you have as long as it either returns a promise or uses a callback to signal when its done.
The general idea is that you create a function for your ajax call and you use the completion callback from that to increment your index and then run the next iteration of your loop.
function runLoop(data) {
var i = 0;
function next() {
if (i < data.length) {
return $.ajax(data[i]).then(function(data) {
++i;
return next();
});
else {
// all done with loop
}
}
return next();
}
// call it like this
runLoop(someArray).then(function() {
// all done here
});
If you don't have an array of data, but just want a loop index:
function runLoop(limitVar) {
var i = 0;
function next() {
if (i < limitVar) {
return $.ajax(something_with_i_in_it).then(function(data) {
++i;
return next();
});
else {
// all done with loop
}
}
return next();
}
// call it like this
runLoop(theLimit).then(function() {
// all done here
});
If your limitVar is not large and there is no other logic involved in deciding whether to continue the loop, you can also use a little bit simpler pattern if you have an ajax function that returns a promise:
function runLoop(limitVar) {
var p = Promise.resolve();
for (var i = 0; i < limitVar; i++) {
p = p.then(function(prevResult) {
return someAjax(i);
});
}
return p;
}
// call it like this
runLoop(theLimit).then(function() {
// all done here
});
If you aren't using ajax functions that return a promise, then it's only a few lines of code to wrap your function with one that does and then you can more easily use these design patterns.
Process the array using some separate function. Each time you take off another element from array, then process it and when it's done call the function again. If there is no more item in list then the whole process is done.
var listOfRequests = ...;
new Promise( function( resolve, reject ) {
requestNext();
function requestNext() {
if ( !listOfRequests.length ) {
return resolve();
}
var next = listOfRequests.shift();
doAjaxStuff( next, reject, requestNext );
}
} )
doAjaxStuff( request, errCallback, doneCallback ) {
...
}
this is a pretty simple pattern:
var queue = Promise.resolve();
var nop = () => null;
for(let i=0; i<limitVar; ++i){
queue = queue.then(() => doAjaxStuff(i));
//or if you want to ignore Errors
//queue = queue.then(() => doAjaxStuff(i)).catch(nop);
}
queue.then(() => console.log("finished"));
Or if you use an Array as input:
var done = data.reduce(
(queue, value, index) => queue.then(() => doSomethingWith(value, index)),
Promise.resolve()
);
done.then(() => console.log("finished"));
I have a case where I want to do something once 10 async calls have completed
let i = 0;
let array = [];
do {
this.service.getSomething(i).subscribe(response => {
array[i] = response;
});
} while (i < 10);
// how can I know when the 10 async calls have completed?
How can I achieve this?
This depends on whether you know the async operations (read Observables/Promises) beforehand or not.
For example if you can compose an array of Observables then the easiest way is to use forkJoin:
let observables = [ ... ];
Observable.forkJoin(observables)
.subscribe(results => /* whatever */);
Otherwise, you can just mergeMap them into a single chain a listen only to the complete signal:
Observable.range(1, 10) // or whatever
.mergeMap(i => /* return Observable here */)
.subscribe(undefined, undefined, () => console.log('all done'));
'The Rx way' is to use forkJoin:
const requestParams = [0,1,2,3,4,5,6,7,8,9];
const requests = requestParams.map(i => this.service.getSomething(i));
Observable.forkJoin(requests).subscribe(reponseArray => alldone(responseArray));
You have to make your loop asynchronous, so that an iteration will only occur when the next response is available. Here is how you can do that:
(function loop(arr) {
if (arr.length >= 10) return alldone(array); // all done
this.service.getSomething(arr.length).subsribe(response => {
loop(array.concat(response)); // "recursive" call
});
})([]); // <--- pass empty array as argument to the loop function
function alldone(arr) {
console.log(arr);
}
The loop function is immediately invoked with an empty array as argument. When you get the response, you call that function again, now with the extended array, ...etc. Once you have 10 responses, you call another function that will do something with the final array.
As you can see, I chose to eliminate the variable i, since arr.length has the same value.
Note that this kind of asynchronous processing can also be done with promises and some recent features like async and await. You might want to look into that. Here is an example
You can just count responses in separate variable, and check it before continue:
let i = 0;
let array = [];
var finishedCnt=0;
do {
this.service.getSomething(i).subsribe(response => {
array[i] = response;
finishedCnt++;
if(finishedCnt>=10) {
// all requests done, do something here
}
});
} while (i < 10);
This question already has answers here:
JavaScript closure inside loops – simple practical example
(44 answers)
Closed 4 years ago.
The community reviewed whether to reopen this question 3 months ago and left it closed:
Duplicate This question has been answered, is not unique, and doesn’t differentiate itself from another question.
I am running an event loop of the following form:
var i;
var j = 10;
for (i = 0; i < j; i++) {
asynchronousProcess(callbackFunction() {
alert(i);
});
}
I am trying to display a series of alerts showing the numbers 0 through 10. The problem is that by the time the callback function is triggered, the loop has already gone through a few iterations and it displays a higher value of i. Any recommendations on how to fix this?
The for loop runs immediately to completion while all your asynchronous operations are started. When they complete some time in the future and call their callbacks, the value of your loop index variable i will be at its last value for all the callbacks.
This is because the for loop does not wait for an asynchronous operation to complete before continuing on to the next iteration of the loop and because the async callbacks are called some time in the future. Thus, the loop completes its iterations and THEN the callbacks get called when those async operations finish. As such, the loop index is "done" and sitting at its final value for all the callbacks.
To work around this, you have to uniquely save the loop index separately for each callback. In Javascript, the way to do that is to capture it in a function closure. That can either be done be creating an inline function closure specifically for this purpose (first example shown below) or you can create an external function that you pass the index to and let it maintain the index uniquely for you (second example shown below).
As of 2016, if you have a fully up-to-spec ES6 implementation of Javascript, you can also use let to define the for loop variable and it will be uniquely defined for each iteration of the for loop (third implementation below). But, note this is a late implementation feature in ES6 implementations so you have to make sure your execution environment supports that option.
Use .forEach() to iterate since it creates its own function closure
someArray.forEach(function(item, i) {
asynchronousProcess(function(item) {
console.log(i);
});
});
Create Your Own Function Closure Using an IIFE
var j = 10;
for (var i = 0; i < j; i++) {
(function(cntr) {
// here the value of i was passed into as the argument cntr
// and will be captured in this function closure so each
// iteration of the loop can have it's own value
asynchronousProcess(function() {
console.log(cntr);
});
})(i);
}
Create or Modify External Function and Pass it the Variable
If you can modify the asynchronousProcess() function, then you could just pass the value in there and have the asynchronousProcess() function the cntr back to the callback like this:
var j = 10;
for (var i = 0; i < j; i++) {
asynchronousProcess(i, function(cntr) {
console.log(cntr);
});
}
Use ES6 let
If you have a Javascript execution environment that fully supports ES6, you can use let in your for loop like this:
const j = 10;
for (let i = 0; i < j; i++) {
asynchronousProcess(function() {
console.log(i);
});
}
let declared in a for loop declaration like this will create a unique value of i for each invocation of the loop (which is what you want).
Serializing with promises and async/await
If your async function returns a promise, and you want to serialize your async operations to run one after another instead of in parallel and you're running in a modern environment that supports async and await, then you have more options.
async function someFunction() {
const j = 10;
for (let i = 0; i < j; i++) {
// wait for the promise to resolve before advancing the for loop
await asynchronousProcess();
console.log(i);
}
}
This will make sure that only one call to asynchronousProcess() is in flight at a time and the for loop won't even advance until each one is done. This is different than the previous schemes that all ran your asynchronous operations in parallel so it depends entirely upon which design you want. Note: await works with a promise so your function has to return a promise that is resolved/rejected when the asynchronous operation is complete. Also, note that in order to use await, the containing function must be declared async.
Run asynchronous operations in parallel and use Promise.all() to collect results in order
function someFunction() {
let promises = [];
for (let i = 0; i < 10; i++) {
promises.push(asynchonousProcessThatReturnsPromise());
}
return Promise.all(promises);
}
someFunction().then(results => {
// array of results in order here
console.log(results);
}).catch(err => {
console.log(err);
});
async await is here
(ES7), so you can do this kind of things very easily now.
var i;
var j = 10;
for (i = 0; i < j; i++) {
await asycronouseProcess();
alert(i);
}
Remember, this works only if asycronouseProcess is returning a Promise
If asycronouseProcess is not in your control then you can make it return a Promise by yourself like this
function asyncProcess() {
return new Promise((resolve, reject) => {
asycronouseProcess(()=>{
resolve();
})
})
}
Then replace this line await asycronouseProcess(); by await asyncProcess();
Understanding Promises before even looking into async await is must
(Also read about support for async await)
Any recommendation on how to fix this?
Several. You can use bind:
for (i = 0; i < j; i++) {
asycronouseProcess(function (i) {
alert(i);
}.bind(null, i));
}
Or, if your browser supports let (it will be in the next ECMAScript version, however Firefox already supports it since a while) you could have:
for (i = 0; i < j; i++) {
let k = i;
asycronouseProcess(function() {
alert(k);
});
}
Or, you could do the job of bind manually (in case the browser doesn't support it, but I would say you can implement a shim in that case, it should be in the link above):
for (i = 0; i < j; i++) {
asycronouseProcess(function(i) {
return function () {
alert(i)
}
}(i));
}
I usually prefer let when I can use it (e.g. for Firefox add-on); otherwise bind or a custom currying function (that doesn't need a context object).
var i = 0;
var length = 10;
function for1() {
console.log(i);
for2();
}
function for2() {
if (i == length) {
return false;
}
setTimeout(function() {
i++;
for1();
}, 500);
}
for1();
Here is a sample functional approach to what is expected here.
ES2017: You can wrap the async code inside a function(say XHRPost) returning a promise( Async code inside the promise).
Then call the function(XHRPost) inside the for loop but with the magical Await keyword. :)
let http = new XMLHttpRequest();
let url = 'http://sumersin/forum.social.json';
function XHRpost(i) {
return new Promise(function(resolve) {
let params = 'id=nobot&%3Aoperation=social%3AcreateForumPost&subject=Demo' + i + '&message=Here%20is%20the%20Demo&_charset_=UTF-8';
http.open('POST', url, true);
http.setRequestHeader('Content-type', 'application/x-www-form-urlencoded');
http.onreadystatechange = function() {
console.log("Done " + i + "<<<<>>>>>" + http.readyState);
if(http.readyState == 4){
console.log('SUCCESS :',i);
resolve();
}
}
http.send(params);
});
}
(async () => {
for (let i = 1; i < 5; i++) {
await XHRpost(i);
}
})();
JavaScript code runs on a single thread, so you cannot principally block to wait for the first loop iteration to complete before beginning the next without seriously impacting page usability.
The solution depends on what you really need. If the example is close to exactly what you need, #Simon's suggestion to pass i to your async process is a good one.
In my project, data is distributed across groups of table. For reading the data, I need to make Async call to each of these groups (1...groupCount).
I need to call another function after all the data present in each of these groups is successfully read. What is the best way to do so?
function getData() {
for(var gc = 1; gc < groupCount; gc++)
readDataFromAsync(gc);
}
Assuming readDataFromAsync returns a jQuery deferred object
Use jQuery.when() and pass a callback to run when all is done.
function getData() {
var promises = [];
for (var gc = 1; gc < groupCount; gc++) {
promises.push(readDataFromAsync(gc));
}
$.when.apply(undefined, promises).done(function(/*...*/) {
// your code
});
}
First of all it is a bad practice to call AJAX several time, like inside of a loop.
If you can combine all the call and send all the request at a time, and take back all the response at a time then it will be best approach. For that you need to update server side code also.
But it is not possible always and depending on circumstances you need to call like this ways.
Here is some alternatives.
Use synchronous call insteted of async as follows
jQuery.ajax({
url: '.....',
success: function (result) {
//.....
},
async: false
});
But it will be slow since 1 by 1 call is there and not at all a good practice.
So you could count the ajaxResponse as a variable on your successful or unsuccessful response and call a final method when ajaxResponse reach it's cap as follows.
var ajaxResponse = 0;
function getData() {
ajaxResponse= 0;
for(var gc = 1; gc < groupCount; gc++)
readDataFromAsync(gc);
}
function readDataFromAsync(gc) {
$.ajax(...).success() {
//Your current code
ajaxResponse++;
if(ajaxResponse>=groupCount){
finalCall();
}
}
}
But there is problem also for the above approach. Handling ajaxResponse counter is difficult and there could be error occurred on ajax call.
You could also try setInterval and clearInterval instated of putting one if inside the success method but it will also costly.
Last one but best is approach provided by #Li Yin Kong as follows.
function getData() {
var promises = [];
for (var gc = 1; gc < groupCount; gc++) {
promises.push(readDataFromAsync(gc));
}
$.when.apply(undefined, promises).done(function(/*...*/) {
// your code
});
}