I have some code that functions like so. It will iterate over an array with forEach, perform a heavy task that freezes the UI with the current iteration, push the returned value from that heavy task to an array, and then return the array being pushed to after the forEach has concluded. This is how it would work:
// generate a dummy array
var array = Array.from(Array(1000000).keys()):
function doSomething() {
// the array to push to
var pushArray = [];
array.forEach((item) => {
// perform a heavy task...
const output = doHeavyThing(item);
// ...and push the heavy task result to an array
pushArray.push(output);
// (this DOM call will not work, because doHeavyThing blocks the UI)
document.getElementById('updateDiv').innerHTML = item;
})
// return the array that was pushed to
return pushArray;
}
const result = doSomething();
return result;
Is there any way I can temporarily pause the execution of the forEach loop, unfreezing the UI, to allow for a DOM update? This would only require a couple of milliseconds.
I cannot use web workers in this situation, because the data being returned is so large that any calls to postMessage would crash the browser on mobile.
I also haven't been able to get setTimeout to work, because it is asynchronous - meaning I cannot return anything. I also cannot reliably use callbacks.
Does anyone have an idea on how to solve this problem? Clarifying questions are always welcome.
My doHeavyThing function is a function from the seek-bzip package:
bzip.decodeBlock(compressedDataBlock, 32);
I believe something like this would work to ensure the event loop can tick in between calls to doHeavyThing.
// create a promise that will resolve on the next tick of the event loop
function sleep() {
return new Promise(r => setTimeout(r));
}
// standin for doHeavyThing. This is enough to induce
// noticable delay on my machine, tweak the # of iterations
// if it's too fast to notice on yours
function doHeavyThing() {
for (let i = 0; i < 100; i++) { console.log(i); }
}
async function handleHeavyLifting() {
const array = Array.from(Array(1000).keys());
const result = [];
for (const item of array) {
document.getElementById('updateDiv').innerHTML = item;
result.push(doHeavyThing(item));
// let DOM updates propagate, other JS callbacks run, etc
await sleep();
}
return result;
}
handleHeavyLifting();
<div id="updateDiv">
None
</div>
Related
I'm trying to run multiple checks through the for-loop. Sometimes within the loop I have to run an AJAX-request. To check if that is finished I'm using the waitForIt function.
After the for-loop is finished. I want to run the InitAndSeachItems function. But the function always runs while the loop is still running.
I have tried await and a variation of the waitForIt function, but i can not get it to work async. I want to stay away from using sleep, since that is not clean coding. Can someone give me some advice please?
See the codeblock for the function I am running. var bool is set to false after the AJAX-call is finished in another function and works fine. The focus here is about running the initAndSeachItems function at the correct timing.
this.setFavorietFromJsonString = async function(filterJson) {
/** set variables */
filterJson = JSON.parse(filterJson); /** loop through keys of json object */
for (var key in filterJson) {
/** set variables */
var selectedValues = filterJson[key];
/** loop through values of the keys */
Object.keys(selectedValues).forEach(await function(index) {
/** give input checked true */
if($('.'+key+'Items#'+selectedValues[index]).length > 0) {
$('.'+key+'Items#'+selectedValues[index]).prop('checked', true);
} else {
bool = true;
$('#'+key+'Lijst').val(selectedValues[index].replace('-', ' '));
window["groteFilterSearch"+key](true);
waitForIt(key, selectedValues[index]);
async function waitForIt(key, value){
if (bool === true) {
setTimeout(function(){waitForIt(key, value)},100);
} else {
setTimeout(function(){
$('.'+key+'Items#'+value).prop('checked', true);
bool = false;
},200);
};
}
}
});
}
/** set init of listing */
initAndSearchItems();
};
This stems from the fact that the foreach loop does not honor the await clause, it just keeps iterating.
You can use a for...of loop which should honor the await but has the downside that every call waits for the previous to finish.
To make the ajax calls in parallel you can use
await Promise.all()
see Any difference between await Promise.all() and multiple await?
Possible duplicate of:
Using async/await with a forEach loop
I am thinking about a scenario of building up a promise queue:
//Let's assume that promises is an array of promises
var promiseQueue = [];
for (var promise of promises) {
if (promiseQueue.length) promiseQueue[promiseQueue.length - 1].then(promise);
promiseQueue.push(promise);
}
I am thinking about implementing a function called resolver:
function *resolve() {
var promise;
while (promise = yield) Promise.resolve(promise);
}
and then iterating it:
var promiseGenerator = resolve();
The problem is the for..of here which would be responsible for the actual iteration:
for (var r of promiseGenerator) {
}
At the code above the generator will be successfully iterated, but unfortunately I am not aware of a way to successfully pass a parameter to this generator at the iteration of for..of.
I would like to clarify that I do not need an alternative, I am perfectly aware that we can do something like this:
for (var p in promiseQueue) promiseGenerator.next(promiseQueue[p]);
I am specifically interested to know whether I can pass parameters to the generator when I execute a for..of cycle.
EDIT
The problem raised by amn is that in the example he/she was focusing on would always get undefined. That's true if we pass undefined to next(), but not true if we pass something else. The problem I was raising is that a for..of loop does not allow us to pass anything to yield, which is this specific question is all about, the example is a mere illustration of the problem, showing that the promises we would create will never be created in a for..of loop. However, there is life for Iterable objects outside the realm of for..of loops and we can pass defined values into the yield. An example with the criticized code chunk can look like:
function *resolve() {
var promise;
while (promise = yield) Promise.resolve(promise);
}
var responses = [];
var f = resolve();
var temp;
for (var i = 10; !(temp = f.next(i)).done; i--) responses.push(temp);
As we can see above, the yield above cannot be assumed ab ovo to be undefined. And of course we can pass some custom thenables, like
Promise.resolve({
then: function(onFulfill, onReject) { onFulfill('fulfilled!'); }
});
or even promises which were not resolved yet. The point of the example was to show that we cannot pass values to the yield using the for..of loop, which is quite a feature gap in my opinion.
No, it is not possible to pass arguments to next.
function* generateItems() { /* ... */ }
for (var item of generateItems()) {
console.log(item);
}
is mostly short for
function* generateItems() { /* ... */ }
var iterator = generateItems()[Symbol.iterator]();
do {
const result = iterator.next();
if (result.done) break;
const item = result.value;
console.log(item);
} while (true);
barring a few missing try/catch wrappers. You can see in the spec here that it calls .next with no arguments:
Let nextResult be ? Call(iteratorRecord.[[NextMethod]], iteratorRecord.[[Iterator]], « »).
e.g.
iterator.next.apply(iterator, []);
calling next() with an empty array of arguments.
This question already has answers here:
JavaScript closure inside loops – simple practical example
(44 answers)
Closed 5 years ago.
I have a function that gets's an object passed into it, with key and data that is an array.
I have to call the API to get additional information, which in turn are added back into the object and the whole object is returned.
My first approach was incorrect because I was trying to pass the data out of the .then(), but that was wrong practice.
function asignChecklistItems(taskArray) {
// get all the people's tasks passed in
return new Promise(function(resolve, reject) {
var promises = []
// Task Array: {"Person":[tasks,tasks,tasks]}
for (var person in taskArray) {
var individualTaskPerson = taskArray[person]
// get the person's tasks
for (var individualTask in individualTaskPerson) {
// here we get each individual task
var task = individualTaskPerson[individualTask]
// Check if the checklist is present, the .exists is a bool
if (task.checklist.exists === true) {
//Here we push the promise back into the promise array
// retrieve is the async function
promises.push( retrieveCheckListItems(task.checklist.checklistID)
.then(checklistItems => {
var complete = []
var incomplete = []
const items = checklistItems["checkItems"];
for (var each in items) {
const item = items[each]
if (item["state"] === "complete") {
complete.push(item["name"])
} else {
incomplete.push(item["name"])
}
}
task.checklist.completeItems.push(complete)
task.checklist.incompleteItems.push(incomplete)
return taskArray // used to be: resolve(taskArray) See **EDIT**
})
.catch(err => {
logError(err)
reject(err)
})
)
} else {
// There's no checklist
}
}
}
Promise.all(promises).then(function(x){
// Here when checked, all the items are pushed to the last person inside the array.
// Eg. {PersonA:[tasks],PersonB:[tasks],PersonC:[tasks]}
// All of the complete and incomplete items are added to Person C for some reason
resolve(taskArray)
})
})
}
I've tried many approaches, returning the entire promise, trying to return from within the promise (which didn't work because it's not allowed), and trying to run the async code earlier, and moving the for loops into the promise. None of them worked, this is the closest, where it returns it for PersonC.
This was mostly based on other SO questions such as this one, which showed how to use Promise.All.
Is this the preoper way of calling a promise (async function) for each element of a for loop?
EDIT:
Another mistake that was in the code, is that if there is a promise insde a promise, such as the retrieveCheckListItems inside of the asignCheckListItems, it shouldn't resolve itself, but it should return the value. I updated the code to reflect that, based on the working production code.
Apparently another problem I was
You are executing task.checklist.completeItems.push(complete) in the retrieveCheckListItems .then, which means the code is asynchronous.
At the same time, var task is instead being assigned in a for...in loop, which means that by the time your .then is triggered, the for...in iteration will be complete and task will be the last assigned object.
Note that var variables have a function scope, which basically means that your code is equivalent to:
function asignChecklistItems(taskArray) {
// get all the people's tasks passed in
return new Promise(function(resolve, reject) {
var promises = []
var individualTaskPerson;
var task;
...
Fix it by:
Either changing var task to let task (if you are using ES6). This then creates the variable within each for loop, rather than within the enclosing function.
or
By replacing
for (var individualTask in individualTaskPerson) {
// here we get each individual task
var task = individualTaskPerson[individualTask]
with
Object.keys(individualTaskPerson).forEach(function(individualTask) {
var task = individualTaskPerson[individualTask];
...
});
Do the same for the other for...in loops and variables.
I have a case where I want to do something once 10 async calls have completed
let i = 0;
let array = [];
do {
this.service.getSomething(i).subscribe(response => {
array[i] = response;
});
} while (i < 10);
// how can I know when the 10 async calls have completed?
How can I achieve this?
This depends on whether you know the async operations (read Observables/Promises) beforehand or not.
For example if you can compose an array of Observables then the easiest way is to use forkJoin:
let observables = [ ... ];
Observable.forkJoin(observables)
.subscribe(results => /* whatever */);
Otherwise, you can just mergeMap them into a single chain a listen only to the complete signal:
Observable.range(1, 10) // or whatever
.mergeMap(i => /* return Observable here */)
.subscribe(undefined, undefined, () => console.log('all done'));
'The Rx way' is to use forkJoin:
const requestParams = [0,1,2,3,4,5,6,7,8,9];
const requests = requestParams.map(i => this.service.getSomething(i));
Observable.forkJoin(requests).subscribe(reponseArray => alldone(responseArray));
You have to make your loop asynchronous, so that an iteration will only occur when the next response is available. Here is how you can do that:
(function loop(arr) {
if (arr.length >= 10) return alldone(array); // all done
this.service.getSomething(arr.length).subsribe(response => {
loop(array.concat(response)); // "recursive" call
});
})([]); // <--- pass empty array as argument to the loop function
function alldone(arr) {
console.log(arr);
}
The loop function is immediately invoked with an empty array as argument. When you get the response, you call that function again, now with the extended array, ...etc. Once you have 10 responses, you call another function that will do something with the final array.
As you can see, I chose to eliminate the variable i, since arr.length has the same value.
Note that this kind of asynchronous processing can also be done with promises and some recent features like async and await. You might want to look into that. Here is an example
You can just count responses in separate variable, and check it before continue:
let i = 0;
let array = [];
var finishedCnt=0;
do {
this.service.getSomething(i).subsribe(response => {
array[i] = response;
finishedCnt++;
if(finishedCnt>=10) {
// all requests done, do something here
}
});
} while (i < 10);
I've got the loop below in my code and each iteration calls a function in which there's an Ajax call.
I'm trying to find a way to make each iteration run only after the former iteration completed.
Until now, as you can see, I've used a delay as a temporary solution.
Is there a neat way to make it synchronized?
Thanks!
$.each(matchIds, function(key,val){
setTimeout(function(){
gettingEvents(key, val,userID);
},delay);
delay+=1700;
});
If they really are synchronous, then simply not using setTimeout will do the job.
$.each(matchIds, function(key,val){
gettingEvents(key, val,userID);
});
This is, however, a terrible approach that will tie up the JS event loop while all the requests run. Use asynchronous requests instead. Iterate inside the success handler instead of in an each loop.
var match_id_keys = Object.keys(matchIds);
var i = 0;
get_next();
function get_next() {
if (!match_id_keys[i]) {
return;
}
var key = match_id_keys[i];
var val = matchIds[key];
$.ajax( ... ).done(function () {
// Do stuff with data
i++;
get_next();
});
}
In 2020, we now have decent browser support for async and await which are tools for managing promises, so this can be written as a simpler loop:
async function get_all_ids(matchIds) {
const entries = Object.entries(matchIds);
for (let i = 0; i < entries.length; i++) {
const [key, val] = entries[i];
const data = await $.ajax(...);
// Do stuff with data
}
}