Consider a sequence of steps that need to be performed as part of preparing a heavy web page:
step1();
step2();
..
stepk();
Each step may take in the range of 100milliseconds to a few seconds: but we are uncertain in advance how long each step takes.
At least until Promise/ await hit the street my understanding is that we use callbacks along with setTimeout.
But how can we avoid that from quickly becoming unwieldy? In the following sequence we have two concerns:
how to specify the timeout when the actual work could be up to two orders of magnitude in range
how to handle the passing of arguments - argK in the code shown below - to the nested function invocations
First two steps (of K):
function step1(args1,args2,args3,..) {
// do work for step1 using args1
setTimeout(function() {step2(args2,args3);}, [some timeout..]);
}
function step2(args2,args3,..) {
// do work for step2 using args2
setTimeout(function() {step3(args3 [, args4, args5 ..]);}, [some timeout..]);
}
So how can these sequential steps be structured so that we are not sending a growing list of args down an entire chain of functions?
Note: Webworkers may be a useful approach for some cases: but I want to be able to serve from the local file system and that apparently precludes them:
http://blog.teamtreehouse.com/using-web-workers-to-speed-up-your-javascript-applications
Restricted Local Access
Web Workers will not work if the web page is being served directly
from the filesystem (using file://). Instead you will need to use a
local development server such as XAMPP.
Without promises or async await, you must do it callback hell style
function step1(a,b,c){
setTimeout(() => {
step2():
}
}
Or you can pass references to the next step
If step2 relies on results from step1
function step1(a,b,c, done){
setTimeout(() => {
done(a,b,step3):
}
}
function step2(d,e,done){
setTimeout(() => {
done(e):
}
}
step1("cat","dog","mouse", step2);
If you want to pass args to step2 manually, and get results from step1
function step1(a,b,c, done){
setTimeout(() => {
done(a):
}
}
function step2(d,e,done){
return function(step1a){
setTimeout(() => {
done(step1a, d):
}
}
}
step1("cat","dog","mouse", step2("d","e", step3);
This is probably as clean as you can get without Promisifying your async actions or implementing your own promise style.
I've been reading up on generator functions and it seems they could be a vanilla JS solution.
Alex Perry wrote a great article with a relevant demo:
function step1() {
setTimeout(function(){
gen.next('data from 1')
}, 500);
}
function step2(data) {
setTimeout(function(){
gen.next(`data from 2 and ${data[0]}`)
}, 700);
}
function step3() {
setTimeout(function(){
gen.next('data from 3')
}, 100);
}
function *sayHello() {
var data = [];
data.push(yield step1());
data.push(yield step2(data));
data.push(yield step3(data));
console.log(data);
}
var gen = sayHello();
gen.next();
In the example above each asynchronous request returns a fake data. Each successive step receives an array containing the previous responses so the previous responses can be used.
Related
I am attempting to make a simple text game that operates in a socket.io chat room on a node server. The program works as follows:
Currently I have three main modules
Rogue : basic home of rogue game functions
rogueParser : module responsible for extracting workable commands from command strings
Verb_library: module containing a list of commands that can be invoked from the client terminal.
The Client types a command like 'say hello world'. This triggers the following socket.io listener
socket.on('rg_command', function(command){
// execute the verb
let verb = rogueParser(command);
rogue.executeVerb(verb, command, function(result){
console.log(result);
});
});
Which then in turn invokes the executeVerb function from rogue..
executeVerb: function(verb, data, callback){
verb_library[verb](data, callback);
},
Each verb in verb_library should be responsible for manipulating the database -if required- and then returning an echo string sent to the appropriate targets representing the completion of the action.
EDIT: I chose 'say' when I posted this but it was pointed out afterward that it was a poor example. 'say' is not currently async but eventually will be as will be the vast majority of 'verbs' as they will need to make calls to the database.
...
say: function(data, callback){
var response = {};
console.log('USR:'+data.user);
var message = data.message.replace('say','');
message = ('you say '+'"'+message.trim()+'"');
response.target = data.user;
response.type = 'echo';
response.message = message;
callback(response);
},
...
My problem is that
1 ) I am having issues passing callbacks through so many modules. Should I be able to pass a callback through multiple layers of modules? Im worried that I'm blind so some scope magic that is making me lose track of what should happen when I pass a callback function into a module which then passes the same callback to another module which then calls the callback. Currently it seems I either end up without access to the callback on the end, or the first function tries to execute without waiting on the final callback returning a null value.
2 ) Im not sure if Im making this harder than it needs to be by not using promises or if this is totally achievable with callbacks, in which case I want to learn how to do it that way before I summon extra code.
Sorry if this is a vague question, I'm in a position of design pattern doubt and looking for advice on this general setup as well as specific information regarding how these callbacks should be passed around. Thanks!
1) Passing callback trough multiple layers doesn't sound like a good idea. Usually I'm thinking what will happen, If I will continiue doing this for a year? Will it be flexible enough so that when I need to change to architecture (let's say customer have new idea), my code will allow me to without rewriting whole app? What you're experiencing is called callback hell. http://callbackhell.com/
What we are trying to do, is to keep our code as shallow as possible.
2) Promise is just syntax sugar for callback. But it's much easier to think in Promise then in callback. So personally, I would advice you to take your time and grasp as much as you can of programming language features during your project. Latest way we're doing asynchronus code is by using async/await syntax which allows us to totally get rid of callback and Promise calls. But during your path, you will have to work with both for sure.
You can try to finish your code this way and when you're done, find what was the biggest pain and how could you write it again to avoid it in future. I promise you that it will be much more educative then getting explicit answear here :)
Asynchronousness and JavaScript go back a long way. How we deal with it has evolved over time and there are numerous applied patterns that attempt to make async easier. I would say that there are 3 concrete and popular patterns. However, each one is very related to the other:
Callbacks
Promises
async/await
Callbacks are probably the most backwards compatible and just involve providing a function to some asynchronous task in order to have your provided function be called whenever the task is complete.
For example:
/**
* Some dummy asynchronous task that waits 2 seconds to complete
*/
function asynchronousTask(cb) {
setTimeout(() => {
console.log("Async task is done");
cb();
}, 2000);
}
asynchronousTask(() => {
console.log("My function to be called after async task");
});
Promises are a primitive that encapsulates the callback pattern so that instead of providing a function to the task function, you call the then method on the Promise that the task returns:
/**
* Some dummy asynchronous task that waits 2 seconds to complete
* BUT the difference is that it returns a Promise
*/
function asynchronousTask() {
return new Promise(resolve => {
setTimeout(() => {
console.log("Async task is done");
resolve();
}, 2000);
});
}
asynchronousTask()
.then(() => {
console.log("My function to be called after async task");
});
The last pattern is the async/await pattern which also deals in Promises which are an encapsulation of callbacks. They are unique because they provide lexical support for using Promises so that you don't have to use .then() directly and also don't have to explicitly return a Promise from your task:
/*
* We still need some Promise oriented bootstrap
* function to demonstrate the async/await
* this will just wait a duration and resolve
*/
function $timeout(duration) {
return new Promise(resolve => setTimeout(resolve, duration));
}
/**
* Task runner that waits 2 seconds and then prints a message
*/
(async function() {
await $timeout(2000);
console.log("My function to be called after async task");
}());
Now that our vocabulary is cleared up, we need to consider one other thing: These patterns are all API dependent. The library that you are using uses callbacks. It is alright to mix these patterns, but I would say that the code that you write should be consistent. Pick one of the patterns and wrap or interface with the library that you need to.
If the library deals in callbacks, see if there is a wrapping library or a mechanism to have it deal in Promises instead. async/await consumes Promises, but not callbacks.
Callbacks are fine, but I would only use them if a function is dependent on some asynchronous result. If however the result is immediately available, then the function should be designed to return that value.
In the example you have given, say does not have to wait for any asynchronous API call to come back with a result, so I would change its signature to the following:
say: function(data){ // <--- no callback argument
var response = {};
console.log('USR:'+data.user);
var message = data.message.replace('say','');
message = ('you say '+'"'+message.trim()+'"');
response.target = data.user;
response.type = 'echo';
response.message = message;
return response; // <--- return it
}
Then going backwards, you would also change the signature of the functions that use say:
executeVerb: function(verb, data){ // <--- no callback argument
return verb_library[verb](data); // <--- no callback argument, and return the returned value
}
And further up the call stack:
socket.on('rg_command', function(command){
// execute the verb
let verb = rogueParser(command);
let result = rogue.executeVerb(verb, command); // <--- no callback, just get the returned value
console.log(result);
});
Of course, this can only work if all verb methods can return the expected result synchronously.
Promises
If say would depend on some asynchronous API, then you could use promises. Let's assume this API provides a callback system, then your say function could return a promise like this:
say: async function(data){ // <--- still no callback argument, but async!
var response = {};
console.log('USR:'+data.user);
var message = data.message.replace('say','');
response.target = data.user;
response.type = 'echo';
// Convert the API callback system to a promise, and use AWAIT
await respone.message = new Promise(resolve => someAsyncAPIWithCallBackAsLastArg(message, resolve));
return response; // <--- return it
}
Again going backwards, you would also change the signature of the functions that use say:
executeVerb: function(verb, data){ // <--- still no callback argument
return verb_library[verb](data); // <--- no callback argument, and return the returned promise(!)
}
And finally:
socket.on('rg_command', async function(command){ // Add async
// execute the verb
let verb = rogueParser(command);
let result = await rogue.executeVerb(verb, command); // <--- await the fulfillment of the returned promise
console.log(result);
});
I was wondering if someone could help me with this issue that I have...
Our client has an Legacy API which retrieves messages from users, and they want us to implement a polling mechanism for it that, based on an specific interval, updates the information within the page. Period.
They want to have a reliable polling strategy so (as you may already know) I'm using setTimeout to pull that off.
TL;DR: Does anyone one of you knows how to pull out an efficient polling utility that doesn't leak memory?
I'm trying to pull off an utility that allows me to add certain actions to a "polling list" and run the "polling" right there.
For example, I have an "actions list" similar to this:
const actions = new Map();
actions.set('1', ('1', {
action: () => {
console.log('action being executed...');
return 'a value';
},
onFinished: (...param) => { console.log(param); },
name: 'name of the action',
id: '1'
}));
I'm using Map for both api convenience and lookup performance and I'm adding a fake action to it (some of the params might not be needed but they're there for testing purposes).
Regarding the polling, I've created a delay fn to handle the timeout as a Promise (just for readability sake. It shouldn't affect the call stack usage whatsoever):
function delay(timeout) {
return new Promise(function delay(resolve) {
setTimeout(resolve, timeout);
});
}
And the polling fn that I came up with looks like this:
async function do_stuff(id, interval) {
const action = actions.get(id);
// breakout condition
if (action.status === 'stop') {
return;
}
console.log('processing...');
const response = await action.action();
console.log('executing action onFinished...');
action.onFinished(action.name, response);
delay(interval).then(function callAgain() {
do_stuff(id, interval);
});
}
I've used async/await in here because my action.action() will be mostly async operations and I'm using .then after the delay because I want to use the browser's EventTable to handle my resolve functions instead of the browser's stack. Also, I'm using named functions for debugging purposes.
To run the polling function, I just do:
const actionId = '1';
const interval = 1000;
do_stuff(actionId, interval);
And to stop the poll of that particular action, I run:
actions.get(actionId).status = 'stop'; // not fancy but effective
So far so good.... not! This surely has a ton of issues, but the one that bothers me the most of the JS Heap usage.
I ran a couple of tests using the Performance Tab from Chrome DevTools (Chrome version 64) and this is what I got:
Using an interval of 10 milliseconds
- 1000ms: polling started
- 10000ms: polling stopped
- 13000ms: ran a manual GC
Using an interval of 1 second
1000ms: polling started
10000ms: polling stopped
13000ms: ran a manual GC
Does anyone know why is this behaving like this? Why the GC it's running more frequently when I decrease the interval? Is it a memory leak or a stack issue? Are there any tools I could use to keep investigating about this issue?
Thanks in advance!
Stuff that I've read:
http://reallifejs.com/brainchunks/repeated-events-timeout-or-interval/ (why choosing setTimeout instead of setInterval)
Building a promise chain recursively in javascript - memory considerations
How do I stop memory leaks with recursive javascript promises?
How to break out of AJAX polling done using setTimeout
https://alexn.org/blog/2017/10/11/javascript-promise-leaks-memory.html
PS: I've putted the snippet right here in case anyone wants to give it a try.
const actions = new Map();
actions.set('1', ('1', {
action: () => {
console.log('action being executed...');
return 'a value';
},
onFinished: (...param) => { console.log(param); },
name: 'name of the action',
id: '1'
}));
function delay(timeout) {
return new Promise(function delay(resolve) {
setTimeout(resolve, timeout);
});
}
async function do_stuff(id, interval) {
const action = actions.get(id);
// breakout condition
if (action.status === 'stop') {
return;
}
console.log('processing...');
const response = await action.action();
console.log('executing action onFinished...');
action.onFinished(action.name, response);
delay(interval).then(function callAgain() {
do_stuff(id, interval);
});
}
/*
// one way to run it:
do_stuff('1', 1000);
// to stop it
actions.get('1').status = 'stop';
*/
What is the best way to create parallel asynchronous HTTP requests and take the first result that comes back positive? I am familiar with the async library for JavaScript and would happy to use that but am not sure if it has exactly what I want.
Background - I have a Redis store that serves as state for a server. There is an API we can call to get some data that takes much longer than reaching the Redis store.
In most cases the data will already be in the Redis store, but in some cases it won't be there yet and we need to retrieve it from the API.
The simple thing to do would be to query Redis, and if the value is not in Redis then go to the API afterwards. However, we'll needlessly lose 20-50ms if the data is not yet in our Redis cache and we have to go to the API after failing to find the data with Redis. Since this particular API server is not under great load, it won't really hurt to go to the API simultaneously/in parallel, even if we don't absolutely need the returned value.
//pseudocode below
async.minimum([
function apiRequest(cb){
request(opts,function(err,response,body){
cb(err,body.result.hit);
}
},
function redisRequest(cb){
client.get("some_key", function(err, reply) {
cb(err,reply.result.hit);
});
}],
function minimumCompleted(err,result){
// this mimimumCompleted final callback function will be only fired once,
// and would be fired by one of the above functions -
// whichever one *first* returned a defined value for result.hit
});
is there a way to get what I am looking for with the async library or perhaps promises, or should I implement something myself?
Use Promise.any([ap, bp]).
The following is a possible way to do it without promises. It is untested but should meet the requirements.
To meet requirement of returning the first success and not just the first completion, I keep a count of the number of completions expected so that if an error occurs it can be ignored it unless it is the last error.
function asyncMinimum(a, cb) {
var triggered = false;
var completions = a.length;
function callback(err, data) {
completions--;
if (err && completions !== 0) return;
if (triggered) return;
triggered = true;
return cb(err, data);
}
a.map(function (f) { return f(callback); });
}
asyncMinimum([
function apiRequest(cb){
request(opts,function(err,response,body){
cb(err,body.result.hit);
}
},
function redisRequest(cb){
client.get("some_key", function(err, reply) {
cb(err,reply.result.hit);
});
}],
function minimumCompleted(err,result){
// this mimimumCompleted final callback function will be only fired once,
// and would be fired by one of the above functions -
// whichever one had a value for body.result.hit that was defined
});
The async.js library (and even promises) keep track of the number of asynchronous operations pending by using a counter. You can see a simple implementation of the idea in an answer to this related question: Coordinating parallel execution in node.js
We can use the same concept to implement the minimum function you want. Only, instead of waiting for the counter to count all responses before triggering a final callback, we deliberately trigger the final callback on the first response and ignore all other responses:
// IMHO, "first" is a better name than "minimum":
function first (async_functions, callback) {
var called_back = false;
var cb = function () {
if (!called_back) {
called_back = true; // block all other responses
callback.apply(null,arguments)
}
}
for (var i=0;i<async_functions.length;i++) {
async_functions[i](cb);
}
}
Using it would be as simple as:
first([apiRequest,redisRequest],function(err,result){
// ...
});
Here's an approach using promises. It takes a little extra custom code because of the non-standard result you're looking for. You aren't just looking for the first one to not return an error, but you're looking for the first one that has a specific type of result so that takes a custom result checker function. And, if none get a result, then we need to communicate that back to the caller by rejecting the promise too. Here's the code:
function firstHit() {
return new Promise(function(resolve, reject) {
var missCntr = 0, missQty = 2;
function checkResult(err, val) {
if (err || !val) {
// see if all requests failed
++missCntr;
if (missCntr === missQty) {
reject();
}
} else {
resolve(val);
}
}
request(opts,function(err, response, body){
checkResult(err, body.result.hit);
}
client.get("some_key", function(err, reply) {
checkResult(err, reply.result.hit);
});
});
}
firstHit().then(function(hit) {
// one of them succeeded here
}, function() {
// neither succeeded here
});
The first promise to call resolve() will trigger the .then() handler. If both fail to get a hit, then it will reject the promise.
I'm using Meteor._wrapAsync to force only one call to the function writeMeLater to be
executing at any one time. If 10 calls to writeMeLater are made within 1 second, the other 9 calls should be queued up in order.
To check that writeMeLater is running synchronously, the timestamp field in the Logs Collection should be spaced 1 second apart.
Problem: With the following code, only the first call to writeMeLater is executed, the other 9 does not appear to run. Why is this so?
Server Code:
writeMeLater = function(data) {
console.log('writeMeLater: ', data)
// simulate taking 1 second to complete
Meteor.setTimeout(function() {
Logs.insert({data: data, timestamp: new Date().getTime()})
}, 1 * 1000)
}
writeMeLaterSync = Meteor._wrapAsync(writeMeLater)
// simulate calling the function many times real quick
for(var i=0; i<10; i++) {
console.log('Loop: ', i)
writeMeLaterSync(i)
}
Output:
=> Meteor server running on: http://localhost:4000/
I20140119-11:04:17.300(8)? Loop: 0
I20140119-11:04:17.394(8)? writeMeLater: 0
Using an alternate version of writeMeLater, I get the same problem:
writeMeLater = function(data) {
console.log('writeMeLater: ', data)
setTimeout(Meteor.bindEnvironment( function() {
Logs.insert({data: data, timestamp: new Date().getTime()})
}), 1 * 1000)
}
TL;DR - your writeMeLater function needs to take a callback parameter.
NodeJS classic asynchronous functions usually have this signature :
function async(params..., callback) {
try {
var result = compute(params);
callback(null,result);
}
catch {
callback("something went wrong", null);
}
}
They take any number of parameters, the last one being a callback to be run when the computation is ready, called with 2 parameters: error which is null if everything is OK, and the result of course.
Meteor._wrapAsync expects to be given a function with this signature to return a newly pseudo-synchronous function.
Meteor "synchronous" functions allows you to write code in a synchronous style, but they are not truly synchronous like NodeJS fs.readFileSync for example, which BLOCKS the event loop until it's done (usually this is bad unless you're writing a command-line app, which is not the case with Meteor).
Note: using NodeJS fs *Sync functions in Meteor is bad because you might be tricked into thinking they are "Meteor synchronous" but they aren't, they will block your entire node process until they're done ! You should be using fs async funcs wrapped with Meteor._wrapAsync.
A simplified clone of Meteor._wrapAsync would look like this:
var wrapAsync=function(asyncFunc) {
// return a function who appears to run synchronously thanks to fibers/future
return function() {
var future = new Future();
// take the arguments...
var args = arguments;
// ...and append our callback at the end
Array.prototype.push.call(args, function(error, result) {
if (error) {
throw error;
}
// our callback calls future.return which unblocks future.wait
future.return(result);
});
// call the async func with computed args
asyncFunc.apply(null, args);
// wait until future.return is called
return future.wait();
};
};
There is a Future.wrap which does exactly this, Meteor._wrapAsync is a bit more complicated because it handles Meteor environment variables by using Meteor.bindEnvironment.
Fibers and Futures are a bit out of scope so I won't dive into them, be sure to check eventedmind.com videos on the subject.
Introducing Fibers - https://www.eventedmind.com/feed/BmG9WmSsdzChk8Pye
Using Futures - https://www.eventedmind.com/feed/kXR6nWTKNctKariSY
Meteor._wrapAsync - https://www.eventedmind.com/feed/Ww3rQrHJo8FLgK7FF
Now that you understand how things need to be done to encapsulate async functions in Meteor, let's fix your code.
If your async function doesn't take a callback as last argument, it won't call it (obviously), and the callback we pass to it in the wrapped function won't trigger either, which means future.return won't be called and this is why your program is blocked in the first place !
You simply have to rewrite writeMeLater to take a callback as final argument :
var writeMeLater = function(data, callback){
console.log('writeMeLater: ', data);
// simulate taking 1 second to complete
Meteor.setTimeout(function() {
Logs.insert({
data:data,
timestamp:new Date().getTime()
});
callback(null, "done processing " + data);
}, 1 * 1000);
};
And you're good to go !
I have such a function in my JS script:
function heavyWork(){
for (i=0; i<300; i++){
doSomethingHeavy(i);
}
}
Maybe "doSomethingHeavy" is ok by itself, but repeating it 300 times causes the browser window to be stuck for a non-negligible time. In Chrome it's not that big of a problem because only one Tab is effected; but for Firefox its a complete disaster.
Is there any way to tell the browser/JS to "take it easy" and not block everything between calls to doSomethingHeavy?
You could nest your calls inside a setTimeout call:
for(...) {
setTimeout(function(i) {
return function() { doSomethingHeavy(i); }
}(i), 0);
}
This queues up calls to doSomethingHeavy for immediate execution, but other JavaScript operations can be wedged in between them.
A better solution is to actually have the browser spawn a new non-blocking process via Web Workers, but that's HTML5-specific.
EDIT:
Using setTimeout(fn, 0) actually takes much longer than zero milliseconds -- Firefox, for example, enforces a minimum 4-millisecond wait time. A better approach might be to use setZeroTimeout, which prefers postMessage for instantaneous, interrupt-able function invocation, but use setTimeout as a fallback for older browsers.
You can try wrapping each function call in a setTimeout, with a timeout of 0. This will push the calls to the bottom of the stack, and should let the browser rest between each one.
function heavyWork(){
for (i=0; i<300; i++){
setTimeout(function(){
doSomethingHeavy(i);
}, 0);
}
}
EDIT: I just realized this won't work. The i value will be the same for each loop iteration, you need to make a closure.
function heavyWork(){
for (i=0; i<300; i++){
setTimeout((function(x){
return function(){
doSomethingHeavy(x);
};
})(i), 0);
}
}
You need to use Web Workers
https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers
There are a lot of links on web workers if you search around on google
We need to release control to the browser every so often to avoid monopolizing the browser's attention.
One way to release control is to use a setTimeout, which schedules a "callback" to be called at some period of time. For example:
var f1 = function() {
document.body.appendChild(document.createTextNode("Hello"));
setTimeout(f2, 1000);
};
var f2 = function() {
document.body.appendChild(document.createTextNode("World"));
};
Calling f1 here will add the word hello to your document, schedule a pending computation, and then release control to the browser. Eventually, f2 will be called.
Note that it's not enough to sprinkle setTimeout indiscriminately throughout your program as if it were magic pixie dust: you really need to encapsulate the rest of the computation in the callback. Typically, the setTimeout will be the last thing in a function, with the rest of the computation stuffed into the callback.
For your particular case, the code needs to be transformed carefully to something like this:
var heavyWork = function(i, onSuccess) {
if (i < 300) {
var restOfComputation = function() {
return heavyWork(i+1, onSuccess);
}
return doSomethingHeavy(i, restOfComputation);
} else {
onSuccess();
}
};
var restOfComputation = function(i, callback) {
// ... do some work, followed by:
setTimeout(callback, 0);
};
which will release control to the browser on every restOfComputation.
As another concrete example of this, see: How can I queue a series of sound HTML5 <audio> sound clips to play in sequence?
Advanced JavaScript programmers need to know how to do this program transformation or else they hit the problems that you're encountering. You'll find that if you use this technique, you'll have to write your programs in a peculiar style, where each function that can release control takes in a callback function. The technical term for this style is "continuation passing style" or "asynchronous style".
You can make many things:
optimize the loops - if the heavy works has something to do with DOM access see this answer
if the function is working with some kind of raw data use typed arrays MSDN MDN
the method with setTimeout() is called eteration. Very usefull.
the function seems to be very straight forward typicall for non-functional programming languages. JavaScript gains advantage of callbacks SO question.
one new feature is web workers MDN MSDN wikipedia.
the last thing ( maybe ) is to combine all the methods - with the traditional way the function is using only one thread. If you can use the web workers, you can divide the work between several. This should minimize the time needed to finish the task.
I see two ways:
a) You are allowed to use Html5 feature. Then you may consider to use a worker thread.
b) You split this task and queue a message which just do one call at once and iterating as long there is something to do.
There was a person that wrote a specific backgroundtask javascript library to do such heavy work.. you might check it out at this question here:
Execute Background Task In Javascript
Haven't used that for myself, just used the also mentioned thread usage.
function doSomethingHeavy(param){
if (param && param%100==0)
alert(param);
}
(function heavyWork(){
for (var i=0; i<=300; i++){
window.setTimeout(
(function(i){ return function(){doSomethingHeavy(i)}; })(i)
,0);
}
}())
There is a feature called requestIdleCallback (pretty recently adopted by most larger platforms) where you can run a function that will only execute when no other function takes up the event loop, which means for less important heavy work you can execute it safely without ever impacting the main thread (given that the task takes less than 16ms, which is one frame. Otherwise work has to be batched)
I wrote a function to execute a list of actions without impacting main thread. You can also pass a shouldCancel callback to cancel the workflow at any time. It will fallback to setTimeout:
export const idleWork = async (
actions: (() => void)[],
shouldCancel: () => boolean
): Promise<boolean> => {
const actionsCopied = [...actions];
const isRequestIdleCallbackAvailable = "requestIdleCallback" in window;
const promise = new Promise<boolean>((resolve) => {
if (isRequestIdleCallbackAvailable) {
const doWork: IdleRequestCallback = (deadline) => {
while (deadline.timeRemaining() > 0 && actionsCopied.length > 0) {
actionsCopied.shift()?.();
}
if (shouldCancel()) {
resolve(false);
}
if (actionsCopied.length > 0) {
window.requestIdleCallback(doWork, { timeout: 150 });
} else {
resolve(true);
}
};
window.requestIdleCallback(doWork, { timeout: 200 });
} else {
const doWork = () => {
actionsCopied.shift()?.();
if (shouldCancel()) {
resolve(false);
}
if (actionsCopied.length !== 0) {
setTimeout(doWork);
} else {
resolve(true);
}
};
setTimeout(doWork);
}
});
const isSuccessful = await promise;
return isSuccessful;
};
The above will execute a list of functions. The list can be extremely long and expensive, but as long as every individual task is under 16ms it will not impact main thread. Warning because not all browsers supports this yet, but webkit does