Run two javascript functions in parallel with promises [closed] - javascript

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
(function() {
function main() {
call1();
call2();
}
function call1() {
return new Promise(() => {
for (let i=0; i<100; i++) {
console.log('This is call-1:', i);
}
});
}
function call2() {
return new Promise(() => {
for (let i=0; i<100; i++) {
console.log('This is call-2:', i);
}
});
}
main();
})();
http://plnkr.co/edit/NtioG92Tiba1KuKTx24I
The output contains all call-1 statements followed by all call-2 statements. I want to run those 2 calls in parallel. This is just an example to mimic my real code where I have 2 functions with ajax calls inside each function. success or failure of those calls with trigger another series of calls. So, I want to those 2 main functions in parallel. Is this the right approach?

This is a rather common misconception about Promises. The Promise constructor (or rather, the function you pass to it) is executed synchronously and immediately.
The following code:
console.log(1);
new Promise(resolve => resolve(console.log(2));
console.log(3);
Outputs 1 2 3 in that order, always.
Both of the functions you've passed to both of your Promise constructors are fully synchronous. There's no asynchronous action (like a setTimeout, or reading a file with a callback), so it's executed one, after the other.
Unlike what some of the other answers may tell you, Promise.all() will not save you in this case. The event loop, or ticks, or any other the other terms you might have heard of do not come into effect. Your code is fully synchronous.
Most JavaScript runtimes (browsers, Node.js, etc) are single threaded, in that your JavaScript code runs in a single thread (the browser uses many threads, but your JS code runs in one).
So in this particular case, there's nothing you can do (save from using workers or other threading options, which you probably don't want to get into). You can't magically make synchronous code asynchronous, even with trickery.

Look into Promise.all https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all.
This will call all your promises and resolve when they are complete (resolved or rejected).
Another option, depending on your use case, is Promise.race.
This will return the first Promise that completes (resolves or rejects) https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/race

I believe you want to run your main function like this:
function main() {
Promise.all([call1(), call2()])
.then( (data) => {
// this block will run, if and when BOTH calls/promises resolve
// process the data from calls
})
.catch( (err) => { // process the err });
}
Promise.all is way of how to run 2 and more async calls and wait until ALL of them are completed. As mentioned in comments they are not running parallel as you might think.
Also make sure, that your call1() and call2() function will call resolve or reject functions at some point. By calling these functions you let program know whether it is completed or "errored". More about it here

Actually your two Promise run in same thread.
So you may want Using Web Workers.
Web Workers is a simple means for web content to run scripts in
background threads. The worker thread can perform tasks without
interfering with the user interface. In addition, they can perform I/O
using XMLHttpRequest (although the responseXML and channel attributes
are always null)
Like below demo:
let url1 = window.URL.createObjectURL(
new Blob([document.querySelector('#worker1').textContent])
)
let worker1 = new Worker(url1)
let url2 = window.URL.createObjectURL(
new Blob([document.querySelector('#worker2').textContent])
)
let worker2 = new Worker(url2)
worker1.onmessage = (msg)=>{
console.log(msg.data)
}
worker2.onmessage = (msg)=>{
console.log(msg.data)
}
worker1.postMessage('init');
worker2.postMessage('init');
<script id="worker1" type="app/worker">
addEventListener('message', function () {
for (let i=0; i<100; i++) {
postMessage('This is call-1: ' + i)
}
}, false);
</script>
<script id="worker2" type="app/worker">
addEventListener('message', function () {
for (let i=0; i<100; i++) {
postMessage('This is call-2: ' + i)
}
}, false);
</script>

Related

JavaScript promise blocking code execution [duplicate]

This question already has answers here:
Correct way to write a non-blocking function in Node.js
(2 answers)
Closed 1 year ago.
Could someone try and help me to understand why the first function is non-blocking while the second one blocks the rest of the code? Isn't Promise.resolve the same as resolving from a new Promise? I can't quite wrap my head around it.
function blockingCode() {
return new Promise((resolve, reject) => {
for (let i = 0; i < 2500000000; i++) {
// Doing nothing...
}
resolve('ping');
});
}
function nonBlockingCode() {
return Promise.resolve().then(() => {
for (let i = 0; i < 2500000000; i++) {
// Doing nothing...
}
return 'pong';
});
}
console.time('non-blocking');
nonBlockingCode().then((message) => console.log(message));
console.timeEnd('non-blocking');
// console.time('blocking');
// blockingCode().then((message) => console.log(message));
// console.timeEnd('blocking');
The two functions are actually blocking.
You have the illusion that the second one isn't blocking because calling Promise.resolve().then() adds one more round to the event loop, so console.timeEnd('non-blocking'); is reached before the code inside your promise even starts.
You will notice that both function block if you fix your code like this:
console.time('non-blocking');
nonBlockingCode()
.then((message) => console.log(message))
.then(() => console.timeEnd('non-blocking'));
Note that a promise is intended not to block when the time-consuming logic inside the promise is delegated to a third-party, and you just wait for the result (e.g. when a client does a call to a remote database, a remote web server, etc.).
If you are having a hard time with promises and then logic, have a look at the async / await syntax, I find it much easier to understand.

Is custom promise implementation in Node.js going to block the I/O? [duplicate]

I wrote a simple function that returns Promise so should be non-blocking (in my opinion). Unfortunately, the program looks like it stops waiting for the Promise to finish. I am not sure what can be wrong here.
function longRunningFunc(val, mod) {
return new Promise((resolve, reject) => {
sum = 0;
for (var i = 0; i < 100000; i++) {
for (var j = 0; j < val; j++) {
sum += i + j % mod
}
}
resolve(sum)
})
}
console.log("before")
longRunningFunc(1000, 3).then((res) => {
console.log("Result: " + res)
})
console.log("after")
The output looks like expected:
before // delay before printing below lines
after
Result: 5000049900000
But the program waits before printing second and third lines. Can you explain what should be the proper way to get "before" and "after" printed first and then (after some time) the result?
Wrapping code in a promise (like you've done) does not make it non-blocking. The Promise executor function (the callback you pass to new Promise(fn) is called synchronously and will block which is why you see the delay in getting output.
In fact, there is no way to create your own plain Javascript code (like what you have) that is non-blocking except putting it into a child process, using a WorkerThread, using some third party library that creates new threads of Javascript or using the new experimental node.js APIs for threads. Regular node.js runs your Javascript as blocking and single threaded, whether it's wrapped in a promise or not.
You can use things like setTimeout() to change "when" your code runs, but whenever it runs, it will still be blocking (once it starts executing nothing else can run until it's done). Asynchronous operations in the node.js library all use some form of underlying native code that allows them to be asynchronous (or they just use other node.js asynchronous APIs that themselves use native code implementations).
But the program waits before printing second and third lines. Can you explain what should be the proper way to get "before" and "after" printed first and then (after some time) the result?
As I said above, wrapping things in promise executor function doesn't make them asynchronous. If you want to "shift" the timing of when things run (thought they are still synchronous), you can use a setTimeout(), but that's not really making anything non-blocking, it just makes it run later (still blocking when it runs).
So, you could do this:
function longRunningFunc(val, mod) {
return new Promise((resolve, reject) => {
setTimeout(() => {
sum = 0;
for (var i = 0; i < 100000; i++) {
for (var j = 0; j < val; j++) {
sum += i + j % mod
}
}
resolve(sum)
}, 10);
})
}
That would reschedule the time consuming for loop to run later and might "appear" to be non-blocking, but it actually still blocks - it just runs later. To make it truly non-blocking, you'd have to use one of the techniques mentioned earlier to get it out of the main Javascript thread.
Ways to create actual non-blocking code in node.js:
Run it in a separate child process and get an asynchronous notification when it's done.
Use the new experimental Worker Threads in node.js v11
Write your own native code add-on to node.js and use libuv threads or OS level threads in your implementation (or other OS level asynchronous tools).
Build on top of previously existing asynchronous APIs and have none of your own code that takes very long in the main thread.
The executor function of a promise is run synchronously, and this is why your code blocks the main thread of execution.
In order to not block the main thread of execution, you need to periodically and cooperatively yield control while the long running task is performed. In effect, you need to split the task into subtasks, and then coordinate the running of subtasks on new ticks of the event loop. In this way you give other tasks (like rendering and responding to user input) the opportunity to run.
You can either write your own async loop using the promise API, or you can use an async function. Async functions enable the suspension and resumation of functions (reentrancy) and hide most of the complexity from you.
The following code uses setTimeout to move subtasks onto new event loop ticks. Of course, this could be generalised, and batching could be used to find a balance between progress through the task and UI responsiveness; the batch size in this solution is only 1, and so progress is slow.
Finally: the real solution to this kind of problem is probably a Worker.
const $ = document.querySelector.bind(document)
const BIG_NUMBER = 1000
let count = 0
// Note that this could also use requestIdleCallback or requestAnimationFrame
const tick = (fn) => new Promise((resolve) => setTimeout(() => resolve(fn), 5))
async function longRunningTask(){
while (count++ < BIG_NUMBER) await tick()
console.log(`A big number of loops done.`)
}
console.log(`*** STARTING ***`)
longRunningTask().then(() => console.log(`*** COMPLETED ***`))
$('button').onclick = () => $('#output').innerHTML += `Current count is: ${count}<br/>`
* {
font-size: 16pt;
color: gray;
padding: 15px;
}
<button>Click me to see that the UI is still responsive.</button>
<div id="output"></div>

Callback not getting executed when I expect it

I have the following code that uses fetch. From what I understand, the callback function will not be invoked until the promise is fulfilled. Because of that, I was expecting the callback functions to be executed in the middle of processing other things (such as the for loop). However, it is not doing what I expect. My code is as follows:
console.log("Before fetch")
fetch('https://example.com/data')
.then(function(response){
console.log("In first then")
return response.json()
})
.then(function(json){
console.log("In second then")
console.log(json)
})
.catch(function(error){
console.log("An error has occured")
console.log(error)
})
console.log("After fetch")
for(let i = 0; i < 1000000; i++){
if (i % 10000 == 0)
console.log(i)
}
console.log("The End")
Rather than the callback being immediately run when the promise is fulfilled, it seems to wait until all the rest of my code is processed before the callback function is activated. Why is this?
The output of my code looks like this:
Before fetch
After fetch
0
10000
.
.
.
970000
980000
990000
The End
In first then
In second then
However, I was expecting the last two lines to appear somewhere prior to this point. What is going on here and how can I change my code so that it reflects when the promise is actually fulfilled?
The key here is that the for loop you're running afterwards is a long, synchronous block of code. That is the reason why synchronous APIs are deprecated / not recommended in JavaScript, as they block all asynchronous callbacks from executing until completion. JavaScript is not multithreaded, and it does not have concepts like interrupts in C, so if the thread is executing a large loop, nothing else will have the chance to run until that loop is finished.
In Node.js, the child_process API allows you to run daemon processes, and the Web Worker API for browsers allows concurrent processes to run in parallel, both of these using serialized event-based messaging to communicate between threads, but aside from that, everything in the above paragraph applies universally to JavaScript.
In general, a possible solution to breaking up long synchronous processes like the one you have there is batching. Using promises, you could rewrite the for loop like this:
(async () => {
for(let i = 0; i < 100000; i++){
if (i % 10000 == 0) {
console.log(i);
// release control for minimum of 4 ms
await new Promise(resolve => { setTimeout(resolve, 0); });
}
}
})().then(() => {
console.log("The End");
});
setTimeout(() => { console.log('Can interrupt loop'); }, 1);
Reason for 4ms minimum: https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/setTimeout#Reasons_for_delays_longer_than_specified
No matter how fast is your promise fulfilled. Callbacks added to Event Loop and will invoke after all synchronous tasks are finished. In this example synchronous task is for loop. You can try event with setTimeout with 0ms, it will also work after loop. Remember JS is single-threaded and doesn't support parallel tasks.
Referance:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop
There's no guarantee when the callback will execute. The for-loop requires very little processing time, so it's possible that the JS engine just decided to wait until it's over to complete the callback functions. There's no certain way of forcing a specific order of functions either unless you chain them as callbacks.

Javascript - how to avoid blocking the browser while doing heavy work?

I have such a function in my JS script:
function heavyWork(){
for (i=0; i<300; i++){
doSomethingHeavy(i);
}
}
Maybe "doSomethingHeavy" is ok by itself, but repeating it 300 times causes the browser window to be stuck for a non-negligible time. In Chrome it's not that big of a problem because only one Tab is effected; but for Firefox its a complete disaster.
Is there any way to tell the browser/JS to "take it easy" and not block everything between calls to doSomethingHeavy?
You could nest your calls inside a setTimeout call:
for(...) {
setTimeout(function(i) {
return function() { doSomethingHeavy(i); }
}(i), 0);
}
This queues up calls to doSomethingHeavy for immediate execution, but other JavaScript operations can be wedged in between them.
A better solution is to actually have the browser spawn a new non-blocking process via Web Workers, but that's HTML5-specific.
EDIT:
Using setTimeout(fn, 0) actually takes much longer than zero milliseconds -- Firefox, for example, enforces a minimum 4-millisecond wait time. A better approach might be to use setZeroTimeout, which prefers postMessage for instantaneous, interrupt-able function invocation, but use setTimeout as a fallback for older browsers.
You can try wrapping each function call in a setTimeout, with a timeout of 0. This will push the calls to the bottom of the stack, and should let the browser rest between each one.
function heavyWork(){
for (i=0; i<300; i++){
setTimeout(function(){
doSomethingHeavy(i);
}, 0);
}
}
EDIT: I just realized this won't work. The i value will be the same for each loop iteration, you need to make a closure.
function heavyWork(){
for (i=0; i<300; i++){
setTimeout((function(x){
return function(){
doSomethingHeavy(x);
};
})(i), 0);
}
}
You need to use Web Workers
https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers
There are a lot of links on web workers if you search around on google
We need to release control to the browser every so often to avoid monopolizing the browser's attention.
One way to release control is to use a setTimeout, which schedules a "callback" to be called at some period of time. For example:
var f1 = function() {
document.body.appendChild(document.createTextNode("Hello"));
setTimeout(f2, 1000);
};
var f2 = function() {
document.body.appendChild(document.createTextNode("World"));
};
Calling f1 here will add the word hello to your document, schedule a pending computation, and then release control to the browser. Eventually, f2 will be called.
Note that it's not enough to sprinkle setTimeout indiscriminately throughout your program as if it were magic pixie dust: you really need to encapsulate the rest of the computation in the callback. Typically, the setTimeout will be the last thing in a function, with the rest of the computation stuffed into the callback.
For your particular case, the code needs to be transformed carefully to something like this:
var heavyWork = function(i, onSuccess) {
if (i < 300) {
var restOfComputation = function() {
return heavyWork(i+1, onSuccess);
}
return doSomethingHeavy(i, restOfComputation);
} else {
onSuccess();
}
};
var restOfComputation = function(i, callback) {
// ... do some work, followed by:
setTimeout(callback, 0);
};
which will release control to the browser on every restOfComputation.
As another concrete example of this, see: How can I queue a series of sound HTML5 <audio> sound clips to play in sequence?
Advanced JavaScript programmers need to know how to do this program transformation or else they hit the problems that you're encountering. You'll find that if you use this technique, you'll have to write your programs in a peculiar style, where each function that can release control takes in a callback function. The technical term for this style is "continuation passing style" or "asynchronous style".
You can make many things:
optimize the loops - if the heavy works has something to do with DOM access see this answer
if the function is working with some kind of raw data use typed arrays MSDN MDN
the method with setTimeout() is called eteration. Very usefull.
the function seems to be very straight forward typicall for non-functional programming languages. JavaScript gains advantage of callbacks SO question.
one new feature is web workers MDN MSDN wikipedia.
the last thing ( maybe ) is to combine all the methods - with the traditional way the function is using only one thread. If you can use the web workers, you can divide the work between several. This should minimize the time needed to finish the task.
I see two ways:
a) You are allowed to use Html5 feature. Then you may consider to use a worker thread.
b) You split this task and queue a message which just do one call at once and iterating as long there is something to do.
There was a person that wrote a specific backgroundtask javascript library to do such heavy work.. you might check it out at this question here:
Execute Background Task In Javascript
Haven't used that for myself, just used the also mentioned thread usage.
function doSomethingHeavy(param){
if (param && param%100==0)
alert(param);
}
(function heavyWork(){
for (var i=0; i<=300; i++){
window.setTimeout(
(function(i){ return function(){doSomethingHeavy(i)}; })(i)
,0);
}
}())
There is a feature called requestIdleCallback (pretty recently adopted by most larger platforms) where you can run a function that will only execute when no other function takes up the event loop, which means for less important heavy work you can execute it safely without ever impacting the main thread (given that the task takes less than 16ms, which is one frame. Otherwise work has to be batched)
I wrote a function to execute a list of actions without impacting main thread. You can also pass a shouldCancel callback to cancel the workflow at any time. It will fallback to setTimeout:
export const idleWork = async (
actions: (() => void)[],
shouldCancel: () => boolean
): Promise<boolean> => {
const actionsCopied = [...actions];
const isRequestIdleCallbackAvailable = "requestIdleCallback" in window;
const promise = new Promise<boolean>((resolve) => {
if (isRequestIdleCallbackAvailable) {
const doWork: IdleRequestCallback = (deadline) => {
while (deadline.timeRemaining() > 0 && actionsCopied.length > 0) {
actionsCopied.shift()?.();
}
if (shouldCancel()) {
resolve(false);
}
if (actionsCopied.length > 0) {
window.requestIdleCallback(doWork, { timeout: 150 });
} else {
resolve(true);
}
};
window.requestIdleCallback(doWork, { timeout: 200 });
} else {
const doWork = () => {
actionsCopied.shift()?.();
if (shouldCancel()) {
resolve(false);
}
if (actionsCopied.length !== 0) {
setTimeout(doWork);
} else {
resolve(true);
}
};
setTimeout(doWork);
}
});
const isSuccessful = await promise;
return isSuccessful;
};
The above will execute a list of functions. The list can be extremely long and expensive, but as long as every individual task is under 16ms it will not impact main thread. Warning because not all browsers supports this yet, but webkit does

How to write a node.js function that waits for an event to fire before 'returning'?

I have a node application that is not a web application - it completes a series of asynchronous tasks before returning 1. Immediately before returning, the results of the program are printed to the console.
How do I make sure all the asynchronous work is completed before returning? I was able to achieve something similar to this in a web application by making sure all tasks we completed before calling res.end(), but I haven't any equivalent for a final 'event' to call before letting a script return.
See below for my (broken) function currently, attempting to wait until callStack is empty. I just discovered that this is a kind of nonsensical approach because node waits for processHub to complete before entering any of the asynchronous functions called in processObjWithRef.
function processHub(hubFileContents){
var callStack = [];
var myNewObj = {};
processObjWithRef(samplePayload, myNewObj, callStack);
while(callStack.length>0){
//do nothing
}
return 1
}
Note: I have tried many times previously to achieve this kind of behavior with libraries like async (see my related question at How can I make this call to request in nodejs synchronous?) so please take the answer and comments there into account before suggesting any answers based on 'just use asynch'.
You cannot wait for an asynchronous event before returning--that's the definition of asynchronous! Trying to force Node into this programming style will only cause you pain. A naive example would be to check periodically to see if callstack is empty.
var callstack = [...];
function processHub(contents) {
doSomethingAsync(..., callstack);
}
// check every second to see if callstack is empty
var interval = setInterval(function() {
if (callstack.length == 0) {
clearInterval(interval);
doSomething()
}
}, 1000);
Instead, the usual way to do async stuff in Node is to implement a callback to your function.
function processHub(hubFileContents, callback){
var callStack = [];
var myNewObj = {};
processObjWithRef(samplePayload, myNewObj, callStack, function() {
if (callStack.length == 0) {
callback(some_results);
}
});
}
If you really want to return something, check out promises; they are guaranteed to emit an event either immediately or at some point in the future when they are resolved.
function processHub(hubFileContents){
var callStack = [];
var myNewObj = {};
var promise = new Promise();
// assuming processObjWithRef takes a callback
processObjWithRef(samplePayload, myNewObj, callStack, function() {
if (callStack.length == 0) {
promise.resolve(some_results);
}
});
return promise;
}
processHubPromise = processHub(...);
processHubPromise.then(function(result) {
// do something with 'result' when complete
});
The problem is with your design of the function. You want to return a synchronous result from a list of tasks that are executed asynchronously.
You should implement your function with an extra parameter that will be the callback where you would put the result (in this case, 1) for some consumer to do something with it.
Also you need to have a callback parameter in your inner function, otherwise you won't know when it ends. If this last thing is not possible, then you should do some kind of polling (using setInterval perhaps) to test when the callStack array is populated.
Remember, in Javascript you should never ever do a busy wait. That will lock your program entirely as it runs on a single process.
deasync is desinged to address your problem exactly. Just replace
while(callStack.length>0){
//do nothing
}
with
require('deasync').loopWhile(function(){return callStack.length>0;});
The problem is that node.js is single-threaded, which means that if one function runs, nothing else runs (event-loop) until that function has returned. So you can not block a function to make it return after async stuff is done.
You could, for example, set up a counter variable that counts started async tasks and decrement that counter using a callback function (that gets called after the task has finished) from your async code.
Node.js runs on A SINGLE threaded event loop and leverages asynchronous calls for doing various things, like I/O operations.
if you need to wait for a number of asynchronous operations to finish before executing additional code
you can try using Async -
Node.js Async Tutorial
You'll need to start designing and thinking asynchronously, which can take a little while to get used to at first. This is a simple example of how you would tackle something like "returning" after a function call.
function doStuff(param, cb) {
//do something
var newData = param;
//"return"
cb(newData);
}
doStuff({some:data}, function(myNewData) {
//you're done with doStuff in here
});
There's also a lot of helpful utility functions in the async library available on npm.

Categories

Resources