Why does the browser not freeze when awaiting these promises? [duplicate] - javascript

When using Javascript promises, does the event loop get blocked?
My understanding is that using await & async, makes the stack stop until the operation has completed. Does it do this by blocking the stack or does it act similar to a callback and pass of the process to an API of sorts?

When using Javascript promises, does the event loop get blocked?
No. Promises are only an event notification system. They aren't an operation themselves. They simply respond to being resolved or rejected by calling the appropriate .then() or .catch() handlers and if chained to other promises, they can delay calling those handlers until the promises they are chained to also resolve/reject. As such a single promise doesn't block anything and certainly does not block the event loop.
My understanding is that using await & async, makes the stack stop
until the operation has completed. Does it do this by blocking the
stack or does it act similar to a callback and pass of the process to
an API of sorts?
await is simply syntactic sugar that replaces a .then() handler with a bit simpler syntax. But, under the covers the operation is the same. The code that comes after the await is basically put inside an invisible .then() handler and there is no blocking of the event loop, just like there is no blocking with a .then() handler.
Note to address one of the comments below:
Now, if you were to construct code that overwhelms the event loop with continually resolving promises (in some sort of infinite loop as proposed in some comments here), then the event loop will just over and over process those continually resolved promises from the microtask queue and will never get a chance to process macrotasks waiting in the event loop (other types of events). The event loop is still running and is still processing microtasks, but if you are stuffing new microtasks (resolved promises) into it continually, then it may never get to the macrotasks. There seems to be some debate about whether one would call this "blocking the event loop" or not. That's just a terminology question - what's more important is what is actually happening. In this example of an infinite loop continually resolving a new promise over and over, the event loop will continue processing those resolved promises and the other events in the event queue will not get processed because they never get to the front of the line to get their turn. This is more often referred to as "starvation" than it is "blocking", but the point is that macrotasks may not get serviced if you are continually and infinitely putting new microtasks in the queue.
This notion of an infinite loop continually resolving a new promise should be avoided in Javascript. It can starve other events from getting a chance to be serviced.

Do Javascript promises block the stack
No, not the stack. The current job will run until completion before the Promise's callback starts executing.
When using Javascript promises, does the event loop get blocked?
Yes it does.
Different environments have different event-loop processing models, so I'll be talking about the one in browsers, but even though nodejs's model is a bit simpler, they actually expose the same behavior.
In a browser, Promises' callbacks (PromiseReactionJob in ES terms), are actually executed in what is called a microtask.
A microtask is a special task that gets queued in the special microtask-queue.
This microtask-queue is visited various times during a single event-loop iteration in what is called a microtask-checkpoint, and every time the JS call stack is empty, for instance after the main task is done, after rendering events like resize are executed, after every animation-frame callback, etc.
These microtask-checkpoints are part of the event-loop, and will block it the time they run just like any other task.
What is more about these however is that a microtask scheduled from a microtask-checkpoint will get executed by that same microtask-checkpoint.
This means that the simple fact of using a Promise doesn't make your code let the event-loop breath, like a setTimeout() scheduled task could do, and even though the js stack has been emptied and the previous task has been executed entirely before the callback is called, you can still very well lock completely the event-loop, never allowing it to process any other task or even update the rendering:
const log = document.getElementById( "log" );
let now = performance.now();
let i = 0;
const promLoop = () => {
// only the final result will get painted
// because the event-loop can never reach the "update the rendering steps"
log.textContent = i++;
if( performance.now() - now < 5000 ) {
// this doesn't let the event-loop loop
return Promise.resolve().then( promLoop );
}
else { i = 0; }
};
const taskLoop = () => {
log.textContent = i++;
if( performance.now() - now < 5000 ) {
// this does let the event-loop loop
postTask( taskLoop );
}
else { i = 0; }
};
document.getElementById( "prom-btn" ).onclick = start( promLoop );
document.getElementById( "task-btn" ).onclick = start( taskLoop );
function start( fn ) {
return (evt) => {
i = 0;
now = performance.now();
fn();
};
}
// Posts a "macro-task".
// We could use setTimeout, but this method gets throttled
// to 4ms after 5 recursive calls.
// So instead we use either the incoming postTask API
// or the MesageChannel API which are not affected
// by this limitation
function postTask( task ) {
// Available in Chrome 86+ under the 'Experimental Web Platforms' flag
if( window.scheduler ) {
return scheduler.postTask( task, { priority: "user-blocking" } );
}
else {
const channel = postTask.channel ||= new MessageChannel();
channel.port1
.addEventListener( "message", () => task(), { once: true } );
channel.port2.postMessage( "" );
channel.port1.start();
}
}
<button id="prom-btn">use promises</button>
<button id="task-btn">use postTask</button>
<pre id="log"></pre>
So beware, using a Promise doesn't help at all with letting the event-loop actually loop.
Too often we see code using a batching pattern to not block the UI that fails completely its goal because it is assuming Promises will let the event-loop loop. For this, keep using setTimeout() as a mean to schedule a task, or use the postTask API if you are in a near future.
My understanding is that using await & async, makes the stack stop until the operation has completed.
Kind of... when awaiting a value it will add the remaining of the function execution to the callbacks attached to the awaited Promise (which can be a new Promise resolving the non-Promise value).
So the stack is indeed cleared at this time, but the event loop is not blocked at all here, on the contrary it's been freed to execute anything else until the Promise resolves.
This means that you can very well await for a never resolving promise and still let your browser live correctly.
async function fn() {
console.log( "will wait a bit" );
const prom = await new Promise( (res, rej) => {} );
console.log( "done waiting" );
}
fn();
onmousemove = () => console.log( "still alive" );
move your mouse to check if the page is locked

An await blocks only the current async function, the event loop continues to run normally. When the promise settles, the execution of the function body is resumed where it stopped.
Every async/await can be transformed in an equivalent .then(…)-callback program, and works just like that from the concurrency perspective. So while a promise is being awaited, other events may fire and arbitrary other code may run.

As other mentioned above... Promises are just like an event notification system and async/await is the same as then(). However, be very careful, You can "block" the event loop by executing a blocking operation. Take a look to the following code:
function blocking_operation_inside_promise(){
return new Promise ( (res, rej) => {
while( true ) console.log(' loop inside promise ')
res();
})
}
async function init(){
let await_forever = await blocking_operation_inside_promise()
}
init()
console.log('END')
The END log will never be printed. JS is single threaded and that thread is busy right now. You could say that whole thing is "blocked" by the blocking operation. In this particular case the event loop is not blocked per se, but it wont deliver events to your application because the main thread is busy.
JS/Node can be a very useful programming language, very efficient when using non-blocking operations (like network operations). But do not use it to execute very intense CPU algorithms. If you are at the browser consider to use Web Workers, if you are at the server side use Worker Threads, Child Processes or a Microservice Architecture.

Related

event emitter emit in sequence or in parallel, and behaviour when they are async

Consider the following code:
import events from 'events';
const eventEmitter = new events.EventEmitter();
eventEmitter.on('flush', (arg)=>{
let i=0;
while(i<=100000){
console.log(i, arg);
i++;
}
})
setTimeout(()=>{
eventEmitter.emit('flush', `Fourth event`);
}, 5000);
setTimeout(()=>{
eventEmitter.emit('flush', `Third event`);
}, 4000);
setTimeout(()=>{
eventEmitter.emit('flush', `First event`);
}, 2000);
setTimeout(()=>{
eventEmitter.emit('flush', `Second event`);
}, 3000);
Output1:
1 First event
2 First event
3 First event
.
.
.
1 Second event
2 Second event
3 Second event
.
.
.
1 Third event
2 Third event
3 Third event
.
.
.
1 Fourth event
2 Fourth event
3 Fourth event
What I wanted to know was, does the event that is emitted get completed first then only the second event get emitted? Or can I expect something like this:
Output2:
1 First event
1 Second event
2 Third event
3 Fourth event
3 Third event
.
.
.
What if I wrote the emitter.on function like this:
eventEmitter.on('flush', (arg)=>{
const mFunc = async ()=>{
let i=0;
while(i<=100000){
console.log(i, arg);
i++;
}
}
mFunc();
})
I was expecting output like Output2, but, instead I got something similar to Output1. Could someone please explain me this behaviour?
Also, consider this case:
eventEmitter.on('flush', (arg)=>{
const mFunc = async ()=>{
let i=0;
while(i<=100000){
console.log(i, arg);
i++;
await updateValueInDatabase(i);
}
}
mFunc();
})
Now, what would be the behaviour of the function?
Coalescing my comments into an answer and adding more commentary and explanation...
Emitting an event from an EventEmitter is 100% synchronous code. It does not go through the event loop. So, once you call .emit(), those event handlers will run until they are done before any other system events (e.g. your timers) can run.
Calling an async function is NO different from the calling point of view at all than calling a non-async function. It doesn't change anything. Now, inside an async function, you could use await and that would cause it to suspend execution and immediately return a promise, but without await, making a function async just makes it return a promise, but doesn't change anything else. So, calling the async version of mFunc() is exactly the same as if it wasn't async (except it returns a promise which you don't use). Doesn't changing sequencing of events at all.
Since you apparently thought it would change things, I'd really suggest reading a bunch more about what an async function is because your perception of what it does is apparently different than what it actually does. All async functions return a promise and catch internal exceptions which they turn into a rejected promise. If you don't use await, they just run synchronously like everything else and then return a promise.
Calling an async function returns ONLY when you hit an await or when the function finishes executing and hits a return or the implicit return at the end of the function. If there's no await, it just runs like a normal function and then returns a promise (which you aren't doing anything with). As I said above, you really need to read more about what async functions actually are. Your current understanding is apparently incorrect.
Here are some summary characteristics of an async function:
They always return a promise.
That promise is resolved with whatever value the function eventually returns with a return statement or resolved with undefined if no return statement.
They run synchronously until they either hit a return statement or an await.
At the point they hit an await, they immediately return a promise and further execution of the function is suspended until the promise that await is on is resolved.
When they suspend and return a promise, the caller receives that promise and keeps executing. The caller is not suspended unless the caller also does an await on that promise.
They also catch any synchronous exceptions or other exceptions that aren't themselves in an asynchronous callback and turn those exceptions into a rejected promise so the promise that the async function returns will be rejected with the exception as the reason.
So, if there is no await in an async function, they just run synchronously and return a promise.
In your last version of code, if you end up making your event emitter handler actually be asynchronous and actually await long enough for other things to get some cycles, then you can create some competition between timers and promises waiting to notify their listeners. The promises waiting to notify their listeners will get to run before the timers. That makes your situation of mixing those two types very complicated and very dependent upon timings. A non-ending sequence of promises waiting to notify their listeners can make timers wait until there are no more promises waiting to notify. But, if there's a moment with no promises waiting to notify, then your next timer will fire and will kick off it's own set of promise-driven operations and then all the promise-driven operations will likely interleave.
Also, emitter.emit() is not promise-aware. It doesn't pay any attention to the return value from the callbacks that are listening for the emit. So, it doesn't make any difference at all to emitter.emit() whether the listeners are async or not. As soon as they return (whether they returned a promise or not), it just goes right on to whatever it was going to do next. Promises only influence code flow if the recipient uses them with await or .then() and .catch(). In this case, the recipient does nothing with the return value so emitter.emit() just goes right onto its next order of business and executes that.
okay, so, if I have a bunch of async function arrays [async1, async2, async3, ....], and they all have await statements internally, what would be the best way to execute them in sequential order? i.e. one after other in order of their index?
Well, if you had an array of async functions that properly resolve their promise when they are actually done with their work, you can execute them sequentially by just looping through the array with an await.
async function someFunc() {
const jobs = [asyncFn1, asyncFn2, asyncFn3];
for (let job of jobs) {
let result = await job(...);
}
}
Internally, the EvenEmitter hold an array of listeners for each event type. When you emit an event type, the EvenEmitter just goes through it's array of listeners and execute it. Therefore, the listeners are executed in the order it was added by addListener or on method.
To answer your questions there are two parts: a) it executes in the order you added the listener and b) it depends if the listeners are async or not. If your listeners are synchronous, then you can expect that behavior. If your listeners are async, you can't expect that. Even if your execute your async synchronously, they will not necessarily resolve at the same time because it's the nature of being async.

Async/Await control flow in JavaScript/TypeScript in Node.js

Context: I'm trying to implement a rudimentary socket pool in TypeScript. My current implementation is just a list of sockets that have an "AVAILABLE/OCCUPIED" enum attached to them (it could be a boolean admittedly) that allows me to have a mutex-like mechanism to ensure each socket only sends/receives a single message at once.
What I understand: I got that Node.js's way of handling "parallel" operations is "single-threaded asynchrony".
What I infer: to me, this means that there is only a single "control pointer"/"code read-head"/"control flow position" at once, since there is a single thread. It seems to me that the readhead only ever jumps to somewhere else in the code when "await" is called, and the Promise I am "awaiting" cannot yet be resolved. But I am not sure that this is indeed the case.
What I am wondering: does "single-threaded asynchrony" ensure that there is indeed no jump of the control flow position at any other time than when "await" is called ? Or is there some underlying scheduler that may indeed cause jumps between tasks at random moments, like normal multithreading ?
My question: All of this to ask, do I need a pure mutex/compare-and-swap mechanism to ensure that my mutex-like AVAILABLE/OCCUPIED field is set appropriately ?
Consider the following code:
export enum TaskSocketStatus
{
AVAILABLE, //Alive and available
OCCUPIED, //Alive and running a task
}
export interface TaskSocket
{
status:TaskSocketStatus;
socket:CustomSocket;
}
export class Server //A gateway that acts like a client manager for an app needing to connect to another secure server
{
private sockets:TaskSocket[];
[...]
private async Borrow_Socket():Promise<TaskSocket|null>
{
for (const socket of this.sockets)
{
if (!socket.socket.Is_Connected())
{
await this.Socket_Close(socket);
continue;
}
if (socket.status === TaskSocketStatus.AVAILABLE)
{
//This line is where things could go wrong if the control flow jumped to another task;
//ie, where I'd need a mutex or compare-and-swap before setting the status
socket.status = TaskSocketStatus.OCCUPIED;
return (socket);
}
}
if (this.sockets.length < this.max_sockets)
{
const maybe_socket = await this.Socket_Create();
if (maybe_socket.isError())
{
return null;
}
//Probably here as well
maybe_socket.value.status = TaskSocketStatus.OCCUPIED;
return maybe_socket.value;
}
return null;
}
[...]
}
The issue I'm looking to avoid is two different "SendMessage" tasks borrowing the same socket because of race conditions. Maybe this is needless worry, but I'd like to make sure, as this is a potential issue that I would really prefer not to have to confront when the server is already in production...
Thanks for your help !
So, the control flow to another operation is not when await is called. It's when the running piece of Javascript returns back to the event loop and the event loop can then service the next waiting event. Resolved promises work via the event loop too (a special queue, but still in the event loop).
So, when you hit await, that doesn't immediately jump control somewhere else. It suspends further execution of that function and then causes the function to immediately return a promise and control continues with a promise being returned to the caller of the function. The caller's code continues to execute after receiving that promise. Only when the caller or the caller of the caller or the caller of the caller of the caller (depending upon how deep the call stack is) returns back to the event loop from whatever event started this whole chain of execution does the event loop get a chance to serve the next event and start a new chain of execution.
Some time later when the underlying asynchronous operation connected to that original await finishes it will insert an event into the event queue. When other Javascript execution returns control back to the event loop and this event gets to the start of the event queue, it will get executed and will resolve the promise that the await was waiting for. Only then does the code within the function after the await get a chance to run. When that async function that contained the await finally finishes it's internal execution, then the promise that was originally returned from the async function when that first await was hit will resolve and the caller will be notified that the promise it got back has been resolved (assuming it used either await or .then() on that promise).
So, there's no jumping of flow from one place to another. The current thread of Javascript execution returns control back to the event loop (by returning and unwinding its call stack) and the event loop can then serves the next waiting event and start a new chain of execution. Only when that chain of execution finishes and returns can the event loop go get the next event and start another chain of execution. In this way, there's just the one call stack frame going at a time.
In your code, I don't quite follow what you're concerned about. There is no pre-emptive switching in Javascript. If your function does an await, then its execution will be suspended at that point and other code can run before the promise gets resolved and it continues execution after the await. But, there's no pre-emptive switching that could change the context and run other code in this thread without your code calling some asynchronous operation and then continuing in the complete callback or after the await.
So, from a pure Javascript point of view, there's no worry between pure local Javascript statements that don't involve asynchronous operations. Those are guaranteed to be sequential and uninterrupted (we're assuming there's none of your code involved that uses shared memory and worker threads - which there is no sign of in the code you posted).
What I am wondering: does "single-threaded asynchrony" ensure that there is indeed no jump of the control flow position at any other time than when "await" is called ?
It ensures that there is no jump of the control flow position at any time except when you return back the event loop (unwind the call stack). It does not occur at await. await may lead to your function returning and may lead to the caller then returning back to the event loop while it waits for the returned promise to resolve, but it's important to understand that the control flow change only happens when the stack unwinds and returns control back to the event loop so the next event can be pulled from the event queue and processed.
Or is there some underlying scheduler that may indeed cause jumps between tasks at random moments, like normal multithreading ?
Assuming we're not talking about Worker Threads, there is no pre-emptive Javascript thread switching in nodejs. Execution to another piece of Javascript changes only when the current thread of Javascript returns back to the event loop.
My question: All of this to ask, do I need a pure mutex/compare-and-swap mechanism to ensure that my mutex-like AVAILABLE/OCCUPIED field is set appropriately ?
No, you do not need a mutex for that. There is no return back to the event loop between the test and set so they are guaranteed to be not interrupted by any other code.

How do I update the values of objects outside of a function? [duplicate]

Why does a function called after my promise execute before the promise's callback?
I read this in MDN, but didn't understand it
"Callbacks will never be called before the completion of the current
run of the JavaScript event loop."
I thought it meant that if I have any other statements after resolve() or reject() they will get executed before the callback is invoked. Though, that seems to be an incomplete understanding.
function myFunction() {
return new Promise( function(resolve, reject) {
const err = false;
if(err) {
reject("Something went wrong!!!");
}
else {
resolve("All good");
}
});
}
myFunction().then(doSuccess).catch(doError);
doOther();
function doError(err) {
console.log(err);
}
function doSuccess() {
console.log('Success');
}
function doOther() {
console.log("My Other Function");
}
Output:
My Other Function
Success
By specification, a promise .then() or .catch() callback is never called synchronously, but is called on a future tick of the event loop. That means that the rest of your synchronous code always runs before any .then() handler is called.
So, thus your doOther() function runs before either doSuccess() or doError() are called.
Promises are designed this way so that a promise .then() handler will be called with consistent timing whether the promise is resolved immediately or resolved some time in the future. If synchronous .then() handlers were allowed, then calling code would either have to know when it might get called synchronously or you'd be susceptible to weird timing bugs.
In the Promises/A+ specification which the promises in the ES6 specification were based on, it defines a `.then() handler like this:
promise.then(onFulfilled, onRejected)
and then has this to say about it:
2.2.4. onFulfilled or onRejected must not be called until the execution context stack contains only platform code. [3.1].
And, then it defines platform code like this:
Here “platform code” means engine, environment, and promise implementation code. In practice, this requirement ensures that onFulfilled and onRejected execute asynchronously, after the event loop turn in which then is called, and with a fresh stack. This can be implemented with either a “macro-task” mechanism such as setTimeout or setImmediate, or with a “micro-task” mechanism such as MutationObserver or process.nextTick. Since the promise implementation is considered platform code, it may itself contain a task-scheduling queue or “trampoline” in which the handlers are called.
Basically what this means is that .then() handlers are called by inserting a task in the event loop that will not execute until the currently running Javascript finishes and returns control back to the interpreter (where it can retrieve the next event). So, thus any synchronous Javascript code you have after the .then() handler is installed will always run before the .then() handler is called.
I had similar confusions about what it means by executing after the current loop. I was going through MDN docs on Promises, which says the following:
Callbacks added with then() will never be invoked before the completion of the current run of the JavaScript event loop.
This website http://latentflip.com/loupe/ video expains it pretty well, basically api's like setTimeout execute after the inbuilt js functions. But, the problem with callbacks is, if it doesn't use those apis then it might execute before the finish of current run of the event loop. Following examples show the difference:
Old school callbacks
var foo = function(then1) {
console.log("initial");
var i = 0;
while (i < 1000000000) {
i++;
}
then1();
}
function then11() {
console.log("res");
}
foo(then11);
console.log("end"); // unlike Promises, end is printed last
New Promise
var promise = new Promise(function(resolve, reject) {
console.log("Initial");
//while loop to 10000
var i = 0;
while(i<1000000000) { //long async task without ext apis
i++;
}
resolve("res");
});
promise.then(function(result) {
console.log(result); // "Stuff worked!"
});
console.log("end"); // end is printed before res!!!

Difference between returning new Promise and Promise.resolve

For the below code snippet, i would like to understand how NodeJS runtime handles things :
const billion = 1000000000;
function longRunningTask(){
let i = 0;
while (i <= billion) i++;
console.log(`Billion loops done.`);
}
function longRunningTaskProm(){
return new Promise((resolve, reject) => {
let i = 0;
while (i <= billion) i++;
resolve(`Billion loops done : with promise.`);
});
}
function longRunningTaskPromResolve(){
return Promise.resolve().then(v => {
let i = 0;
while (i <= billion) i++;
return `Billion loops done : with promise.resolve`;
})
}
console.log(`*** STARTING ***`);
console.log(`1> Long Running Task`);
longRunningTask();
console.log(`2> Long Running Task that returns promise`);
longRunningTaskProm().then(console.log);
console.log(`3> Long Running Task that returns promise.resolve`);
longRunningTaskPromResolve().then(console.log);
console.log(`*** COMPLETED ***`);
1st approach :
longRunningTask() function will block the main thread, as expected.
2nd approach :
In longRunningTaskProm() wrapping the same code in a Promise, was expecting execution will move away from main thread and run as a micro-task. Doesn't seem so, would like to understand what's happening behind the scenes.
3rd approach :
Third approach longRunningTaskPromResolve() works.
Here's my understanding :
Creation and execution of a Promise is still hooked to the main thread. Only Promise resolved execution is moved as a micro-task.
Am kinda not convinced with whatever resources i found & with my understanding.
All three of these options run the code in the main thread and block the event loop. There is a slight difference in timing for WHEN they start running the while loop code and when they block the event loop which will lead to a difference in when they run versus some of your console messages.
The first and second options block the event loop immediately.
The third option blocks the event loop starting on the next tick - that's when Promise.resolve().then() calls the callback you pass to .then() (on the next tick).
The first option is just pure synchronous code. No surprise that it immediately blocks the event loop until the while loop is done.
In the second option the new Promise executor callback function is also called synchronously so again it blocks the event loop immediately until the while loop is done.
In the third option, it calls:
Promise.resolve().then(yourCallback);
The Promise.resolve() creates an already resolved promise and then calls .then(yourCallback) on that new promise. This schedules yourCallback to run on the next tick of the event loop. Per the promise specification, .then() handlers are always run on a future tick of the event loop, even if the promise is already resolved.
Meanwhile, any other Javascript right after this continues to run and only when that Javascript is done does the interpreter get to the next tick of the event loop and run yourCallback. But, when it does run that callback, it's run in the main thread and therefore blocks until it's done.
Creation and execution of a Promise is still hooked to the main thread. Only Promise resolved execution is moved as a micro-task.
All your code in your example is run in the main thread. A .then() handler is scheduled to run in a future tick of the event loop (still in the main thread). This scheduling uses a micro task queue which allows it to get in front of some other things in the event queue, but it still runs in the main thread and it still runs on a future tick of the event loop.
Also, the phrase "execution of a promise" is a bit of a misnomer. Promises are a notification system and you schedule to run callbacks with them at some point in the future using .then() or .catch() or .finally() on a promise. So, in general, you don't want to think of "executing a promise". Your code executes causing a promise to get created and then you register callbacks on that promise to run in the future based on what happens with that promise. Promises are a specialized event notification system.
Promises help notify you when things complete or help you schedule when things run. They don't move tasks to another thread.
As an illustration, you can insert a setTimeout(fn, 1) right after the third option and see that the timeout is blocked from running until the third option finishes. Here's an example of that. And, I've made the blocking loops all be 1000ms long so you can more easily see. Run this in the browser here or copy into a node.js file and run it there to see how the setTimeout() is blocked from executing on time by the execution time of longRunningTaskPromResolve(). So, longRunningTaskPromResolve() is still blocking. Putting it inside a .then() handler changes when it gets to run, but it is still blocking.
const loopTime = 1000;
let startTime;
function log(...args) {
if (!startTime) {
startTime = Date.now();
}
let delta = (Date.now() - startTime) / 1000;
args.unshift(delta.toFixed(3) + ":");
console.log(...args);
}
function longRunningTask(){
log('longRunningTask() starting');
let start = Date.now();
while (Date.now() - start < loopTime) {}
log('** longRunningTask() done **');
}
function longRunningTaskProm(){
log('longRunningTaskProm() starting');
return new Promise((resolve, reject) => {
let start = Date.now();
while (Date.now() - start < loopTime) {}
log('About to call resolve() in longRunningTaskProm()');
resolve('** longRunningTaskProm().then(handler) called **');
});
}
function longRunningTaskPromResolve(){
log('longRunningTaskPromResolve() starting');
return Promise.resolve().then(v => {
log('Start running .then() handler in longRunningTaskPromResolve()');
let start = Date.now();
while (Date.now() - start < loopTime) {}
log('About to return from .then() in longRunningTaskPromResolve()');
return '** longRunningTaskPromResolve().then(handler) called **';
})
}
log('*** STARTING ***');
longRunningTask();
longRunningTaskProm().then(log);
longRunningTaskPromResolve().then(log);
log('Scheduling 1ms setTimeout')
setTimeout(() => {
log('1ms setTimeout Got to Run');
}, 1);
log('*** First sequence of code completed, returning to event loop ***');
If you run this snippet and look at exactly when each message is output and the timing associated with each message, you can see the exact sequence of when things get to run.
Here's the output when I run it in node.js (line numbers added to help with the explanation below):
1 0.000: *** STARTING ***
2 0.005: longRunningTask() starting
3 1.006: ** longRunningTask() done **
4 1.006: longRunningTaskProm() starting
5 2.007: About to call resolve() in longRunningTaskProm()
6 2.007: longRunningTaskPromResolve() starting
7 2.008: Scheduling 1ms setTimeout
8 2.009: *** First sequence of code completed, returning to event loop ***
9 2.010: ** longRunningTaskProm().then(handler) called **
10 2.010: Start running .then() handler in longRunningTaskPromResolve()
11 3.010: About to return from .then() in longRunningTaskPromResolve()
12 3.010: ** longRunningTaskPromResolve().then(handler) called **
13 3.012: 1ms setTimeout Got to Run
Here's a step-by-step annotation:
Things start.
longRunningTask() initiated.
longRunningTask() completes. It is entirely synchronous.
longRunningTaskProm() initiated.
longRunningTaskProm() calls resolve(). You can see from this that the promise executor function (the callback passed to new Promise(fn)` is entirely synchronous too.
longRunningTaskPromResolve() initiated. You can see that the handler from longRunningTaskProm().then(handler) has not yet been called. That has been scheduled to run on the next tick of the event loop, but since we haven't gotten back to the event loop yet, it hasn't yet been called.
We're now setting the 1ms timer. Note that this timer is being set only 1ms after we started longRunningTaskPromResolve(). That's because longRunningTaskPromResolve() didn't do much yet. It ran Promise.resolve().then(handler), but all that did was schedule the handler to run on a future tick of the event loop. So, that only took 1ms to schedule that. The long running part of that function hasn't started running yet.
We get to the end of this sequence of code and return back to the event loop.
The next thing scheduled to run in the event loop is the handler from longRunningTaskProm().then(handler) so that gets called. You can see that it was already waiting to run since it ran only 1ms after we returned to the event loop. That handler runs and we return back to the event loop.
The next thing scheduled to run in the event loop is the handler from Promise.resolve().then(handler) so we now see that that starts to run and since it was already queued, it runs immediately after the previous event finished.
It takes exactly 1000ms for the loop in longRunningTaskPromResolve() to run and then it returns from it's .then() handler which schedules then next .then() handler in that promise chain to run on the next tick of the eventl loop.
That .then() gets to run.
Then, finally when there are no .then() handlers scheduled to run, the setTimeout() callback gets to run. It was set to run in 1ms, but it got delayed by all the promise action running at a higher priority ahead of it so instead of running 1ms, it ran in 1004ms.

async resolve() needs to be wrapped?

Why do I need to wrap resolve() with meaningless async function in node 10.16.0, but not in chrome? Is this node.js bug?
let shoot = async () => console.log('there shouldn\'t be race condition');
(async () => {
let c = 3;
while(c--) {
// Works also in node 10.16.0
// console.log(await new Promise(resolve => shoot = async (...args) => resolve(...args)));
// Works is chrome, but not in node 10.16.0?
console.log(await new Promise(resolve => shoot = resolve));
};
})();
(async () => {
await shoot(1);
await shoot(2);
await shoot(3);
})();
resolve() is not async
And calling resolve() (via shoot()) does not immediately triggers the related await (in the loop) - but instead queues up the event. Adding async/await gives chance to the event loop to wake up and consume the queue. In chrome await alone is enough and in node await needs to be coupled with actual async function. This kind of synchronization of tasks in not reliable and there are chances of calling the same resolve() twice.
This is an example of what NOT to do in javascript.
It’s a Node 10 (possible) bug or (probable) outdated behaviour* in the implementation of promises. According to the ECMAScript spec at the time of this writing, in
await shoot(1);
shoot(1) fulfills the promise created with new Promise(), which enqueues a job for each reaction in that promise’s fulfill reactions
await undefined (what shoot(1) returns) enqueues a job to continue after this statement, because undefined is converted to a fulfilled promise
The reaction in the promise’s fulfill reactions corresponding to the await in the first IIFE was added by PerformPromiseThen and it doesn’t involve any other jobs; it just continues inside that IIFE immediately.
In short, the next shoot = resolve should always run before execution continues after await shoot(n). The Node 12/current Chrome result is correct.
Normally, you shouldn’t come across this type of bug anyway: as I mentioned in the comments, relying on operations creating specific numbers of jobs/taking specific numbers of microticks for synchronization is bad design. If you wanted a sort of stream where each shoot() call always produces a loop iteration (even without the misleading await), something like this would be better:
let available;
(async () => {
let queue = new Queue();
while (true) {
await new Promise(resolve => {
available = value => {
resolve();
queue.enqueue(value);
};
});
available = null; // just an assertion, pretty much
while (!queue.isEmpty()) {
let value = queue.dequeue();
// process value
}
}
})();
shoot(1);
shoot(2);
shoot(3);
with an appropriate queue implementation. (Then you could look to async iterators to make consuming the queue neat.)
* not sure of the exact history here. fairly certain the ES spec used to reference microtasks, but they’re jobs now. current stable firefox matches node 10. await may take less time than it used to. this kind of thing is the reason for the following advice.
Your code is relying on a somewhat obscure timing issue involving await of a constant (a non-promise). That timing issue apparently does not behave the same in the two environments you have tested.
Each await shoot(n) is really just doing await resolve(n) which is not awaiting a promise. It's awaiting undefined since resolve() has no return value.
So, you're apparently seeing an implementation difference in event loop and promise implementation when you await a non-promise. You are apparently expecting await resolve() to somehow be asynchronous and allow your while() loop to run before running the next await shoot(n), but I'm not aware of a language requirement to do that and even if there is, it's an implementation detail that you probably should not write code that relies on.
I think it's basically just bad code design that relies on micro-details of scheduling of two jobs that are enqueued at about the same time. It's always safer to write the code in a way that enforces the proper sequencing rather than relying on micro-details of scheduler implementation - even if these details are in the specification and certainly if they are not.
node.js is perhaps more optimized or buggy (I don't know which) to not go back to the event loop when doing await on a constant. Or, if it does go back to the event loop, it goes in a prioritized fashion that keeps the current chain of code executing rather than letting other promises go next. In any case, for this code to work, it has to rely on some await someConstant behavior that isn't the same everywhere.
Wrapping resolve() forces the interpreter to go back to the event loop after each await shoot(n) because it is actually now awaiting a promise which gives the while() loop a chance to run and fill shoot with a new value before the next shoot(n) is called.

Categories

Resources