Synchronous functions executing faster than their asynchronous counterparts? - javascript

Consider the below two snippets:
const loop1NoPromise = () => {
let i = 0
for (i; i < 500000; i++) {}
}
const loop2NoPromise = () => {
let i = 0
for (i; i < 500000; i++) {}
}
const startNoPromise = () => {
console.time('No promise')
loop1NoPromise()
loop2NoPromise()
console.timeEnd('No promise')
}
startNoPromise()
const loop1Promise = async () => {
let i = 0
for (i; i < 500000; i++) {}
return new Promise((resolve) => resolve());
}
const loop2Promise = async () => {
let i = 0
for (i; i < 500000; i++) {}
return new Promise((resolve) => resolve());
}
const startPromise = async () => {
console.time('With promise')
await Promise.all([loop1Promise(), loop2Promise()])
console.timeEnd('With promise')
}
startPromise()
I was wondering if delegating multiple functions that weren't intended to be Promises to the browser's web APIs and then awaiting them would increase performance in any way. Going into this experiment, I half expected startPromise to run a bit quicker than startNoPromise and the other half of me expected them to run +/- the same.
However, running the below snippets individually shows that startNoPromise is significantly faster than startPromise. What strange to me, is that if I merge these two snippets into one snippet, and then execute startNoPromise and startPromise, then they run more or less equally as fast... but running them individually shows a ~1ms difference in time, with startNoPromise clocking in consistently at around 2.245ms.
My question is why was my original logic flawed, that making non-promise functions into promises and outsourcing them to web APIs would make them run faster (because they would therefore be running asynchronously)? Also, why is the Promise version of these two functions executing at a slower speed than their synchronous counterparts?

The "problem" with JavaScript is that everything runs in a single thread, ie, only one thing can be done at a time. Using promises for CPU bound problems (code that is limited by the CPU power) will actually make it slower. The reason for the slow down is that the API calls for the promises and managing the promises all take CPU overhead as well.
Another thing to note is that with CPU bound functions without any await these functions will run in serial (one after the other) anyway. Adding await on the other hand will just make it even slower since there will be more management overhead.
Promises are great for code where you have to wait on something to complete outside of the main process. Most often that will be io.

Both snippets are synchoronous calculations.
Returning a promise does not make previous calculations to be "asynchronous": It just returns an object that represents an asyncronous result. That is an (in this case already resolved) promise.
In your second snipped, the code you add only wraps the result in a promise which resolution cannot be handled until next event loop cycle.
In other words: When you do await Promise.all(...) you are telling JS engine to attend all pending events and, in the next cycle, check if all promises are resolved and get its value or continue waiting otherwise and so on.
If you remove the await keyword you will see almost identical results.
You can also wrap the Promise.all(...) expression into a console.log(...) and you will see that value (a pending promise).

Related

Is custom promise implementation in Node.js going to block the I/O? [duplicate]

I wrote a simple function that returns Promise so should be non-blocking (in my opinion). Unfortunately, the program looks like it stops waiting for the Promise to finish. I am not sure what can be wrong here.
function longRunningFunc(val, mod) {
return new Promise((resolve, reject) => {
sum = 0;
for (var i = 0; i < 100000; i++) {
for (var j = 0; j < val; j++) {
sum += i + j % mod
}
}
resolve(sum)
})
}
console.log("before")
longRunningFunc(1000, 3).then((res) => {
console.log("Result: " + res)
})
console.log("after")
The output looks like expected:
before // delay before printing below lines
after
Result: 5000049900000
But the program waits before printing second and third lines. Can you explain what should be the proper way to get "before" and "after" printed first and then (after some time) the result?
Wrapping code in a promise (like you've done) does not make it non-blocking. The Promise executor function (the callback you pass to new Promise(fn) is called synchronously and will block which is why you see the delay in getting output.
In fact, there is no way to create your own plain Javascript code (like what you have) that is non-blocking except putting it into a child process, using a WorkerThread, using some third party library that creates new threads of Javascript or using the new experimental node.js APIs for threads. Regular node.js runs your Javascript as blocking and single threaded, whether it's wrapped in a promise or not.
You can use things like setTimeout() to change "when" your code runs, but whenever it runs, it will still be blocking (once it starts executing nothing else can run until it's done). Asynchronous operations in the node.js library all use some form of underlying native code that allows them to be asynchronous (or they just use other node.js asynchronous APIs that themselves use native code implementations).
But the program waits before printing second and third lines. Can you explain what should be the proper way to get "before" and "after" printed first and then (after some time) the result?
As I said above, wrapping things in promise executor function doesn't make them asynchronous. If you want to "shift" the timing of when things run (thought they are still synchronous), you can use a setTimeout(), but that's not really making anything non-blocking, it just makes it run later (still blocking when it runs).
So, you could do this:
function longRunningFunc(val, mod) {
return new Promise((resolve, reject) => {
setTimeout(() => {
sum = 0;
for (var i = 0; i < 100000; i++) {
for (var j = 0; j < val; j++) {
sum += i + j % mod
}
}
resolve(sum)
}, 10);
})
}
That would reschedule the time consuming for loop to run later and might "appear" to be non-blocking, but it actually still blocks - it just runs later. To make it truly non-blocking, you'd have to use one of the techniques mentioned earlier to get it out of the main Javascript thread.
Ways to create actual non-blocking code in node.js:
Run it in a separate child process and get an asynchronous notification when it's done.
Use the new experimental Worker Threads in node.js v11
Write your own native code add-on to node.js and use libuv threads or OS level threads in your implementation (or other OS level asynchronous tools).
Build on top of previously existing asynchronous APIs and have none of your own code that takes very long in the main thread.
The executor function of a promise is run synchronously, and this is why your code blocks the main thread of execution.
In order to not block the main thread of execution, you need to periodically and cooperatively yield control while the long running task is performed. In effect, you need to split the task into subtasks, and then coordinate the running of subtasks on new ticks of the event loop. In this way you give other tasks (like rendering and responding to user input) the opportunity to run.
You can either write your own async loop using the promise API, or you can use an async function. Async functions enable the suspension and resumation of functions (reentrancy) and hide most of the complexity from you.
The following code uses setTimeout to move subtasks onto new event loop ticks. Of course, this could be generalised, and batching could be used to find a balance between progress through the task and UI responsiveness; the batch size in this solution is only 1, and so progress is slow.
Finally: the real solution to this kind of problem is probably a Worker.
const $ = document.querySelector.bind(document)
const BIG_NUMBER = 1000
let count = 0
// Note that this could also use requestIdleCallback or requestAnimationFrame
const tick = (fn) => new Promise((resolve) => setTimeout(() => resolve(fn), 5))
async function longRunningTask(){
while (count++ < BIG_NUMBER) await tick()
console.log(`A big number of loops done.`)
}
console.log(`*** STARTING ***`)
longRunningTask().then(() => console.log(`*** COMPLETED ***`))
$('button').onclick = () => $('#output').innerHTML += `Current count is: ${count}<br/>`
* {
font-size: 16pt;
color: gray;
padding: 15px;
}
<button>Click me to see that the UI is still responsive.</button>
<div id="output"></div>

async resolve() needs to be wrapped?

Why do I need to wrap resolve() with meaningless async function in node 10.16.0, but not in chrome? Is this node.js bug?
let shoot = async () => console.log('there shouldn\'t be race condition');
(async () => {
let c = 3;
while(c--) {
// Works also in node 10.16.0
// console.log(await new Promise(resolve => shoot = async (...args) => resolve(...args)));
// Works is chrome, but not in node 10.16.0?
console.log(await new Promise(resolve => shoot = resolve));
};
})();
(async () => {
await shoot(1);
await shoot(2);
await shoot(3);
})();
resolve() is not async
And calling resolve() (via shoot()) does not immediately triggers the related await (in the loop) - but instead queues up the event. Adding async/await gives chance to the event loop to wake up and consume the queue. In chrome await alone is enough and in node await needs to be coupled with actual async function. This kind of synchronization of tasks in not reliable and there are chances of calling the same resolve() twice.
This is an example of what NOT to do in javascript.
It’s a Node 10 (possible) bug or (probable) outdated behaviour* in the implementation of promises. According to the ECMAScript spec at the time of this writing, in
await shoot(1);
shoot(1) fulfills the promise created with new Promise(), which enqueues a job for each reaction in that promise’s fulfill reactions
await undefined (what shoot(1) returns) enqueues a job to continue after this statement, because undefined is converted to a fulfilled promise
The reaction in the promise’s fulfill reactions corresponding to the await in the first IIFE was added by PerformPromiseThen and it doesn’t involve any other jobs; it just continues inside that IIFE immediately.
In short, the next shoot = resolve should always run before execution continues after await shoot(n). The Node 12/current Chrome result is correct.
Normally, you shouldn’t come across this type of bug anyway: as I mentioned in the comments, relying on operations creating specific numbers of jobs/taking specific numbers of microticks for synchronization is bad design. If you wanted a sort of stream where each shoot() call always produces a loop iteration (even without the misleading await), something like this would be better:
let available;
(async () => {
let queue = new Queue();
while (true) {
await new Promise(resolve => {
available = value => {
resolve();
queue.enqueue(value);
};
});
available = null; // just an assertion, pretty much
while (!queue.isEmpty()) {
let value = queue.dequeue();
// process value
}
}
})();
shoot(1);
shoot(2);
shoot(3);
with an appropriate queue implementation. (Then you could look to async iterators to make consuming the queue neat.)
* not sure of the exact history here. fairly certain the ES spec used to reference microtasks, but they’re jobs now. current stable firefox matches node 10. await may take less time than it used to. this kind of thing is the reason for the following advice.
Your code is relying on a somewhat obscure timing issue involving await of a constant (a non-promise). That timing issue apparently does not behave the same in the two environments you have tested.
Each await shoot(n) is really just doing await resolve(n) which is not awaiting a promise. It's awaiting undefined since resolve() has no return value.
So, you're apparently seeing an implementation difference in event loop and promise implementation when you await a non-promise. You are apparently expecting await resolve() to somehow be asynchronous and allow your while() loop to run before running the next await shoot(n), but I'm not aware of a language requirement to do that and even if there is, it's an implementation detail that you probably should not write code that relies on.
I think it's basically just bad code design that relies on micro-details of scheduling of two jobs that are enqueued at about the same time. It's always safer to write the code in a way that enforces the proper sequencing rather than relying on micro-details of scheduler implementation - even if these details are in the specification and certainly if they are not.
node.js is perhaps more optimized or buggy (I don't know which) to not go back to the event loop when doing await on a constant. Or, if it does go back to the event loop, it goes in a prioritized fashion that keeps the current chain of code executing rather than letting other promises go next. In any case, for this code to work, it has to rely on some await someConstant behavior that isn't the same everywhere.
Wrapping resolve() forces the interpreter to go back to the event loop after each await shoot(n) because it is actually now awaiting a promise which gives the while() loop a chance to run and fill shoot with a new value before the next shoot(n) is called.

Waiting for loops to finish by using await on the loop

Synchronicity in js loops is still driving me up the wall.
What I want to do is fairly simple
async doAllTheThings(data, array) {
await array.forEach(entry => {
let val = //some algorithm using entry keys
let subVal = someFunc(/*more entry keys*/)
data[entry.Namekey] = `${val}/${subVal}`;
});
return data; //after data is modified
}
But I can't tell if that's actually safe or not. I simply don't like the simple loop pattern
for (i=0; i<arrayLength; i++) {
//do things
if (i === arrayLength-1) {
return
}
}
I wanted a better way to do it, but I can't tell if what I'm trying is working safely or not, or I simply haven't hit a data pattern that will trigger the race condition.
Or perhaps I'm overthinking it. The algorithm in the array consists solely of some MATH and assignment statements...and a small function call that itself also consists solely of more MATH and assignment statements. Those are supposedly fully synchronous across the board. But loops are weird sometimes.
The Question
Can you use await in that manner, outside the loop itself, to trigger the code to wait for the loop to complete? Or is the only safe way to accomplish this the older manner of simply checking where you are in the loop, and not returning until you hit the end, manually.
One of the best ways to handle async and loops is to put then on a promise and wait for Promise.all remember that await returns a Promise so you can do:
async function doAllTheThings(array) {
const promises = []
array.forEach((entry, index) => {
promises.push(new Promise((resolve) => {
setTimeout(() => resolve(entry + 1), 200 )
}))
});
return Promise.all(promises)
}
async function main () {
const arrayPlus1 = await doAllTheThings([1,2,3,4,5])
console.log(arrayPlus1.join(', '))
}
main().then(() => {
console.log('Done the async')
}).catch((err) => console.log(err))
Another option is to use generators but they are a little bit more complex so if you can just save your promises and wait for then that is an easier approach.
About the question at the end:
Can you use await in that manner, outside the loop itself, to trigger the code to wait for the loop to complete? Or is the only safe way to accomplish this the older manner of simply checking where you are in the loop, and not returning until you hit the end, manually.
All javascript loops are synchronous so the next line will wait for the loop to execute.
If you need to do some async code in loop a good approach is the promise approach above.
Another approach for async loops specially if you have to "pause" or get info from outside the loop is the iterator/generator approach.

How does the JavaScript heap handle recursion

Hi my question above is a little vague so I'll try to make it clearer.
I have code of the form:
function main () {
async function recursive () {
var a = "Hello World";
var b = "Goodbye World";
recursive();
}
recursive();
}
I am running into an issue where I am running out of heap memory.
Assuming what I showed above is how my program behaves, with a and b being declared within the recursive function, my question is whether the variables are destroyed upon the call to recursive within the recursive function or whether they will persist until there are no more recursive calls and the main function reaches its endpoint assuming I keep the main function running long enough for that to happen.
I'm worried they stay alive in the heap and since my real program stores large strings in those variables I'm worried that that is the reason I'm running out of heap.
JavaScript does not (yet?) have tail call optimization for recursion, so for sure you'll eventually fill up your call stack. The reason your heap is filling up is because recursive is defined as an async function and never waits for the allocated Promise objects to resolve. This will fill up your heap because the garbage collector doesn't have the chance to collect them.
In short, recursion doesn't use up the heap unless you're allocating things in the heap within your functions.
Expansion of what's happening:
main
new Promise(() => {
new Promise(() => {
new Promise(() => {
new Promise(() => {
new Promise(() => { //etc.
You say you're not awaiting Promises because you want things to run in parallel. This is a dangerous approach for a few reasons. Promises should always be awaited in case they throw, and of course you want to make sure that you're allocating predictable amounts of memory usage. If this recursive function means you're getting an unpredictable number of needed tasks, you should consider a task queue pattern:
const tasks = [];
async function recursive() {
// ...
tasks.push(workItem);
// ...
}
async function processTasks() {
while (tasks.length) {
workItem = tasks.unshift();
// Process work item. May provoke calls to `recursive`
// ...do awaits here for the async parts
}
}
async function processWork(maxParallelism) {
const workers = [];
for (let i = 0; i < maxParallelism; i++) {
workers.push(processTasks());
}
await Promise.all(workers);
}
Finally, in case it needs to be said, async functions don't let you do parallel processing; they just provide a way to more conveniently write Promises and do sequential and/or parallel waits for async events. JavaScript remains a single-threaded execution engine unless you use things like web workers. So using async functions to try to parallelize synchronous tasks won't do anything for you.

Callback not getting executed when I expect it

I have the following code that uses fetch. From what I understand, the callback function will not be invoked until the promise is fulfilled. Because of that, I was expecting the callback functions to be executed in the middle of processing other things (such as the for loop). However, it is not doing what I expect. My code is as follows:
console.log("Before fetch")
fetch('https://example.com/data')
.then(function(response){
console.log("In first then")
return response.json()
})
.then(function(json){
console.log("In second then")
console.log(json)
})
.catch(function(error){
console.log("An error has occured")
console.log(error)
})
console.log("After fetch")
for(let i = 0; i < 1000000; i++){
if (i % 10000 == 0)
console.log(i)
}
console.log("The End")
Rather than the callback being immediately run when the promise is fulfilled, it seems to wait until all the rest of my code is processed before the callback function is activated. Why is this?
The output of my code looks like this:
Before fetch
After fetch
0
10000
.
.
.
970000
980000
990000
The End
In first then
In second then
However, I was expecting the last two lines to appear somewhere prior to this point. What is going on here and how can I change my code so that it reflects when the promise is actually fulfilled?
The key here is that the for loop you're running afterwards is a long, synchronous block of code. That is the reason why synchronous APIs are deprecated / not recommended in JavaScript, as they block all asynchronous callbacks from executing until completion. JavaScript is not multithreaded, and it does not have concepts like interrupts in C, so if the thread is executing a large loop, nothing else will have the chance to run until that loop is finished.
In Node.js, the child_process API allows you to run daemon processes, and the Web Worker API for browsers allows concurrent processes to run in parallel, both of these using serialized event-based messaging to communicate between threads, but aside from that, everything in the above paragraph applies universally to JavaScript.
In general, a possible solution to breaking up long synchronous processes like the one you have there is batching. Using promises, you could rewrite the for loop like this:
(async () => {
for(let i = 0; i < 100000; i++){
if (i % 10000 == 0) {
console.log(i);
// release control for minimum of 4 ms
await new Promise(resolve => { setTimeout(resolve, 0); });
}
}
})().then(() => {
console.log("The End");
});
setTimeout(() => { console.log('Can interrupt loop'); }, 1);
Reason for 4ms minimum: https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/setTimeout#Reasons_for_delays_longer_than_specified
No matter how fast is your promise fulfilled. Callbacks added to Event Loop and will invoke after all synchronous tasks are finished. In this example synchronous task is for loop. You can try event with setTimeout with 0ms, it will also work after loop. Remember JS is single-threaded and doesn't support parallel tasks.
Referance:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop
There's no guarantee when the callback will execute. The for-loop requires very little processing time, so it's possible that the JS engine just decided to wait until it's over to complete the callback functions. There's no certain way of forcing a specific order of functions either unless you chain them as callbacks.

Categories

Resources