I've created some tests of using async and sync event handlers. I've came to conclusion that using async handlers could make huge improvement in our code.
The only difference in the below two snippets is that in one, customEventHandler is async, and in that handler, it uses await sleep(customEventHandlerSleepTime); instead of sleep(customEventHandlerSleepTime);:
async test:
<body>
<div id="event1">
<div id="event2">
<div id="event3">
<div id="event4"></div>
</div>
</div>
</div>
</body>
<script>
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const customEventHandlerIterationsCount = 1000000;
const customEventHandlerSleepTime = 500;
const customEventName = 'customevent';
const customEvent = new Event('customevent');
const customEventHandler = async() => {
for (let i = 0; i < customEventHandlerIterationsCount; ++i) {
await sleep(customEventHandlerSleepTime);
}
};
document.getElementById('event4').addEventListener(customEventName, customEventHandler);
document.getElementById('event3').addEventListener(customEventName, customEventHandler);
document.getElementById('event2').addEventListener(customEventName, customEventHandler);
document.getElementById('event1').addEventListener(customEventName, customEventHandler);
(() => {
const start = new Date().getTime();
document.getElementById('event4').dispatchEvent(customEvent);
const end = new Date().getTime();
console.log('Time: ', (end - start));
})();
</script>
sync test:
<body>
<div id="event1">
<div id="event2">
<div id="event3">
<div id="event4"></div>
</div>
</div>
</div>
</body>
<script>
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const customEventHandlerIterationsCount = 1000000;
const customEventHandlerSleepTime = 500;
const customEventName = 'customevent';
const customEvent = new Event('customevent');
const customEventHandler = () => {
for (let i = 0; i < customEventHandlerIterationsCount; ++i) {
sleep(customEventHandlerSleepTime).then(() => {});
}
};
document.getElementById('event4').addEventListener(customEventName, customEventHandler);
document.getElementById('event3').addEventListener(customEventName, customEventHandler);
document.getElementById('event2').addEventListener(customEventName, customEventHandler);
document.getElementById('event1').addEventListener(customEventName, customEventHandler);
(() => {
const start = new Date().getTime();
document.getElementById('event4').dispatchEvent(customEvent);
const end = new Date().getTime();
console.log('Time: ', (end - start));
})();
</script>
The result of the above tests are:
async test execution time: ~1ms
sync test execution time: ~1500ms
Am I doing something wrong or is it true? If we remove "await" and ".then()" from sleep function, the sync handler prints "Time" message faster with minimal difference of time.
Based on this test, I am wondering if it's better to always (or almost always) use async handlers, if we e.g. don't know what will be going on in nested functions of this handler or maybe if we don't use "await" in our handler directly it's better to avoid using async? Maybe there is a better way to test this?
You're doing heavy processing in both snippets. The main difference is that in the second (sync) snippet, you're creating all Promises at once, synchronously. There are a large number of promises, so the overhead of creating so many is significant. In the first (async) snippet, when the event is dispatched and the handler runs, you only create one sleep Promise synchronously - this takes next to no time at all, and then the event finishes. Then, as a microtask, the second Promise is created, and you wait for it to resolve. Then, as a microtask, the third Promise is created, and you wait for it to resolve. Etc.
You're doing heavy processing in both snippets, but in one, it's staggered out over a long period of time (the Promises run in serial), but in the other, the Promises all run in parallel, being initialized immediately. If the event is fired via Javascript (rather than, for example, a native button click), it will take some time to get to the line after the manual firing of the event if all the heavy processing is synchronous.
So, sure, sometimes this async technique may help you (though, processing this intensive is pretty rare in Javascript, so often it won't be noticeable at all).
A better option would probably be to move the heavy processing into a web worker instead - that way, the processing is done on a completely separate thread.
No, there's no advantage to using async functions for DOM event handlers, all it does is add (a very tiny bit of) overhead.
What your test is missing is that the functions still take the same amount of time to run, they just do it later, after you're done measuring, so you don't see it. But they still take that time, and it's still on the main UI thread.
Related
I want to let Javascript run two operations at the same time.
I want to let javascript process a large loop while running interval to do a timer.
I searched online and found that we only have two ways either using a event loop or using a web worker.
I have been struggled with it for a long time, but it is pretty for me to understand how to really use web worker or event loop.
For the event loop, I can't really understand the code the documentation it gives:
while (queue.waitForMessage()) {
queue.processNextMessage()
}
What does the queue stand for? If I directly use queue in my code, it will display queue is not defined
Also, how could i use it actually it my code?
For the web worker, here is my code:
let result = document.getElementById('result')
let time = document.getElementById('time')
let worker;
let script = document.getElementsByTagName('script')
let interval = setInterval(function(){
time.textContent = parseFloat(time.textContent)+1
},1000)
function display(){
for(let i =0;i<10000;i++){
result.innerHTML +=i
}
}
let startworker = function(){
if(typeof(worker)=='undefined'){
worker = new Worker(script);
//I think this will produce error, but I don't know how to deal with the onmessage to let it run the function
worker.onmessage = function(){
display();
}
}
}
<div id='result'></div>
<button onclick='startworker()'>Click</button>
<div id='time'>0</div>
If my code, if I click the button, it will produce failed to construct 'Worker'. Also, for the onmessage part, I don't really have idea of how could I make it run the loop and I think my code is incorrect.
I have read a lot of articles and documentations, but I still don't have idea of how to do that.
Could anyone helps me to understand those two ways and give me a better way of solving this situation?
Thanks for any responds and forgive my ignorance!!!
The example:
let result = document.getElementById('result')
let time = document.getElementById('time')
let interval;
function start(){
interval = setInterval(function(){
time.textContent = parseFloat(time.textContent)+1
},1000)
setTimeout(reallystart(),0)
}
function reallystart(){
for(let i =0;i<10000;i++){
if(i == 9999){
window.clearInterval(interval)
}
result.innerHTML +=i
}
}
<button onclick='start()'>Click</button>
<div id='result'></div>
<div id='time'>0</div>
You don't have to write yourself an event-loop, you will make use of the browser's one.
Still, for your case of running a (synchronous) loop and a scheduled task concurrently, there are indeed two ways of doing it:
On a single thread, by breaking your loop in several tasks.
On multiple threads, by using Web Workers.
The event loop in a browser can ultimately be very roughly schematized as being a while loop indeed. At each iteration it will pick a new task and run it until completion (in facts it will pick the tasks from multiple task sources, so that a priority system can be set up).
Your JavaScript job is itself generally executed as part of such a task*, and the task won't end before this JavaScript job is itself ran to completion. This means that if your JavaScript job never ends, the event-loop won't be able to process the next task, because it will be stuck in the one that spawned your code.
setInterval (and setTimeout) do schedule a new task to execute the passed JavaScript at some time in the future. But once again, while your for loop is blocking the event-loop, this newly scheduled task won't ever be able to do its job.
To solve this issue, we thus have to not block the event loop in order to let it pick the newly scheduled tasks too.
To do this in a single event-loop, you will have to break your loop into small chunks and separate the execution of each chunks in their own tasks. This way, the event-loop will be able to determine if it has to execute other tasks than these ones:
let i = 0;
// for loop only updates i
const executeForLoop = async () => {
for(; i < Infinity; i++) {
// a chunk of 1000 loops
if(i % 1000 === 0) {
await waitNextTask();
}
}
};
executeForLoop();
// every 1s we update the displayed value
setInterval(() => {
document.getElementById("log").textContent = i;
}, 1000);
// currently the fastest way to post a new task
// see https://stackoverflow.com/questions/61338780/
function waitNextTask() {
return new Promise( (res) => {
const channel = waitNextTask.channel ??= new MessageChannel();
channel.port1.addEventListener("message", () => res(), { once: true });
channel.port1.start();
channel.port2.postMessage("");
});
}
<pre id="log"></pre>
However doing this will still take a lot of the computation power that is normally affected to rendering the page correctly and handling all the user interactions, so use this only when you have no choice, e.g because the calculation requires access to data that is only accessible in the main thread like the DOM.
The better way to handle long running scripts is to use a so-called "dedicated worker". This will generally run on an other thread but will anyway have its own event-loop.
So you can block this Worker's event-loop all you like, and the main thread will keep working as if nothing was happening.
However, you need to execute the blocking script in the Worker, the worker.onmessage handler will still be executed in the main thread and it will still block the main event-loop.
So the previous example rewritten to use a Worker:
let i;
// every 1s we update the displayed value
setInterval(() => {
document.getElementById("log").textContent = i;
}, 1000);
const worker = new Worker(getWorkerURL());
worker.onmessage = (evt) => {
i = evt.data;
};
// because we can't link to external files in StackSnippets
// we use a dirty blob: URL hack
function getWorkerURL() {
const elem = document.querySelector("[type='worker-script']");
const content = elem.textContent;
const blob = new Blob([content], { type: "text/javascript" });
return URL.createObjectURL(blob);
}
<script type="worker-script">
// the hard computation is made in the script
// that will be used to initiate the Worker
let i = 0;
const executeForLoop = () => {
for(; i < Infinity; i++) {
// we keep a chunking logic
// to avoid spamming the main thread with too many messages
if(i % 1000 === 0) {
postMessage(i);
}
}
};
executeForLoop();
</script>
<pre id="log"></pre>
*JS jobs may also be called as a callback in the update-the-rendering steps where it's technically not linked to a task, but there is no sensible difference for us web-authors.
Consider the below two snippets:
const loop1NoPromise = () => {
let i = 0
for (i; i < 500000; i++) {}
}
const loop2NoPromise = () => {
let i = 0
for (i; i < 500000; i++) {}
}
const startNoPromise = () => {
console.time('No promise')
loop1NoPromise()
loop2NoPromise()
console.timeEnd('No promise')
}
startNoPromise()
const loop1Promise = async () => {
let i = 0
for (i; i < 500000; i++) {}
return new Promise((resolve) => resolve());
}
const loop2Promise = async () => {
let i = 0
for (i; i < 500000; i++) {}
return new Promise((resolve) => resolve());
}
const startPromise = async () => {
console.time('With promise')
await Promise.all([loop1Promise(), loop2Promise()])
console.timeEnd('With promise')
}
startPromise()
I was wondering if delegating multiple functions that weren't intended to be Promises to the browser's web APIs and then awaiting them would increase performance in any way. Going into this experiment, I half expected startPromise to run a bit quicker than startNoPromise and the other half of me expected them to run +/- the same.
However, running the below snippets individually shows that startNoPromise is significantly faster than startPromise. What strange to me, is that if I merge these two snippets into one snippet, and then execute startNoPromise and startPromise, then they run more or less equally as fast... but running them individually shows a ~1ms difference in time, with startNoPromise clocking in consistently at around 2.245ms.
My question is why was my original logic flawed, that making non-promise functions into promises and outsourcing them to web APIs would make them run faster (because they would therefore be running asynchronously)? Also, why is the Promise version of these two functions executing at a slower speed than their synchronous counterparts?
The "problem" with JavaScript is that everything runs in a single thread, ie, only one thing can be done at a time. Using promises for CPU bound problems (code that is limited by the CPU power) will actually make it slower. The reason for the slow down is that the API calls for the promises and managing the promises all take CPU overhead as well.
Another thing to note is that with CPU bound functions without any await these functions will run in serial (one after the other) anyway. Adding await on the other hand will just make it even slower since there will be more management overhead.
Promises are great for code where you have to wait on something to complete outside of the main process. Most often that will be io.
Both snippets are synchoronous calculations.
Returning a promise does not make previous calculations to be "asynchronous": It just returns an object that represents an asyncronous result. That is an (in this case already resolved) promise.
In your second snipped, the code you add only wraps the result in a promise which resolution cannot be handled until next event loop cycle.
In other words: When you do await Promise.all(...) you are telling JS engine to attend all pending events and, in the next cycle, check if all promises are resolved and get its value or continue waiting otherwise and so on.
If you remove the await keyword you will see almost identical results.
You can also wrap the Promise.all(...) expression into a console.log(...) and you will see that value (a pending promise).
If we started 2 concurrent infinite loops using worker('hello') and worker('world'), how can we later stop one of the loops?
For example:
const sleep = async function (duration) {
await new Promise(r => setTimeout(r, duration));
}
const worker = async (id) => {
while (true) {
console.log(id);
await sleep(2000); // simulates a blocking call
}
}
(async () => {
const hello = worker('hello')
const world = worker('world')
// Let's assume that now a user-input requires us to stop the `worker('hello')`
setTimeout(() => {
console.log('stopping hello...')\
// how to stop 'hello'?
}, 5000)
})();
You cannot stop those worker() loops from outside of the function. Javascript does not have that capability.
You would need those loops to be checking something that is outside the loop (a variable or calling a function or something like that) for you to be able to influence them.
There are many other ways to write the loop that can be influenced from the outside world.
Some examples:
Use setInterval() and return the interval timerID from the function. Then, you can call clearInterval() to stop the loop.
Create a small object where your loop is one method and have that loop test an instance variable that you can change from the outside.
P.S. There might be some hacks where you replace Promise with a constructor that would force a reject which would cause the await to throw and then containing async function to reject on the next cycle, but I assume you're not looking for that level of hack and invasion of the environment.
Since sleep() is declared as const you can't hack in a replacement for it that would reject.
If the only thing you want to do with the worker function is to repeat some action every N milliseconds, I suggest using setInterval explained here
function worker(id) {
return setInterval(() => {//loop actions inside this annonymous function
console.log(id);
//Anything else
}, 2000);//Every 2000 milliseconds
}
//make a loop active
const intervalHello = worker(`Hello`);
//stop the interval
clearInterval(intervalHello);
I have an event listener in Node JS, as shown below.
client.on('collect', async reaction => {
await external.run(reaction, ...);
});
The function I called external.run returns a promise, and takes around 5 seconds to complete. If this event is triggered again while the previous trigger is still in execution (i.e before the 5 seconds it takes), it messes with my program.
Is there a way to wait for the previous execution to finish before running the new one?
Thanks.
Yes, what you want is called a Lock in other languages ... JS doesn't provide that mechanism natively, but its easy to write one yourself:
const createLock = () => {
let queue = Promise.resolve();
return task => queue = queue.then(() => task());
};
const externalLock = createLock();
client.on('collect', reaction => externalLock(async () => {
await external.run(reaction, ...);
}));
For sure this is only a contrived example, you might want to handle errors properly ... or you just use one of the libraries out there that do this
I've got a really weird issue whereby awaiting a Promise that has passed its resolve to an event-emitter callback just exits the process without error.
const {EventEmitter} = require('events');
async function main() {
console.log("entry");
let ev = new EventEmitter();
let task = new Promise(resolve=>{
ev.once("next", function(){resolve()}); console.log("added listener");
});
await task;
console.log("exit");
}
main()
.then(()=>console.log("exit"))
.catch(console.log);
process.on("uncaughtException", (e)=>console.log(e));
I'm expecting the process to halt when I run this because clearly "next" is currently never emitted. but the output I get is:
entry
added listener
and then the nodejs process terminates gracefully.
I thought it was something to do with the Garbage Collector, but ev and task are clearly still in scope on main. So I'm really at a loss as to why the process exits entirely without error.
Obviously I would eventually emit the event, but I've simplified my code to the above to reproduce. I'm on node v8.7.0. Is there something wrong with my code or is this a node bug?
This question is basically: how does node decide whether to exit the event loop or go around again?
Basically node keeps a reference count of scheduled async requests — setTimeouts, network requests, etc.. Each time one is scheduled, that count increases, and each time one is finished, the count decreases. If you arrive at the end of an event loop cycle and that reference count is zero node exits.
Simply creating a promise or event emitter does not increase the reference count — creating these objects isn't actually an async operation. For example, this promise's state will always be pending but the process exits right away:
const p = new Promise( resolve => {
if(false) resolve()
})
p.then(console.log)
In the same vein this also exits after creating the emitter and registering a listener:
const ev = new EventEmitter()
ev.on("event", (e) => console.log("event:", e))
If you expect Node to wait on an event that is never scheduled, then you may be working under the idea that Node doesn't know whether there are future events possible, but it does because it keeps a count every time one is scheduled.
So consider this small alteration:
const ev = new EventEmitter()
ev.on("event", (e) => console.log("event:", e))
const timer = setTimeout(() => ev.emit("event", "fired!"), 1000)
// ref count is not zero, event loop will go again.
// after timer fires ref count goes back to zero and node exits
As a side note, you can remove the reference to the timer with: timeout.unref(). This, unlike the previous example, will exit immediately:
const ev = new EventEmitter()
ev.on("event", (e) => console.log("event:", e))
const timer = setTimeout(() => ev.emit("event", "fired!"), 1000)
timer.unref()
There's a good talk about the event loop by Bert Belder here that clears up a lot of misconceptions: https://www.youtube.com/watch?v=PNa9OMajw9w
I was debugging for several hours why one of our scripts exits (without any errors) after one line of code in the middle of main function. It was a line await connectToDatabase(config). And you know what?
I found that difference between these two functions is CRUCIAL:
first:
async function connectToDatabase(config = {}) {
if (!config.port) return;
return new Promise(resolve => {
resolve();
})
}
second:
async function connectToDatabase(config = {}) {
return new Promise(resolve => {
if (!config.port) return;
resolve();
})
}
second function sometimes (when config.port is empty) creates never-resolved promise, it makes event loop empty, and node.js exits thinking that "nothing more to do here"
check it yourself:
// index.js - start it as node index.js
(async function main() {
console.log('STARTED')
await connectToDatabase()
console.log('CONNECTED')
console.log('DOING SOMETHING ELSE')
})()
'CONNECTED' and 'DOING SOMETHING ELSE' are NOT printed if you use second function and are printed, if you use first
As a general note, your code is combining three similar, but different methods: async/await, promises, event listeners. I'm not sure what you mean by "bombs out." But looking at the code, the result seems expected.
Your process exits, because you called promise on adding your event listener. It successfully resolves, and therefore exits. If you try to log task, it will give you undefined. Instead of logging "exit" in your then statement, log the result. Task will be undefined since the program does not wait to resolve its value and its "code block has finished".
You can simplify your code to the following. As you can see it resolves immediately since you call the resolve function.
const { EventEmitter } = require('events');
let ev = new EventEmitter()
var p = new Promise(( resolve ) => {
ev.once("next", resolve("Added Event Listener"));
})
p
.then(res => console.log(res))
.catch(e => console.log(e))