See this example:
// CLOSEWORKER.JS:
self.postMessage('foo');
self.close();
self.postMessage('bar');
setTimeout(() => self.postMessage('baz'), 0);
// MAIN.JS:
const worker = new Worker('./worker.js');
worker.onmessage = ({data}) => console.log(data);
Even if I have invoked self.close(),the next line self.postMessage('bar'); still executed and log 'bar'.I can't understand that since the close method should just kill the worker thread right after being called.
Also ,see another example:
TERMINATEWORKER.JS:
self.onmessage = ({data}) => console.log(data);
MAIN.JS:
const worker = new Worker('./worker.js');
setTimeout(() => {
worker.postMessage('foo');
worker.terminate();
worker.postMessage('bar');
setTimeout(() => worker.postMessage('baz'), 0);
}, 1000)
When using the terminate method instead of the close method, the code executed as I expected. It just logs 'foo', no-log any other thing, which suggests that the thread has been killed right after calling the terminate method.
WorkerGlobalScope.close() when called discards any tasks that have been scheduled and sets the closing flag of the WorkerGlobalScope, effectively preventing any new task to get queued.
However, it doesn't stop the execution of the current task.
So indeed, in your first example, it will run the current script until its end and still execute the self.postMessage('bar') operation, that your main thread will be able to handle.
Worker.terminate() on the other hand will also "[a]bort the script currently running in the worker."
Related
While studying nodejs streams, I wanted to check what happens if the end listener is added after a filestream has been consumed and closed.
I am puzzled by what happens: no error event, no exception thrown, but the promise is not awaited and the node process is closed with return value 0:
const fs = require("node:fs");
const { setTimeout } = require("node:timers/promises");
run()
.then(() => console.log("done"))
.catch(console.error);
async function run() {
const myStream = fs.createReadStream(__filename);
myStream.on("data", () => console.log("stream data"));
myStream.on("error", (error) => console.log("stream error", error));
myStream.on("close", () => console.log("stream close"));
console.log("data listener added");
await setTimeout(2000);
await new Promise((resolve) => {
console.log("add end listener");
myStream.on("end", () => {
console.log("stream end");
resolve();
});
console.log("end listener added");
});
console.log("run end");
}
Output (the process stops immediately after printing this, it does not hang):
$ node stream1.js
data listener added
stream data
stream close
add end listener
end listener added
I get that "stream end" is not reached, not pleasant but I can work with it by first checking for stream.isClosed.
As a consequence it seems logical that "run end" is not reached: I expect the run function to wait for the promise forever.
But why the process does not hang?
Can some nodejs expert explain what is happening?
Did I make an error with my promise handling?
I am using node v18.12.1 on Linux.
When you add the end listener to your stream, it has already ended, therefore the listener will never be invoked. resolve() is never executed and the await new Promise(...) never comes to an end. Therefore, run end is not logged either. And the promise returned by run never resolves so that done is also not logged.
On the other hand, after end listener added has been logged, the event loop of Node.js is empty: there is nothing more to expect, because the file has been read. In such a situation, the Node.js process exits.
Just attaching a listener to an event does not make Node.js wait. Perhaps the following analogy helps:
http.createServer().on("request", express());
registers a request handler but then exists immediately. By contrast, the .listen method makes the HTTP server wait for incoming requests, even if they are ignored because no request handler is registered at all:
http.createServer().listen(8080);
never exits.
Also, pending promises do not make Node.js wait. Compare
new Promise(function(resolve, reject) {
setTimeout(resolve, 5000);
});
which exits after 5 seconds, to
new Promise(function(resolve, reject) {
});
which exists immediately. It is not the promise that makes Node.js wait, but the setTimeout.
You made no error, only a wrong expectation perhaps.
var con = document.getElementById('con');
con.onclick = function () {
Promise.resolve().then(function Promise1() {
con.textContent = 0;
// requestAnimationFrame(() => con.textContent = 0)
});
};
<div id="con">this is con</div>
Why this code does not trigger rendering after performing microtasks?
setTimeout(function setTimeout1() {
console.log('setTimeout1')
}, 0)
var channel = new MessageChannel();
channel.port1.onmessage = function onmessage1() {
console.log('postMessage');
Promise.resolve().then(function promise1() {
console.log('promise1');
})
};
channel.port2.postMessage(0);
setTimeout(function setTimeout2() {
console.log('setTimeout2')
}, 0);
console.log('sync');
Why postmessage is executed before timer?
Why this code does not trigger rendering after performing microtasks?
It does, otherwise you wouldn't see the text being updated...
Maybe you are not able to tell it from your dev tools?
This is probably because mouse events are now generally throttled to the screen-refresh rate, meaning that when the task dispatching the mouse event will run, you'd already be in a painting frame, this may be for an other reason (because to my knowledge, mousemove events are throttled this way, not click...).
So there, your Promise callback will get executed synchronously (with only the sixth step "set currentTask to null" in between), before the update the rendering steps kicks in, and all the dev tools will see is a normal painting frame, just like it was expecting.
So maybe, the dev tools won't show anything particular here, but given the broadness of your claim, it's quite hard to pin-point a particular reason, and this is just a theory of mine.
You can try to validate this theory by calling requestAnimationFrame from inside such an event and check if it did execute in the same event loop iteration:
onclick = (evt) => {
console.clear();
setTimeout( () => console.log( 'timeout' ), 0 );
requestAnimationFrame( () => console.log( 'rAF' ) );
};
Click anywhere<br>
If "rAF" gets logged before "timeout", the click event got handled in a painting frame.
For me it does quite often in Chrome, and only once in a while in Firefox, but in the mean time I know Chrome's rAF is broken... so this theory is quite weak.
Why postmessage is executed before timer?
That will depend on the User-Agent (browser) and on when this code is executed for this statement to hold true, and also of course for the reason why it does.
In Chrome, they set a minimum 1ms to the timeout value passed to setTimeout:
base::TimeDelta interval_milliseconds =
std::max(base::TimeDelta::FromMilliseconds(1), interval);
the message task has no timeout and will thus get queued immediately. So if no other task is to be processed, it will be the next one executed, long before the 1ms timeout resolves.
In Firefox, they treat tasks scheduled by setTimeout as low priority, when scheduled from the page load (that means that in Firefox, the message task would actually fire after the setTimeout one, if both are scheduled after the page load:
function test() {
setTimeout(function setTimeout1() {
console.log('setTimeout1')
}, 0)
var channel = new MessageChannel();
channel.port1.onmessage = function onmessage1() {
console.log('postMessage');
Promise.resolve().then(function promise1() {
console.log('promise1');
})
};
channel.port2.postMessage(0);
setTimeout(function setTimeout2() {
console.log('setTimeout2')
}, 0);
console.log('sync');
}
console.log( 'testing # page load' );
test();
setTimeout(() => {
console.log( 'testing after page load' );
test();
}, 1000 );
/* results in Firefox:
testing # page load
sync
postMessage
promise1
setTimeout1
setTimeout2
testing after page load
sync
setTimeout1
setTimeout2
postMessage
promise1
*/
).
So there, in this particular case of a page load, they will treat the message task as more important than the timeout one, and when the task executor will have to choose which task to execute next (as part of the first step of the Event Loop processing model), it will pick the message over the timeout.
But these are implementation quirks, and nothing in the specs does formalize this behavior.
Assume I have a function that logs to the console after each input. The usage of this is, for example, appending items to a list then setting a timer for them to be removed.
const getRandomInt = (min, max) => {
min = Math.ceil(min);
max = Math.floor(max);
return Math.floor(Math.random() * (max - min + 1)) + min;
}
const button = document.querySelector('#click-me');
async function removeSomething(after) {
setTimeout(() => {
console.log('I fired after' + after.toString());
}, after);
}
button.addEventListener('click', () => {
removeSomething(getRandomInt(500, 2500));
});
<button id="click-me">Click me.</button>
When I declare a function async, I assume it scopes out of the main event and is able to run behind to allow other things to run, but the thing is, if that function is used within an event listener, assume I clicked that button 50 times a second, what would be the bottle-neck? The CPU that can't handle the 50 click events or my function?
I'm trying to understand the interaction between blocking & non-blocking code.
When I declare a function async, I assume it scopes out of the main event and is able to run behind to allow other things to run
No! There is no such thing as a "background". JS is single threaded². Making the function async is a no op here. It doesn't change anything.
if that function is used within an event listener, assume I clicked that button 50 times a second, what would be the bottle-neck? The CPU that can't handle the 50 click events or my function?
Um, all the code you write runs on your CPU?
I'm trying to understand the interaction between blocking & non-blocking code.
All code you write runs on the only thread JS has. Therefore all the code you write blocks that thread. If you however start an asynchronous task, the engine will offload that (the task, not your handler) to another thread / hardware that is out of your reach, therefore attaching a callback or await ing a promise is non-blocking, as execution of the current code ends, the engine can execute other code, and then if the async task finishes, it runs the callback or continues the async function execution (which itself is blocking too).
Example 1: A callback from a timer:
console.log(1);
setTimeout(() => console.log(3), 1000);
console.log(2);
// Chain of actions:
// Execution of the global scope:
console.log(1);
setTimeout(/*...*/, 1000); // Send timer task to some underlying mechanism, e.g. hardware timers
console.log(2);
// Execution of global scope ends
// Engine goes on with other stuff
// Timer finishes, callback gets called:
(() => console.log(3))()
// Callback ends
Example 2: Awaiting a fetch call:
(async function task() {
const req = await fetch("http://example.com");
const result = await req.json();
})();
// Chain of actions:
// Execution of the global scope
(async function task() {
/* halted here */ await fetch("http://example.com"); // Offloaded to some IO thread
// Implicitly return promise
})();
// Execution of global scope ends
// Engine goes on with other stuff
// Response arrived from the server via some IO thread, execution continues:
const req = /*the result*/
/*halted*/ await req.json();
// Execution ends
// Engine goes on with other stuff
// Response parsed, execution continues
const result = /*the result*/
return result; // Promise resolves
// Execution ends
² Actually it isn't, it does however have an "observably synchronous execution model". "JS is single threaded" is the (over)simplified version of that
I have an event listener in Node JS, as shown below.
client.on('collect', async reaction => {
await external.run(reaction, ...);
});
The function I called external.run returns a promise, and takes around 5 seconds to complete. If this event is triggered again while the previous trigger is still in execution (i.e before the 5 seconds it takes), it messes with my program.
Is there a way to wait for the previous execution to finish before running the new one?
Thanks.
Yes, what you want is called a Lock in other languages ... JS doesn't provide that mechanism natively, but its easy to write one yourself:
const createLock = () => {
let queue = Promise.resolve();
return task => queue = queue.then(() => task());
};
const externalLock = createLock();
client.on('collect', reaction => externalLock(async () => {
await external.run(reaction, ...);
}));
For sure this is only a contrived example, you might want to handle errors properly ... or you just use one of the libraries out there that do this
I've got a really weird issue whereby awaiting a Promise that has passed its resolve to an event-emitter callback just exits the process without error.
const {EventEmitter} = require('events');
async function main() {
console.log("entry");
let ev = new EventEmitter();
let task = new Promise(resolve=>{
ev.once("next", function(){resolve()}); console.log("added listener");
});
await task;
console.log("exit");
}
main()
.then(()=>console.log("exit"))
.catch(console.log);
process.on("uncaughtException", (e)=>console.log(e));
I'm expecting the process to halt when I run this because clearly "next" is currently never emitted. but the output I get is:
entry
added listener
and then the nodejs process terminates gracefully.
I thought it was something to do with the Garbage Collector, but ev and task are clearly still in scope on main. So I'm really at a loss as to why the process exits entirely without error.
Obviously I would eventually emit the event, but I've simplified my code to the above to reproduce. I'm on node v8.7.0. Is there something wrong with my code or is this a node bug?
This question is basically: how does node decide whether to exit the event loop or go around again?
Basically node keeps a reference count of scheduled async requests — setTimeouts, network requests, etc.. Each time one is scheduled, that count increases, and each time one is finished, the count decreases. If you arrive at the end of an event loop cycle and that reference count is zero node exits.
Simply creating a promise or event emitter does not increase the reference count — creating these objects isn't actually an async operation. For example, this promise's state will always be pending but the process exits right away:
const p = new Promise( resolve => {
if(false) resolve()
})
p.then(console.log)
In the same vein this also exits after creating the emitter and registering a listener:
const ev = new EventEmitter()
ev.on("event", (e) => console.log("event:", e))
If you expect Node to wait on an event that is never scheduled, then you may be working under the idea that Node doesn't know whether there are future events possible, but it does because it keeps a count every time one is scheduled.
So consider this small alteration:
const ev = new EventEmitter()
ev.on("event", (e) => console.log("event:", e))
const timer = setTimeout(() => ev.emit("event", "fired!"), 1000)
// ref count is not zero, event loop will go again.
// after timer fires ref count goes back to zero and node exits
As a side note, you can remove the reference to the timer with: timeout.unref(). This, unlike the previous example, will exit immediately:
const ev = new EventEmitter()
ev.on("event", (e) => console.log("event:", e))
const timer = setTimeout(() => ev.emit("event", "fired!"), 1000)
timer.unref()
There's a good talk about the event loop by Bert Belder here that clears up a lot of misconceptions: https://www.youtube.com/watch?v=PNa9OMajw9w
I was debugging for several hours why one of our scripts exits (without any errors) after one line of code in the middle of main function. It was a line await connectToDatabase(config). And you know what?
I found that difference between these two functions is CRUCIAL:
first:
async function connectToDatabase(config = {}) {
if (!config.port) return;
return new Promise(resolve => {
resolve();
})
}
second:
async function connectToDatabase(config = {}) {
return new Promise(resolve => {
if (!config.port) return;
resolve();
})
}
second function sometimes (when config.port is empty) creates never-resolved promise, it makes event loop empty, and node.js exits thinking that "nothing more to do here"
check it yourself:
// index.js - start it as node index.js
(async function main() {
console.log('STARTED')
await connectToDatabase()
console.log('CONNECTED')
console.log('DOING SOMETHING ELSE')
})()
'CONNECTED' and 'DOING SOMETHING ELSE' are NOT printed if you use second function and are printed, if you use first
As a general note, your code is combining three similar, but different methods: async/await, promises, event listeners. I'm not sure what you mean by "bombs out." But looking at the code, the result seems expected.
Your process exits, because you called promise on adding your event listener. It successfully resolves, and therefore exits. If you try to log task, it will give you undefined. Instead of logging "exit" in your then statement, log the result. Task will be undefined since the program does not wait to resolve its value and its "code block has finished".
You can simplify your code to the following. As you can see it resolves immediately since you call the resolve function.
const { EventEmitter } = require('events');
let ev = new EventEmitter()
var p = new Promise(( resolve ) => {
ev.once("next", resolve("Added Event Listener"));
})
p
.then(res => console.log(res))
.catch(e => console.log(e))