As example, in SIGINT handler I need to wait until all child processes are completed. But there may be handlers on child`s 'close' event, which themselves may execute async actions like external notifications.
So I need to wait until
child.closed and
child.closed handlers completed and
async actions initiated in child.closed handlers are all completed.
Below goes simplified current code that is aware only of second checkpoint.
var child_process = require('child_process');
var events = require('events');
var timers = require('timers');
var childRunning = false; // has child flag (counter in actual app)
// starting child
var child = child_process.spawn(process.cwd()+'/stub.js',{detached:true});
childRunning = true;
child.on('close',function(){childRunning=false}); //
// example close handler with async action inside
// actually there is a bunch of such handlers
child.on('close',function(){
console.log('child close handler triggered');
timers.setTimeout(function(){
console.log('close handler async action completed')
}, 2000);
});
process.on('SIGINT',function(){
console.log("Received SIGINT");
closeApp=function(){
console.log("readyToExit");
process.exit();
}
if (!childRunning) closeApp();
// in fact, i need here not this event, but
// 'all close handlers are done their job'
child.once('close',closeApp);
})
// actually there is a daemon app, so it does not stop by itself at all
In this example normally you will see "close handler async action completed" message, but if you press ctrl+c then that message will be missed. So I need somehow to rewrite it to se
I'm trying to find a solution which keeps close-handlers as simple as it can be.
I don't know how to name this case, so googling didn't help.
One possible solution is to use some extended EventEmitter which
can handle promises, returned by event listeners
implement Q.all()-like behavior for collected promises
emit another event on all-listeners-completed (including wait for promises)
I will try to find such in npm registry or will implement one myself if nothing found.
Related
I recently dived into subscriptions of Subject/BehaviorSubject/ etc and I am looking for the goto approach when used in combinations with Promises.
Given is the example code below:
firebase.user.subscribe((user: any | null) => {
fs.readFile('path/to/file')
.then((buf: Buffer) => {
this.modifySomeData = buf;
});
});
I subscribe to a Subject that triggers whenever the user logs in or out of their service. Whenever this happens, I read a file from disk. This readFile event could potentially take longer than the next "login/logout" event. Of course, I am in JS and in an asynchronous environment. This means, my user code is not multithreaded, but still, the 2nd user event and 2nd readFile could theoretically be faster than the first readFile.
First user event fired
First readFile is executed
Second user event is fired
Second readFile is executed
Second readFile is resolved <---
First readFile is resolved <---
The order is mixed up. The silliest approach I could think of is to create a uuid before reading the file and check inside the promise if this is still the same. If it's not I discard the data.
Is there a better solution?
If i have a process where older requests can be discarded i often keep a variable in scope to track the latest request and compare, similar to your UUID idea:
let lastRead: Promise<Buffer> | null = null;
firebase.user.subscribe((user: any | null) => {
const read = lastRead = fs.readFile('path/to/file');
read.then((buf: Buffer) => {
if (read != lastRead)
return;
this.modifySomeData = buf;
});
});
In this specific case, readFile also supports an abort signal. So you might also be able to abort the last request instead; you will still need to track it though.
The first approach is to see if your event generation logic could handle waiting for event handling. For example, you can use a promise to wait for the event OR generate another event, say doneReadFile and only then send the next event. Usually, this is not the case for a generic (distributed) environment.
If event generation does not care about how long it took to handle events, you can still use the above approach but check for the intermediate event doneReadFile in the next event handler (login/logout). This can be achieved by implementing some kind of polling or busy-wait/sleep
I am confused about some terms. I am trying to find out how the event system of Node.js actually works, and in a lot of places I read that the event handlers are totally synchronous.
For me that seemed really strange, because one of the advantages of using an event-driven approach would be that the main thread would not be blocked by events. So I tried to come up with my own example, and it seems like that what did happen was what I actually expected:
const fs = require('fs')
const util = require('util')
const readFile = util.promisify(fs.readFile)
const events = require('events')
const emitter = new events.EventEmitter()
emitter.on('fire', () => {
readFile('bigFile.txt')
.then(() => console.log('Done reading bigFile.txt'))
.catch(error => console.log(error))
console.log('Sync thing in handler')
})
emitter.on('fire', () => {
console.log('Second handler')
})
console.log('First outside')
emitter.emit('fire')
console.log('Last outside')
Note that bigFile.txt is an actually large text file, processing it takes a few hundred milliseconds on my machine.
Here I first log out 'First outside' synchronously. Then I raise the event which starts the event handling process. The event handler does seem to be asynchronous, because even though we first log out the synchronous 'Sync thing in handler' text, we start using the thread pool in the background to return back with the result of reading the file later. After running the first handler, the second handler runs printing out its message, and finally we print out the last sync message, 'Last outside'.
So I started with trying to prove what some people say, which is that event handlers are by nature synchronous, and then I found them to be asynchronous. My best guess is that either people saying that the event system is synchronous mean something else, or that I have some conceptual misunderstanding. Please help me understand this issue!
The EventEmitter class is synchronous in regard to the emit function: event handlers are called synchronously from within the .emit() call, as you've demonstrated with the fire event you fired yourself.
In general, events that come from the operating system (file and network operations, timers etc) through node's event loop are fired asynchronously. You're not firing them yourself, some native API does fire them. When you listen to these events, you can be sure that they will occur not before the next tick.
The event handler does seem to be asynchronous, because even though we first log out the synchronous 'Sync thing in handler' text, we start using the thread pool in the background to return back with the result of reading the file later
Yes, you are calling the asynchronous function readFile (that will notify you later), but that doesn't make your event listener function or the .emit('fire') call asynchronous. Even "asynchronous functions" that start a background process will immediately (synchronously) return something - often nothing (undefined) or a promise.
TL;DR: is it possible to say "call that function once all awaits in the current context finish" in JS/node?
A very simplified example:
a frontend-facing service creates a new user, then does another async task ([1]) and returns
a user service validates & saves the new user, then fires an event ([2]) that can trigger some other logic (unrelated to the current request)
goal: [1] should always finish before handlers for [2] start running
class Service1 {
// say this is called from a controller / express route handler
async createUser(userData: NewUserData): Promise<UserWithAdditionalData> {
const user = await this.userSerivce.validateAndSaveUser(userData);
// [1] this is also async, and should finish before [2] handlers start running
const userWithData = await this.getSomeAdditionalData(user);
return userWithData;
}
}
class UserService {
async validateAndSaveUser(userData: NewUserData): Promise<User> {
const validatedData = await this.validateNewUserData(userData);
await this.dbService.saveNew(validatedData)
// [2] trigger an event/hook to be executed later
this.eventBus.onUserCreated();
}
}
The question: is it possible to implement/replace [2] in a way to achieve the mentioned goal? Preferably in a better way than scheduling the event a few seconds in the future :D
In my current implementation, I'm using an event bus library, that calls registered event consumers when an event is triggered. However, since it's probably just pushing the callback onto the event loop under the hood, it's likely to be executed before [1] because both will just be queued onto the event loop. For the same reason, the same happens if I wrap the handlers in setTimeout(..., 0) or setImmediate. What I want to achieve, is that the event handlers should be fired after all the awaits from the caller are finished.
// please let's avoid discussing if the pattern above is a good design -- like most things, it can be used both in good and bad ways ;)
For example i have some click event and while i trigger any selector's event, all other selector's event should be disable until current event finished executing.
I clicked on selector-a then repeatly clicked on selector-b
document.querySelector('.selector-a').addEventListener('click', function() {
setTimeOut(function() {
console.log('event A');
}, 500);
});
document.querySelector('.selector-b').addEventListener('click', function() {
console.log('event B');
});
My expected result:
event A
event B
event B
event B
.....
But actual result was:
event B
event A
event B
event B
.....
Some of my friends told me to use callback() but i don't think it can use with addEventListener. Can anybody explain why i should use callback() or there is others way to solve this ?
Have a nice day everyone :)
The actual click event can't be prevented entirely, but the logic you're looking for can be achieved relatively easily - just set a boolean flag when you have something asynchronous that's running, and you can check it on every listener so that clicks can be ignored until the flag gets reset:
let somethingRunning = false;
document.querySelector('.selector-a').addEventListener('click', function() {
if (somethingRunning) return;
somethingRunning = true;
setTimeOut(function() {
console.log('event A');
somethingRunning = false;
}, 500);
});
document.querySelector('.selector-b').addEventListener('click', function() {
if (somethingRunning) return;
console.log('event B');
});
Note that completely synchronous handlers, like with .selector-b, don't need to reassign somethingRunning, because such a handler will always run to the end before another handler can run.
In simple word it is not possible but by using some logic like flag we can but
suppose you have clicked 'selector a' and some operation is ongoing and same time you clicked on 'selector b' then you can avoid execution by flag but how will you ensure that when first's execution is over there should a mechanism to execute second click operation.
But to do avoid click whenever user click and some action is going on, add overlay on page so that user not able to access any actionable UI part.
The reason why you do not get your expected result is how javascript works. When you call setTimout` it will work asynchronous, so the function you passed in setTimout to will go to browser and after 500 milliseconds will be passed to the queue and then event loop will do the checking of a stack if it is empty, when stack is empty it will start executing code from queue.
But for you to receive the expected result you can do it with a flag, if you use a flag (example isWaiting) the function from .selector-b will be interrupted until function from .selector-ais executed. Then it will enable you to run a function from .selector-b.
Like on this comment.
I am looking for an effective way to reply to a message sent to a child process. Currently, I am using the following code:
const { fork } = require('child_process');
const child = fork(path.join(__dirname, 'sub.js'));
async function run() {
console.log('Requesting status....');
child.send('status');
const status = await awaitMessage(child);
console.log(status);
}
function awaitMessage(childProcess) {
return new Promise((resolve) => {
childProcess.on('message', (m) => {
resolve(m);
});
});
}
The problem of this code is that it creates a new event listener every single time the awaitMessage() function is called, which is prone to memory leaks. Is there an elegant way of receiving a reply from the child process?
This isn't really "prone to memory leaks" in that a leak is something that is supposed to get freed (per the rules of the garbage collector), but isn't. In this case, you've left a promise hooked up to an event handler that can still get called so the system simply can't know that you intend for it to be freed.
So, the system is retaining exactly what your code told it to retain. It's the consequence of how your code works that it is retaining every promise ever created in awaitMessage() and also firing a bunch of extra event handlers too. Because you keep the event listener, the garbage collector sees that the promise is still "reachable" by that listener and thus cannot and should not remove the promise even if there are no outside references to it (per the rules of the Javascript garbage collector).
If you're going to add an event listener inside a promise, then you have to remove that event listener when the promise resolves so that the promise can eventually be freed. A promise is no magic object in Javascript, it's just a regular object so as long as you have an object that can be referenced by a live event listener, that object can't be garbage collected.
In addition, this is subject to race conditions if you ever call awaitMessage() twice in a row as both promises will then respond to the next message that comes. In general, this is just not a good design approach. If you want to wait for a message, then you have to somehow tag your messages so you know which message response is the actual one you're waiting for to avoid your race conditions and you have to remove the event listener after you get your message.
To avoid the memory build-up because of the accumulation of listeners, you can do this:
function awaitMessage(childProcess) {
return new Promise((resolve) => {
function handleMsg(m) {
childProcess.removeListener(handleMsg);
resolve(m);
}
childProcess.on('message', handleMsg);
});
}