Manually update web worker messages - javascript

I have a web worker in which I'm trying to run as little asynchronous code as possible.
I would like to have a while loop in my web worker while still allowing messages to be processed. Is there a way to manually update the event system in the browser? Or at least update the web worker's messages?
There appears to be something like this in Node.js (process._tickDomainCallback()) but so far I haven't found anything for web.
Using a setTimeout is not an option. I would like either a solution or a definitive answer that this is simply not possible.
// worker.js
self.onmessage = function(e) {
console.log("Receive Message");
};
while (true) {
UpdateMessages(); // Receive and handle incoming messages
// Do other stuff
}

Not sure what your context is here ... I had an issue where the browser side Web Audio API event loop was getting interrupted at inopportune moments by WebSocket traffic coming in from my nodejs server so I added a WebWorker middle layer to free up the browser event loop from ever getting interruptions
The WebWorker handled all network traffic then populated a circular queue with data ... when the browser event loop deemed itself available it plucked data from this shared queue (using Transferable Object buffer) and so was never interrupted since it was the one initiating calls to the WebWorker
I feel your pain however this approach kept the event loop happy ... since a WebWorker is effectively on its own thread and from the browser side is doing async work, there is no need to minimize async code

Related

NodeJS Returning data to client browser

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.
The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

With Electron/Node.js, how do I implement simple sequential code asynchronously?

I am working on a project where my Electron App interacts with a physical device using serial commands, via serialport. The app sends a string to the device, the device executes the command (which can take ~30s) and then sends back a string to signify completion and results from that operation.
My goal is to automate a series of actions. For that, basically the following needs to be done asynchronously, so that the render thread doesn't get blocked:
Start a loop
Send a string to the device
Wait until a specific response comes back
Tell the render thread about the response, so it can update the UI
Afterwards, repeat with the next string.
Actually, multiple different commands need to be send in each loop cycle, and between each one the app has to wait for a specific string from the device.
This is kind of related to my last question, What's the correct way to run a function asynchronously in Electron?. From that, I know I should use web workers to run something asynchronously. However, my plan turned out to involve more problems than I anticipated, and I wanted to ask what would be a good way to implement this, having the whole plan in mind and not just a certain aspect of it.
I am especially not sure how to make the worker work with serialport. The serial device it needs to interact with is a child of the render process, so sending commands will probably be done over web worker messages. But I have no idea on how to make the worker wait for a specific response from the device.
(Since this question is of a more general nature, I am unsure whether I should provide some code snippets. If this is to general, I can try to write some pseudo code to make my problem more clear.)
I would go for a promise-based approach like this:
let promiseChain = Promise.resolve();
waitForEvent = function(){
return new Promise(resolve=>{
event.on("someEvent", (eventData => {
resolve(eventData)
}))
})
}
while(someLoopCondition) {
promiseChain = promiseChain
.then(sendToSerialPort(someString))
.then(waitForEvent)
.then(result=>{
updateUI(result)
})
}

Node JS Asynchronous Execution and Event Emission / Listener Models

I am new to Node JS and am trying to understand the concurrent / asynchronous execution models of Node.
So far, I do understand that whenever an asynchronous task is encountered in Node, that task runs in the background ( e.g an asynchronous setTimeout function will start timing) and the control is then sent back to other tasks that are there on the call stack. Once the timer times out, the callback that was passed to the asynchronous task is pushed onto the callback queue and once the call stack is empty, that callback gets executed. I took the help of this visualization to understand the sequence of task execution. So far so good.
Q1. Now, I am not being able to wrap my head around the paradigm of event listeners and event emitters and would appreciate if someone could explain how even emitters and listeners fall into the picture of call stack, event loops and callback queues.
Q2. I have the following code that reads data from the serial port of a raspberry pi.
const SerialPort = require('serialport');
const port = new SerialPort('/dev/ttyUSB0',{baudRate: 9600}, (err) => {
if (err) {
console.log("Port Open Error: ", err);
}
} )
port.on('data', (data) => {
console.log(data.toString());
})
As can be seen from the example, to read data from the serial port, an 'event-listener' has been employed. From what I understand, whenever data comes to the port, a 'data' event is emitted which is 'responded to' or rather listened to by the listener, which just prints the data onto the console.
When I run the above program, it runs continuously, with no break, printing the data onto the console whenever a data arrives at the serial port. There are no continuously running while loops continuously scanning the serial port as would be expected in a synchronous program. So my question is, why is this program running continuously? It is obvious that the event emitter is running continuously, generating an event whenever data comes, and the event listener is also running continuously, printing the data whenever a 'data' event is emitted. But WHERE are these things actually running, that too, continuously? How are these things fitting into the whole picture of the call/execution stack, the event loop and the callback queue?
Thanks
Q1. Now, I am not being able to wrap my head around the paradigm of event listeners and event emitters and would appreciate if someone could explain how even emitters and listeners fall into the picture of call stack, event loops and callback queues.
Event emitters on their own have nothing to do with the event loop. Event listeners are called synchronously whenever someone emits an event. When some code calls someEmitter.emit(...), all listeners are called synchronously from the time the .emit() occurred one after another. This is just plain old function calls. You can look in the eventEmitter code yourself to see a for loop that calls all the listeners one after another associated with a given event.
Q2. I have the following code that reads data from the serial port of a raspberry pi.
The data event in your code is an asynchronous event. That means that it will be triggered one or more times at an unknown time in the future. Some lower level code will be registered for some sort of I/O event. If that code is native code, then it will insert a callback into the node.js event queue. When node.js is done running other code, it will grab the next event from the event queue. When it gets to the event associated with data being available on the serial port, it will call port.emit(...) and that will synchronously trigger each of the listeners for the data event to be called.
When I run the above program, it runs continuously, with no break, printing the data onto the console whenever a data arrives at the serial port. There are no continuously running while loops continuously scanning the serial port as would be expected in a synchronous program. So my question is, why is this program running continuously?
This is the event-driven nature of node.js in a nutshell. You register an interest in certain events. Lower level code sees that incoming data has arrived and triggers those events, thus calling your listeners.
This is how the Javascript interpreter manages the event loop. Run current piece of Javascript until it's done. Check to see if any more events in the event loop. If so, grab next event and run it. If not, wait until there is an event in the event queue and then run it.
It is obvious that the event emitter is running continuously, generating an event whenever data comes, and the event listener is also running continuously, printing the data whenever a 'data' event is emitted. But WHERE are these things actually running, that too, continuously?
The event emitter itself is not running continuously. It's just a notification scheme (essentially a publish/subscribe model) where one party can register an interest in certain events with .on() and another party can trigger certain events with .emit(). It allows very loose coupling through a generic interface. Nothing is running continuously in the emitter system. It's just a notification scheme. Someone triggers an event with .emit() and it looks in its data structures to see who has registered an interest in that event and calls them. It knows nothing about the event or the data itself or how it was triggered. The emitters job is just to deliver notifications to those who expressed an interest.
We've described so far how the Javascript side of things works. It runs the event loop as described above. At a lower level, there is serial port code that interfaces directly with the serial port and this is likely some native code. If the OS supports a native asynchronous interface for the serial port, then the native code would use that and tell the OS to call it when there's data waiting on the serial port. If there is not a native asynchronous interface for the serial port data in the OS, then there's probably a native thread in the native code that interfaces with the serial port that handles getting data from the port, either polling for it or using some other mechanism built into the hardware to tell you when data is available. The exact details of how that works would be built into the serial port module you're using.
How are these things fitting into the whole picture of the call/execution stack, the event loop and the callback queue?
The call/execution stack comes into play the moment an event in the Javascript event queue is found by the interpreter and it starts to execute it. Executing that event will always start with a Javascript callback. The interpreter will call that callback (putting a return address on the call/execution stack). That callback will run until it returns. When it returns, the call/execution stack will be empty. The interpreter will then check to see if there's another event waiting in the event queue. If so, it will run that one.
FYI, if you want to examine the code for the serial port module it appears you are using, it's all there on Github. It does appear to have a number of native code files. You can see a file called poller.cpp here and it appears to do cooperative polling using the node.js add-on programming interface offered by libuv. For example, it creates a uv_poll_t which is a poll handle described here. Here's an excerpt from that doc:
Poll handles are used to watch file descriptors for readability, writability and disconnection similar to the purpose of poll(2).
The purpose of poll handles is to enable integrating external libraries that rely on the event loop to signal it about the socket status changes, like c-ares or libssh2. Using uv_poll_t for any other purpose is not recommended; uv_tcp_t, uv_udp_t, etc. provide an implementation that is faster and more scalable than what can be achieved with uv_poll_t, especially on Windows.
It is possible that poll handles occasionally signal that a file descriptor is readable or writable even when it isn’t. The user should therefore always be prepared to handle EAGAIN or equivalent when it attempts to read from or write to the fd.

Where is the node.js event queue?

I have seen similar questions on stack overflow but none of them fully dive down into the question that I have? I am familiar with event queues, how they work as well as implementing them. I am new to node.js and I am trying to wrap my head around how Node.js does it.
In a c++ application you would do something along the lines of:
int main(){
std::vector<Handler*> handlers;
BlockingQueue queue = new BlockingQueue();
//Add all the handlers call constructors and other such initialization
//Then run the event loop
while(true){
Event e = queue.pop();
for( std::vector<Handler>::iterator it = handlers.begin(); it != handlers.end(); ++it){
*it.handle(e);
}
}
}
Now in the case of node.js I might have a main file called main.js that looks like.
var http = require("http");
function main(){
// Console will print the message
console.log('Server running at http://127.0.0.1:8080/');
var server = http.createServer(function (request, response) {
// Send the HTTP header
// HTTP Status: 200 : OK
// Content Type: text/plain
response.writeHead(200, {'Content-Type': 'text/plain'});
// Send the response body as "Hello World"
response.end('Hello World\n');
});
server.listen(8080);
console.log('Main completed');
}
main();
I understand the server.listen is attaching a handler to the event queue and that we are adding the callback similar to the c++ example.
My question is. Where is the event queue? Is it in the javascript somewhere or is it built into the interpreter? Also how does the main function get called relative to the main event loop?
Where is the event queue? Is it in the javascript somewhere or is it
built into the interpreter?
The event queue is built into the operating environment that hosts the Javascript interpreter. It isn't fundamental to Javascript itself so it's not part of the actual JS runtime. One interesting indicator of this is that setTimeout() is not actually part of ECMAScript, but rather something made available to the Javascript environment by the host.
The system surrounding the Javascript implementation in node.js keeps track of externally triggered events (timers, networking results, etc...) and when Javascript is not busy executing something and an external event occurs, it then triggers an associated Javascript callback. If Javascript is busy executing something, then it queues that event so that as soon as Javascript is no longer busy, it can then trigger the next event in the queue.
node.js itself uses libuv for the event loop. You can read more about that here. It provides a multi-platform way of doing evented, async I/O that was developed for node.js, but is also being used by some other projects.
Here's a related answer that might also help:
Run Arbitrary Code While Waiting For Callback in Node?
Also how does the main function get called relative to the main event
loop?
When node.js starts up, it is given an initial script file to execute. It loads that script file into memory, parses the Javascript in it and executes it. In your particular example, that will cause the function main to get parsed and then will cause the execution of main() which will run that function.
Loading, parsing and executing the script file passed to node when it starts up is the task given to node.js. It isn't really related to the event queue at all. In some node.js applications, it runs that initial script and then exits (done with its work). In other node.js applications, the initial script starts timers or servers or something like that which will receive events in the future. When that is the case, node.js runs the initial script to completion, but because there are now lasting objects that were created and are listening for events (in your case, a server), nodejs does not shut down the app. It leaves it running so that it can receive these future events when they occur.
One missing piece here is that things like the server object you created allow you to register a callback that will be called one or more times in the future when some particular events occur. This behavior is not built into Javascript. Instead, the code that implements these objects or the TCP functions that they use must maintain a list of callbacks that are registered and when those events occur, it must execute code so that the appropriate callbacks are called and passed the appropriate data. In the case of http.createServer(), it is a mix of Javascript and native code in the nodejs http library that make that work.

Node.js EventEmitter events not sharing event loop

Perhaps the underlying issue is how the node-kafka module I am using has implemented things, but perhaps not, so here we go...
Using the node-kafa library, I am facing an issue with subscribing to consumer.on('message') events. The library is using the standard events module, so I think this question might be generic enough.
My actual code structure is large and complicated, so here is a pseudo-example of the basic layout to highlight my problem. (Note: This code snippet is untested so I might have errors here, but the syntax is not in question here anyway)
var messageCount = 0;
var queryCount = 0;
// Getting messages via some event Emitter
consumer.on('message', function(message) {
message++;
console.log('Message #' + message);
// Making a database call for each message
mysql.query('SELECT "test" AS testQuery', function(err, rows, fields) {
queryCount++;
console.log('Query #' + queryCount);
});
})
What I am seeing here is when I start my server, there are 100,000 or so backlogged messages that kafka will want to give me and it does so through the event emitter. So I start to get messages. To get and log all the messages takes about 15 seconds.
This is what I would expect to see for an output assuming the mysql query is reasonably fast:
Message #1
Message #2
Message #3
...
Message #500
Query #1
Message #501
Message #502
Query #2
... and so on in some intermingled fashion
I would expect this because my first mysql result should be ready very quickly and I would expect the result(s) to take their turn in the event loop to have the response processed. What I am actually getting is:
Message #1
Message #2
...
Message #100000
Query #1
Query #2
...
Query #100000
I am getting every single message before a mysql response is able to be processed. So my question is, why? Why am I not able to get a single database result until all the message events are complete?
Another note: I set a break point at .emit('message') in node-kafka and at mysql.query() in my code and I am hitting them turn-based. So it appears that all 100,000 emits are not stacking up up front before getting into my event subscriber. So there went my first hypothesis on the problem.
Ideas and knowledge would be very appreciated :)
The node-kafka driver uses quite a liberal buffer size (1M), which means that it will get as many messages from Kafka that will fit in the buffer. If the server is backlogged, and depending on the message size, this may mean (tens of) thousands of messages coming in with one request.
Because EventEmitter is synchronous (it doesn't use the Node event loop), this means that the driver will emit (tens of) thousands of events to its listeners, and since it's synchronous, it won't yield to the Node event loop until all messages have been delivered.
I don't think you can work around the flood of event deliveries, but I don't think that specifically the event delivery is problematic. The more likely problem is starting an asynchronous operation (in this case a MySQL query) for each event, which may flood the database with queries.
A possible workaround would be to use a queue instead of performing the queries directly from the event handlers. For instance, with async.queue you can limit the number of concurrent (asynchronous) tasks. The "worker" part of the queue would perform the MySQL query, and in the event handlers you'd merely push the message onto the queue.

Categories

Resources